Post by JAY6390 » Tue Apr 24, 2012 9:24 pm

Ah, easily overlooked, I do it myself all the time :)

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by jfn99 » Fri Apr 27, 2012 1:58 am

Jay, the extension is performing very well. server loads has been down since I installed it.
one questions, the size of the cachefiles folder is increasing very fast since I set the expiring time to 3.
is there a way to have the chache files auto deleted after some time?
thanks

New member

Posts

Joined
Thu Feb 25, 2010 5:14 pm

Post by JAY6390 » Fri Apr 27, 2012 2:03 am

The expiring time will not delete them, merely replace them. The easiest way to reduce the size is to change

Code: Select all

define('PAGE_COMPRESS', FALSE); 
to

Code: Select all

define('PAGE_COMPRESS', TRUE); 
in your pagecache/caching.php file

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by czosnek24 » Tue May 01, 2012 4:33 am

hey Jay
Will your module help to speed up my website www[dot]itcave.co.uk with this fancy ajax attribute menu?? It has it's own cache so I don't know if there will be some interference. Website might be ok right now but when I import or update in cron job stock for 20-50k products the website gets really slow even when the products are disabled. Tried to optimize mysql config but there was no improvement.
Website running on vps and cpu, memory usage is low. Before buying I need your advice Please help!

Quality Electronic Cigarettes in UK
Shopecigarette


New member

Posts

Joined
Fri Mar 16, 2012 8:29 pm
Location - london

Post by JAY6390 » Tue May 01, 2012 5:48 am

Hi

I cant say for certain, however it's worth noting that the page cache mod intentionally doesn't cache Ajax requests, so my guess would be no for speeding up the menu sorry

Kind regards
Jay

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by czosnek24 » Tue May 01, 2012 8:11 am

Hi Jay
Thanks for the quick response. I will probably wait for guys from hostjars to help me with their import tool. For me it seems to be a problem with sql communication or seo names. :(

Quality Electronic Cigarettes in UK
Shopecigarette


New member

Posts

Joined
Fri Mar 16, 2012 8:29 pm
Location - london

Post by JAY6390 » Tue May 01, 2012 8:17 am

Ah I see. I've just taken a look at the site now, and it appears the menu isn't actually using ajax at all. If that's the case, then the mod will be of benefit to your store. Not sure if that's intentional or not?

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by czosnek24 » Tue May 01, 2012 8:48 am

I had to disable ajax in options because there is a bug with menu category pages. I am waiting for developer fix. Have a look again I can enable it. Do you think w3 errors might have big impact on speed performance? could you check if there is something really bad since this is html5?
http://validator.w3.org/check?uri=www.i ... ne&group=0
The menu error is easy to reproduce.
Go to components and on the bottom page change to page no 2 you will see page numbers have increased and the fun begins ^^
I am also getting x24 in select option.
So instead of trading I am struggling with bugs and performance issues :D

Quality Electronic Cigarettes in UK
Shopecigarette


New member

Posts

Joined
Fri Mar 16, 2012 8:29 pm
Location - london

Post by JAY6390 » Tue May 01, 2012 8:59 am

Ah I see. Those errors shouldn't really affect it, though the developer of the mod should really be using data-dnd and data-ajaxurl as attributes rather than the ones currently in use to reduce the w3 validation issues

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by czosnek24 » Tue May 01, 2012 9:14 am

thanks I will pass the message and see if there is other solution for slow pages if not I will give your module a try. I am sure it will help.

Quality Electronic Cigarettes in UK
Shopecigarette


New member

Posts

Joined
Fri Mar 16, 2012 8:29 pm
Location - london

Post by MrTech » Thu May 10, 2012 11:07 pm

Hi Jay, I've been using page cache for the better part of a month now and it's making a big difference. However I am just a one man show and I try to add a half dozen products to my site daily as time allows, hit clear cache and then my site goes right back to being slow again.

There is a previous post asking if the caching could be automated, but a definite answer was never found. I have been playing around with the idea for over a week and I'm still coming up empty. I've tried sitemap crawlers, something called link sleuth but none of them seem to actually re-cash the pages. When I go view the source it still shows up without cache data. I then spend time manually trying to open my 10 most obvious pages so they get cached but it's not very productive to do this everyday.

Would you consider putting some kind of "re-build cache" function on the roadmap for future versions? And in the meantime if it's not too much too ask, could you look into this topic again and maybe advise if you think there may be a quick solution? I'm sure I am not the only one that deletes the cache often.

Thanks for your time as always...

~
Install Extensions OR OpenCart Fast Service! PayPal Accepted
I will professionally install and configure any free or purchased theme, module or extension.

Visit http://www.mrtech.ca if you need an OpenCart webmaster
~


User avatar
Active Member

Posts

Joined
Mon Jan 09, 2012 2:39 pm
Location - Canada, Eh!

Post by JAY6390 » Thu May 10, 2012 11:17 pm

Hey MrTech

Thanks for the posting. incorporating a rebuilder would essentially mean having to build a site crawler of my own, which is a massive task. There is software available to do this for you, such as sitemap building software. This is something I've just found (but not tried)
http://gsitecrawler.com/
I'm sure there are hundreds more out there that also do the same thing

Please note that this is going to put a strain on both your server and likely your system. It's also worth noting that using software on a shared host is also possibly going to have your account suspended for essentially bombarding it with requests

The solution offered to people for this was to have the single page cache delete toolbar, which allows you to delete your product page cache only when you view it and are logged in as an admin, and would advise simply going to the product page, deleting the cache file, and then doing the same for any category/categories it's in.

Kind regards
Jay

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by scanreg » Fri May 11, 2012 12:52 am

Not sure if this will help but some of the Joomla SEF extensions rebuild all the urls after a page is called after a full flush of cache, here's where you can download the free version of Artio SEF, I think this rebuilds all automagically:

http://www.artio.net/downloads/joomla/joomsef

Does this help ?

Here is the Joomla SEF extension list:

http://extensions.joomla.org/extensions ... gement/sef

Active Member

Posts

Joined
Thu May 06, 2010 12:15 am

Post by JAY6390 » Fri May 11, 2012 1:09 am

Hi Scanreg

Thanks for the links. To be honest, something like that is only going to be able to go so far with re-indexing. With OC you can have hundreds of links in theory going to the same page(s) so it's pretty impossible to cache all of them. A prime example of this is a product page. for example, the links you could have could be

Code: Select all

/product-name
/category-a/product-name
/category-a/category-b/product-name
/category-a/category-b/category-c/product-name
/category-a/category-b/category-c/category-d/product-name
/manufacturer-name/product-name
Which are all cached as individual pages, since they all can potentially have slight variances in content (breadcrumb is what I'm thinking here mainly). This mod also caches search queries, which are going to be limitless with what could potentially saved for caches

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by stf » Tue May 15, 2012 12:24 am

Hello Jay

I have run into a kind of a big problem. I received an email from my hosting company saying I am using too many resources and its affecting other customers.

Before I got the email I was installing an extension for GA analytics ...

Funny thing is I have no traffic on the site because I havent launched the store yet.

They said something about too many MySQL queries ... and they told me to optimize SQL ... which I have absolutely no clue on how to do it...

Have no clue what to do ..

Will your mod help with this issue ... or is it recommended to move the site to a dedicated server to avoid future hustles like these ?

Thank you!

stf
New member

Posts

Joined
Mon Apr 02, 2012 4:10 pm

Post by JAY6390 » Tue May 15, 2012 12:35 am

Hi stf. This will definitely reduce the number of connections to MySQL yes. I can't guarantee it will stop the issue completely, as that depends on a number of factors, but should reduce it significantly

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by stf » Tue May 15, 2012 1:46 am

I took a look in myphpadmin at the status of the database

Getting a few red flags :

Handler_read_rnd_next 169 The number of requests to read the next row in the data file. This is high if you are doing a lot of table scans. Generally this suggests that your tables are not properly indexed or that your queries are not written to take advantage of the indexes you have.

Qcache_lowmem_prunes 25 M The number of queries that have been removed from the cache to free up memory for caching new queries. This information can help you tune the query cache size. The query cache uses a least recently used (LRU) strategy to decide which queries to remove from the cache.

Slow_launch_threads 2 The number of threads that have taken more than slow_launch_time seconds to create.

Key_reads 103 M The number of physical reads of a key block from disk. If Key_reads is big, then your key_buffer_size value is probably too small. The cache miss rate can be calculated as Key_reads/Key_read_requests.

Table_locks_waited 170 k The number of times that a table lock could not be acquired immediately and a wait was needed. If this is high, and you have performance problems, you should first optimize your queries, and then either split your table or tables or use replication.


Can these be fixed with your mod ?

Thanks!

stf
New member

Posts

Joined
Mon Apr 02, 2012 4:10 pm

Post by JAY6390 » Tue May 15, 2012 1:50 am

The mod doesn't actually change any php or mysql settings whatsoever, it simply caches pages, thereby reducing queries by skipping all db querying altogether for cached pages. Changing those settings is likely to only be editable by your hosting, something they'll have done to keep sites pretty locked down to reduce over usage of resources

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom

Post by Ampeter » Thu May 17, 2012 8:53 pm

Still loving the extension but...
Is there a way to build a cache from a predefined list of urls? say you write down the 50 most visited urls in your store, and every time you clear the cache you have the option of recaching those 50 pages feeding the list to the function that does the caching or something like that so that you get an instant quick boost to your page speed :D

That would simplify things IMMENSELY for those who are updating every day.

MrTech wrote:Hi Jay, I've been using page cache for the better part of a month now and it's making a big difference. However I am just a one man show and I try to add a half dozen products to my site daily as time allows, hit clear cache and then my site goes right back to being slow again.

There is a previous post asking if the caching could be automated, but a definite answer was never found. I have been playing around with the idea for over a week and I'm still coming up empty. I've tried sitemap crawlers, something called link sleuth but none of them seem to actually re-cash the pages. When I go view the source it still shows up without cache data. I then spend time manually trying to open my 10 most obvious pages so they get cached but it's not very productive to do this everyday.

Would you consider putting some kind of "re-build cache" function on the roadmap for future versions? And in the meantime if it's not too much too ask, could you look into this topic again and maybe advise if you think there may be a quick solution? I'm sure I am not the only one that deletes the cache often.

Thanks for your time as always...
The only thing that worked for me was using scrapbook (firefox extension) to download the whole webpage (except for images and scripts). Its FAR from a decent solution, but at least I don't have to click each link individually to have a quick speed boost after clearing cache.

I still don't understand why link crawlers don't work. They should be generating cache files but I must be missing something.

Anyone got any ideas?

User avatar
New member

Posts

Joined
Thu Jan 13, 2011 5:01 pm

Post by JAY6390 » Thu May 17, 2012 9:26 pm

Hi Ampeter

That is a good idea, and I may develop something to accomodate it, however is likely to be a separate addon for a small fee for those that wish to use the feature, however not sure when I will be able to get around to it as I have a lot on my plate at the minute

Kind regards
Jay

Image


User avatar
Guru Member

Posts

Joined
Wed May 26, 2010 11:47 pm
Location - United Kingdom
Who is online

Users browsing this forum: No registered users and 135 guests