Jay, the extension is performing very well. server loads has been down since I installed it.
one questions, the size of the cachefiles folder is increasing very fast since I set the expiring time to 3.
is there a way to have the chache files auto deleted after some time?
thanks
one questions, the size of the cachefiles folder is increasing very fast since I set the expiring time to 3.
is there a way to have the chache files auto deleted after some time?
thanks
The expiring time will not delete them, merely replace them. The easiest way to reduce the size is to change
to
in your pagecache/caching.php file
Code: Select all
define('PAGE_COMPRESS', FALSE);
Code: Select all
define('PAGE_COMPRESS', TRUE);
hey Jay
Will your module help to speed up my website www[dot]itcave.co.uk with this fancy ajax attribute menu?? It has it's own cache so I don't know if there will be some interference. Website might be ok right now but when I import or update in cron job stock for 20-50k products the website gets really slow even when the products are disabled. Tried to optimize mysql config but there was no improvement.
Website running on vps and cpu, memory usage is low. Before buying I need your advice Please help!
Will your module help to speed up my website www[dot]itcave.co.uk with this fancy ajax attribute menu?? It has it's own cache so I don't know if there will be some interference. Website might be ok right now but when I import or update in cron job stock for 20-50k products the website gets really slow even when the products are disabled. Tried to optimize mysql config but there was no improvement.
Website running on vps and cpu, memory usage is low. Before buying I need your advice Please help!
Quality Electronic Cigarettes in UK
Shopecigarette
Hi Jay
Thanks for the quick response. I will probably wait for guys from hostjars to help me with their import tool. For me it seems to be a problem with sql communication or seo names.
Thanks for the quick response. I will probably wait for guys from hostjars to help me with their import tool. For me it seems to be a problem with sql communication or seo names.
Quality Electronic Cigarettes in UK
Shopecigarette
I had to disable ajax in options because there is a bug with menu category pages. I am waiting for developer fix. Have a look again I can enable it. Do you think w3 errors might have big impact on speed performance? could you check if there is something really bad since this is html5?
http://validator.w3.org/check?uri=www.i ... ne&group=0
The menu error is easy to reproduce.
Go to components and on the bottom page change to page no 2 you will see page numbers have increased and the fun begins ^^
I am also getting x24 in select option.
So instead of trading I am struggling with bugs and performance issues
http://validator.w3.org/check?uri=www.i ... ne&group=0
The menu error is easy to reproduce.
Go to components and on the bottom page change to page no 2 you will see page numbers have increased and the fun begins ^^
I am also getting x24 in select option.
So instead of trading I am struggling with bugs and performance issues
Quality Electronic Cigarettes in UK
Shopecigarette
Hi Jay, I've been using page cache for the better part of a month now and it's making a big difference. However I am just a one man show and I try to add a half dozen products to my site daily as time allows, hit clear cache and then my site goes right back to being slow again.
There is a previous post asking if the caching could be automated, but a definite answer was never found. I have been playing around with the idea for over a week and I'm still coming up empty. I've tried sitemap crawlers, something called link sleuth but none of them seem to actually re-cash the pages. When I go view the source it still shows up without cache data. I then spend time manually trying to open my 10 most obvious pages so they get cached but it's not very productive to do this everyday.
Would you consider putting some kind of "re-build cache" function on the roadmap for future versions? And in the meantime if it's not too much too ask, could you look into this topic again and maybe advise if you think there may be a quick solution? I'm sure I am not the only one that deletes the cache often.
Thanks for your time as always...
There is a previous post asking if the caching could be automated, but a definite answer was never found. I have been playing around with the idea for over a week and I'm still coming up empty. I've tried sitemap crawlers, something called link sleuth but none of them seem to actually re-cash the pages. When I go view the source it still shows up without cache data. I then spend time manually trying to open my 10 most obvious pages so they get cached but it's not very productive to do this everyday.
Would you consider putting some kind of "re-build cache" function on the roadmap for future versions? And in the meantime if it's not too much too ask, could you look into this topic again and maybe advise if you think there may be a quick solution? I'm sure I am not the only one that deletes the cache often.
Thanks for your time as always...
~
Install Extensions OR OpenCart Fast Service! PayPal Accepted
I will professionally install and configure any free or purchased theme, module or extension.
Visit http://www.mrtech.ca if you need an OpenCart webmaster
~
Hey MrTech
Thanks for the posting. incorporating a rebuilder would essentially mean having to build a site crawler of my own, which is a massive task. There is software available to do this for you, such as sitemap building software. This is something I've just found (but not tried)
http://gsitecrawler.com/
I'm sure there are hundreds more out there that also do the same thing
Please note that this is going to put a strain on both your server and likely your system. It's also worth noting that using software on a shared host is also possibly going to have your account suspended for essentially bombarding it with requests
The solution offered to people for this was to have the single page cache delete toolbar, which allows you to delete your product page cache only when you view it and are logged in as an admin, and would advise simply going to the product page, deleting the cache file, and then doing the same for any category/categories it's in.
Kind regards
Jay
Thanks for the posting. incorporating a rebuilder would essentially mean having to build a site crawler of my own, which is a massive task. There is software available to do this for you, such as sitemap building software. This is something I've just found (but not tried)
http://gsitecrawler.com/
I'm sure there are hundreds more out there that also do the same thing
Please note that this is going to put a strain on both your server and likely your system. It's also worth noting that using software on a shared host is also possibly going to have your account suspended for essentially bombarding it with requests
The solution offered to people for this was to have the single page cache delete toolbar, which allows you to delete your product page cache only when you view it and are logged in as an admin, and would advise simply going to the product page, deleting the cache file, and then doing the same for any category/categories it's in.
Kind regards
Jay
Not sure if this will help but some of the Joomla SEF extensions rebuild all the urls after a page is called after a full flush of cache, here's where you can download the free version of Artio SEF, I think this rebuilds all automagically:
http://www.artio.net/downloads/joomla/joomsef
Does this help ?
Here is the Joomla SEF extension list:
http://extensions.joomla.org/extensions ... gement/sef
http://www.artio.net/downloads/joomla/joomsef
Does this help ?
Here is the Joomla SEF extension list:
http://extensions.joomla.org/extensions ... gement/sef
Hi Scanreg
Thanks for the links. To be honest, something like that is only going to be able to go so far with re-indexing. With OC you can have hundreds of links in theory going to the same page(s) so it's pretty impossible to cache all of them. A prime example of this is a product page. for example, the links you could have could be
Which are all cached as individual pages, since they all can potentially have slight variances in content (breadcrumb is what I'm thinking here mainly). This mod also caches search queries, which are going to be limitless with what could potentially saved for caches
Thanks for the links. To be honest, something like that is only going to be able to go so far with re-indexing. With OC you can have hundreds of links in theory going to the same page(s) so it's pretty impossible to cache all of them. A prime example of this is a product page. for example, the links you could have could be
Code: Select all
/product-name
/category-a/product-name
/category-a/category-b/product-name
/category-a/category-b/category-c/product-name
/category-a/category-b/category-c/category-d/product-name
/manufacturer-name/product-name
Hello Jay
I have run into a kind of a big problem. I received an email from my hosting company saying I am using too many resources and its affecting other customers.
Before I got the email I was installing an extension for GA analytics ...
Funny thing is I have no traffic on the site because I havent launched the store yet.
They said something about too many MySQL queries ... and they told me to optimize SQL ... which I have absolutely no clue on how to do it...
Have no clue what to do ..
Will your mod help with this issue ... or is it recommended to move the site to a dedicated server to avoid future hustles like these ?
Thank you!
I have run into a kind of a big problem. I received an email from my hosting company saying I am using too many resources and its affecting other customers.
Before I got the email I was installing an extension for GA analytics ...
Funny thing is I have no traffic on the site because I havent launched the store yet.
They said something about too many MySQL queries ... and they told me to optimize SQL ... which I have absolutely no clue on how to do it...
Have no clue what to do ..
Will your mod help with this issue ... or is it recommended to move the site to a dedicated server to avoid future hustles like these ?
Thank you!
I took a look in myphpadmin at the status of the database
Getting a few red flags :
Handler_read_rnd_next 169 The number of requests to read the next row in the data file. This is high if you are doing a lot of table scans. Generally this suggests that your tables are not properly indexed or that your queries are not written to take advantage of the indexes you have.
Qcache_lowmem_prunes 25 M The number of queries that have been removed from the cache to free up memory for caching new queries. This information can help you tune the query cache size. The query cache uses a least recently used (LRU) strategy to decide which queries to remove from the cache.
Slow_launch_threads 2 The number of threads that have taken more than slow_launch_time seconds to create.
Key_reads 103 M The number of physical reads of a key block from disk. If Key_reads is big, then your key_buffer_size value is probably too small. The cache miss rate can be calculated as Key_reads/Key_read_requests.
Table_locks_waited 170 k The number of times that a table lock could not be acquired immediately and a wait was needed. If this is high, and you have performance problems, you should first optimize your queries, and then either split your table or tables or use replication.
Can these be fixed with your mod ?
Thanks!
Getting a few red flags :
Handler_read_rnd_next 169 The number of requests to read the next row in the data file. This is high if you are doing a lot of table scans. Generally this suggests that your tables are not properly indexed or that your queries are not written to take advantage of the indexes you have.
Qcache_lowmem_prunes 25 M The number of queries that have been removed from the cache to free up memory for caching new queries. This information can help you tune the query cache size. The query cache uses a least recently used (LRU) strategy to decide which queries to remove from the cache.
Slow_launch_threads 2 The number of threads that have taken more than slow_launch_time seconds to create.
Key_reads 103 M The number of physical reads of a key block from disk. If Key_reads is big, then your key_buffer_size value is probably too small. The cache miss rate can be calculated as Key_reads/Key_read_requests.
Table_locks_waited 170 k The number of times that a table lock could not be acquired immediately and a wait was needed. If this is high, and you have performance problems, you should first optimize your queries, and then either split your table or tables or use replication.
Can these be fixed with your mod ?
Thanks!
The mod doesn't actually change any php or mysql settings whatsoever, it simply caches pages, thereby reducing queries by skipping all db querying altogether for cached pages. Changing those settings is likely to only be editable by your hosting, something they'll have done to keep sites pretty locked down to reduce over usage of resources
Still loving the extension but...
Is there a way to build a cache from a predefined list of urls? say you write down the 50 most visited urls in your store, and every time you clear the cache you have the option of recaching those 50 pages feeding the list to the function that does the caching or something like that so that you get an instant quick boost to your page speed
That would simplify things IMMENSELY for those who are updating every day.
I still don't understand why link crawlers don't work. They should be generating cache files but I must be missing something.
Anyone got any ideas?
Is there a way to build a cache from a predefined list of urls? say you write down the 50 most visited urls in your store, and every time you clear the cache you have the option of recaching those 50 pages feeding the list to the function that does the caching or something like that so that you get an instant quick boost to your page speed
That would simplify things IMMENSELY for those who are updating every day.
The only thing that worked for me was using scrapbook (firefox extension) to download the whole webpage (except for images and scripts). Its FAR from a decent solution, but at least I don't have to click each link individually to have a quick speed boost after clearing cache.MrTech wrote:Hi Jay, I've been using page cache for the better part of a month now and it's making a big difference. However I am just a one man show and I try to add a half dozen products to my site daily as time allows, hit clear cache and then my site goes right back to being slow again.
There is a previous post asking if the caching could be automated, but a definite answer was never found. I have been playing around with the idea for over a week and I'm still coming up empty. I've tried sitemap crawlers, something called link sleuth but none of them seem to actually re-cash the pages. When I go view the source it still shows up without cache data. I then spend time manually trying to open my 10 most obvious pages so they get cached but it's not very productive to do this everyday.
Would you consider putting some kind of "re-build cache" function on the roadmap for future versions? And in the meantime if it's not too much too ask, could you look into this topic again and maybe advise if you think there may be a quick solution? I'm sure I am not the only one that deletes the cache often.
Thanks for your time as always...
I still don't understand why link crawlers don't work. They should be generating cache files but I must be missing something.
Anyone got any ideas?
Who is online
Users browsing this forum: No registered users and 135 guests