Post by artaweb » Mon Apr 27, 2015 12:53 am

I have a totally strange and weird issue with my opencart.

I have about 820 products added into the system and when I browse the products page under admin>catalog>products I have 42 pages.

All the pages opens except the page number 36 which gives me an error as below:

Code: Select all

Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 40320 bytes) in /home/domain.com/public_html/system/library/image.php on line 34
I have no clue how am I getting this error only on the page number 36 and not on any other pages.

I'm using opencart Version 1.5.6.4

I would be grateful if someone can help me to solve this issue.

Thanks

New member

Posts

Joined
Sat Apr 06, 2013 6:30 pm

Post by rph » Mon Apr 27, 2015 3:36 am

You have an image that's so large it uses up all the memory when OpenCart tries to resize it. Try to figure out which one it is and replace it with a smaller version.

-Ryan


rph
Expert Member

Posts

Joined
Fri Jan 08, 2010 5:05 am
Location - Lincoln, Nebraska

Post by Dhaupin » Tue Apr 28, 2015 1:00 am

Its saying you need 0.04032mb more to process that image. As a suggestion, 67mb is way too small to run many database driven ecom apps. OC is pretty lean and idles at very low ram, but you should still boost its cap to something from 96mb to 128mb to compensate for spikes like what you are encountering. With a large store (thousands of items), we actually dont recommend anything under 265mb. Its rare you eat all 256mb unless bots are slamming sitemaps, but its better the overhead is there.

PS about that sitemap: try your sitemap too, or try generating 2 at once. It might cause this error with current 67mb...if its not already, it might soon as you add more products/cats/etc.

Here is a rando article describing the ways you can attempt to boost mem limit: http://tutorials.hostucan.net/how-to-in ... ory-limit/

https://creadev.org | support@creadev.org - Opencart Extensions, Integrations, & Development. Made in the USA.


User avatar
Active Member

Posts

Joined
Tue May 13, 2014 3:45 am
Location - PA

Post by PeteA » Tue Apr 28, 2015 3:28 pm

Dhaupin wrote:Its saying you need 0.04032mb more to process that image. As a suggestion, 67mb is way too small to run many database driven ecom apps. OC is pretty lean and idles at very low ram, but you should still boost its cap to something from 96mb to 128mb to compensate for spikes like what you are encountering. With a large store (thousands of items), we actually dont recommend anything under 265mb. Its rare you eat all 256mb unless bots are slamming sitemaps, but its better the overhead is there.

PS about that sitemap: try your sitemap too, or try generating 2 at once. It might cause this error with current 67mb...if its not already, it might soon as you add more products/cats/etc.

Here is a rando article describing the ways you can attempt to boost mem limit: http://tutorials.hostucan.net/how-to-in ... ory-limit/
Sorry to disagree with this, but this isn't quite what's happening. PHP limits the amount of memory available to run a single script, in this case 64Mb for each script being run (67 million bytes isn't 67Mb as you need to divide by 1024). This allows the server the ability to run multiple threads without one script consuming all of the available ram - a 2Gb server can run 32 scripts maxed out 64Mb at a single point in time for example. This rarely happens in practice as it's unlikely that a script uses anywhere near this. My product page uses ~9Mb to generate, a category page ~8.4Mb and my sitemap 7.5Mb as an example.

Where the complication comes is that any image processing is very memory and processor intensive, the PHP gd2 routines are no exception. This means that it's not uncommon as RPH points out, for a large image (1Mb+) to consume 64Mb without breaking a sweat. There are three solutions:
* You can just raise the PHP limit above 64Mb. This is the quickest fix, but also the one that is more likely to break your site as you're accepting that you may have ANY script grabbing huge 256Mb slices of ram (your theoretical cap of 32 scripts has just dropped to 8).
* You can modify the image model to increase the memory limit purely when resizing images. This requires less work (but code changes), but isn't a perfect solution as large images take longer to generate for the viewer.
* You can reduce the size of the original image (as per RPH's post - its one of the products on page 36 that is causing the problem). This is actually the safest and best performance option, but requires the most work on the person who deals with the site images.

New member

Posts

Joined
Wed Jul 30, 2014 5:46 pm

Post by IP_CAM » Tue Apr 28, 2015 7:49 pm

> This is actually the safest and best performance option, but requires the most work on the person who deals with the site images. <

That's the point, but tell that the fellows, using images, sized up to 1 MB, and more, each. According to my experience, most just don't know, how anything works. For them, OC is like another software, they use, it's beeing downloaded, and used, as it comes. And with images, it's the same, as bigger/better they look, as better they are.

If OC would not 'shrink' them, and so, at least, automatically reduce their 'Masses', as well, by default, I assume, most OC-Driven Sites would probably belong to the Slow Food Generation Shops.

The good thing about is, that Competition exists. As a consequence of this, the BEST will end up, beeing best. And best is, who knows best, how to do things best! That's free enterprise! ;)

Ernie
bigmax.ch/shop/

My Github OC Site: https://github.com/IP-CAM
5'600 + FREE OC Extensions, on the World's largest private Github OC Repository Archive Site.


User avatar
Legendary Member

Posts

Joined
Tue Mar 04, 2014 1:37 am
Location - Switzerland

Post by Dhaupin » Tue Apr 28, 2015 10:23 pm

PeteA wrote:This rarely happens in practice as it's unlikely that a script uses anywhere near this. My product page uses ~9Mb to generate, a category page ~8.4Mb and my sitemap 7.5Mb as an example.
Yes perhaps in a tiny store. But in the real world, bots multihit, and they hit often

@Store 04-26-2015 10:25:13 AM Store Name Removed Google Sitemap was generated via IP 66.249.65.12 [Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)] Memory: 89MB Time: 86.91 Sec
@Store 04-26-2015 10:25:50 AM Store Name Removed Google Sitemap was generated via IP 66.249.69.40 [Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)] Memory: 89MB Time: 85.88 Sec
@Store 04-26-2015 10:27:31 AM Store Name Removed Google Sitemap was generated via IP 66.249.75.72 [Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)] Memory: 88.75MB Time: 96.43 Sec

Also, by default in 1.5.x, the google feed does image proc for every entry (which is insane, i know). You gotta remove image processing that or you will see nonstop mem exhausted. Even removed its comparable mem use to sitemaps

https://creadev.org | support@creadev.org - Opencart Extensions, Integrations, & Development. Made in the USA.


User avatar
Active Member

Posts

Joined
Tue May 13, 2014 3:45 am
Location - PA

Post by PeteA » Wed Apr 29, 2015 4:28 pm

IP_CAM wrote:That's the point, but tell that the fellows, using images, sized up to 1 MB, and more, each. According to my experience, most just don't know, how anything works. For them, OC is like another software, they use, it's beeing downloaded, and used, as it comes. And with images, it's the same, as bigger/better they look, as better they are.

If OC would not 'shrink' them, and so, at least, automatically reduce their 'Masses', as well, by default, I assume, most OC-Driven Sites would probably belong to the Slow Food Generation Shops.

The good thing about is, that Competition exists. As a consequence of this, the BEST will end up, beeing best. And best is, who knows best, how to do things best! That's free enterprise! ;)

Ernie
bigmax.ch/shop/
If you have no control over what people are uploading then increasing the memory limit for image processing is probably the safest option. The alternative as you say is to modify the upload routine within the admin backend to automatically resize images above a specific threshold (e.g. > 1580 x 1580). That's assuming product images are only uploaded using the backend. Not sure what you mean about slow food generation shops?

Dhaupin wrote:
PeteA wrote:This rarely happens in practice as it's unlikely that a script uses anywhere near this. My product page uses ~9Mb to generate, a category page ~8.4Mb and my sitemap 7.5Mb as an example.
Yes perhaps in a tiny store. But in the real world, bots multihit, and they hit often

@Store 04-26-2015 10:25:13 AM Store Name Removed Google Sitemap was generated via IP 66.249.65.12 [Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)] Memory: 89MB Time: 86.91 Sec
@Store 04-26-2015 10:25:50 AM Store Name Removed Google Sitemap was generated via IP 66.249.69.40 [Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)] Memory: 89MB Time: 85.88 Sec
@Store 04-26-2015 10:27:31 AM Store Name Removed Google Sitemap was generated via IP 66.249.75.72 [Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)] Memory: 88.75MB Time: 96.43 Sec

Also, by default in 1.5.x, the google feed does image proc for every entry (which is insane, i know). You gotta remove image processing that or you will see nonstop mem exhausted. Even removed its comparable mem use to sitemaps
Which feed are you referring too? Neither information/sitemap or feed/google_sitemap do any image resizing "out of the box". Also if the model/tool/image class is working correctly it should only create a resized image if one doesn't exist. My sitemap XML with a few thousand products only takes around 8 seconds to generate and that includes a mod for blog posts?

New member

Posts

Joined
Wed Jul 30, 2014 5:46 pm
Who is online

Users browsing this forum: Amazon [Bot] and 31 guests