Post by kabatza1 » Thu Mar 06, 2014 12:45 pm

I have read a few discussions about large stores MyIsam vs InnoDB and always wondered how can some websites out there have millions of products, and then some people have problems with 80k in opencart or one of the other free carts. Are really those custom applications so much better, or is it just the massive amout of hardware put to making the difference???

Then I came across an article about the hardware uses and it was not as massive as expected. I was imagining at least a small datacenter...but it is not the case, and the whole probably fits in a single rack.
I have also watched the piratebay documentary and my jaw dropped when I saw how small the hardware was.

So today I decided to do a few tests...

I thought why stop at 80k? So,I have loaded 6.000.000 :choke: dummy products (from an articles database)in my test server hosted in a laptop, and yes that is 6 million products making the database size over 5GB GB in size.
Did it work well? Not really, so I decided to investigate!

looking at catalog/model/catalog/product.php I can see that opencart uses LIKE in the database queries
At first I thought this is not the best option and it does returm poor results as well so I will use RLIKE instead

opened the terminal and did a few tests.

The absolute horror!!
product table query SELECT using LIKE on the description took 2 minutes to complete.
product table query SELECT using RLIKE did in fact return better results, but need 10 minutes each to complete. ::)

Looking at the product table I discovered that the name is indexed but the description is not!
So I did a fulltext Index for the descriptions and around 1 hour later I tried again.

product description query IN NATURAL LANGUAGE MODE LIMIT 500 took 0.326 seconds.
Not bad at all considering the size of the database.

product description query IN BOOLEAN MODE LIMIT 500 took 0.011 seconds.
I said wow and I was about to start jumping up and down....but then I tried an INSERT and landed back down to earth.
I waited 25 minutes to insert 1 product before killing it... so who knows how long it would take, the fact is that the store with 6 million products is not possible with the standard opencart, or in fact with any other of the free carts (and probably the commercial ones too)!

How about InnoDB?
I converted the product table to InnoDB and tried again....
Result 0.928 seconds
not bad considering it searched 6 million products but it is 3 times slower than MyIsam.

Result 0.352 seconds again much slower than MyIsam
So, time to try the INSERT...but guess what? same takes for ever!

A the end my conclusion is that for small stores with just a few thousand products will be faster if Indexing the product description without affecting the INSERTS too much (I guess around 30 seconds for an insert), but I'm not sure about edits or mass inserts because I was disapointed from the results and could not be bothered testing the standard thing any more. The test also show that MyIsam has faster SELECT than not such a bad choice for shoping carts as some pople believe. For me, the speed is more important than the transactions support in InnoDB

Then I decided to put a bit more effort and bring in the reinforcements (the Sphinx).
Sphinx makes its own special search-optimized index and does not need a MySQL index to work.
The sphinx index took under 15 minutes to make the index which is impressive because it is 1/4 of the time MySQL needed to make its own index.
A quick test query showed that Sphinx can do the search in 0.018 seconds which is nearly as good as the BOOLEAN MODE query in MyIsam but there is no MySQL index in the product table, so INSERTS do remain fast.

So at last it is actually possible to have a functional opencart store with 6 million products but it needs work.
To use Sphinx you will need a vps or dedicated server and somebody who knows how to set it up... it did take me 2 days to figure it out, but it is more than worth it.

So....for me sphinx in the way to go!



Wed Jun 13, 2012 1:08 am

Post by oyeaussie » Thu May 01, 2014 12:20 am

Hey Mate,

I know it took you couple of days to figure this one out and it is unwise of me to ask you for what configuration changes you make in opencart and sphinx.conf to get this going.

I am currently testing a store with only 50K products and oh my the response is just crap.

I would appreciate any help.




Thu May 16, 2013 7:48 am

Post by Axansh » Wed May 07, 2014 1:24 am

have a look apache solr.

it is better with millions of products.

and for page cache you can use varnish as well.

You need to use best technologies for experienced.

Redis is also good option for setting.

and use rabbitmq for queue processing.

Let me know if any buddy need any kind of technical help/query.

Thanks and Best Regards,
Axansh Sheth

Our Opencart Module
All in ONE SEO |
Multi-purpose pop |
Restaurant theme |
Store time save module

User avatar
Active Member


Mon Feb 06, 2012 3:07 pm

Post by rempong » Sun May 11, 2014 9:08 pm

There is "special treament" needed if you've large product.

Searching need sphinx / solr / elastic search etc
Cache replaced by memory based, like redis, memcache, ironcache etc
Opencart table need more index
Mysql, do tune up some variable
Have load balancer (HAProxy / nginx / varnish)

And yes, if you've such large product you will need serious web hosting (vps, dedicated).

Opencart Compare Link VQMOD, Link your compared product to forum/email

Copy and DIRECTLY Edit Product, the easy way.

Custom Product Sort, Full control to product sorting options

Already Sold Product Module, shows total product sold

Opencart Shortcode (Wordpress Clone)

Active Member


Fri Sep 14, 2012 2:38 pm
Location - Indonesia

Post by kabatza1 » Wed Jun 18, 2014 4:27 am

Sorry for the late reply guys.... I've been busy.

I think you have all misset the "Laptop" bit...... it was just a test not a live store.

Yes I did tune mysql settings before adding sphinx but this is irellevant as sphinx search performance has nothing to do with mysql.

I did not add any indexes in opencart....the opposite...I removed a few as the job was now handled by sphinx.

I did have full-page cache and replaced the stock SEF URL with my own wich work without need to acces the database
Like this: some-product-title-here-PID20 (PID20 is basically id=20)
Maybe some more changes I cannot remember now..... this was not a clean install, and i have played with a lot of settings before finally trying sphinx.
The test was run on localhost so there was no user load on the site.
Nevertheless it was a good test!



Wed Jun 13, 2012 1:08 am
Who is online

Users browsing this forum: No registered users and 10 guests