Post by dangkhoaweb » Sat Aug 13, 2011 12:07 am

Hi!
I write my own a function to import products from CSV file, that file was exported from another site
So the process is:
1/ User upload that file to my server
2/ Read that file and make a loop through all rows
3/ get the image link in "image" field --> download it to my server
4/ check if SKU has been use or not to Insert or Update

If it loops through 5000 products --> long long time and maybe database error as in attached file :( :(
I think about close DB connect for each 500 products then open DB connect again but not sure how to do it in Opencart

Any solution to make it works effective, anyone have idea please?
Thank you

Attachments

errormysql.png

errormysql.png (21.47 KiB) Viewed 2737 times


Newbie

Posts

Joined
Wed Jan 13, 2010 11:34 am

Post by terraGirl » Sat Aug 13, 2011 12:41 am

The error "MySQL server has gone away" means that the MySQL time out has been reached. This setting is in my.cnf and is a server-wide setting. If you are on dedicated, you can modify; but if you are on shared hosting, your host may not modify this as it does have some serious performance implications.

The relevant part of my.cnf is:
wait_timeout=30

As first step, you could check with your host what the setting on your server is & whether it can be increased.

--------

Doing each product as it's own MySQL seems a radical solution - just splitting the CSV file into smaller parts eg 5 x 1000 may do it.

A second though is the image link - if you're downloading images from another location while the MySQL connection is open, then this can seriously slow things down. Maybe remove the image function from the import script & run separately? Accessing offsite content is the most common reason for MySQL timeouts.

UK Web Hosting for Designers & Developers by TerraNetwork (Norwich, UK) for websites of all sizes.


User avatar
New member

Posts

Joined
Fri Aug 12, 2011 12:26 am

Post by dangkhoaweb » Sat Aug 13, 2011 10:19 am

hello terraGirl!
Thanks your feedback, I will try the second solution first :D
Have a nice day

Newbie

Posts

Joined
Wed Jan 13, 2010 11:34 am

Post by dangkhoaweb » Sat Aug 13, 2011 5:08 pm

Hello!
After trying to download on small 100 products at a time --> it still has error on database, I find out the reason is some images are TOO BIG (1 MB to 4MB) ---> long time to finish download 1 by 1
Any suggestion please
Thanks

Newbie

Posts

Joined
Wed Jan 13, 2010 11:34 am

Post by uksitebuilder » Sat Aug 13, 2011 5:53 pm

get the images first from x and put them on your server, then you can take that out of your script ?

User avatar
Guru Member

Posts

Joined
Thu Jun 09, 2011 11:37 pm
Location - United Kindgom

Post by terraGirl » Sat Aug 13, 2011 6:14 pm

Agree with uksitebuilder, with images that large even increasing the time-out in MySQL won't help much. MySQL connections are meant to be opened & closed swiftly, keeping them open long enough to download images up to 4MB isn't advisable.

You will need to split off the image download from the MySQL import script. Either do it before or after, but run it as a separate script.

But also for images of that size:
* an FTP upload would be much quicker
* secondly, the images really should be optimised before adding to your site - no visitor is going to be happy if the page loads slowly thanks to 4MB image sizes

Edith

UK Web Hosting for Designers & Developers by TerraNetwork (Norwich, UK) for websites of all sizes.


User avatar
New member

Posts

Joined
Fri Aug 12, 2011 12:26 am

Post by dangkhoaweb » Sat Aug 13, 2011 6:39 pm

Thanks for your feedback
But the image is from another server, the csv file was exported by that server (affiliate program) so I can't control or optimize the size of it :(
I separate code like this:
1/ upload the csv file to server
2/ import it to database without downloading image (just put the image link to the "image field")
3/ after import it successfully, I have to click a button call "Get Image"
4/ select all imagelink in database that has the prefix "http:" with the LIMIT 500
5/ do the download image and update the "image" field in database ( I know that the database must wait too hix :( )

Please guide me more detail about "FTP upload would be much quicker"
Thanks all of you

Newbie

Posts

Joined
Wed Jan 13, 2010 11:34 am

Post by terraGirl » Sat Aug 13, 2011 6:53 pm

For FTP, it depends how the affiliate scheme works - for dropshipping, sometimes the provider makes the images available so you just periodically (when new stock comes in eg weekly) upload the images manually via FTP. But you'd need to discuss this with your provider.

For the MySQL import, I'd be inclined to populate the image field with the image file name (not the http link) so you don't need to update the DB twice. Then run through the CSV file & download all images with http links. But this will lead to a short period of time when the image won't yet be available on your site.

Or, import the CSV with image field empty. Run 2nd script from CSV to download all http links (write script to read CSV not the DB). Then run a 3rd script that updates the DB with the actual image name from the CSV.

Either way, I'd handle the image download from the CSV and completely separate it from the MySQL update (no MySQL update during image download).

UK Web Hosting for Designers & Developers by TerraNetwork (Norwich, UK) for websites of all sizes.


User avatar
New member

Posts

Joined
Fri Aug 12, 2011 12:26 am

Post by dangkhoaweb » Sat Aug 13, 2011 7:07 pm

Thank terraGirl very much
I will try it and feedback :D

Newbie

Posts

Joined
Wed Jan 13, 2010 11:34 am

Post by terraGirl » Sat Aug 13, 2011 9:06 pm

Yes, if you have time, would be interesting what worked for you. I've dealt with large mass product upload scripts before, but not ones that also had an image upload going on, would be good to know how that could work in reality.

UK Web Hosting for Designers & Developers by TerraNetwork (Norwich, UK) for websites of all sizes.


User avatar
New member

Posts

Joined
Fri Aug 12, 2011 12:26 am

Post by uksitebuilder » Sat Aug 13, 2011 9:15 pm

I've done it in the past (the early days) with about 2500 products, entailed having no time limit set for php scripts.

open and close the database connection within the loop, rather than leaving it open for the duration

forgot to add, used @copy within the loop to grab the image from remote host

User avatar
Guru Member

Posts

Joined
Thu Jun 09, 2011 11:37 pm
Location - United Kindgom

Post by dangkhoaweb » Sun Aug 14, 2011 12:35 am

Hello all!
I have tested with 100 products (100 rows in csv file) <-- it takes about 15 mins ( total 100 images size: ~ 150MB):
1/ save info of product to database
2/ download image from link in csv to a TEMP folder (will "unlink" after finish step 3) --> no database things
3/ do a resize (copy to image/data/ ) --> from original image 1,5MB to 20KB (GREAT)

it will be slow for 5000 products :(
I think I can split that csv file to small csv --> any suggestion for this solution, or any help please
Thanks you all

Newbie

Posts

Joined
Wed Jan 13, 2010 11:34 am

Post by uksitebuilder » Sun Aug 14, 2011 12:53 am

The resizing of those images will take some major resources and time.

What I would do:

1. Run a script to get all the images from the remote host downloaded to a temp folder (hope you have lots of space)
Could possibly do this in 2 or 3 simultaneous runs maybe more depending on your server

Order a Pizza and eat it

2. Try using image magick to batch resize the whole temp folder (this is the tricky part no matter how you look at it, resizing 5000 x 1.5mb [7.5GB] of images is going to take time)

Have a romantic 3 minutes with the wife

3. Move the resized images to image/data and drop the temp folder

Go have a coffee

4. The database data insert including the image name.

Smile and say Job Well Done

User avatar
Guru Member

Posts

Joined
Thu Jun 09, 2011 11:37 pm
Location - United Kindgom
Who is online

Users browsing this forum: No registered users and 72 guests