Mostly working with Firebug + Google Page Speed.
You can also use: http://www.webpagetest.org
to test your site too, for some more in-depth stats.
This is to make the user experience better and in anticipation of Google's recent announcement that page-loading-time will take more of a role in search results.
These are all pretty simple to do. Always backup ALL files before making any changes.
If you're not sure about any of these changes, or they don't work for you, don't do them. Simple.
By adding caching timestamps (see below) I removed about 4 seconds from page load time.
By implementing a content sub-domain (see below) I cut page load speed by another 4-5 seconds according to Google Page Speed.
Obvious to some, but not others!
I'm not sure if Opencart does this by default when you upload an image, because I always compress images before uploading, but.....make sure your images are optimised!
Use PageSpeed to see how much more your images could be compressed. This is quite important for an online store with loads of images.
Or download a optimisation application like this one:
By default your Apache install should gzip/deflate most media types, make sure it gzips css and js too - as this can help.
Code: Select all
<IfModule mod_deflate.c> <FilesMatch "\.(js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule>
This one's really simple and just involves updating your .htaccess file.
Basically all the files your site uses (jpgs, gifs, css, js, etc) need to be able to tell your PC how long they should keep them for before checking for a new file.
If we don't tell the browse a "cache time" then it will get a fresh copy of the file every time....which means more bandwidth usage and getting a 'fresh' version of the site every time - which takes longer to load.
Google recommends (or Page Speed does) that files are cached at least a month - to be honest, this is a bit of a long time but...I'm sticking with what Google tell me
Although I've left JS and CSS at 24 hours since I'm currently changing the site a lot - if you don't make many changes, set it to at least a month.
(PageSpeed and other optimisation tools through a fit about a 'small' 24-hour cache period, but hey...
Before on a 'fresh view' my site was transferring 400kb. On the second view it was still transferring 300kb.
After changing the htaccess, it dropped to 80kb.
So add this to your htaccess file :
Code: Select all
# Turn on Expires and set default to 0 ExpiresActive On ExpiresDefault A0 # Set up caching on media files for 5 weeks <FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$"> ExpiresDefault A3024000 Header append Cache-Control "public" </FilesMatch> # Set up caching on media files for 5 weeks <FilesMatch "\.(gif|jpg|jpeg|png|swf)$"> ExpiresDefault A3024000 Header append Cache-Control "public" </FilesMatch> # Set up 1 day caching on commonly updated files <FilesMatch "\.(xml|txt|html|js|css)$"> ExpiresDefault A86400 Header append Cache-Control "proxy-revalidate" </FilesMatch> # Force no caching for dynamic files <FilesMatch "\.(php|cgi|pl|htm)$"> ExpiresActive Off Header set Cache-Control "private, no-cache, no-store, proxy-revalidate, no-transform" Header set Pragma "no-cache" </FilesMatch>
Ever noticed that Amazon, Ebay and other sites load resources from places like:
This isn't just good organisation, or load balancing with their server farms/content delivery network - it helps speed up page loading.
If you have everything on one domain, like http://www.LoveMoissanite.com, then the text, images and scripts are all requested from the same domain.
This means that the content is downloaded one after another....so your text, then your images, then your scripts... (in any random order).
If split your content on to different sub-domains then your browser can make multiple connections at the same time and download images, script, text....and any other content in parallel to each other - making the page load faster.
So while your text is loading, your images can also be loading and so can your scripts.
This explains it a bit more:
http://www.askapache.com/htaccess/apach ... mains.html
It's really simple to do, just go into your hosts admin panel and create a subdomain like:
Copy the contents of your 'image' folder into your new 'images123' folder in your www root.
Edit both your Config.php in your top level www folder and your admin folder to use the new images location.
Make sure your also update the directory location used to save images to too!
In my example, I've created a subdomain called "images.lovemoissanite.com", which creates a new folder called images.
Code: Select all
// HTTP define('HTTP_SERVER', 'http://www.lovemoissanite.com/'); [b]define('HTTP_IMAGE', 'http://images.lovemoissanite.com/'); // HTTPS define('HTTPS_SERVER', 'https://www.lovemoissanite.com/'); [b]define('HTTP_IMAGE', 'http://images.lovemoissanite.com/'); // DIR define('DIR_IMAGE', '/home2/lovemois/public_html/images/');
Check your site still loads - make sure it's using the new image subdomain you've created, try and create a test product with an image to check your Admin section works well too.
If you use SSL...
Unless you can afford a wildcard certificate (which seem to start at $500...!) then you need to make sure that your SSL image location points to a non sub-domain location.
Otherwise your images will not load when someone is viewing a page using HTTPS!
So in my example, your config.php would look like this:
Code: Select all
// HTTP define('HTTP_SERVER', 'http://www.lovemoissanite.com/'); define('HTTP_IMAGE', 'http://images.lovemoissanite.com/'); // HTTPS define('HTTPS_SERVER', 'https://www.lovemoissanite.com/'); define('HTTPS_IMAGE', 'https://www.lovemoissanite.com/images/'); // DIR define('DIR_IMAGE', '/home2/lovemois/public_html/images/');
Anyone fancy making a few mods...?
It would be helpful if we also had the option to change the directory used for scripts (js and css) so we could create a scripts.mydomain.com subdomain and parallelise those downloads too.
Additionally, if anyone fancies doing a bit more work - being able to use multiple content sub domains would also help speed a site up too.
So instead of having one content subdomain like "images.mydomain.com", we could use up to 5 and browsers could parallel download from 5 locations - making it even faster:
...you get the idea.
This (in theory) could be done fairly easily...
1. Allow users to specify an array of Image directories in the Config.
2. When a product is created, or a product image added, save the image to every image directory - effectively cloning the content of the Image directory to all the new image directories.
3. When a user requests a page, pick a image URL at random (or start at 0 in the array and loop through each entry..).
So images would load from:
So the config file would end up looking something like:
Code: Select all
// HTTP define('HTTP_SERVER', 'http://www.lovemoissanite.com/'); define('HTTP_IMAGE_0', '[b]http://images.lovemoissanite.com/[/b]'); define('HTTP_IMAGE_1', '[b]http://images66.lovemoissanite.com/[/b]'); define('HTTP_IMAGE_2', '[b]http://i44.lovemoissanite.com/[/b]'); define('HTTP_IMAGE_3', '[b]http://images87.lovemoissanite.com/[/b]'); // HTTPS define('HTTPS_SERVER', 'https://www.lovemoissanite.com/'); define('HTTPS_IMAGE_0', '[b]https://www.lovemoissanite.com/images/[/b]'); define('HTTPS_IMAGE'_1, '[b]https://www.lovemoissanite.com/images66/[/b]'); define('HTTPS_IMAGE_2', '[b]https://www.lovemoissanite.com/i44/[/b]'); define('HTTPS_IMAGE_3', '[b]https://www.lovemoissanite.com/images87/[/b]'); // DIR define('DIR_IMAGE_0', '/home2/lovemois/public_html/images/'); define('DIR_IMAGE_1', '/home2/lovemois/public_html/images67/'); define('DIR_IMAGE_2', '/home2/lovemois/public_html/i44/'); define('DIR_IMAGE_3', '/home2/lovemois/public_html/images87/');