I'm working on securing our website at the moment and running into some issues. The OC documentation here: http://docs.opencart.com/display/openca ... +practices says that if I upload an htaccess file with particular configuration and "your ip address here", yet obviously my own personal IP wouldn't make sense as it is a public website... I also tried the server IP, localhost, 127.0.0.1, et al. Yet it still denies access no matter what unless I put my own IP which blocks the whole world except for me. Great for business, right
My question is... what folders am I supposed to upload this into? Just /catalog/ and/or /system/, or all sub-directories as well? And what am I supposed to use for "your ip address here", if not my own?
Second question: I found another source who mentioned that blank index.php files should be placed in all "pertinent" folders to prevent directory browsing. Okay... which folders? I see that WP uses this in their base install but is not in all folders--seems to be just the empty ones.
Beyond that, I'm really looking to open a discussion here on how to lock down an OpenCart site like Fort Knox without killing the functionality of the site beyond the basic "Opencart Security 101" stuff. I've installed CMS-specific security modules on other websites that have proven to be very successful at blocking attacks but yet to find anything worth buying made for OC.
all traffic to site is logged, including refferer and what they try.
suspicious ip adresses are blocked by htaccess in root by adding a entry deny from ipadress, wich goes for whole site, so those ip adresses can not access at all.
opencart has by default empty html files in all directorys that should not be accessed from outside, so that is tackeled as well, as i servers th same purpose as emty index.php files.
There are some discussions about the admin, changing the folder name from admin to anything you want, wich introduces more security.
i personaly believe that if you want fort knox, protect it with htaccess password, as bots wil find it in the end.
as far as i know opencart is quite secure, i keep an eye for bots that try to find exploits and activly block them.
futher i invested in automatic backup of site and database, make sure my computer is clean, and hosting login credetials and admin credentials are changed on regular base (once a week)
Koeltechnische deurrubbers eenvoudig online op maat bestellen.
Alle niet stekplichtige onderdelen zoals scharnieren, sloten, randverwarming en verlichting voor alle typen koelingen en vriezers.
The issue with the kind of security policy is that it is reactive, rather than proactive. Security modules I've used on other websites maintain a "blacklist" of known malicious user agent strings, IP addresses, et al. I think I will copy & paste this over to the Opencart htaccess file as an added measure of protection. Blocking hacker IPs is a joke, for obvious reasons... the most obvious of which being that any decent hacker will never use the same IP address more than once. I'm not worried about the script kiddies. I'm worried about highly intelligent hackers who know exactly what they are doing and have been doing it for a very long time.suspicious ip adresses are blocked by htaccess in root by adding a entry deny from ipadress, wich goes for whole site, so those ip adresses can not access at all.
There do not appear to be any of these files in the base install. What should I use as a point of reference when uploading my own? WHICH directories should this file be uploaded to? This was one of the original questions I had posed above.opencart has by default empty html files in all directorys that should not be accessed from outside, so that is tackeled as well, as i servers th same purpose as emty index.php files.
Great advice. This kind of thing is already part of the "Security 101" article provided by Opencart. My admin password would take close to 1,700 years to crack with brute force and it is changed regularly. It's not the brute force crackers I'm worried about though... it's malicious JS/SQL injections, malicious file injections, remote PHP execution, et al...There are some discussions about the admin, changing the folder name from admin to anything you want, wich introduces more security.
Something that's worked quite well on other non-Opencart websites is to add the following line of code to my php.ini file on a site-to-site basis:
Code: Select all
But this is really just the TIP of the iceberg!!
Since I can't find a way to make the instructions work properly from the Opencart manual for htaccess prevention, I'm at a loss. As I stated before, denying access to php files in the /catalog/ directory causes the site to stop working properly unless I allow my own IP address, which completely stops the flow of business, allowing me to have my cake as long as I'm okay with no longer having a functioning website for the rest of the world.
What am I missing here guys? I think what would be very helpful is an application firewall designed for or tuned for OpenCart specifically. Perhaps a file-change detection system to spot successful attacks before havoc is wrought. I've used OSE Security for other sites, but only because it comes packaged with Joomla & WordPress whitelist rules so it doesn't interrupt the functioning of the website once installed.
Hoping to get a good conversation going here on some hardcore OpenCart security methods that few people would even understand how to implement. Thoughts? Ideas? I want to learn.
(1) One overview of security, with various links, comprises the currently 4+ pages of http://forum.opencart.com/viewtopic.php?f=20&t=98644 on through the more recently explicit means of 64 bit mime attacks by shifting double- and multiple-extensions while setting fake text .jpg files into action as fully executable .php files, as at http://forum.opencart.com/viewtopic.php ... 60#p440953 and http://forum.opencart.com/viewtopic.php ... 60#p453513. Hacking consoles are a substantially worse, yet, risk, and several of those consoles that I have encountered have been supremely malicious. There are several threads on other specific installations, such as current summary at http://forum.opencart.com/viewtopic.php ... 62#p458962.
(2) If your server offers ModSecurity firewalling you may want to utilize it, it works very well in stopping suspicious code. It is "tuned" for explicit as well as for heuristic detection of various kinds of code snippets which for practical purposes cannot be arriving for any good purpose at all. Among other niceties, in looking at inherently suspicious code snippets it sandbags attempts to reach directories and files that do not even exist.
(3) Read, safeguard, and archive your ftp logs, and especially your Apache http traffic logs, which show visits in extreme detail. Hackers will try to ruin those. Rogue robots and benign but berserk robots can be evaluated through them.
(4) Since .htaccess files are read hierarchically from root upward, the most widely applicable protections for all domains and directories in an account's tree belong in the account's public root (usually public_html/, www/, httpdocs/, htdocs/, or similar, according to how that is "aliased" in Apache configuration). Whatever is in that file essentially does not need to be repeated higher in the tree. You can block address ranges (such as known hotbeds of hacking around the world where you would not want any customers anyway, and proxies), but teasing those out of address rosters is tedious.
(5) You can put zero-byte index.html files in whatever directories do not have normal live index.html files. Normal index.html files already do what they do, in the following critical regards. Access to directories normally tries a default sequence of indices, index.html, index.htm, index.php, and then others. That can be controlled in .htaccess to some extent. When a hollow index.html sits in a directory, it sandbags attempts to see a roster of files in the absence of any index.html file. You may see 44 byte index.html files instead; I prefer that they be zero bytes, then I know at a glance that they are still intact unless the operating system itself has been upended. When other controls are ample, I tend to prefer having nothing present.
(6) Renaming your /admin/ and /download/ directories is not as effective as simply passwording them so that a server challenge greets anyone before the log-in or any download can even be seen. There is a limit to renaming, as well, since complete avoidance of all words in all languages that may be found in a dictionary somewhere, in favor of utter gibberish, is self-evident absurdity. Passwording effectiveness is markedly increased by allowing as few as 3 tries, or when strict 2 tries, or when outright strict 1 try, before a mandatory lockout and lengthy wait or when quite strict a recognizably personal live request for an new preassigned username and password. That level of effectiveness is usually impractical. In the simply telephonic BBS days, with intricate doors and voluminous downloads available through one-on-one modems at (wow!) even as high as 2,400 baud or on (superwow!) T-1, those fairly extreme measures were practical and essentially invincible means of initial access. Customers would ordinarily balk nowadays.
(7) The essentials for .htaccess are found in the .htaccess.txt file (which upon installation is edited and renamed to make it live). Check .htaccess files from time to time, hackers can reach and gut them. Fairly complete .ht* file rules (there are several kinds of .ht* files) are fairly intricate and voluminous. There is an old adage that reasonably complete security is had only by allowing only one adequately passworded and strictly policy-governed user and otherwise unplugging a computer and sleeping with it. One does one's best, there is no such thing as utterly invincible but approximations to that do work. Really the last word you would care to find yourself saying is, "Ulp."
(8) I am satisfied that OC itself, 126.96.36.199 through 188.8.131.52 and in all probability 2.0.x in their wake, is extraordinarily secure when installed and maintained properly. Yes, there have been "fixes" for this or that security concern along the way, and due attention needs to be paid to "retrofitting" them into place, but the system itself is not easy to hack into. In complex websites where OC is only one important element, augmented by a blog or forum or both as well as other features, vulnerabilities of OC tend to increase owing to something among those. The idea overall is to secure the entirety, by plural means that are essentially extrinsic to OC itself.
What I was able to determine based on the above, and additional research, is that the most effective method of securing an Opencart site that does include upload or use of the download directory will be as follows:
1) Add to .htaccess in public root:
Code: Select all
RewriteRule ^download/(.*) /index.php?route=error/not_found [L]
http://forum.opencart.com/viewtopic.php ... 20#p403255
3) Add .htaccess file in /download/ and /system/ directories containing:
Deny from all
4) Disable ZIP module in Apache, PHP.
5) Ensure that CKeditor setting:
$config['CheckDoubleExtension'] = true ;
6) Upload 0-byte index.html files in all directories
7) Enable and configure ModSecurity on the server level. Strangely, I was able to determine that ModSecurity was pre-installed on the server but with a blank config file. I set this to the default settings for now, but also purchased two books on working with ModSecurity to defend against attacks. In the meantime, do you have any "quick reference" or resources with Opencart-specific examples of special configuration that would prove more useful than the base config file?
8.) Rename the /admin/ directory. I'm having some difficulty doing this. It seems as though vqmod is throwing some kind of error... which I fixed by changing the $admin variable to the new directory, so had to roll back to the default /admin/ directory. I'm thinking I may want to put some htpasswd protection on this directory as well.
9) Analyze ftp & http traffic logs. My issue here is that these logs are gigantic in size. Am I to understand that as a security-minded administrator, I am to read an analyze these logs in their entirety, or only after a successful attack? Are there any tools available that might eliminate innocuous entries from the logs so that only potentially malicious entries would be included for review?
10) Hide ftp & http traffic logs from attackers so they cannot destroy them after a successful attack. What is the smartest place to put them where a hacker would never find them? Server root? What about naming policy? Any other considerations in relation to log file obscurity I should know here?
11) As mentioned by cleo, Crawlprotect seems like a good tool that I will consider installing as well.
Lastly, I will mention that a security module I've been using on some other (non-Opencart) websites comes packaged with a file-change / addition / deletion detection system. Before closing all security holes, this tool has been a GREAT help in notifying me of successful attacks. Are you aware of any tools like this built for Opencart or otherwise that may prove useful?
It makes perfect sense... this is for the admin site. You only want to allow certain IP addresses access to the admin area so you would use your personal computer's ip address.furrywombat wrote:and "your ip address here", yet obviously my own personal IP wouldn't make sense as it is a public website...
Plural extensions are important well before and well beyond ckeditor's own config, but that is a start.
Renaming /admin/ and /download/ as well as passwording them for additional protection is good. If vqmod is balking, then that is because vqmod knows what the defaults are. Rename /admin/ or /download/ if desired, THEN LEAVE IN PLACE the config.php pair, back up the index.php pair, upload a new virgin index.php pair, then refire /vqmod/install/index.php in order to accommodate changes in the names of /admin/ and /download/.
Traffic logs are usually already put where they are fairly safe. If hackers get in far enough to get to those, it is too late for preventive vigilance till that is mopped up.
The traffic logs can and should be periodically archived and pruned. New ones will be generated. Archived ones can be subdivided. When logs become too large to open, they have become useless. On some servers you can reach them. On some servers you must ask support to supply copies of them (one dedicated server had that asinine arrangement, and the crucial timeframes in the logs had been hacked). There isn't anywhere that a hacker would never find. The idea there is the same as with database backups, minimize unarchived timeframes lost.
You can read traffic logs in the same frame of mind as the daily news and funnies, peruse and when appropriate pay strict attention. What's been going on? What is interesting or suspicious or worse? How are the patterns doing, what are the robots up to? Have rogues arrived? Have even benign robots generated enough visits and sessions to raise concern that they might well alarm the host into shutting down the website?
Crawlprotect serves its purposes; it is but one useful tool and carries overhead that can foster turning it off. To be blunt, my trust in such things is drastically undermined when I read, as at the moment on that website, "CrawlProtect do more" and "they will try to thind an easiest target." That's not in a "post" or "message" on the fly but in a landing page intended to sit there. What, the whole damned thing was done by people who might as well have written Oriental toaster and skillet instructions for English readers? How does that work when the lingo is code and requires exacting terms and syntax -- that might maybe be sloppy? And I neither do nor forgive this sort of nonsense, which all too often is a flat knowing lie (one was done by AVG and took a while to get rid of AVG): "This extension may change your browser's default search and homepage. You can control these settings right after installation."
None built specifically solely FOR OC. Others include ModSecurity. It helps to review filesets locally using means that often cannot readily be executed on public servers. FileZilla Client can compare local and remote filesets, then return likes or unlikes in a compact list. Are all images actually still images? Little barefoot BadPeggy can check that. Are all protective files actually still protective, rather than gutted or given stand-in content for byte counts, no longer actually doing their jobs as .ht* or index.* or whatever else? Look; inside; in person. Generally YOU are your foremost anti-malice scanner, so know thy trees. That means timestamp patterns, proper resident files and directories, permissions, whatever, against which intrusions will attract your attention.
[I'm unsure why the frequency of this has been increasing recently, but when I hit Reply I'm not actually expecting doing so to return, "The requested topic does not exist." Better than 500.]
Yes, for the admin site this does make perfect sense. But straight out of the Opencart documentation for general security:It makes perfect sense... this is for the admin site. You only want to allow certain IP addresses access to the admin area so you would use your personal computer's ip address.
When I upload this to the /catalog/ folder, the CSS (derived from PHP) does not fire. It would appear that, were the CSS a standard.css file there would not be an issue.Catalog
The catalog can be protected with the traditional .htaccess file. Using file match can be useful for protecting important file types for your store, such as php and txt, rather than all of them. The following code can be used for .htaccess in your catalog folder:
This will deny access to all template, php, and txt files.
Code: Select all
<FilesMatch "\.(php|tpl|txt)$"> Order Deny,Allow Deny from all Allow from "your ip address" </FilesMatch>
Correct me if I'm wrong, but the only feature I disabled on the server are the ZIP functions, referenced here: http://www.php.net/manual/en/ref.zip.php. My assumption is that this would not have any affect on the type of compression you are referring to? My thoughts on this were originally that a common pattern I've seen in successful attacks is a ZIP file which is extracted post-upload, assumedly via remote PHP execution. Not a chance any shell commands could be run as far as I'm aware so GZIP would not be of concern.Disabling compression will not work out very well if you need export/import tools or have other reasons for compression.
Love this approach. I've been reading more about profiling and essentially grabbing more data about the user interaction with the site to include post vars (hidden from HTTP server logs, unlike GET variables of course) and other data not available in the default server logs. In my reading on this topic, I came across this website:The traffic logs can and should be periodically archived and pruned...You can read traffic logs in the same frame of mind as the daily news and funnies, , peruse and when appropriate pay strict attention...
According to the website copy:
File integrity checking, essentially being another term for file change monitoring, I then did some additional looking to determine if this would be considered a replacement for, or enhancement to, ModSecurity. Looks like they may play well together, according to some WordPress guy: http://holisticinfosec.blogspot.com/200 ... y-and.html. Not really sure what to take away from that. Need to research more.OSSEC is an Open Source Host-based Intrusion Detection System that performs log analysis, file integrity checking, policy monitoring, rootkit detection, real-time alerting and active response.
It runs on most operating systems, including Linux, MacOS, Solaris, HP-UX, AIX and Windows.
Haha, I will avoid this tool based on the commentary to follow the quote above. Hadn't even checked out the site yet until you mentioned it. What a mess!Crawlprotect serves its purposes...
However, double- and multi-extensions are one avenue in -- thus far as to OC rarely as .*.zip or as .zip content -- and from there the raw simplicity of shell commands is handled quite deftly by slipping further executable files into known positions. Dropping down to full account root as well as doing whatever putty.exe can do is then easy.
Filetype denials normally go in .htaccess in OC root so as to protect trees and branches. I've never even considered putting them in /catalog/ (and in /system/, /admin/, /image/, etc.); the basal position covers trees. Behaviors of various .css are "spotty" and .css is fairly easily read anyway as source.
Whatever logs are available are good reading, and good to read. When they are at their most revealing, as to robotic, malicious, and common visits, alike, they border upon droll (what the Russian Yandex does certain "trusted" robots also do and are plainly berserk for it). Most servers do not provide accounts with the most essential logs. The primary php error logs that can be positioned precisely where wanted, and that spring from php itself as set in php.ini, allow evaluations of file executions in particular directories. What comes out of OC captures is not trivial but is also not as flexible. "Compleat" primary Apache traffic logs don't miss much. Access to primary Apache configurations helps, but most folks do not have that.
Whereas Trend Micro came bundled with several machines down through the years and its performances and summary removals left me with nothing remarkably positive to say about it, I'm not terribly happy to see that "OSSEC is an Open Source Host-based Intrusion Detection System" and is "Backed by Trend Micro's Global Support Team" at http://www.ossec.net/. On the other hand, maybe they finally got it right. They do provide binaries and key(s), but, at any given time, only one for all Windows and one for each of only some Linux distributions (where mismatch would be problematical). That reduces OSSEC to an additional tool, requiring close attention to its ins and outs, which in the realm is probably about as good as it gets anyway. Sigh.
Ya gotta come away from McRee with at least this much: http://holisticinfosec.org/templates/be ... ryFlow.png . . .
I and furrywombat both seem to me to be amply unconcerned for whether crawlprotect has a happy rooting section or vigorous defender.
How on earth you dreamt up "not even using opencart" is not even of interest.
Regarding admin protections (above, 30th), a few further thoughts and means (06th) are at
http://forum.opencart.com/viewtopic.php ... 94#p460383
Users browsing this forum: No registered users and 24 guests