Trying to use a cron job (wget, php and script using curl, etc.) to run module code in the front end. Everything works fine when visiting the url in my browser and the module runs fine updating the DB. However when I use wget or curl+script manually nothing seems to work.
Is there a reason for this? Why wouldn't wget and the url work the same as my browser?
Thanks.
PS. This is actually kind of weird, it's a multi-lingual site and it tries to store cookies. The output in wget and lynx suggests it is working...but the DB does not update at all.
wget doesn't have cookies (sessions too) so you can't visit some page in backend
but you wrote you are running script in frontend - without login and session required right?
copy the url to anonymous panel in the browser and watch what it will do and don't
but you wrote you are running script in frontend - without login and session required right?
copy the url to anonymous panel in the browser and watch what it will do and don't
Mass update product descriptions/Hromadná změna popisků zboží
Hey,
Try this:
Change feed/feed to the route of the function.
You have to run the script from the front end, I would suggest setting up a password and username in the file and calling it in the url using &password=password&username=username for security reasons.
Regards,
Joel.
Try this:
Code: Select all
wget -q -O /dev/null "http://yourdomain.com/index.php?route=feed/feed"
You have to run the script from the front end, I would suggest setting up a password and username in the file and calling it in the url using &password=password&username=username for security reasons.
Regards,
Joel.
Thanks,
No the code requires no login or session and yes it's on the front end, I'm not worried about users or bots hitting it themselves as it will just initiate tasks which isn't a bad thing and actually relieve server load with more processing of tasks more frequently.
I have tried using wget with -O but when I look at the DB nothing changes, it only changes when I visit from a browser on the front end. I run wget from a linux terminal and although the output makes me thing it worked, I see no change in the DB which the script should update.
The feeds stuff, is that in open cart already? Maybe I should look at that code and see if I am missing something...
No the code requires no login or session and yes it's on the front end, I'm not worried about users or bots hitting it themselves as it will just initiate tasks which isn't a bad thing and actually relieve server load with more processing of tasks more frequently.
I have tried using wget with -O but when I look at the DB nothing changes, it only changes when I visit from a browser on the front end. I run wget from a linux terminal and although the output makes me thing it worked, I see no change in the DB which the script should update.
The feeds stuff, is that in open cart already? Maybe I should look at that code and see if I am missing something...
I closely inspected the output of wget and at a glance it looked like it was working, but when I look closer I see a discrepency. I dumped out a php date time to see and when I use browser it's real when I use wget it's old...how do I get it to run properly and not from cache?
Is it open cart cache? VQMOD cache? Something third party? What would be causing wget to get an old stale version or read something out of cache? Like I said, it works fine via web browser so with wget it may not be running any code just getting a cached HTML version?
Is it open cart cache? VQMOD cache? Something third party? What would be causing wget to get an old stale version or read something out of cache? Like I said, it works fine via web browser so with wget it may not be running any code just getting a cached HTML version?
i could be more helpful if you could describe what exactly it is you're trying to do. for the most part if i was setting up a cron job to do anything with opencart i'd just write it to issue a query to the db directly. though this may not be appropriate in every case, it can accomplish 99% of the tasks you'd want.
As I had a similar requirement several times, I put my ideas into a lightweight commandline tool called OCOK. Which solves amongst others the issue of calling Opencart controllers via command line and thus as cron jobs...
All you need to do is createing a controller in the admin package and call it with the ocok run command from the terminal.
I think you shouldn't use URLs for cron jobs, which are world callable. Hiding them behind the admin only allows the commandline calling them.
Documentation on installation and usage can be found here. Feel free to contact me if you have problems, feedback or suggestions...
All you need to do is createing a controller in the admin package and call it with the ocok run command from the terminal.
I think you shouldn't use URLs for cron jobs, which are world callable. Hiding them behind the admin only allows the commandline calling them.
Documentation on installation and usage can be found here. Feel free to contact me if you have problems, feedback or suggestions...
Checkout our extensions, or our open source projects
Who is online
Users browsing this forum: No registered users and 34 guests