Cron job keeps failing, what am I doing wrong?

purepixels

New member
Joined
Jul 5, 2019
Messages
3
Hi all,

Working with DirectAdmin 1.646, on a CentOS 7 vps. I don't know what to use to start the command. Not being a programmer, nor don't have I enough knowledge to get this cron command setup right and thus, the job keeps failing. I try to find some documentation to learn stuff like this, but either I'm searching for the wrong topics, or it's hard to find the right documentation. So if there's someone who can guide me in the right directions, I would be very thankful.

I'm trying to setup a cron job for a client website. I have to url's - one is a processing url, the other is a trigger url, both url's are like this;

trigger

and

processing

The suggestion is to set the trigger job to my needings, few times a day during working hours. The processing url needs to be called every x-minutes, for me every 15 minutes will be fine. I get the timing thing. Setting a schedule to make this work is not the issue. My issue is with setting up the command in the correct way. The thing I could use some help is with the command part.

Do I start the command like this; /usr/bin/wget - or will it be enough to only put wget as first part of the command? I read that since it's an url on the same server, I do not need to use CURL command, if I understand correctly, that is used to import from remote locations. In my case, all data is located in the same account where the website is located, within the DirectAdmin user. And than there are a series of letters which can be used to alter the output of the cron, -O, -s, -L, all sorts of stuff which I have a hard time finding out what all of those do or mean. I have no idea if I need those in my command line. In the examples given by the documentation I have from the Import-software, they show the cron jobs without any other addition to the cron command.
And another thing from the url to be called for with the cron, do i need to put that between "url" or 'url', does it matter if I use ' or "?

As you can see, a real NOOB here! I hope someone can help me in the right direction in understanding what I'm doing wrong, and why. I've tried searching for a document with all the parts to build up a cron command, but I've not been able to find a guide that helped me so far.

Many thanks in advance for any help!
 
Uhm, not sure why the URL is there twice in my message, and the trigger/processing is moved to a new line, those supposed to be the last part of both url's.
 
The 2 lines is because the lines are seen as url's and the words aren't for some reason, so probably that's why they went on a new line here on the forum.

I found this:

You are not getting some file, so I wouldn't know why you should use wget or curl. Just enter the url's that's all, with php in front of it like @Peter Laws stated.
But are you sure it must be wp-load and not wp-cron? Because I have the idea the wp-cron.php should be used instead in those lines. But I could be mistaken.
 
Hi Peter and Richard,

Thanks for your fast reply on my request for a bit of help. I tried setting the job like Peter is suggesting with the sample, in that case the url parameters are cut off. The cron output email is suggesting that I loaded only a part of the url. It cuts the complete url after the first &. I tried addind "" around the url, that seems to do the trick in combination with the php as first part of the command.

@Richard - WPAllImport is showing me the cronjob url's with the wp-load.php in their examples. My knowledge is so limited about this stuff that I don't dare to argue with them or you. I just trust the documentation and copy-paste their suggestions from the documentation.

What I don't get is why do you add the /usr/... to the curl and wget lines, and php without a /usr/... in front of it. And why the -O with the wget command and not for the php command. So much to learn about this stuff.
Do you guys have any idea about a document where I can learn from? Or books? Or whatever. I try to understand these commands, but I have a hard time finding some good documents on it.

Thanks again for your help! For now the cron job seems to work. That was the main thing.
 
Hello.

It could be that you do not need to use the url but the relative path. We always use the relative path, so like this:
Code:
/usr/local/php74/bin/php /home/user/domains/domain.com/public_html/wp-load.php?import_key=[MY_SECRET_KEY]&import_id=[MY_IMPORT_ID]&action=processing

That would be the correct way to do it. Ofcourse replace "user" with the correct name and also domain.com by the correct domain. And the key.
And replace php74 with your version of php, or just remove everything and use only php without path. Test if it works.
 
Last edited:
What I don't get is why do you add the /usr/... to the curl and wget lines, and php without a /usr/... in front of it.
Forgot this one. This can be done because not on all systems wget or php is in the path and then cron would not be working.
I don't use it like that, because I've got that in the path.

Also be aware as the paths can differ between distro's. For example /usr/local/bin/curl is in /usr/bin/curl on my Centos 7 system.

Command line options can also be found on the net, like here:

To learn about crons, maybe this is a good place:
 
Hello.

It could be that you do not need to use the url but the relative path. We always use the relative path, so like this:
Code:
/usr/local/php74/bin/php /home/user/domains/domain.com/public_html/wp-load.php?import_key=[MY_SECRET_KEY]&import_id=[MY_IMPORT_ID]&action=processing
This will not work as when invoked via the command line, parameters are passed in the $argv array, just like C.... $_GET/$_POST associative arrays are only initialised when your script is invoked via a web server.
 
Last edited:
are only initialized when your script is invoked via a web server.
Ah oke, thanks.

Seems the solution was already there but I've overlooked it.
It cuts the complete url after the first &. I tried addind "" around the url, that seems to do the trick in combination with the php as first part of the command.
 
Back
Top