Remote SSH command execution

Jordz2203

Verified User
Joined
Sep 8, 2022
Messages
74
Hey everyone, our server is running Cloud Linux and Direct Admin, so we have a terminal within DA as well as remote SSH via the direct admin username and password of the profile.
But the thing is, we are trying to run some SSH code remotely (we have a PHP script that is connecting and executing the command), but the thing is that as soon as the SSH command gets executed, it disconnects from the session, cancelling the SSH command. How can we go about running this SSH command without needing to keep the PHP script running? The script times out after 30 seconds and the command takes longer than that to execute. I thought of using screen but Ive never used it with this PHP library and Im not even sure Direct Admins SSH allows screen.
 
Hi, add "&" at the end of command, it will execute in background. So even disconnect, it still running.

I have problem like this too with "pm2" nodejs module. It disconnect while execute some args parameter

Code:
php test.php &
 
but the thing is that as soon as the SSH command gets executed, it disconnects from the session, cancelling the SSH command.
I need clarification on exactly what you mean by this.


How can we go about running this SSH command without needing to keep the PHP script running? The script times out after 30 seconds and the command takes longer than that to execute.

Is the script running for 30 seconds and then dying or is the script or is it dying as soon as it's executed?

I don't think this has anything to do with SSH. I think you're script is taking a long time to run. The bigger question would be, why is it taking so long to execute? For example, if the script is trying to communicate with a remote server and the server that you are executing the script from is unable to communicate with that server (i.e. outbound port is not opened) then the script will always timeout.

You can increase the max_execution_time variable when executing the script to see if that helps:

php -d max_execution_time=60 myscripttoexecute.php

But again, if the timeout is happening because of something outside of the control of the PHP script, then the script will continue to timeout.
 
I need clarification on exactly what you mean by this.




Is the script running for 30 seconds and then dying or is the script or is it dying as soon as it's executed?

I don't think this has anything to do with SSH. I think you're script is taking a long time to run. The bigger question would be, why is it taking so long to execute? For example, if the script is trying to communicate with a remote server and the server that you are executing the script from is unable to communicate with that server (i.e. outbound port is not opened) then the script will always timeout.

You can increase the max_execution_time variable when executing the script to see if that helps:

php -d max_execution_time=60 myscripttoexecute.php

But again, if the timeout is happening because of something outside of the control of the PHP script, then the script will continue to timeout.
So regarding your first question, I have a PHP script, and it uses a library called phpseclib SSH2, and it is meant to perform SSH commands, I did this EXACT method, on another Direct Admin server, with a far far far longer command, maybe 30x as long, and it worked. But the difference now is that it was an internal script, whereas this is a webhook essentially receiving form data, thats the only difference technically.

When I run the command (see the code in this gist, the command is the variable $script), it works but dies at a certain point if I remove the trailing ampersand, but if I remove the trailing ampersand, then it doesn't execute any of the SSH commands. Im not sure why?
 
Update on something I noticed: For some reason, if I change the command slightly the following happens...
If I replace $script with
Code:
cd public_html && rm index.html &
it works perfectly fine. If I say
Code:
cd public_html && rm index.html && wget mydomain.com/site-zip.zip &
then it will perform the rm index.html but NOT the wget. If I put the wget BEFORE rm index.html, then it will perform neither.
 
End a command with a single ampersand (&) then you're just telling the shell to put that process in the background.

Running:

php test.php &

Is the same thing as:

php test.php
Pressing Ctrl+Z
typing bg and hitting enter

I'm not sure what this has to do with executing your script.

The double ampersand (&&) is a command chaining directive. This is explained in great detail at:


Basically the difference is, if you use double ampersands you're telling the chain to only execute the next command if the previous command exited with a status of 0. Whereas a semi-colon chain is oblivious to the exit status of the previous command, it's just going to execute one command sequentially after the previous.

I typically always use a semi-colon (;) instead of the double ampersand. Maybe I shouldn't.

Still not entirely sure what this has to do with a timeout in running the script. The gist you posted appears to take arguments from $_POST which would not be a command-line argument. So I'm a little confused there.

But in line 44 of that gist:

$ssh = new Net_SSH2($daHostname, 22);

What I'm saying, is if the server you are executing this script from (whether that be from the command line or from a browser) is not able to connect to $daHostname on port 22 then the script is going to timeout. Increasing the timeout time won't help, because the issue is that the server can't connect to $daHostname on port 22 - that would need to be rectified.

Additionally, the script appears to be downloading a file, unzipping it, importing a database, and doing some more WordPress stuff. How fast the server is able to download the zip file and how quickly the server can unzip the file is going to play a role in the timeout as well. A zip file that is 1MB probably isn't going to take as long as a zip file that is 24GB.
 
F.Y.I. -

1. you may use "nohup <some command> & " to run scripts in background. Output will be appended to nohup.out
Then you are safe to exit SSH, and leave it running.

# nohup /usr/local/php??/bin/php /home/xxx/test.php &

Sometimes, I run yum -y update with nohup (to prevent disconnected in middle)

ref.: https://linux.die.net/man/1/nohup

---

2. You may check php.ini max_execution_time parameter

---

3. You may also check/add SSH option parameter - ServerAliveInterval - to keep long running SSH session alive too (on best effort).
 
Last edited:
Back
Top