Using SCP command to realize file upload code in shell

Time:2022-5-3

Since SSH can execute remote commands, FTP software has gradually been rarely used.

Every time I update the server code, I have to pack, upload, log in to the server, modify file permissions, update cache, etc. It’s slow and error prone, so I make a script. Once I get it done, I light a cigarette and sit in front of the computer in a daze, or play “plants vs zombies” or something. Relax!

PS: ant and phing users, you can ignore… I just like playing shell.

Copy codeThe code is as follows:
#!/bin/sh
 
HOME=’/cygdrive/d/public_html/myproject’
 
##Pack the local code first
##Exclude the following files: * svn, *. bat, upload. sh, cache
##If there are many files that need to be excluded, you can use the parameter — exclude from = file
##Write the file line by line to file
cd $HOME
tar jcf tmp.tar.bz2 *   \
    –exclude=*.bat     \
    –exclude=*.bz2     \
    –exclude=*.gz      \
    –exclude=.svn      \
    –exclude=cache     \
    –exclude=upload.sh \
 
##Upload files through SCP
scp tmp.tar.bz2 [email protected]:/home/public_html/myproject/tmp.tar.bz2
 
##Execute remote SSH command
##Here, another script, load, is executed sh
ssh [email protected]
 “
 cd /home/public_html/myproject
 tar jxf tmp.tar.bz2
 chown -R web:web *
 chmod -R 755 *
 sh load.sh
 rm -f tmp.tar.bz2
 “
 
##Delete local file
rm -f tmp.tar.bz2
echo “Everything is done.”
 
#Monday, January 11, 2010 by Verdana
# vim: set expandtab tabstop=4 shiftwidth=4:

SSH is configured as automatic login, which can be referred tohere

Recommended Today

Big data Hadoop — spark SQL + spark streaming

catalogue 1、 Spark SQL overview 2、 Sparksql version 1) Evolution of sparksql 2) Comparison between shark and sparksql 3)SparkSession 3、 RDD, dataframes and dataset 1) Relationship between the three 1)RDD 1. Core concept 2. RDD simple operation 3、RDD API 1)Transformation 2)Action 4. Actual operation 2)DataFrames 1. DSL style syntax operation 1) Dataframe creation 2. SQL […]