0

I normally use a bash script to grab all the files onto local machine and use glob to process all the files. Just wondering what would be the best way to use python (instead of another bash script) to ssh into each server and process those files?

My current program runs as

 for filename in glob.glob('*-err.txt'):
        input_open = open (filename, 'rb')
        for line in input_open:
            do something

My files all have the ending -err.txt and the directories where they reside in the remote server have the same name /documents/err/. I am not able to install third party libraries as I don't have the permission.

UPDATE

I am trying to not to scp the files from the server but to read it on the remote server instead..

I want to use a local python script LOCALLY to read in files on remote server.

user3669481
  • 317
  • 7
  • 19

2 Answers2

0

The simplest way to do it is to use paramico_scp to use ssh copy from the remote server (How to scp in python?)

If you are not allowed to download any libraries, you can create SSH key pair so that connecting to server does not require a password (https://www.debian.org/devel/passwordlessssh). You then can for each file do

import os
os.system('scp user@host:/path/to/file/on/remote/machine /path/to/local/file')

Note that using system is usually considered less portable than using libraries. If you give the script that use system('scp ...') to copy the files and they do not have SSH key pair set up, they will experience problems

Community
  • 1
  • 1
Dmitry Torba
  • 3,004
  • 1
  • 14
  • 24
0

Looks like you want to use a local Python script remotely. This has been answered here.

Community
  • 1
  • 1
Steven Shaw
  • 6,063
  • 3
  • 33
  • 44