0

I am using Ubuntu with python 2.7,

I need to take all the files in a folder, and count the lines in every file seperatly and dump it to a file.

I found how to do it via terminal directly using parallel processing here

It crashes when I try:

subprocess.Popen('ls %s* | parallel -k zcat {} | wc -l >%s'%(dir,outputfile), shell=True)

Now I am trying to use that terminal command via python, seems like it can't take the list of files and use them as files but only to count the length of the files list.

p1 = subprocess.Popen(["ls", dest], stdout=subprocess.PIPE)
    output = subprocess.check_output(["wc", "-l"], stdin=p1.stdout)

Gives me the number of files in the folder, when I want a list of how many lines there are in each file.

How can I use python to execute a command that will: give me a list of how many lines there are, at every file in a folder, and will do it using parallel (or any other good multi coring method)

Tomos Williams
  • 1,988
  • 1
  • 13
  • 20
thebeancounter
  • 4,261
  • 8
  • 61
  • 109

4 Answers4

1

You can use stuff in the standard library without having to shell out:

import os

from multiprocessing import Pool

folder = '.'

fnames = (name for name in os.listdir(folder)
          if os.path.isfile(os.path.join(folder, name)))


def file_wc(fname):
    with open(fname) as f:
        count = sum(1 for line in f)
    return count


pool = Pool()

print(pool.map(file_wc, list(fnames)))

If you want to record the file names

def file_wc(fname):
    with open(fname) as f:
        count = sum(1 for line in f)
    return (fname, count)

print(dict(pool.map(file_wc, list(fnames))))
Jack Evans
  • 1,697
  • 3
  • 17
  • 33
0

Count files, dirs and path's in folder

import os

path, dirs, files = os.walk("/home/my_folder").next()
file_count = len(files)

Count lines in file, I tried to find a way to count the lines without open the file but I can't

with open(<pathtofile>) as f:
    print len(f.readlines())

Now you have a list of the files (variable files in the firts example) you just need to join this 2 pieces of code to get the number of lines for every file in the variable files

Joao Vitorino
  • 2,976
  • 3
  • 26
  • 55
0

Actually you don't need to use external processes to do this task in python. Python can do it for you. Here is python3 snippet:

import os

for x in os.listdir(): 
    if os.path.isfile(x):
        with open(x, 'rb') as f:
            print('{} lines: {}'.format(x, sum(1 for line in x)))

Here are some additional information about listening files in dir, getting number of lines in file and counting lines for huge files

running.t
  • 5,329
  • 3
  • 32
  • 50
0

You can use multiprocessing together with system calls. You don't have to use the queue here and just print the results directly.

import multiprocessing as mp
from subprocess import Popen, PIPE


output = mp.Queue()


def count_lines(path, output):
    popen = Popen(["wc", "-l", path], stdout=PIPE, stderr=PIPE)
    res, err = popen.communicate()
    output.put(res.strip())


popen = Popen(["ls", "."], stdout=PIPE, stderr=PIPE)
res, err = popen.communicate()


processes = [mp.Process(target=count_lines, args=(path.strip(), output)) for path in res.split('\n') if path]

# Run processes
for proc in processes:
    proc.start()

for proc in processes:
    proc.join()


results = [output.get() for proc in processes]
non_empty = [result for result in results if result]
print(non_empty)

Reference:

https://sebastianraschka.com/Articles/2014_multiprocessing.html

Oluwafemi Sule
  • 36,144
  • 1
  • 56
  • 81