0

I am using Multiprocessing in Python to run a function for different arguments, and while executing the function, it prints varies outputs. I used to wright that output in a text file, but that doesn't work.

def fun(I):
    #calculations
    print(output)
    f=open('test_'+str(I)+'.txt', 'w')
    f.write(str(output))
    f.close()

for n in range(100):
    multiprocessing.Process(target=fun, args=(n,)).start()

In this code the function is run successfully, but it doesn't write the output to the txt file.

How to solve this issue and, is there any better way to save the output to file other than txt format, like using pickle or joblib

Thanks

Edit

Just wanted to include the code that I tested and that works for me. The comments do not allow proper code formatting hence the answer.

I have tested your code with a random output and it is working fine on my system.

import  multiprocessing
import random


def fun(I):
    #calculations
    output = random.randint(100, 1000)
    print(output)
    f=open('test_'+str(I)+'.txt', 'w')
    f.write(str(output))
    f.close()

for n in range(100):
    multiprocessing.Process(target=fun, args=(n,)).start()

so the issue is in some other part of your code, for which you will need to share more code.

endo.anaconda
  • 2,449
  • 4
  • 29
  • 55
vishak raj
  • 21
  • 4
  • Does this answer your question? [Python multiprocessing safely writing to a file](https://stackoverflow.com/questions/13446445/python-multiprocessing-safely-writing-to-a-file) – mkrieger1 Jun 05 '21 at 11:03
  • 1
    As for what's "better" - that depends entirely on what you consider better. You're putting `pickle` and `joblib` side by side, but they have very little to do with each other. – Grismar Jun 05 '21 at 11:03
  • @Grismar, but I was writing to different files opened in different filenames, how does that overwrites... – vishak raj Jun 05 '21 at 11:06

0 Answers0