2

I'm creating a tool that uses the P4Python's run_sync. I've noticed, that if the file cannot be overwritten, e.g. it is an opened .exe, P4Python waits about 2 minutes and tries 10 times to overwrite it. This takes too long and I need some way to make this time shorter/interrupt the operation.

I know, that in console the time could be shortened with:

p4 -r[number of tries] -vnet.maxwait=[seconds of waiting]

But with P4Python global parameters cannot be used this way and I cannot find the way to set those parameters.

The other solution would be to send a signal that would stop the syncing, but I also did not find the way to do this.

What can I do?

Kowalski Paweł
  • 594
  • 1
  • 6
  • 27
  • You could write a context manager for the run method that runs asynchronously to the main thread, and gets killed if it runs too long? You would have to make sure to only instance the connection on the asynchronous thread to avoid connection conflicts, though – MaVCArt Jul 28 '17 at 16:44

1 Answers1

3

There is an open request for this functionality to be exposed via P4Python, to which I have added details of this post. As a work-around you could try overriding the 'p4.run_sync' method, and changing the default number of times it tries to sync a file.

You could also do a system call to 'p4'.

P4Jen
  • 1,003
  • 6
  • 6
  • Thank you for reply, I'll just leave it for now then, I do not want to modify the P4Python or change the application to use the command line – Kowalski Paweł Apr 28 '17 at 11:07
  • I know this is old but this answer actually doesn't appear to work. From the command line, neither the -r or -vnet.maxwait switches have any effect on attempts to rename over locked files. These two commands both make 10 attempts and take the same length of time: p4 -r1 -vnet.maxwait=1 sync -f , and p4 sync -f – Boon Jul 28 '23 at 01:23