Currently experiencing high memory usage by scapy 2.4.0 on python 2.7 running on centos 7.4
originally I thought it was due to https://bitbucket.org/secdev/scapy/issues/5149/rdpcap-and-wrpcap-file-descriptor-leak but this was was fixed some time ago.
import scapy
from guppy import hpy
import psutil
hp = hpy()
class SessionBuilder(object):
def __init__(self):
pass
def get_sessions(pcap):
# this heap always reports 50mb usage
hp.heap()
process = psutil.Process(os.getpid())
# I expect this memory to be around 50mb... but that's not always true
print process.memory_info()
opened_pcap = rdpcap("pcap_location")
sessions = opened_pcap.session()
# this heap always reports 50mb usage
hp.heap()
# I expect this memory to be larger which is it
print process.memory_info()
return sessions
inside aonther python file
import SessionBuilder
class session_worker
def __init__(self):
self.sb = SessionBuilder()
def work(self):
for pcap in pcaps:
sessions = self.sb.get_sessions(pcap)
# I then go about doing some things with these sessions
apologies for some mistakes in the code but it's on an offline system so just added a rough chunk of what I'm doing.
every time I go around the loop there is a chance that the memory from the previous loop stays and this just compounds until I have no memory left (not featured in the code above but there is some logic in place that will ignore any pcap that is larger than 1/4 of available memory on the box to ensure that scapy can open it an extract the sessions)
according to Using heapy to track down memory leaks in Django app the heap amount of representative of the memory that python is using and not any underlying c code. I'm assuming scapy uses c underneath?