Code
def search(self, arg1, arg2):
global ipv4_regex, type_regex
if isinstance(arg2, type_regex):
text = re.search(arg2, arg1)
if text:
p = text.group()
return len(p)
else:
return 0
else:
if arg2 in arg1:
return len(arg2)
else:
return 0
I have a python script where, to serve one request, this function gets recomputed around 452900 times(found this value by doing profiling of code using pprofile), to give the output.
Some notes about function:
arg1 = string
arg2 = ith string/regex
Basically, arg2
is an ith
instance of a long list of regex/strings that needs to be matched against arg1
(arg1 remains constant for one request) inorder to find matching length.
I came up with this function, but time & CPU consumption is becoming bottleneck for me. Because of many search operations, CPU goes very high soon.
What ideas I thought till now
- Use dictionary comprehension instead of
if-else
as suggested here - Use some kind of Tree based search, the way we do for strings.
both these ideas are simple ideas as of now, I am not able to bring them into action.
I have no control over regexes in arg2, so regex optimization wouldn't help here.
Please suggest, I am looking for ideas as to how can i optimize this function?
Thanks