0

For python3.x, using dateutil.parser is very slow. I measured it to be 3.25s for 1800 datetime strings. It was even slower for datetime.strptime method (with "%z"). Both of these tests I used list comprehension.

Are these my only options?

All other answers I found were just about converting it without regard to efficiency on large datasets, and I am looking to do large datasets of strings to datetime objects.

laserpython
  • 300
  • 1
  • 6
  • 21
  • See [How do I translate an ISO 8601 datetime string into a Python datetime object?](https://stackoverflow.com/questions/969285/how-do-i-translate-an-iso-8601-datetime-string-into-a-python-datetime-object) for a bunch of different options. Performance is highly dependent on the nature of your dataset, so the best approach would be to time various methods against a decent size sample of your specific dataset. – benvc Feb 26 '19 at 20:30
  • 2
    If your strings are all perfectly regular, you could slice them up manually, convert to `int`'s and feed into the `datetime` constructor with an appropriate timezone instance. You could even look at [fastnumbers](https://pypi.org/project/fastnumbers/) for additional speed, though I have no experience with that library. – Aaron Feb 26 '19 at 20:38

0 Answers0