There's official Python bindings for the CLD3 neural network model, which is what Chrome uses for offline language detection.
sudo apt install -y protobuf-compiler
pip install gcld3
Like all Python code from Google that I've used, it's unpythonic and just generally sucks to use but at least it works well:
>>> import gcld3
>>> lang_identifier = gcld3.NNetLanguageIdentifier(0, 1000)
>>> lang_identifier.Find
lang_identifier.FindLanguage( lang_identifier.FindTopNMostFreqLangs(
>>> a = lang_identifier.FindLanguage("This is a test")
>>> a
<gcld3.pybind_ext.Result object at 0x7f606e0ec3b0>
>>> a.
a.is_reliable a.language a.probability a.proportion
>>> a.language
'en'
>>> a = lang_identifier.FindTopNMostFreqLangs("This piece of text is in English. Този текст е на Български.", 5)
>>> a
[<gcld3.pybind_ext.Result object at 0x7f606e0ec4b0>, <gcld3.pybind_ext.Result object at 0x7f606e0ec570>, <gcld3.pybind_ext.Result object at 0x7f606e0ec470>, <gcld3.pybind_ext.Result object at 0x7f606e0ec5b0>, <gcld3.pybind_ext.Result object at 0x7f606e0ec530>]
>>> [r.language for r in a]
['bg', 'en', 'und', 'und', 'und']
You can also try the unofficial Python bindings https://github.com/bsolomon1124/pycld3