I'm using the pyKML module for extracting coordinates from a given KML file.
My Python code is as follows:
from pykml import parser
fileobject = parser.fromstring(open('MapSource.kml', 'r').read())
root = parser.parse(fileobject).getroot()
print(xml.Document.Placemark.Point.coordinates)
However, on running this, I get the following error:
ValueError: Unicode strings with encoding declaration are not supported. Please use bytes input or XML fragments without declaration.
Looking for solutions, I came across this solution http://twigstechtips.blogspot.in/2013/06/python-lxml-strings-with-encoding.html from where I've tried this (which I'm not sure is the correct method):
from pykml import parser
from lxml import etree
from os import path
kml_file = open('MapSource.kml', 'r')
parser = etree.XMLParser(recover=True)
xml = etree.fromstring(kml_file, parser)
print(xml.Document.Placemark.Point.coordinates)
This gives me ValueError: can only parse strings
. What is the correct way for me to parse the KML and get the coordinates at that structure?