Similar question-
- GeoSpark librairy using Spark Java
- From ResultSet to Spark dataframe using Java
- GeoSpark using Spark / Java
- Undefined function: 'ST_GeomFromText' Using Spark / Java
I think, you haven't followed the GeoSparkSQL-Overview/#quick-start thoroughly-
- As per the quick start you need to Add GeoSpark-core and GeoSparkSQL into your project POM.xml or build.sbt
<!-- Geo spark lib doc - https://datasystemslab.github.io/GeoSpark/api/sql/GeoSparkSQL-Overview/#quick-start-->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-sql_2.3</artifactId>
<version>1.3.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.vividsolutions/jts -->
<dependency>
<groupId>com.vividsolutions</groupId>
<artifactId>jts</artifactId>
<version>1.13</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.datasyslab/geospark-viz -->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-viz_2.3</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark</artifactId>
<version>1.3.1</version>
</dependency>
- Declare your Spark Session
SparkSession sparkSession = SparkSession.builder()
.config("spark.serializer", KryoSerializer.class.getName())
.config("spark.kryo.registrator", GeoSparkKryoRegistrator.class.getName())
.master("local[*]")
.appName("myGeoSparkSQLdemo")
.getOrCreate();
- Register all the functions from
geospark-sql_2.3
to the sparkSession
so that it can be used directly spark-sql
// register all functions from geospark-sql_2.3 to sparkSession
GeoSparkSQLRegistrator.registerAll(sparkSession);
Now Here is the working example-
SparkSession sparkSession = SparkSession.builder()
.config("spark.serializer", KryoSerializer.class.getName())
.config("spark.kryo.registrator", GeoSparkKryoRegistrator.class.getName())
.master("local[*]")
.appName("myGeoSparkSQLdemo")
.getOrCreate();
// register all functions from geospark-sql_2.3 to sparkSession
GeoSparkSQLRegistrator.registerAll(sparkSession);
try {
System.out.println(sparkSession.catalog().getFunction("ST_Geomfromtext"));
// Function[name='ST_GeomFromText', className='org.apache.spark.sql.geosparksql.expressions.ST_GeomFromText$', isTemporary='true']
} catch (Exception e) {
e.printStackTrace();
}
// https://datasystemslab.github.io/GeoSpark/api/sql/GeoSparkSQL-Function/
Dataset<Row> dataframe = sparkSession.sql("select ST_GeomFromText('POINT(-7.07378166 33.826661)')");
dataframe.show(false);
dataframe.printSchema();
/**
* +---------------------------------------------+
* |st_geomfromtext(POINT(-7.07378166 33.826661))|
* +---------------------------------------------+
* |POINT (-7.07378166 33.826661) |
* +---------------------------------------------+
*/
// using longitude and latitude column from existing dataframe
Dataset<Row> df = sparkSession.sql("select -7.07378166 as longitude, 33.826661 as latitude");
df.withColumn("ST_Geomfromtext ",
expr("ST_GeomFromText(CONCAT('POINT(',longitude,' ',latitude,')'))"))
.show(false);
/**
* +-----------+---------+-----------------------------+
* |longitude |latitude |ST_Geomfromtext |
* +-----------+---------+-----------------------------+
* |-7.07378166|33.826661|POINT (-7.07378166 33.826661)|
* +-----------+---------+-----------------------------+
*/