Dear all, We are trying to use `GeoMesa` from `PySpark`. With the following code SQL function such as ST_makeBBox is not defined at SQL level. We use the following python code: ``` import geomesa_pyspark conf = geomesa_pyspark.configure( jars=['/mnt/geomesa/geomesa_spark/geomesa-accumulo-spark-runtime_2.11-2.0.2.jar','/mnt/geomesa/geomesa_spark/geomesa-spark-core_2.11-2.0.2.jar','/mnt/geomesa/geomesa_spark/geomesa-spark-sql_2.11-2.0.2.jar','/mnt/geomesa/geomesa_spark/geomesa-spark-jts_2.11-2.0.2.jar'], packages='geomesa_pyspark','pytz', spark_home='/usr/lib/spark/').\ setAppName('GeoMesa PySpark') conf.get('spark.master') import findspark findspark.init() from pyspark.sql import SparkSession sparkSession = SparkSession.builder.config(conf=conf).enableHiveSupport().getOrCreate() sqlQuery = "select st_makeBBOX(40.0, 50.0, 50.0, 60.0)" sparkSession.sql(sqlQuery) ``` It gives: ``` Py4JJavaError: An error occurred while calling o439.sql. : org.apache.spark.sql.AnalysisException: Undefined function: 'st_makeBBOX'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 7 ``` In Scala we are able to have those functions loaded by doing: ``` val sqlContext = new org.apache.spark.sql.SQLContext(sc) org.apache.spark.sql.SQLTypes.init(sqlContext) ``` What is the equivalent for PySpark? Or what should be imported? |