Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] Geomesa_PySpark: SQL functions are not loaded.

Hello,

You need to modify the '/path/to/geomesa-accumulo-spark-runtime_2.11-$VERSION.jar' to be an actual path pointing to an actual jar on your filesystem. E.g., '/tmp/geomesa-accumulo-spark-runtime_2.11-2.0.2.jar'.

Thanks,

Emilio

On 10/30/18 10:25 AM, Kepin Kumar wrote:
Hello team Geomesa,

How are you doing? I have been trying using geomesa_spark and when I do 
import geomesa_pyspark
conf = geomesa_pyspark.configure(
    jars=['/path/to/geomesa-accumulo-spark-runtime_2.11-$VERSION.jar'],
    packages=['geomesa_pyspark','pytz'],
    spark_home='/path/to/spark/').\
    setAppName('MyTestApp')

conf.get('spark.master')
# u'yarn'

from pyspark.sql import SparkSession

spark = ( SparkSession
    .builder
    .config(conf=conf)
    .enableHiveSupport()
    .getOrCreate()
)
sqlQuery = "select st_makeBBOX(40.0, 50.0, 50.0, 60.0)"
sparkSession.sql(sqlQuery)

I get this error message:
AnalysisException: "Undefined function: 'st_makeBBOX'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 7"

                
Please help me in solving this problem.

                
Thank you.

                
Regards,
Kepin

_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users


Back to the top