I have a pre-installed Hadoop cluster with YARN, Hive, and Spark and wanted to use the geospatial functionality by GeoMesa with the Filesystem release. After following the tutorials and setup here: 15.3. Installing GeoMesa FileSystem 15.9. GeoMesa FileSystem Data Store with Spark SQL I run ingestion with: {{/usr/local/geomesa-fs/bin/geomesa-fs ingest \ --encoding parquet \ --partition-scheme daily,z2-2bit \ --path hdfs://node-master:54310/tmp/geomesa/1 \ -C gdelt \ -s gdelt \ --num-reducers 10 \ /home/hadoop/sgds/data/raw/20200101.export.csv}} Which works mostly fine and I am also able to access and query the files via Spark, but I am unable to use it via geomesa with the following code: val dataFrame = spark.read.format("geomesa").option("fs.path","hdfs://node-master:54310/tmp/geomesa/1").option("geomesa.feature", "gdelt").load() this command results in this warning/error: {{WARN geotools.factory: Can't load a service for category "CRSAuthorityFactory". Cause is "ServiceConfigurationError: org.opengis.referencing.crs.CRSAuthorityFactory: Provider org.geotools.referencing.epsg.wkt.EPSGCRSAuthorityFactory could not be instantiated"}} {{dataFrame.createOrReplaceTempView("gdelt") spark.sql("SELECT eventCode FROM gdelt").show()}} this command then results in this error: java.lang.NoSuchMethodError: org.geotools.data.Query.getHints()Lorg/geotools/util/factory/Hints Could this be potentially some dependency mismatch? |