I just modified some versions on the master pom to match my environnement. I may try your solution later. I finally decided to use maven on java to get the builded jars instead. However, now I'm not sure how to use the geomesa-spark integration in java, as I haven't found any example code for java in the documentation. I am trying to "translate" this simple scala code into java :
// Datastore params
// set SparkContext
val conf = new SparkConf().setMaster("local[*]").setAppName("testSpark")
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
conf.set("spark.kryo.registrator", classOf[GeoMesaSparkKryoRegistrator].getName)
val sc = SparkContext.getOrCreate(conf)
// create RDD with a geospatial query using Geomesa functions
val spatialRDDProvider = GeoMesaSpark(dsParams)
val filter = ECQL.toFilter("BBOX(coords, 48.815215, 2.249294, 48.904295, 2.419337)")
val query = new Query("history_feature_nodate", filter)
val resultRDD = spatialRDDProvider.rdd(new Configuration, sc, dsParams, query)
resultRDD.count
Is there any useful link or documentation to understand how the geomesa-spark java api works ?