I am running the next query on
gdelt_Ukraine example from spark:
val ds =
DataStoreFinder.getDataStore(params).asInstanceOf[AccumuloDataStore]
val cqlFilter = CQL.toFilter("[[bbox(geom, 34, 46,
35, 45.8)] AND [SQLDATE BETWEEN
'2012-02-01T00:00:00.000Z' AND
'2015-05-02T00:00:00.000Z']]")
val q = new Query("gdelt", cqlFilter)
// Configure Spark
val conf = new Configuration
val sparkConf = new
SparkConf(true).setMaster("local")
.setAppName("testSpark")
.set("spark.executor.memory", "1g")
val sconf = GeoMesaSpark.init(sparkConf, ds)
val sc = new SparkContext(sconf)
// Create an RDD from a query
val queryRDD = GeoMesaSpark.rdd(conf, sc, params,
q)
logger.info("Count
queryRDD: " + queryRDD.count())The resulting RDD
count() is 0.
The ERROR that I see on console is:
[2015-07-30 17:34:54,932] ERROR
org.locationtech.geomesa.compute.spark.GeoMesaSpark$:
The query being executed requires multiple scans,
which is not currently supported by geomesa. Your
result set will be partially incomplete. This is
most likely due to an OR clause in your query.
Query: BBOX(geom, 34.0,45.8,35.0,46.0) AND SQLDATE
BETWEEN '2012-02-01T00:00:00.000Z' AND
'2015-05-02T00:00:00.000Z'
but I do not have OR clause.
If I run it only with CQL.toFilter("[[bbox(geom,
34, 46, 35, 45.8)])
everything works and does filtering.
If I run CQL.toFilter("[SQLDATE BETWEEN
'2012-02-01T00:00:00.000Z' AND
'2015-05-02T00:00:00.000Z']]")
I see the same problem as above.
Also everything works using:
geomesa export -u user -p password -c gdelt_Ukraine
-fn gdelt -fmt csv -max 50 -q "[[SQLDATE BETWEEN
'2014-02-01T00:00:00.000Z' AND
'2014-05-02T00:00:00.000Z'] AND [bbox(geom, 34, 46,
35, 45.8)]]"
or from geoserver UI.
Please, let me know, what is the problem.