Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[geomesa-users] GeomesaSpark

Hello,
I´m running some queries with Spark using your provided class: 
GeomesaSpark (from version 1.1.0-rc4).
When monitoring there is only 1 task in total.
This is my code:

scala.Option<Object> desiredSplits = 
scala.Option$.MODULE$.apply((Object)10);
JavaRDD<SimpleFeature> eventCodeRDD = GeoMesaSpark
.rdd(conf, sparkContext, 
AccumuloDataStoreConfiguration.toScalaMap(dsConf), query1, desiredSplits)
.toJavaRDD();

I would expect 10 partitions. But when logging 
eventCodeRDD.partitions.size() it outputs 1. Also after calling 
eventCode.coalesce(10) its still 1.
Any ideas? Maybe something wrong about the mapping from scala to java?

Thanks,
Marcel Jacob.




Back to the top