Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] Instantiate a SpatialRDDProvider from Geomesa HBase (Geomesa-Spark)

Thank you, you are doing a great work.

2017-03-07 12:25 GMT+01:00 Emilio Lahr-Vivaz <elahrvivaz@xxxxxxxx>:
We're hoping to release it this week.

Thanks,

Emilio


On 03/07/2017 06:14 AM, Jose Bujalance wrote:
Thanks for your answer.
I think I will try to set up an Accumulo environnement. Do you have an idea on when the 1.3.1 version will be launched ?

Thanks,
José

2017-03-07 12:11 GMT+01:00 Emilio Lahr-Vivaz <elahrvivaz@xxxxxxxx>:
Hi José,

The full spark integration with hbase hasn't been merged yet. There's a PR up here, but we're waiting for eclipse approval of the new dependencies:

https://github.com/locationtech/geomesa/pull/1363

We're hoping to get this in for 1.3.1, or in the meantime you can check out that branch and build it yourself.

Thanks,

Emilio


On 03/07/2017 04:43 AM, Jose Bujalance wrote:
Hi,

I am running Geomesa on top of HBase. I have created some SimpleFeatureTypes in a Geomesa catalog and I am trying now to access the features from Spark using the Geomesa-Spark libraries:

import java.io.IOException;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;

import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.rdd.RDD;
import org.geotools.data.Query;
import org.geotools.filter.text.cql2.CQLException;
import org.geotools.filter.text.ecql.ECQL;
import org.locationtech.geomesa.spark.GeoMesaSpark;
import org.locationtech.geomesa.spark.SpatialRDDProvider;
import org.opengis.feature.simple.SimpleFeature;

public class Test {

public static void main(String[] args) throws IOException, CQLException {
//Spark configuration
SparkConf conf = new SparkConf().setAppName("MyAppName").setMaster("local[*]");
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
conf.set("spark.kryo.registrator", "org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator");
SparkContext sc = new SparkContext(conf);
//Datastore configuration
Map<String, Serializable> parameters = new HashMap<>();
parameters.put("bigtable.table.name", "Geoloc_Praxedo_catalog");
SpatialRDDProvider provider = GeoMesaSpark.apply(parameters);
String predicate = "BBOX(coords, 48.815215, 2.249294, 48.904295, 2.419337)";
Query query = new Query("history_feature_nodate", ECQL.toFilter(predicate));
RDD<SimpleFeature> resultRDD = provider.rdd(new Configuration(), sc, (scala.collection.immutable.Map<String, String>) parameters, query);
System.out.println("Number of RDDs: " + resultRDD.count());
}

}

Either if I do this on Java or on Scala, I am getting the same error at the same line : SpatialRDDProvider provider = GeoMesaSpark.apply(parameters); The errror is:

Error in Java:
Exception in thread "main" java.util.ServiceConfigurationError: org.locationtech.geomesa.spark.SpatialRDDProvider: Provider org.locationtech.geomesa.spark.converter.ConverterSpatialRDDProviderorg.locationtech.geomesa.spark.geotools.GeoToolsSpatialRDDProvider not found
        at java.util.ServiceLoader.fail(ServiceLoader.java:239)
        at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:372)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
        at scala.collection.Iterator$class.find(Iterator.scala:944)
        at scala.collection.AbstractIterator.find(Iterator.scala:1336)
        at scala.collection.IterableLike$class.find(IterableLike.scala:79)
        at scala.collection.AbstractIterable.find(Iterable.scala:54)
        at org.locationtech.geomesa.spark.GeoMesaSpark$.apply(GeoMesaSpark.scala:32)
        at org.locationtech.geomesa.spark.GeoMesaSpark.apply(GeoMesaSpark.scala)
        at com.praxedo.geomesa.geomesa_spark.Test.main(Test.java:34)

Error in Scala:
scala> val spatialRDDProvider = GeoMesaSpark(params)
java.lang.RuntimeException: Could not find a SparkGISProvider
  at org.locationtech.geomesa.spark.GeoMesaSpark$$anonfun$apply$2.apply(GeoMesaSpark.scala:47)
  at org.locationtech.geomesa.spark.GeoMesaSpark$$anonfun$apply$2.apply(GeoMesaSpark.scala:47)
  at scala.Option.getOrElse(Option.scala:121)
  at org.locationtech.geomesa.spark.GeoMesaSpark$.apply(GeoMesaSpark.scala:47)
  ... 54 elided


I gess I am not accessing the HBase catalog properly.
Can someone give a hint on how to access a HBase catalog to treat it on Spark ?

Thank you,
José


_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________ geomesa-users mailing list geomesa-users@xxxxxxxxxxxxxxxx To change your delivery options, retrieve your password, or unsubscribe from this list, visit https://dev.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users

_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users


Back to the top