Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] GeomesaSpark

Hello,

I´ve found the problem. The featurename 'gdelt' was wrong.

But unfortunately I´ve got another problem using Spark with Geomesa. I´m not quite sure where the error comes from, but I assume its problem with Spark.
A ClassNotFoundException is thrown with following content: "Failed to register classes with Kryo".
Please have a look at https://github.com/apache/spark/pull/4258
A solution is described there, but I´m not sure how to use this "patch".

I´m using Spark version 1.3.0 and it´s not possible for me to update my version, because I use GeoMesa.

Thanks in advance,
Marcel Jacob.

Am 22.07.2015 18:38, schrieb Jim Hughes:
Hi Marcel,

From a quick look, I'm guessing that your DataStore is null.  I'd suggest adding a quick check to see if 'ds' is null.  You don't need to specify the 'featureName' to get a datastore.  I don't know if that would hurt anything, but I'd suggest removing it.

Other than that, you can double-check the settings you are passing by using the GeoMesa tools (http://www.geomesa.org/geomesa-tools-features/) like 'list' and 'describe'.  Other than that, you can use the Accumulo shell to scan the 'gdelt' table to make sure that sensible metadata is present in that table.

Let us know how getting a DataStore in this code works out for you.  I'll add the idea of a Java GeoMesaSpark tutorial/example project to our list of additions to make.

Cheers,

Jim

On 07/22/2015 09:13 AM, Marcel wrote:
Hey,

I´m trying to retrieve a RDD using the GeomesaSpark class. Unfortunately a NullPointerException is thrown during execution of this method:
GeoMesaSpark.rdd(conf, sparkContext, ds, query1);

It says SimpleFeatureType.encodeType throws this exception. Is something wrong with my datatore or my arguments? Here is my code:

        Map<String, String> map = new HashMap<String, String>();
 
        map.put("instanceId", "accumulo");
        map.put("zookeepers", "node1-scads02:2181");
        map.put("user", "user");      
        map.put("password", "password");
        map.put("tableName", "gdelt");
        map.put("featureName", "event");
 
        AccumuloDataStore ds = (AccumuloDataStore) DataStoreFinder.getDataStore(map);
 
        SparkConf sc = new SparkConf(true);
        sc.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
        sc.set("spark.kryo.serializer.buffer.mb", "24");
         
        Configuration conf = new Configuration();
        SparkConf sc2 = GeoMesaSpark.init(sc, ds);
        SparkContext sparkContext = new SparkContext("spark://node1-scads02:7077", "countryWithMostEvent", sc2);
 
        Filter f = Filter.INCLUDE;
        Query query1 = new Query("gdelt", f, new String[]{"Actor1CountryCode", "Actor2CountryCode"});
         
        RDD<SimpleFeature> actorResultRDD = GeoMesaSpark.rdd(conf, sparkContext, ds, query1);

Thanks again.

ps: It would be great when anybody could post a working GeomesaSpark example in Java including a RDD transformation.

Best regards
Marcel Jacob.


_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users



_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users


Back to the top