Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] Bootstrapping GeoMesa Accumulo and Spark on AWS

Hi Emilio,

The tutorial did reflect s3a:// however that appeared to be a typo, so I "fixed" it.

Thanks!



-Brooks



--
Brooks Isoldi, Software Developer

Traversed
7164 Columbia Gateway Drive, Suite 120A
Columbia, MD 21046

On Mon, Sep 25, 2017 at 2:07 PM, Emilio Lahr-Vivaz <elahrvivaz@xxxxxxxx> wrote:
Hello,

I believe that we don't support direct ingestion using the 's3' prefix. You can ingest using 's3a' or 's3n' (which are supported through hadoop's API), but you have to set up your AWS credentials as outlined here:

http://www.geomesa.org/documentation/user/accumulo/commandline_tools.html#enabling-s3-ingest

You can alternatively download the files to a local filesystem and ingest them from there.

I'll open a ticket to fix that tutorial, sorry about that.

Thanks,

Emilio


On 09/25/2017 01:34 PM, Brooks Isoldi wrote:

Hi all,

I'm attempting to follow the Boostrapping Geomesa/Accumulo/Spark tutorial located at http://www.geomesa.org/documentation/tutorials/geodocker-geomesa-spark-on-aws.html and am unable to ingest the GDELT data.  It's saying it "Ingested 0 features" when I run the ingest command.

Here are the commands I ran (correcting for a couple typo's I found in the tutorial).

$ FILES=$(seq 7 -1 1 | xargs -n 1 -I{} sh -c "date -d'{} days ago' +%Y%m%d" | xargs -n 1 -I{} echo s3://gdelt-open-data/events/{}.export.csv | tr '\n' ' ')
$ sudo docker exec accumulo-master geomesa ingest -c geomesa.gdelt -C gdelt -f gdelt -s gdelt -u root -p secret $FILES
$ sudo docker exec accumulo-master geomesa export -c geomesa.gdelt -f gdelt -u root -p secret -m 100


The first command seems to work just fine...echo'ing $FILES returns a list of valid s3 paths.

The ingest command seems to work successfully (meaning, no errors), but I get the following and it does not appear to actually ingest anything.

INFO  Creating schema gdelt
INFO  Running ingestion in local mode
INFO  Ingesting 7 files with 1 thread
[                                                            ]   0% complete 0 ingested 0 failed in 00:00:02
INFO  Local ingestion complete in 00:00:02
INFO  Ingested 0 features with no failures.

The export command simply spits out all of the column headers without any rows (features).


If anyone can point to what I'm doing wrong here, I would greatly appreciate it.

Thank you!



--
Brooks Isoldi, Software Developer

Traversed
7164 Columbia Gateway Drive, Suite 120A
Columbia, MD 21046


_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users


_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users


Back to the top