Rob,
Both are a few words from Emilio describing his work
Cheers,
Jim
There's a PR up here for those interested:
https://github.com/locationtech/geomesa/pull/685
The approach we've taken is to try to extract CQL from the 'where'
clause of the SQL statement, and use that to load the initial result
sets into spark. We then pass the rest of the SQL statement off to
spark SQL so that you can do things like joins, etc.
For example, it lets you create queries like:
select myAttr, count(*) as count from mySft where bbox(mySft.geom,
-115, 45, -110, 50) AND mySft.dtg during
2015-03-02T10:00:00.000Z/2015-03-02T11:00:00.000Z group by myAttr
We've also created a web service front-end to facilitate kicking off
spark queries.
Thanks,
Emilio
On 09/16/2015 02:22 PM, Rob Emanuele
wrote:
Nice! Will it support CQL in SparkSQL via UDT's and
UDF's? Would be interested in looking at the code if it's up on
github.
_______________________________________________
technology-pmc mailing list
technology-pmc@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://locationtech.org/mailman/listinfo/technology-pmc
|