Skip to content
Simon Schiff edited this page Mar 4, 2017 · 29 revisions

You can compute STARQL queries with three available back ends:

  1. PostgreSQL
    You can transform STARQL queries directly into PostgreSQL queries. You need a running instance of a PostgreSQL database, the latest Java 8 version and maven. For this you can use the Preinstallations guide. Then goto First Run.

  2. Spark SQL
    You can use Apache Spark to compute STARQL queries. To do this you need a running instance of the Hadoop Files System and Spark. To install Hadoop and Spark you have to do at first all the Preinstallations. Then follow the Hadoop Cluster Setup and then the Spark Cluster Setup guide. Then goto First Run.

  3. Spark Streaming
    You can use Apache Spark Streaming to compute STARQL queries continuously. Please note that the streaming application is working batch oriented and not incrementally. To do this you need a running instance of the Hadoop Files System and Spark. To install Hadoop and Spark you have to do at first all the Preinstallations. Then follow the Hadoop Cluster Setup and then the Spark Cluster Setup guide. Then goto First Run.

Clone this wiki locally