Skip to content

import all flowdock messages into elastic search stack for better search and analytics

License

Notifications You must be signed in to change notification settings

albacoretuna/flowdock-search-and-analytics

Repository files navigation

Flowdock Analytics

A nodejs app to import Flowdock messages into elasticsearch, so that search and analytics becomes easier for Flowdock. Once messages are indexed Kibana can be used to search and visualize results in real time.

Install

Everything needed for this app is included, and will be running with a single command!

If docker-compose works in your command line, you're good to go.

1. clone the repo

git clone git@github.com:albacoretuna/flowdock-analytics.git

2. Set environment veriables

Copy env-sample to .env and add your flowdock api token, and other details. Your api token can be found at flowdock's user account page .

3. Start it!

# in the root directoly!
docker-compose up

Development

For development you need node, npm, and an elasticsearch instance, which is included via docker.

1. clone and add dependencies

git clone git@github.com:omidfi/flowdock-analytics.git
cd flowdock-analytics
npm install

2. Set environment variables

Copy env-sample to .env and add your flowdock api token, and other details. Your api token can be found at flowdock's user account page . Remember you need to set the elasticsearch host in the .env file.

3. Get yourself elasticsearch and Kibana

You can use your exisiting elasticsearch and Kibana services, use a hosted version, or start one by the docker images proved:

# start elasticsearch
cd docker-elk
docker-compose up

If it doesn't work, check out Docker ELK Project on Github

4. Start indexing!

# start the app
npm start

This process might take some time, say half an hour, depending on the number of messages that need to be indexed.

There's a crontab file in the setup, which updates the index every half an hour. See ./crontab

Usage

After the setup is done and indexing is started, you can immediately use Kibana to search and analyze the data. Point your browser to localhost:5601 and you'll be good to go!

You will need to insert "flowdock" as a index pattern name in the kibana's setup page.

Frequently asked questions

open
  1. How long indexing might take?

The first time for 71 flows, and 600,000 messages, it took about half an hour on my laptop. And next runs were around one minute, as only new messages need to be downloaded.

  1. How to get list of the flow names?

There's an npm script for it. Run npm run list-flows.

  1. How to setup Kibana? What's an index pattern?

Index pattern is simply the index name you have used for indexing data into elastic search. The default here is "flowdock".

  1. How to setup Kibana? What's the time stamp field?

Choose "sentEpoch" as your time stamp field.

  1. I got tons of messages and errors in console, what's hapenning?

Try openning Kibana, and see if the indexing is working, if it's working forget about the erros :D

  1. I get "elasticsearch not found, trying again in 60 seconds" what's that? Wait for 60 seconds, probably it will find it, if not, you need to check your settings in .env file

  2. Is indexing incremental? If I run it again will it start from scratch? It is incremental. Every time you run it again it only downloads the new messages.

Developer notes

open What are we trying to achieve here? Import all the flows into Elasticsearch.

Why?

  • Flowdock doesn't provide a global search.
  • Flowdock doesn't provide any search in the mobile version.

How?

  • Make a list of interesting flows
  • Make an api call to get all the users
  • Ask Elasticsearch how far each flow has been downloaded
  • Download new messages and store it into elastic search recursively
  • Merge with the messages with user information so that each message gets user's name etc.
  • Index those into Elasticsearch

About

import all flowdock messages into elastic search stack for better search and analytics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published