Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

varken to nightly #40

Merged
merged 90 commits into from
Dec 6, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
90 commits
Select commit Hold shift + click to select a range
1759aa4
PEP8 Cleanup
Nov 28, 2018
bcb5f58
sonarr.py overhaul
Nov 28, 2018
5e878c7
formatting, and some notation
dirtycajunrice Nov 29, 2018
d8a48bd
formatting, and some notation
dirtycajunrice Nov 29, 2018
8e2ef07
added example config.ini
dirtycajunrice Nov 29, 2018
7d7a161
minor change in example
dirtycajunrice Nov 29, 2018
e788db2
added default values to helper classes
dirtycajunrice Nov 29, 2018
e807a88
added missing keys to example.ini
dirtycajunrice Nov 29, 2018
7e3db8a
Created INIParser.py to read config file
dirtycajunrice Nov 29, 2018
08fc6fb
the copy pasta was too strong
dirtycajunrice Nov 29, 2018
c18ed69
split off sonarrserver as a class
dirtycajunrice Nov 29, 2018
2efdb76
var separation and args added to sonarr
dirtycajunrice Nov 29, 2018
56db264
leaving template
dirtycajunrice Nov 29, 2018
dbece19
migrated sonarr.py to new style
dirtycajunrice Nov 29, 2018
6c3ec96
added args to ini
dirtycajunrice Nov 29, 2018
cbbc3c5
initial scheduling test
dirtycajunrice Nov 29, 2018
4a38e2d
passed influx_seraver to sonarrapi class
dirtycajunrice Nov 29, 2018
62520ea
bunch of tweaks. Scheduling is working
dirtycajunrice Nov 29, 2018
4dee66f
added config minutes setting
dirtycajunrice Nov 29, 2018
bca833e
created systemd config example
dirtycajunrice Nov 29, 2018
a0456f0
changed to seconds instead of minutes
Nov 29, 2018
fa69fdb
folder restructure, dbmanager placeholder, iniparser file fullpath, a…
Nov 29, 2018
eaba2fa
folder restructure, dbmanager placeholder, iniparser file fullpath, a…
Nov 29, 2018
1604b11
Migrated tautulli.py and allowed for multiple servers
dirtycajunrice Dec 1, 2018
95fbb33
fixed time
dirtycajunrice Dec 1, 2018
f46de05
deleted extras and fixed downloaded
dirtycajunrice Dec 1, 2018
c93a526
moved files
dirtycajunrice Dec 1, 2018
c8ceb52
moved files
dirtycajunrice Dec 1, 2018
b4ed9fb
moved files
dirtycajunrice Dec 1, 2018
c201d1e
moved files
dirtycajunrice Dec 1, 2018
6c51399
forgot to delete period
dirtycajunrice Dec 1, 2018
48f458c
forgot to delete period again
dirtycajunrice Dec 1, 2018
4c439bb
migrated radarr
dirtycajunrice Dec 2, 2018
07ef8a8
reworked scheduler to pass server to instance to remove duplication
dirtycajunrice Dec 2, 2018
5241cfc
truncate roku product version
dirtycajunrice Dec 2, 2018
3c6cb14
ported ombi
dirtycajunrice Dec 2, 2018
0ca76be
created and assigned basic dbmanager
dirtycajunrice Dec 2, 2018
0eb3702
cleared extra imports
dirtycajunrice Dec 2, 2018
a77d521
ammended servicefile
dirtycajunrice Dec 2, 2018
651bb26
accept dbchange and add to readme
dirtycajunrice Dec 2, 2018
7fb3907
readme update
dirtycajunrice Dec 2, 2018
e5e2d5b
temporary root in systemd file
dirtycajunrice Dec 2, 2018
e677d24
added min requirements and split friendly name with username
dirtycajunrice Dec 2, 2018
fa15eca
added forced package imports
dirtycajunrice Dec 2, 2018
47538f5
removed typing from requirements
dirtycajunrice Dec 2, 2018
17b825c
Revert "removed typing from requirements"
dirtycajunrice Dec 2, 2018
3fac8ff
Revert "added forced package imports"
dirtycajunrice Dec 2, 2018
b462e7b
modified for venv
dirtycajunrice Dec 2, 2018
1dbc6ca
fixed file path search
dirtycajunrice Dec 3, 2018
1e09867
testing hashit
dirtycajunrice Dec 3, 2018
d6b35b4
added hashing to radarr
dirtycajunrice Dec 3, 2018
2f2e288
added hashing to tautulli
dirtycajunrice Dec 3, 2018
c6df3a9
since init uses only one server, moved headers/params to init
Dec 3, 2018
487a8a7
changed default verify to false
Dec 3, 2018
2e466b9
added initial run
Dec 3, 2018
2608e0f
added docker config to readme
Dec 3, 2018
9977dc8
added docker config to readme
Dec 3, 2018
6cd2a52
added connection_handler for bad requests
dirtycajunrice Dec 4, 2018
57616c3
added ability to define data folder
dirtycajunrice Dec 4, 2018
ce6e52d
split helpers and structures
dirtycajunrice Dec 4, 2018
6dfa32a
Update README to simplify instructions
samwiseg0 Dec 4, 2018
e29cf21
Rename and update systemd to have more info
samwiseg0 Dec 4, 2018
e8f25bb
Change User/group and restart interval
samwiseg0 Dec 4, 2018
56ded31
swapped capitalization
Dec 4, 2018
8c6ddb8
test push
dirtycajunrice Dec 4, 2018
968be2f
Update systemd with uppercase
samwiseg0 Dec 4, 2018
bcda450
Update gitignore to exclude varken-venv
samwiseg0 Dec 4, 2018
50f5bca
Fix parsing for disabled servers
samwiseg0 Dec 4, 2018
7e6c5e6
Make 0 explicit
samwiseg0 Dec 4, 2018
bf0cb06
Add new logger
samwiseg0 Dec 5, 2018
1814619
Changes to logging tautulli
samwiseg0 Dec 5, 2018
95c6f82
Removing old logging
samwiseg0 Dec 5, 2018
c3f081e
Fix what I broke when converting to single quotes
samwiseg0 Dec 5, 2018
78f22c0
Add more logging to intparser
samwiseg0 Dec 5, 2018
d60a13e
Add platform info and python info
samwiseg0 Dec 5, 2018
325aa2e
Add distro package
samwiseg0 Dec 5, 2018
b473c00
fixed clean_check of server_ids, fixed under-indented radarr get_movie,
dirtycajunrice Dec 5, 2018
ea825a8
kill script if no services enabled
dirtycajunrice Dec 5, 2018
6d1813d
removed all traces of get_sessions and consolidated to get_activity
dirtycajunrice Dec 5, 2018
244e9a3
Thou shall not lie...
samwiseg0 Dec 5, 2018
542eaef
Remove username and pass from default
samwiseg0 Dec 5, 2018
0f882f0
Update note
samwiseg0 Dec 5, 2018
4333841
Update logger
samwiseg0 Dec 5, 2018
eeecad4
Fix logging and add new make dir function
samwiseg0 Dec 5, 2018
b9dae7a
Move logger and add --debug
samwiseg0 Dec 5, 2018
2a91f4d
Verify SSL by default
samwiseg0 Dec 5, 2018
7245744
Fix links and typos
samwiseg0 Dec 5, 2018
3819e62
Add sub_type to tautulli tuple
samwiseg0 Dec 5, 2018
9983b67
Add exception logging for tautulli
samwiseg0 Dec 5, 2018
7a74487
Update README and log messgaes
samwiseg0 Dec 5, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,10 @@
.Trashes
ehthumbs.db
Thumbs.db
configuration.py
__pycache__
GeoLite2-City.mmdb
GeoLite2-City.tar.gz
data/varken.ini
.idea/
Legacy/configuration.py
varken-venv/
2 changes: 1 addition & 1 deletion cisco_asa.py → Legacy/cisco_asa.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
from datetime import datetime, timezone
from influxdb import InfluxDBClient

import configuration
from Legacy import configuration

current_time = datetime.now(timezone.utc).astimezone().isoformat()

Expand Down
File renamed without changes.
File renamed without changes.
118 changes: 23 additions & 95 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,33 @@
# Grafana Scripts
Repo for api scripts written (both pushing and pulling) to aggregate data into influxdb for grafana
# Varken
Dutch for PIG. PIG is an Acronym for Plex/InfluxDB/Grafana

varken is a standalone command-line utility to aggregate data
from the Plex ecosystem into InfluxDB. Examples use Grafana for a
frontend

Requirements /w install links: [Grafana](http://docs.grafana.org/installation/), [Python3](https://www.python.org/downloads/), [InfluxDB](https://docs.influxdata.com/influxdb/v1.5/introduction/installation/)

<center><img width="800" src="https://i.imgur.com/av8e0HP.png"></center>
<p align="center">
<img width="800" src="https://i.imgur.com/av8e0HP.png">
</p>

## Quick Setup
1. Install requirements `pip3 install -r requirements.txt`
1. Make a copy of `configuration.example.py` to `configuration.py`
2. Make the appropriate changes to `configuration.py`
1. Create your plex database in influx
```sh
user@server: ~$ influx
> CREATE DATABASE plex
> quit
```
1. After completing the [getting started](http://docs.grafana.org/guides/getting_started/) portion of grafana, create your datasource for influxdb. At a minimum, you will need the plex database.
## Quick Setup (Varken Alpha)
1. Clone the repository `sudo git clone https://github.com/Boerderij/Varken.git /opt/Varken`
1. Follow the systemd install instructions located in `varken.systemd`
1. Create venv in project `cd /opt/Varken && /usr/bin/python3 -m venv varken-venv`
1. Install requirements `/opt/Varken/varken-venv/bin/python -m pip install -r requirements.txt`
1. Make a copy of `varken.example.ini` to `varken.ini` in the `data` folder
`cp /opt/Varken/data/varken.example.ini /opt/Varken/data/varken.ini`
1. Make the appropriate changes to `varken.ini`
ie.`nano /opt/Varken/data/varken.ini`
1. Make sure all the files have the appropriate permissions `sudo chown varken:varken -R /opt/Varken`
1. After completing the [getting started](http://docs.grafana.org/guides/getting_started/) portion of grafana, create your datasource for influxdb.
1. Install `grafana-cli plugins install grafana-worldmap-panel`
1. Click the + on your menu and click import. Using the .json provided in this repo, paste it in and customize as you like.


1. TODO:: Click the + on your menu and click import. Using the .json provided in this repo, paste it in and customize as you like.

### Docker

Repo is included in [si0972/grafana-scripts](https://github.com/si0972/grafana-scripts-docker)
Repo is included in [si0972/grafana-scripts-docker](https://github.com/si0972/grafana-scripts-docker/tree/varken)

<details><summary>Example</summary>
<p>
Expand All @@ -32,84 +36,8 @@ Repo is included in [si0972/grafana-scripts](https://github.com/si0972/grafana-s
docker create \
--name=grafana-scripts \
-v <path to data>:/Scripts \
-e plex=true \
-e PGID=<gid> -e PUID=<uid> \
si0972/grafana-scripts:latest
si0972/grafana-scripts:varken
```
</p>
</details>




## Scripts
### `sonarr.py`
Gathers data from Sonarr and pushes it to influxdb.

```
Script to aid in data gathering from Sonarr

optional arguments:
-h, --help show this help message and exit
--missing Get all missing TV shows
--missing_days MISSING_DAYS
Get missing TV shows in past X days
--upcoming Get upcoming TV shows
--future FUTURE Get TV shows on X days into the future. Includes today.
i.e. --future 2 is Today and Tomorrow
--queue Get TV shows in queue
```
- Notes:
- You cannot stack the arguments. ie. `sonarr.py --missing --queue`
- One argument must be supplied

### `radarr.py`
Gathers data from Radarr and pushes it to influxdb

```
Script to aid in data gathering from Radarr

optional arguments:
-h, --help show this help message and exit
--missing Get missing movies
--missing_avl Get missing available movies
--queue Get movies in queue
```
- Notes:
- You cannot stack the arguments. ie. `radarr.py --missing --queue`
- One argument must be supplied
- `--missing_avl` Refers to how Radarr has determined if the movie should be available to download. The easy way to determine if the movie will appear on this list is if the movie has a <span style="color:red">RED "Missing"</span> tag associated with that movie. <span style="color:blue">BLUE "Missing"</span> tag refers to a movie that is missing but is not available for download yet. These tags are determined by your "Minimum Availability" settings for that movie.

### `ombi.py`
Gathers data from Ombi and pushes it to influxdb

```
Script to aid in data gathering from Ombi

optional arguments:
-h, --help show this help message and exit
--total Get the total count of all requests
--counts Get the count of pending, approved, and available requests
```
- Notes:
- You cannot stack the arguments. ie. `ombi.py --total --counts`
- One argument must be supplied

### `tautulli.py`
Gathers data from Tautulli and pushes it to influxdb. On initial run it will download the geoip2 DB and use it for locations.

## Notes
To run the python scripts crontab is currently leveraged. Examples:
```sh
### Modify paths as appropriate. python3 is located in different places for different users. (`which python3` will give you the path)
### to edit your crontab entry, do not modify /var/spool/cron/crontabs/<user> directly, use `crontab -e`
### Crontabs require an empty line at the end or they WILL not run. Make sure to have 2 lines to be safe
### It is bad practice to run any cronjob more than once a minute. For timing help: https://crontab.guru/
* * * * * /usr/bin/python3 /path-to-grafana-scripts/ombi.py --total
* * * * * /usr/bin/python3 /path-to-grafana-scripts/tautulli.py
* * * * * /usr/bin/python3 /path-to-grafana-scripts/radarr.py --queue
* * * * * /usr/bin/python3 /path-to-grafana-scripts/sonarr.py --queue
*/30 * * * * /usr/bin/python3 /path-to-grafana-scripts/radarr.py --missing
*/30 * * * * /usr/bin/python3 /path-to-grafana-scripts/sonarr.py --missing
*/30 * * * * /usr/bin/python3 /path-to-grafana-scripts/sickrage.py
```
104 changes: 104 additions & 0 deletions Varken.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
import schedule
import threading
import sys
import platform
import distro

from sys import exit
from time import sleep
from os import access, R_OK
from os.path import isdir, abspath, dirname, join
from argparse import ArgumentParser, RawTextHelpFormatter

from varken.iniparser import INIParser
from varken.sonarr import SonarrAPI
from varken.tautulli import TautulliAPI
from varken.radarr import RadarrAPI
from varken.ombi import OmbiAPI
from varken.dbmanager import DBManager
from varken.varkenlogger import VarkenLogger

PLATFORM_LINUX_DISTRO = ' '.join(x for x in distro.linux_distribution() if x)

def threaded(job):
thread = threading.Thread(target=job)
thread.start()


if __name__ == "__main__":
parser = ArgumentParser(prog='varken',
description='Command-line utility to aggregate data from the plex ecosystem into InfluxDB',
formatter_class=RawTextHelpFormatter)

parser.add_argument("-d", "--data-folder", help='Define an alternate data folder location')
parser.add_argument("-D", "--debug", action='store_true', help='Use to enable DEBUG logging')

opts = parser.parse_args()

DATA_FOLDER = abspath(join(dirname(__file__), 'data'))

if opts.data_folder:
ARG_FOLDER = opts.data_folder

if isdir(ARG_FOLDER):
DATA_FOLDER = ARG_FOLDER
if not access(ARG_FOLDER, R_OK):
exit("Read permission error for {}".format(ARG_FOLDER))
else:
exit("{} does not exist".format(ARG_FOLDER))

# Initiate the logger
vl = VarkenLogger(data_folder=DATA_FOLDER, debug=opts.debug)
vl.logger.info('Starting Varken...')

vl.logger.info(u"{} {} ({}{})".format(
platform.system(), platform.release(), platform.version(),
' - {}'.format(PLATFORM_LINUX_DISTRO) if PLATFORM_LINUX_DISTRO else ''
))
vl.logger.info(u"Python {}".format(sys.version))


CONFIG = INIParser(DATA_FOLDER)
DBMANAGER = DBManager(CONFIG.influx_server)

if CONFIG.sonarr_enabled:
for server in CONFIG.sonarr_servers:
SONARR = SonarrAPI(server, DBMANAGER)
if server.queue:
schedule.every(server.queue_run_seconds).seconds.do(threaded, SONARR.get_queue)
if server.missing_days > 0:
schedule.every(server.missing_days_run_seconds).seconds.do(threaded, SONARR.get_missing)
if server.future_days > 0:
schedule.every(server.future_days_run_seconds).seconds.do(threaded, SONARR.get_future)

if CONFIG.tautulli_enabled:
for server in CONFIG.tautulli_servers:
TAUTULLI = TautulliAPI(server, DBMANAGER)
if server.get_activity:
schedule.every(server.get_activity_run_seconds).seconds.do(threaded, TAUTULLI.get_activity)

if CONFIG.radarr_enabled:
for server in CONFIG.radarr_servers:
RADARR = RadarrAPI(server, DBMANAGER)
if server.get_missing:
schedule.every(server.get_missing_run_seconds).seconds.do(threaded, RADARR.get_missing)
if server.queue:
schedule.every(server.queue_run_seconds).seconds.do(threaded, RADARR.get_queue)

if CONFIG.ombi_enabled:
for server in CONFIG.ombi_servers:
OMBI = OmbiAPI(server, DBMANAGER)
if server.request_type_counts:
schedule.every(server.request_type_run_seconds).seconds.do(threaded, OMBI.get_request_counts)
if server.request_total_counts:
schedule.every(server.request_total_run_seconds).seconds.do(threaded, OMBI.get_total_requests)

# Run all on startup
SERVICES_ENABLED = [CONFIG.ombi_enabled, CONFIG.radarr_enabled, CONFIG.tautulli_enabled, CONFIG.sonarr_enabled]
if not [enabled for enabled in SERVICES_ENABLED if enabled]:
exit("All services disabled. Exiting")
schedule.run_all()

while True:
schedule.run_pending()
sleep(1)
49 changes: 0 additions & 49 deletions configuration.example.py

This file was deleted.

11 changes: 0 additions & 11 deletions crontabs

This file was deleted.

Loading