Google+ Archive Viewer
- Python 3.7+
- If you use macOS, make sure you use the Homebrew version of Python 3
- Postgres
DATABASE_URL
- Mandatory Postgres URL. For development, it should bepostgres://admin:password@localhost:5432/gpav
SECRET_KEY
- Mandatory Django secret keyDEBUG
- Optional Django debug. By defaultfalse
SHOW_PRIVATE_POSTS
- Optional. Whether to show private (non-Public in Google+) posts. By defaultfalse
- Initialize virtualenv and install dependencies
python3 -m venv venv
./venv/bin/pip install -r requirements.txt
- Create the required Postgres database
createdb gpav
psql gpav
# create user admin password 'password';
# grant all privileges on database gpav to admin;
- Copy a development environment file
cp .example.env .env
- Perform the first migration
python manage.py migrate
- Create an admin user for Django admin
python manage.py createsuperuser
- Enter virtualenv
source venv/bin/activate
- Run server
python manage.py runserver
- Setup S3 bucket
- Create bucket, don't block public access, use policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicRead",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion"
],
"Resource": "arn:aws:s3:::<bucket name>/*"
}
]
}
- Create IAM user and grant permission, enable
Programmatic Access
, use policy:{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObjectAcl", "s3:GetObject", "s3:ListBucket", "s3:DeleteObject", "s3:PutObjectAcl" ], "Resource": [ "arn:aws:s3:::<bucket name>/*", "arn:aws:s3:::<bucket name>" ] } ] }
- Fill in env vars in
example.env
- Delete all data in database
python manage.py delete_all
- Perform migration
python manage.py migrate
- Import data
python manage.py import <path/to/archive/directory>
- Setup s3 bucket (same as step 1 above)
- Add following config var to heroku:
AWS_STORAGE_BUCKET_NAME
AWS_SECRET_ACCESS_KEY
AWS_S3_REGION_NAME
AWS_ACCESS_KEY_ID