Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add dockerized wiki configuration #117

Draft
wants to merge 28 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,6 @@ nubis/terraform/.terraform
nubis/terraform/terraform.tfvars
nubis/builder/artifacts/*-dev/
nubis/builder/artifacts/AMIs.json

# Database dump for local development
wiki.sql*
37 changes: 37 additions & 0 deletions gcp/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Ensure both post 1. numbers match here.
FROM docker.io/mediawiki:1.39
ENV MWIKI_VER=39
WORKDIR /var/www/html/

ARG UID=10001
ARG GID=10001

RUN apt-get update && apt-get install -y --no-install-recommends ffmpeg unzip

# Prepare for nonroot user
RUN groupadd -g $GID app; \
useradd -g $GID -u $UID -m -s /usr/sbin/nologin app; \
chown -R app:app /var/www/html/

USER app

RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/ConfirmAccount /var/www/html/extensions/ConfirmAccount
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/LabeledSectionTransclusion /var/www/html/extensions/LabeledSectionTransclusion
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/TimedMediaHandler /var/www/html/extensions/TimedMediaHandler
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/RSS /var/www/html/extensions/RSS
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/PageForms /var/www/html/extensions/PageForms
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/UrlGetParameters /var/www/html/extensions/UrlGetParameters
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/NoTitle /var/www/html/extensions/NoTitle
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/Widgets /var/www/html/extensions/Widgets
RUN git clone --depth 1 --single-branch --branch REL1_${MWIKI_VER} https://gerrit.wikimedia.org/r/mediawiki/extensions/MobileFrontend /var/www/html/extensions/MobileFrontend
RUN git clone --depth 1 --single-branch --branch main https://github.com/mozilla/mediawiki-bugzilla.git /var/www/html/extensions/Bugzilla

RUN php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');" && \
php -r "if (hash_file('sha384', 'composer-setup.php') === 'dac665fdc30fdd8ec78b38b9800061b4150413ff2e3b6f88543c636f7cd84f6db9189d43a81e5503cda447da73c7e5b6') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;" && \
php composer-setup.php && \
php -r "unlink('composer-setup.php');"

COPY composer.local.json /var/www/html/composer.local.json
COPY ports.conf /etc/apache2/ports.conf
COPY LocalSettings.php /var/www/html/LocalSettings.php
RUN php composer.phar update --no-dev
562 changes: 562 additions & 0 deletions gcp/LocalSettings.php

Large diffs are not rendered by default.

45 changes: 45 additions & 0 deletions gcp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# wikimo test

## "Runbook"

Use `docker compose up --build mediawiki` after changing the version in `Dockerfile`.

To migrate the db dump to the latest version use the following path:

- switch the version in `Dockerfile` to `mediawiki:1.35` and update MWIKI_VER to match the point release number.
- Set "wikimedia/at-ease": "v2.1.0" to "wikimedia/at-ease": "v2.0.0"
- Run `docker compose exec mediawiki php maintenance/update.php`

- switch the version in `Dockerfile` to `mediawiki:1.39` and update MWIKI_VER to match the point release number.
- Set "wikimedia/at-ease": "v2.0.0" to "wikimedia/at-ease": "v2.1.0"
- Run `docker compose exec mediawiki php maintenance/update.php`
- Run `docker compose exec mediawiki php extensions/SemanticMediaWiki/maintenance/populateHashField.php`
- Run `docker compose exec mediawiki php extensions/SemanticMediaWiki/maintenance/rebuildData.php -v --with-maintenance-log` <- this takes a while, may not need? we should clean up spam before doing this

### Notes
- switch the version in `Dockerfile` to `mediawiki:1.xx` and update MWIKI_VER to match the point release number.
- Run `docker-compose exec mediawiki php maintenance/run.php update.php`

## Cleanup Scripts
Scripts that could be run on >=1.27
```shell
compose exec mediawiki php maintenance/deleteArchivedRevisions.php --delete
compose exec mediawiki php maintenance/removeUnusedAccounts.php --delete
```

## Migration Plan
1. Put the AWS Wiki into maintenance.
2. Dump the DB.
3. Import DB and images into GCP environment.
4. Deploy a 1.35 build to GCP.
5. Run `php maintenance/update.php` with 1.35.
6. Deploy a 1.39 build to GCP
7. Run `php maintenance/update.php` with 1.39
8. Run `php maintenance/deleteArchivedRevisions.php --delete`.
9. Run `php maintenance/removeUnusedAccounts.php --delete`.
10. Run `php maintenance/populateHashField.php`.
11. Run `php maintenance/rebuildData.php -v --with-maintenance-log`.
12. Check GCP version of wiki.
13. Update DNS to point to GCP Wiki.
14. Remove maintenance mode on GCP Wiki.
15. Clean up AWS Wiki (and optionally Nubis).
21 changes: 21 additions & 0 deletions gcp/composer.local.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"config": {
"allow-plugins": true
},
"require": {
"mediawiki/sub-page-list": "~3.0",
"mediawiki/semantic-media-wiki": "~4.1",
"mediawiki/semantic-result-formats": "~4.2",
"mediawiki/semantic-watchlist": "~1.0",
"wikimedia/normalized-exception": "v1.0.1",
"wikimedia/at-ease": "v2.1.0"
},
"extra": {
"merge-plugin": {
"include": [
"extensions/TimedMediaHandler/composer.json",
"extensions/Widgets/composer.json"
]
}
}
}
24 changes: 24 additions & 0 deletions gcp/docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
services:
mediawiki:
build: .
ports:
- 8080:80
volumes:
- ./images:/var/www/html/images
- ./LocalSettings.php:/var/www/html/LocalSettings.php
- ../mediawiki-bugzilla/:/var/www/html/extensions/Bugzilla # can be removed once our changes are upstream
db:
image: docker.io/mariadb:11
restart: always
environment:
MYSQL_DATABASE: 'db'
MYSQL_USER: 'user'
MYSQL_PASSWORD: 'password'
MYSQL_ROOT_PASSWORD: 'password'
ports:
- '127.0.0.1:3306:3306'
volumes:
- data:/var/lib/mysql
- ./wiki.sql:/docker-entrypoint-initdb.d/wiki.sql
volumes:
data:
23 changes: 23 additions & 0 deletions gcp/podman-compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
services:
mediawiki:
image: docker.io/mediawiki:1.39
ports:
- 8080:80
volumes:
- /var/www/html/images
- ./LocalSettings.php:/var/www/html/LocalSettings.php
db:
image: docker.io/mariadb:11
restart: always
environment:
MYSQL_DATABASE: 'db'
MYSQL_USER: 'user'
MYSQL_PASSWORD: 'password'
MYSQL_ROOT_PASSWORD: 'password'
ports:
- '127.0.0.1:3306:3306'
volumes:
- data:/var/lib/mysql
- ./wiki.sql:/docker-entrypoint-initdb.d/wiki.sql
volumes:
data:
13 changes: 13 additions & 0 deletions gcp/ports.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# If you just change the port or add more ports here, you will likely also
# have to change the VirtualHost statement in
# /etc/apache2/sites-enabled/000-default.conf

Listen 8000

<IfModule ssl_module>
Listen 443
</IfModule>

<IfModule mod_gnutls.c>
Listen 443
</IfModule>
1 change: 1 addition & 0 deletions gcp/test/.ruby-version
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3.3.1
7 changes: 7 additions & 0 deletions gcp/test/Gemfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# frozen_string_literal: true

source 'https://rubygems.org'

gem 'nokogiri', '~> 1.16'

gem 'typhoeus', '~> 1.4'
41 changes: 41 additions & 0 deletions gcp/test/Gemfile.lock
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
GEM
remote: https://rubygems.org/
specs:
ethon (0.16.0)
ffi (>= 1.15.0)
ffi (1.17.0-aarch64-linux-gnu)
ffi (1.17.0-arm-linux-gnu)
ffi (1.17.0-arm64-darwin)
ffi (1.17.0-x86-linux-gnu)
ffi (1.17.0-x86_64-darwin)
ffi (1.17.0-x86_64-linux-gnu)
nokogiri (1.16.7-aarch64-linux)
racc (~> 1.4)
nokogiri (1.16.7-arm-linux)
racc (~> 1.4)
nokogiri (1.16.7-arm64-darwin)
racc (~> 1.4)
nokogiri (1.16.7-x86-linux)
racc (~> 1.4)
nokogiri (1.16.7-x86_64-darwin)
racc (~> 1.4)
nokogiri (1.16.7-x86_64-linux)
racc (~> 1.4)
racc (1.8.1)
typhoeus (1.4.1)
ethon (>= 0.9.0)

PLATFORMS
aarch64-linux
arm-linux
arm64-darwin
x86-linux
x86_64-darwin
x86_64-linux

DEPENDENCIES
nokogiri (~> 1.16)
typhoeus (~> 1.4)

BUNDLED WITH
2.5.9
50 changes: 50 additions & 0 deletions gcp/test/test.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# frozen_string_literal: true

require 'typhoeus'
require 'nokogiri'

BASE_URL_PROD = 'https://wiki.mozilla.org'
BASE_URL_TEST = 'http://localhost:8080/index.php'

INSPECT_ELEMENT = '#content'

TEST_PAGES_COUNT = 1000
TEST_KEYWORDS = ['Warning: ', 'Error: '].freeze

def fetch_page(url)
response = Typhoeus.get(url, followlocation: true)
url = response.effective_url
html = response.response_body
code = response.response_code

puts "WARN: #{url} Status code is #{code}" unless code == 200

[html, url]
end

def parse_html(html)
parsed_data = Nokogiri::HTML.parse(html)

parsed_data.css(INSPECT_ELEMENT).inner_text
end

def pages_equal?(prod_page, test_page, keyword)
prod_has_keyword = prod_page.include?(keyword)
test_has_keyword = test_page.include?(keyword)

puts "NOK: #{url} (#{prod_has_keyword})" if prod_has_keyword != test_has_keyword
end

TEST_PAGES_COUNT.times do |i|
html, url = fetch_page("#{BASE_URL_PROD}/Special:Random")
prod_page = parse_html(html)

html, = fetch_page(url.gsub(BASE_URL_PROD, BASE_URL_TEST))
test_page = parse_html(html)

print "\r#{i}"

TEST_KEYWORDS.each do |keyword|
pages_equal?(prod_page, test_page, keyword)
end
end