This is a standard Plone-buildout of the company Starzel.de.
Contents
- It extends to config- and version-files on github shared by all projects that use the same version of Plone.
- It allows to update a project simply by changing the version it extends.
- It allows to update all projects of one version by changing remote files (very useful for HotFixes).
- It is minimal work to setup a new project.
- It has presets for development, testing, staging and production.
- It has all the nice development-helpers we use.
$ git clone https://github.com/starzel/buildout SOME_PROJECT
$ cd SOME_PROJECT
Remove all files that are not needed for a project but are only used for the buildout itself.
$ rm -rf linkto README.rst README.txt .travis.yml secret.cfg_tmpl VERSION.txt local_coredev.cfg CHANGES.rst
If you're not developing the buildout itself you want a create a new git repo.
$ rm -rf .git && git init
Add a file that contains a passwort. Do not use admin
as a password in production!
$ echo -e "[buildout]\nlogin = admin\npassword = admin" > secret.cfg
Symlink to the file that best fits you local environment. At first that is usually development. Later you can use production or test. This buildout only uses local.cfg
and ignores all local_*.cfg
.
$ ln -s local_develop.cfg local.cfg
Create a virtualenv in Python 2.7 or Python 3.7 (Plone 5.2 only).
$ virtualenv . # for Python 2.7
$ python3.7 -m venv . # for Python 3 (Plone 5.2 only)
Install and configure Plone
$ ./bin/pip install -r requirements.txt
$ ./bin/buildout
Install git pre-commit hooks
$ ./bin/pre-commit install
buildout.cfg
- This contains the project settings (name, addons, checkouts etc.).
local.cfg
For each environment (development, production, test) there is a separate
local_*.cfg
-file. You create a symlink calledlocal.cfg
to one of these files depending on your environment. Each of the files includes the remotebase.cfg
that is hosted on github like this:extends = https://raw.githubusercontent.com/starzel/buildout/5.1/linkto/base.cfg
This example refers to the tag 5.1 of this buildout that uses Plone 5.1 To use a different Plone-version simply change that to point to a different tag.
base.cfg
This remote file conatains most of the commonly used logic used for prodcution. It also includes two version-files that are also hosted on github:
- pinned_versions.cfg: Pinns the Plone-version using http://dist.plone.org/release/5.1/versions.cfg
- floating_versions.cfg: Pinns all commonly used addons of this buildout.
pinned_versions_project.cfg
- Here you pinn versions to overwrite or extend the hosted
pinned_versions.cfg
. These eggs are usually pinned for a reason and are usually not safe to be upgraded. floating_versions_project.cfg
- Here you overwrite and extend the hosted
floating_versions.cfg
. These eggs should usually be safe to be upgraded../bin/checkversions floating_versions_project.cfg
will check pypi if there are newer releases for your pinned eggs.
We support the following versions of Plone:
- 5.2.3 (Python 2 and 3)
- 5.2.2 (Python 2 and 3)
- 5.2.1 (Python 2 and 3)
- 5.2 (Python 2 and 3)
- 5.1.7
- 5.1.6
- 5.1.5
- 5.1.4
- 5.1.3
- 5.1.2
- 5.1.1
- 5.1
- 5.0.10
- 5.0.9
- 5.0.8
- 5.0.7
- 5.0.6
- 5.0.5
- 5.0.4
- 5.0.3
- 5.0.2
- 5.0
- 4.3.20
- 4.3.19
- 4.3.18
- 4.3.17
- 4.3.15
- 4.3.14
- 4.3.11
- 4.3.10
- 4.3.9
- 4.3.8
- 4.3.7
- 4.3.6
- 4.3.4
- 4.3.3
- 4.3.2
- 4.3.1
- 4.3
- 4.2.7
- 4.2.5
- 4.2.3
- 4.2.2
To use this buildout with a version of Plone that is currently in development (a.k.a. the Coredev) please use:
local_coredev.cfg
for Plone 6.0.x <https://github.com/plone/buildout.coredev/tree/6.0>
Please note that new features are not always introduced to old versions.
- Tags for development-versions (alpha, beta and rc) will exists but will be removed after the final release of that version.
buildout.cfg
contains the general project settings. Here you configure the name of the project, the eggs, source-checkouts and languages Plone will use.
Symlink to the development-config:
$ ln -s local_develop.cfg local.cfg
The development-setup will build a simple instance with some useful tools (see below). The setup assumes that zeo, varnish and loadbalancing are only configured on production.
Install git pre-commit hooks using the pre-commit tool that was installed via requirements.txt:
$ ./bin/pre-commit install
Symlink to the production-config:
$ ln -s local_production.cfg local.cfg
A average project could use this stack pipeline:
nginx > varnish > nginx (for load-balancing) > at least 2 zeoclients > zeoserver
In local_production.cfg
select the parts you really need.
parts +=
${buildout:zeo-multi-parts}
${buildout:varnish}
${buildout:supervisor-parts}
${buildout:cron-parts}
logrotate
precompiler
Also modify templates/supervisord.conf
to have supervisor manage the parts you want to use.
Frontend webserver (Nginx)
The first Nginx manages the virtualhost, does url rewrites if needed and terminates the SSL (needs to be done before varnish).
A full demo-config with ssl and redirects can be found in this repo in /templates/demo_nginx.conf.
A minimal config without ssl can be found in the demo.plone.de project.
More information can also be found in the PloneDocs
Cache (Varnish)
After nginx we use varnish to cache the site. You can activate it like this:
# comment out what you need parts += # Choose one! ... varnish4-config ...
Take a look in
linkto/base.cfg
for the varnish4-config part, there are several switches to configure.It is best practice to install varnish from your distribution repository. If this is not possible you can build it, see the section below.
If you use the system-varnish only need the
[varnish4-config]
part, it will generate the config (vcl) for you. In/etc/varnish/default.vcl
include the generated vcl:vcl 4.0; include "<path to your buildout>/etc/varnish4.vcl";
A
systemctl restart varnish
should activate the new config. To use one varnish installation with serveral vhosts, see theVarnish with multiple sites
section below.Loadbalancer (Nginx)
Another Nginx spreads the requests to several Zeoclients, here is a minimal config. In production you can look at the demo.plone.de project
# starzel (zeoclients) upstream starzel_zeoclients { ip_hash; server 127.0.0.1:8082; server 127.0.0.1:8083; server 127.0.0.1:8084; }
The
ip_hash
option is needed for multiple Zeoclients, more information can be found in this issueThe ip and port has to be the same as the settings for the zeoclients in then part
[bindips]
and[ports]
.
The generated varnish config works with a single vhost, for multiple sites/domains we need a custom varnish config. This configuration is not yet build into the buildout script/template, we need to do the changes manually in a copy of a varnish config file (just copy the varnis4.vcl over to /etc/varnish and include it in default.vcl).
In the varnish4.vcl we need to add the additional backend, note the different loadbalancer port.
backend 001 {
.host = "localhost";
.port = "8091";
.connect_timeout = 0.4s;
.first_byte_timeout = 300s;
.between_bytes_timeout = 60s;
}
backend 002 {
.host = "localhost";
.port = "8081";
.connect_timeout = 0.4s;
.first_byte_timeout = 300s;
.between_bytes_timeout = 60s;
}
In sub vcl_recv
we remove the backend (set req.backend_hint = backend_000;) and add this switch:
if (req.http.host == "my_host") {
set req.backend_hint = 001;
}
else {
set req.backend_hint = 002;
}
This does the vhost routing to the different backends. "my_host" is the upstream name of the cache, see the config of demo.plone.de project. The Varnish config can be tested with this command: varnishd -C -f /etc/varnish/default.vcl
If you need to build varnish (e.g. because your system does not ship with version 4), you need to add varnish-build
:
# comment out what you need
parts +=
[...]
${buildout:varnish}
varnish-build
[...]
[varnish-build]
recipe = plone.recipe.varnish:build
url = https://varnish-cache.org/_downloads/varnish-4.0.5.tgz
varnish_version = 4.0
The varnish4
part generates a start script, this can be used together with supervisord.
Create a copy of local_production.cfg
called local_test.cfg
and modify it according to your needs.
Warning
If test runs on the same server as production:
In this case you need a different name for the project on test. Otherwise one will overwrite the database of the other. Because of this the name of the project must not be set in buildout.cfg
but in the local_*.cfg
-files.
- packages
- All eggs of your buildout will be symlinked to in
parts/packages
. - zopepy
- Run
./bin/zopepy
to have a python-prompt with all eggs of your buildout in its python-path. - checkversions
- Run
./bin/checkversions floating_versions_project.cfg
to check if your pinned eggs are up-to-date. - codeintel
- This part uses
corneti.recipes.codeintel
to prepare for codeintel-integration (useful for users of Sublime Text). - stacktrace
The part
stacktrace-script
adds a bash-script./bin/stack.sh
that will print the current stacktrace to stdout. Useful to find out what Plone is doing when it's busy.This was removed in 5.2.2 because it only works with ZServer (i.e. in Python 2). Use https://pypi.org/project/py-spy/ instead.
- pre-commit
- This installs a pre-commit-hook that runs several code analysis tests including black.
- mrbob
- This part adds bobtemplates.plone to simplify the creation of new addons.
- test
- Run tests for your test-eggs
- coverage-test
- Generate coverage-reports for your test-eggs in parts/test/.
- Restrict loaded languages
- By default only german ('de') is loaded on startup. In your
buildout.cfg
you can override the loaded languages usinglanguage = de en fr
. This setting also affects the languages used in thei18nize-xxx
part. (see http://maurits.vanrees.org/weblog/archive/2010/10/i18n-plone-4#restrict-the-loaded-languages) - i18nize-diff
- Show differences of the po files against what is currently in git. This script uses podiff that filters out a lot of noise like creation dates and line numbers. So this output is much more usable. Use this script in jenkins together with i18nize-all to make sure that you po files are up to date.
- i18nize-xxx
Modify the commented-out part
i18nize-xxx
to get a script that runs i18ndude fro an egg. Here is an example for the eggdynajet.site
adding a script./bin/i18nize-site
.[i18nize-site] recipe = collective.recipe.template input = ${buildout:directory}/i18nize.in output = ${buildout:bin-directory}/i18nize-site mode = 775 dollar = $ domain = dynajet.site packagepath = ${buildout:directory}/src/dynajet.site/src/dynajet/site languages = ${buildout:languages}
- i18nize-all
- This runs all i18nize commands for a package.
- Setup for gitlab-ci and jenkins
- Configure your ci-system to run the script
./bootstrap_ci.sh
. This will configure and run the whole buildout.
- monitoring
- Change the settings for
maxram
to have memmon restart an instance when it uses up to much memory. - Sentry logging
- Configure zeoclients to send tracebacks to Sentry in
local_production.cfg
by uncommenting it and adding a dsn. You also need to enable the eggraven
. Repeat for each zeoclient.
This buildout automatically includes the correct Hotfixes for the version of Plone you use. E.g. the extends-file for Plone 5.0.6 https://raw.githubusercontent.com/starzel/buildout/5.0.6/linkto/base.cfg pulls in the file https://raw.githubusercontent.com/starzel/buildout/master/linkto/hotfixes/5.0.6.cfg which in turn contains the pinns and eggs for all HotFixes for that version.
By having the hotfixes-files in the master-branch we can easily update Hotfixes for each version without having to move any tags. The same day a Hotfix is published the corresponding extends-files will be updated. You simply have to rerun buildout and restart your site to include them.
local.cfg
and secret.cfg
must never be versioned. The file .gitignore
in this buildout already prevent this.
It might feels weird that buildout.cfg
loads local.cfg
, but this avoids some weird behavior of buildouts extends-feature.
To have different supervisor-configurations for test-servers by adding a file templates/supervisord-test.conf
and referencing it in local_test.cfg:
[supervisor-conf]
input= ${buildout:directory}/templates/supervisord-test.conf