Skip to content

ministryofjustice/correspondence_tool_staff

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


MoJ logo

Correspondence Tools - Staff

repo standards badge

An application to allow internal staff users to answer correspondence.

Development

Working on the Code

Work should be based off of, and PRed to, the main branch. We use the GitHub PR approval process so once your PR is ready you'll need to have one person approve it, and the CI tests passing, before it can be merged.

Basic Setup

Cloning This Repository

Clone this repository then cd into the new directory

$ git clone git@github.com:ministryofjustice/correspondence_tool_staff.git
$ cd correspondence_tool_staff

Installing the app for development

Latest Version of Ruby

If you don't have rbenv already installed, install it as follows:

$ brew install rbenv ruby-build
$ rbenv init

Follow the instructions printed out from the rbenv init command and update your ~/.bash_profile or equivalent file accordingly, then start a new terminal and navigate to the repo directory.

Use rbenv to install the latest version of ruby as defined in .ruby-version (make sure you are in the repo path):

$ rbenv install

Dependencies

Node.js:

$ brew install node

Yarn

$ brew install yarn

Postgresql

$ brew install postgresql

Setup

Use the following commands to install gems and javascript packages then create the database

$ bin/setup
$ bin/yarn install

Seeds

Seeds can be loaded into the database via a rake task. The user accounts password is set with the DEV_PASSWORD env var.

$ DEV_PASSWORD=correspondence bin/rake db:seed:dev

Running locally:

To just run the web server without any background jobs (usually sufficient):

$ bin/rails server

If you need any of the background jobs running then start with:

$ bin/dev

The site will be accessible at http://localhost:3000. You can login using one of the users created during the seeding process such as: correspondence-staff-dev+brian.rix@digital.justice.gov.uk or correspondence-staff-dev+david.attenborough@digital.justice.gov.uk with the password set as DEV_PASSWORD

Sidekiq

When the server is running, you can view the sidekiq queues by going to http://localhost:3000/sidekiq. This path can also be used on the live site when you are logged in as an admin.

Testing

This project can produce code coverage data (w/o JS or views) using the simplecov gem set COVERAGE=1 (or any value) to generate a coverage report. Parallel tests are supposed to be supported - however the coverage output from simplecov is a little strange (the total lines in the project are different for each coverage run)

This project includes the parallel_tests gem which enables multiple CPUs to be used during testing in order to speed up execution. Otherwise running the tests takes an unacceptably long amount of time.

The default parallelism is 8 (override by setting PARALLEL_TEST_PROCESSORS) which seems to be about right for a typical Macbook Pro (10,1 single processor with 4 cores)

To set up parallel testing

Create the required number of extra test databases:

rails parallel:create

Load the schema into all of the extra test databases:

rails parallel:load_structure
To run all the tests in parallel
rails parallel:spec
To run only feature tests in parallel
rails parallel:spec:features
To run only the non-feature tests in parallel
rails parallel:spec:non_features

Browser testing

We use chromedriver for Capybara tests, which require JavaScript. This is managed by selenium-webdriver.

If you have an existing old version on your PATH this may cause an issue so you will need to remove it from your PATH or uninstall it.

Where we don't require JavaScript to test a feature we use Capybara's default driver RackTest which is ruby based and much faster as it does not require a server to be started.

Debugging:

To debug a spec that requires JavaScript, you need to set a environment variable called CHROME_DEBUG. It can be set to any value you like.

Examples:

$ CHROME_DEBUG=1 bin/rspec

When you have set CHROME_DEBUG, you should notice chrome start up and appear on your taskbar. You can now click on chrome and watch it run through your tests. If you have a debugger in your tests the browser will stop at that point.

Emails

Emails are sent using the GOVUK Notify service. Configuration relies on an API key which is not stored with the project, as even the test API key can be used to access account information. To do local testing you need to have an account that is attached to the "Track a query" service, and a "Team and whitelist" API key generated from the GOVUK Notify service website. See the instructions in the .env.example file for how to setup the correct environment variable to override the govuk_notify_api_key setting.

The urls generated in the mail use the cts_email_url and cts_email_port configuration variables from the settings.yml. These can be overridden by setting the appropriate environment variables, e.g.

$ export SETTINGS__CTS_EMAIL_URL=localhost
$ export SETTINGS__CTS_EMAIL_PORT=5000

Devise OmniAuth - Azure Active Directory

In addition to sign in with email and password, there is an integration with Azure Active Directory through Devise OmniAuth.

For this to work in your local machine, you will need to set 3 ENV variables. See the instructions in the .env.example file.

A colleague can provide this to you. Usually, the tenant and client will be the same for all local/dev environments, but the secret should be unique to your machine, as this makes it easier to revoke it in case of a leak.

This feature can be enabled/disabled through the enabled_features mechanism configured in config/settings.yml.

Uploads

Responses and other case attachments are uploaded directly to S3 before being submitted to the application to be added to the case. Each deployed environment has the permissions is needs to access the uploads bucket for that environment.

In local development, uploads are made to the local filesystem.

Dumping the database

We have functionality to create an anonymised copy of the production or staging database. This feature is to be used as a very last resort. If the copy of the database is needed for debugging please consider the following options first:

  • seeing if the issue is covered in the feature tests
  • trying to track the issue through Kibana
  • recreating the issue locally

If the options above do not solve the issue you by create an anonymised dump of the database by connecting to a pod and running:

rake db:dump:delete_s3_dumps[latest,false] && rake db:dump:local

This will upload the file to an S3 bucket by default which can be loaded to an environment by connecting and running:

rake db:dump:copy_s3_dumps && db:restore:local[,false,]

For more help with the data dump tasks run:

rake db:dump:help

Papertrail

The papertrail gem is used as an auditing tool, keeping the old copies of records every time they are changed. There are a couple of complexities in using this tool which are described below:

JSONB fields on the database

The default serializer does not de-serialize the properties column correctly because internally it is held as JSON, and papertrail serializes the object in YAML. The custom serializer CTSPapertrailSerializer takes care of this and reconstitutes the JSON fields correctly. See /spec/lib/papertrail_spec.rb for examples of how to reify a previous version, or get a hash of field values for the previous version.

Data Migrations

The app uses the rails-data-migrations gem https://github.com/anjlab/rails-data-migrations

Data migrations work like regular migrations but for data; they're found in db/data_migrations.

To create a data migration you need to run:

rails generate data_migration migration_name

and this will create a migration_name.rb file in db/data_migrations folder with the following content:

class MigrationName < DataMigration
  def up
    # put your code here
  end
end

Finally, at release time, you need to run:

rake data:migrate

This will run all pending data migrations and store migration history in data_migrations table.

Letter templates and synchronising data

The app has templated correspondence for generating case-related letters for the Offender SAR case type.

The template body for each letter is maintained in the letter_templates table in the database, and populated from information in the /db/seeders/letter_template_seeder.rb script.

Whenever any changes to the letter templates are required DO NOT EDIT THE DATABASE, but amend the seeder and then on each environment, run rails db:seed:dev:letter_templates to delete and re-populate the table.

This is required whenever any new template is added; should someone have edited the versions in the database directly, those changes will be overwritten the next time the seeder is run.

Site prism page manifest file

The tests use the Site Prism gem to manage page objects which behave as an abstract description of the pages in the application; they're used in feature tests for finding elements, describing the URL path for a given page and for defining useful methods e.g. for completing particular form fields on the page in question.

If you add new Site Prism page objects, it's easy to follow the existing structure - however, there is one gotcha which is that in order to refer to them in your tests, you also need to add the new objects to a manifest file here so that it maps an instantiated object to the new Page object class you've defined.

See spec/site_prism/page_objects/pages/application.rb

Localisation keys checking

As part of the test suite, we check to see if any tr keys are missing from the localised YAML files

There is a command line tool provided to check for these manually as well - i18n-tasks missing - you can see the output from it below.

$ i18n-tasks missing
Missing translations (1) | i18n-tasks v0.9.29
+--------+------------------------------------+--------------------------------------------------+
| Locale | Key                                | Value in other locales or source                 |
+--------+------------------------------------+--------------------------------------------------+
|  all   | offender_sars.case_details.heading | app/views/offender_sars/case_details.html.slim:5 |
+--------+------------------------------------+--------------------------------------------------+

...fixing happens...

$ i18n-tasks missing
✓ Good job! No translations are missing.
$

There's also a similar task called i18n-tasks unused

$ i18n-tasks unused
Unused keys (1) | i18n-tasks v0.9.29
+--------+-----------------------+---------------+
| Locale | Key                   | Value         |
+--------+-----------------------+---------------+
|   en   | steps.new.sub_heading | Create a case |
+--------+-----------------------+---------------+
$ i18n-tasks unused
✓ Well done! Every translation is in use.
$

Keeping secrets and sensitive information secure

There should be absolutely no secure credentials committed in this repo. Information about secret management can be found in the related confluence pages.

Case Journey

  1. unassigned A new case entered by a DACU user is created in this state. It is in this state very briefly before it the user assigns it to a team on the next screen.

  2. awaiting_responder The new case has been assigned to a business unit for response.

  3. drafting A kilo in the responding business unit has accepted the case.

  4. pending_dacu_clearance For cases that have an approver assignment with DACU Disclosure, as soon as a response file is uploaded, the case will transition to pending_dacu disclosure. The DACU disclosure team can either clear the case, in which case it goes forward to awaiting dispatch, or request changes, in which case it goes back to drafting.

  5. awaiting_dispatch The Kilo has uploaded at least one response document.

  6. responded The kilo has marked the response as sent.

  7. closed The kilo has marked the case as closed.

Exceptions

Any exceptions raised in any deployed environment will be sent to Sentry.