Skip to content
This repository has been archived by the owner on Nov 10, 2020. It is now read-only.

Goals and metrics

Ryan Johnson edited this page Oct 15, 2019 · 8 revisions

What does success look like for NRRD?

Goals and metrics

Access/visibility

1. The site has a proper scope and amount of DOI public data available to reduce the number of data requests fielded by Data Retrieval or DOI public affairs on this data. (Business/user experience)

  • Number of Data Retrieval requests
  • Track # of FOIA requests

2. Users come to the site when looking for federal or Native American energy and mineral resources and revenue information.

  • Number of NRRD users
  • Entry and exit pages in addition to user

3. Zero section 508 (accessibility) errors (Technical/user experience)

User experience/customer service

4. Users can find the data or information on the site that they are looking for. (User experience)

  • Usability testing/user interviews
  • General user feelings about website
  • Path analysis - referrals from target audiences
  • Analysis of search terms of NRRD
  • Number of confused emails to Data Display or Data Retrieval
  • Ask on page "Did you find what you were looking for?"
  • Screener for user research

5. Users can understand the data and information presented on site (comprehension)

  • Target reading grade level - grade 10

Timely/efficient business and technical processes

6. Efficient, documented, replicable processes to update and maintain data and content on the website. (Business/technical)

  • Amount of Data Display/Data Retrieval time used to update data and perform routine maintenance on the site
    • Number of issues closed related to bugs vs. maintenance/data updates vs. enhancements
    • Time Data Retrieval spends compiling data for us
  • Number of updates that are made on schedule—timeline from when data is available to when data makes it on the website
  • Ability for more than one member of the NRRD team to update data (with ONRR-furnished equipment)
  • Number of manual steps vs. automated steps in update process
  • How much manually curated content vs. data-driven content
  • Load time (Technical/user experience)

What does success look like for our blog?

The purpose of our innovation design blog is to share our user-centered design and development process at the Office of Natural Resources Revenue. The strategic purpose of the blog is to:

1. Provide a venue for long-form publishing about our process, decision making, and workflow

  • Metric: publish at least 2 blog posts per month
  • Metric: each member of the Data Display team has written or contributed to at least one blog post each year, providing broad subject-matter relevance and team representation

2. Write compelling content that has demonstrable value to others

  • Metric: average at least 200 unique pageviews each month (for all blog posts)
  • Metric: observe at least 2 minutes average time on page each month (for all blog posts)

3. Nurture and support the research and/or adoption of our process, or elements of our process, in other federal agencies

  • Metric: we observe referrals each year from at least 3 other agencies linking to one of our posts (in a given year).
  • Metric: we receive some form of feedback, including presentation or collaboration requests, from at least 1 other agency every six months.

4. Introduce a tool for ONRR staff collaboration and sharing

  • Metric: at least 2 ONRR staff from teams other than Data Display write or contribute to a blog post each year.
Clone this wiki locally