Skip to content
This repository was archived by the owner on Jan 23, 2024. It is now read-only.
This repository was archived by the owner on Jan 23, 2024. It is now read-only.

DORA metrics for teams working on microservice-architecture-enabled systems #472

@gintautassulskus-elsevier

Description

Hi, I am unsure where best to ask the question below, so I am posting it here.

Is it correct that the fourkeys implementation assumes a single-team-to-single-codebase setup?

How do you compute DORA metrics for teams working on microservice-architecture-enabled systems? These systems typically introduce a mesh of teams and codebases. A number of factors can then skew the deployment frequency metric. First, the team's contributions may vary from codebase to another, giving the impression that the team's throughput has decreased. Second, multiple teams contributing to the same codebase would contribute to each other's deployment frequency metric stats.

Does the notion of teams have to be introduced into the computation for accuracy? Is there a simpler alternative to that?

One suggested approach is to track "issue" completion frequency at the issue tracker level. Here, the definition of done is "in production". Issue trackers like Jira are well suited for this purpose as they organise items at team-level sprint boards. On the one hand, I am concerned that this tracks a fundamentally different metric - a frequency of requirements delivery, not code as per the DORA definition. On the other hand, such a metric would still correlate with engineering practice maturity and would lead to deficiencies thereof. What do you think?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions