top of page
  • Writer's picturePraximax Tech

What do DORA metrics tell us


If you have been around the software engineering in the last few years, you would have definitely come across terms like Deployment Frequency, Mean Time to Restore, Change Failure Rate, and Lead Time for Changes. If not the very words, some derivative of them. These four came from the DevOps Research and Assessment (DORA) initiative - a collaboration between Dr. Nicole Forsgren, Jez Humble, and Gene Kim that started in 2014 and culminated in 2018 with the book "Accelerate". In this post, we will go through the purpose of DORA and why was it path-breaking.


A Brief History

The goal of their research was to identify the key performance indicators (KPIs) that drive successful software delivery and organizational performance. Their work was heavily influenced by the growing DevOps movement, which aimed to bridge the gap between development and operations teams and improve the efficiency of software delivery.

In 2014, Dr. Forsgren, along with Jez Humble and Gene Kim, started the State of DevOps Report, which was an annual survey of the software development industry. The report gathered data from thousands of professionals worldwide, helping the researchers identify trends and best practices in the DevOps space.

The DORA metrics emerged from the analysis of this data and were first introduced in the 2015 State of DevOps Report. The four key metrics – Deployment Frequency, Lead Time for Changes, Mean Time to Restore (MTTR), and Change Failure Rate (CFR) – were found to be strong predictors of software delivery performance and overall organizational success. These metrics were chosen because they offered a balanced view of the speed and stability of software delivery processes.

In 2018, Dr. Forsgren, Jez Humble, and Gene Kim co-authored the book "Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations," which provided a comprehensive overview of their research findings and solidified the importance of DORA metrics in the DevOps community.

Google acquired DORA shortly after, and the team behind the State of DevOps Report joined Google Cloud. Since then, DORA metrics have become widely recognized and adopted as key performance indicators in the software development industry. They have been integrated into various tools and platforms, helping organizations measure and improve their software delivery performance.


The Significance

As mentioned earlier, DevOps had truly become a movement in the software community. People were beginning to see the advantage of automating all deployment related work for reducing manual errors and saving a whole bunch of time. For those of us who had been around the block much before that, it felt all the time spent on manually deploying software in the years past was a criminal waste of time.

At the same time, mobile and cloud suddenly made software ubiquitous! Deployment processes, tooling, monitoring, etc. became more sophisticated and the new specialization for DevOps emerged. The DORA initiative took a deep dive into the world of DevOps, giving the field the much needed research it richly deserved.

The research showed a fascinating, data-driven view of what successful organizations do in terms of developing and releasing their software. The report was full of great insights but its most significant contribution lies in the way in which it translated engineering metrics into a framework for the senior management.

It is uncommon for senior leaders in large organizations to take a deep dive into the nitty-gritties of engineering. Even if they have the background and expertise, they often do not have the luxury of time to delve too deep into the weeds. By reducing many different signals across the engineering processes, DORA condensed them into just few key metrics. By tracking these metrics, leaders could get a pulse on the various aspects of the team's output.


DORA Metrics for leaders

Leaders can use DORA Metrics as early indicators to plan, budget and take corrective actions to improve throughput or performance. Moreover, it creates a shared vocabulary that everyone across the organization can use instead of technical or business jargon.

Here are some examples:

  1. Speed of delivery: Deployment Frequency and Lead Time for Changes offer insights into the speed at which a team is delivering new features, updates, or bug fixes. Higher Deployment Frequency and shorter Lead Times indicate that the team is able to deliver value to customers quickly and respond to market changes more effectively.

  2. Stability and reliability: MTTR and CFR help leaders understand the stability and reliability of their software delivery process. A low CFR shows that the team is producing high-quality, reliable code with fewer incidents requiring immediate attention. A short MTTR means that the team is effective at resolving issues quickly, minimizing downtime and customer impact.

  3. Continuous improvement: By tracking DORA metrics over time, leaders can identify trends and areas for improvement in their software delivery process. They can use this information to guide investments in tooling, training, or process improvements that will have the most significant impact on delivery performance.

  4. DevOps maturity: High-performing teams with strong DevOps practices typically exhibit better DORA metrics. By benchmarking their teams against industry standards or other organizations, leaders can gauge their DevOps maturity and identify areas where they can further improve their practices.

In addition, DORA has made a conscious effort to identify the cause-and-effect of different parts of the Software Development Lifecycle on the eventual outcomes such as organizational performance and employee burnout.

The Annual DORA Report is published regularly with the latest research results.

But...DORA Metrics are not enough!

In the last few years, DORA has received a large following and for a good reasons. Today there are several tools that show you DORA metrics with great visualizations. Great Engineering Leaders, however, understand that mere metrics are not enough. DORA Metrics should seldom be used on their own. Here are some limitations that should be kept in mind:

  1. Incomplete picture of software quality: DORA metrics focus primarily on delivery speed and stability. While these are important aspects of software development, they do not account for other crucial elements like code quality, security, maintainability, and user experience. A team could have high DORA scores but still produce software with poor overall quality, leading to user dissatisfaction or security vulnerabilities.

  2. Possible gaming of metrics: DORA metrics are susceptible to manipulation or "gaming", as Goodhart’s Law states. Teams may prioritize short-term gains in the metrics at the expense of long-term software quality or sustainability. For example, they may rush deployments to increase deployment frequency or artificially reduce lead times by cutting corners in the development process, leading to increased technical debt and potential issues in the future.

  3. Lack of context: DORA metrics are most useful when considered alongside other contextual factors, such as team size, project complexity, and technology stack. Comparing raw DORA metrics across different teams or organizations without considering these factors can lead to misleading conclusions. Moreover, DORA metrics may not be directly applicable to certain types of projects, like those with long release cycles or strict regulatory requirements.

Despite these issues, DORA metrics can still be valuable for identifying trends and areas for improvement within a software development team. It is essential to use these metrics as a starting point rather than an end-all-be-all solution, and consider additional factors and metrics that may provide a more comprehensive view of software quality and team performance.

An organization doesn't necessarily have to aim to be in "Elite" category. It is the Engineering Leader's job to glean the most important areas to improve upon.


We at Praximax are working on offering an unvarnished, clear view of metrics across the organization. Instead of fancy charts and confusing tables, we surface the most relevant actionable insights for the engineering leaders to act upon. Leaders should spend their time and effort trying to find solutions, not to find them by digging through heaps of data.

Interested to know more? Apply for early access today.




bottom of page