Ken Muse

Modeling the Hidden Costs of Development


While it’s healthy to focus on the costs of the services and software you use, it’s not your only expense. Many companies make the mistake of modeling developers as a “sunk” cost. The salary is treated as fixed expense and irrelevant to the ongoing costs. They often focus on the cost of tools and services, ignoring the actual underlying expenses behind them. As a manager, this never made sense to me. Would you want your developers spending 40 hours a week eating pizza, or would you consider that a waste of money and time?

If your’re trying to make a responsible business decision, then it’s important to not ignore the actual costs of the decisions you’re making. Believe it or not, your development process has a cost. That includes the inefficient parts. This post will focus on some ways to model and expose these costs.

Understanding time

Before we start, it’s important to decide on your time basis – the number of hours in a typical work year. To keep my examples simple, I will assume 2,080 hours per year (52 weeks * 40 hours per week). This shows the best case scenario with the lowest cost per hour. This value tends to underestimate the true expenses, since it represents the time as covering both work and paid time off together. As a result, the cost per hour is spread over more hours.

Because we hire people not expecting work on the holidays, we can improve the time value. One common time basis is 1,920 hours. This assumes 20 days of PTO and holidays, which we remove from our earlier number: 2,080 - (20 days x 8 hr/day) = 1,920. It assumes a slightly higher cost by spreading the hours over the actual available working time. From an accounting perspective, this is more accurate since it reflects the idea that you’re paying for the days worked. If your company offers 10 holidays and 4 weeks of PTO (240 hours of paid time off), then a better number is 1,840. Similarly, 1,760 assumes 10 days of holidays and 6 weeks of PTO, sick days, and personal leave. This is more common with senior US staff.

Whatever number you choose, be consistent in your calculations and use the same number.

The next part of the math uses the employee salary. If you have a mixture of employees, you can use an average to approximate the costs. Dividing the salary by the selected time basis provides an hourly rate for the employee. We can then use this number to calculate costs.

One of the most common hidden expenses is lost time. Lost time is the portion of the work hours spent on activities other than creating new code and features (bringing value to customers). In The Developer Coefficient, Stripe determined that the average developer:

  • Works 41.1 hours each week
  • Spends 13.5 hours addressing issues related to technical debt (32.85%)
  • Spends 3.8 hours dealing with bad code (9.25%)

The total, 42.1%, represents the percentage of a developer’s cost that is consumed by bad code, technical debt, and rework. It does not include the reduced velocity for delivering new features. If you have a separate QA team that can pull developers from their current development cycle to repair code from a previous release before deployment, this number typically rises to 60-70%. A portion of that time is the lost to context switching, while the rest is used for triaging, debugging, and reworking code. While you can use these number, I’d recommend polling your developers or asking them to track a typical week and report the numbers back to you. That will give you a clearer understanding of your actual costs.

Knowing these numbers ,we can create a simple model using 42.1% lost time (0.421) and 2,080 hours per year:

Annual SalaryHourly RateNumber of DevelopersLost Time (Per Dev)Cost Per Week
-= Salary / HoursPerYear-= 40 x PercentLostTimeLostTime x #Dev x Hourly
$125,000$60.101016.84$10,120

This simple formula allows you to see the cost of the lost time in a typical week. In this particular example, the annual cost would be equivalent to hiring 4.2 developers! This also shows you the ongoing cost of not reducing the amount of lost time.

Improving the accuracy

There are two ways we can improve the model and make it more accurate. First, benefits and company-paid taxes should be added to the annual salary. This is called the “loaded salary” or “loaded cost” and represents your company’s actual costs for employing the developer. If you don’t know your loaded costs, it’s typically 1.25 - 1.4 times the salary (per the SBA). The PTO and vacation time is often added to the loaded rate. This is an expense that the company takes to retain the developer, so we add the cost of that benefit.. Modeling our $125,000 developer, the loaded cost could be $125K * 1.4 = 175,000 or $84/hr. If we model this developer with 20 days of time off (1,920 hours), the rate becomes $91/hr.

The second improvement is to realize that hours over 40 have a diminishing return and indicate inefficiencies. To model this added inefficiency, add those hours directly to the lost time. For example, 50 working hours with 42.1% lost time would be modelled as (40 x .421) + 10. Modeling this time shows the costs related to supporting the current situation.

As an example, assuming a load factor of 1.4, 50 hours per week, 1,920 hours per year, 42.1% lost time, and PTO and vacation modeled as $0 (assumed to be included in the load factor)

SalaryLoaded SalaryHourly Rate# of DevsLost Time (Per Dev)Cost Per Week
-= Salary * LoadFactor= LoadedSalary / HoursPerYear-= (40 x PercentLostTime) + OvertimeLostTime x #Dev x Hourly
$125,000$175,000$91.141026.84$16,131

If that’s modeled with holidays as an overhead cost, the number increases even more!

Deployment Costs

To understand these costs, we need to consider the items that contribute to that cost:

  • How frequently do you release per year?
  • What’s the lead time required to verify and release a new deployment to production after code-complete (and how many people are needed)?
  • How much time is required to deploy the application?
  • How much time is spent in a typical deployment to remediate issues?

With those answers, we can model those expenses based on the amount of time spent between the completing of the code and its deployment. We just multiply the number of people and the time by the hourly salary rate.

As an example, let’s assume a company completes the code every two weeks. The code is release 26 times per year. They have a QA team of 3 people that validate the system for 1 week, then deploy the system. The process is automated, but that automation requires 1 hour to deploy and smoke-test, during which time the QA team must wait. The team then spends 1 day validating the release in production to ensure it is stable. They have found the team also average 8 hours per release with a developer helping with issue remediation.

Assuming a QA team that averages $48/hr and a Dev at $60/hr:

TaskPeopleHoursTotal Cost
---= People x Hours x Hourly
Initial validation340$5,760
Deployment wait31$144
Validation38$1,152
QA Remediation38$1,152
Dev Remediation18$1,440
Cost Per Release$9,648
Annual Cost26 releases$250,848

If the team releases every 2 weeks using automated unit tests (and any required “smoke tests”) to replace the QA team, and reduces remediation to an average of 1 hour of developer time averaged per cycle, the cost drops to $480 per release ($12,480 per year). This is part of why good automated testing and release processes are generally recommended over a dedicated QA time and manual release cycles!

In 2020, Harness did a survey that indicated an average deployment cost of $1,450 ($109K per year), with an average deployment frequency of 4 days, 8 person-hours of lead time, 25 person-hours of production effort, and 1 person-hour for remediation. They assumed $56.81/hr ($100,000 salary, 1,760 working hours), with 75 production releases each year (and no waiting during the lead time). You can use these numbers to understand how your release compares to the industry average.

Cost of Delay

In the model above, it takes just over 3 weeks for any planned change or new features to go into production. The cycle requires 2 weeks of development, 6 days of validation, 1 hour for deployments, and an average of 8 hours to fix. That totals to 137 hours per release. There are approximately 173 working hours per month (using 2080 hours/12 months).

This creates an associated cost of delay, since any new features that are essential to gaining (or retaining customers) requires at least 3 weeks to be created and pushed to production. To model this, estimate the dollar value per month with the new customer or feature. The formula we need is this:

Cost of Delay = ExpectedRevenue x HoursPerRelease / HoursPerMonth

For example, assume a new feature will generate $25,000 per month in new revenue. It will need 3 release cycless to create an MVP, with 137 hours per release cycle.

Expected RevenueHours Per ReleaseHours Per MonthCost of Delay
$25,0001372080/12 = 173$19,798

Each release cycle without that feature will cost nearly $20K. Over 26 release cycles, that’s $520K.

There’s an additional way to estimate cost of delay – the average time it takes a feature to move from the time it is captured as a requirement to the time it is released. The number of months it takes to release the feature times the additional sales from the feature can be used. For example, a feature took 4 months to release. The release enabled $5,000 per month of new sales. As a result, it had a cost of delay of $20,000. The same calculation can be used if Sales reports that they lose $5,000 of business each month because of a missing feature.

There are more complex financial models that take into account peak profits and declining value over time. This simplified approach is often more than enough for most businesses. Once you know the cost of a delay (or an average cost), that number can be used until the time required for releases is changed.

Downtime cost

This is the cost for system issues that create downtime. Modeling this is simply:

  • Incident Cost = ( Number Of Engineers x Avg Time to Restore x Hourly Cost ) + Contract Costs
  • Annual Cost = Incident Cost * Incidents Per Year

For example, if a typical incident requires 4 developers working 8 hours to restore service at $60/hr, the average cost for an incident is $1,920. If you have 5 incidents per year, the annual cost is $9,600. If each incident averages a cost of $5,000 related to not meeting contractual obligations (or credits for the downtime), the cost jumps to $6,920 per incident and $34,600 per year. The delays this cause in development and deployment cycles are included in earlier calculations.

If you want to increase accuracy, add in customer attrition in the 3-6 months following an outage. The most impacted customers will usually leave in the first three months, while large customers may need to wait an additional quarter.

Seeing the bigger picture

Through these various costs, you can see the expense of the current processes and approach. Understanding how to model them makes it easier for you to expose the true costs of your process. It will also help you to make more informed decisions about the development costs for your project.

Until next time … Happy DevOp’ing!