What do you consider an appropriate "Error Rate"? I'm working with a client's IT department to develop a framework to improve its web and mobile digital properties. A recurring topic is the metrics that the business areas (mainly marketing) measure; one of them is "Error Rate," within the context of a customer-facing web and mobile application. The Error Rate can be a broad metric, but in general, within this context, it refers to the total of valid HTTP requests that return an error, divided by the total of requests. The client's IT department wants to define an initial achievable target for error rate, they want to know if there are any benchmarks or references of what is an "acceptable" error rate range. 

3.2k viewscircle icon1 Upvotecircle icon8 Comments
Sort by:
Director of Operations2 years ago

Thanks for sharing your advice and experience. I understand that there is no benchmark or standard for an "acceptable error rate", and that the best approach depends on the specific property and the enterprise's goals. The enterprise can strive for 0% error rate, or lower the bar if that aligns with its overall strategy. Another key point is that the initial target error rate should be aligned with the business, and both parties should agree on what is realistic, achievable, and sustainable.

CTO in Transportation2 years ago

It depends on how critical those properties are (even in the same application) certain areas could accept a higher error rate than others.
Another consideration is to understand the total (real) number of affected users.
0.5% can be totally acceptable in certain cases and very high on others.

Lightbulb on2
C-PIO in Software2 years ago

Excerptible error rate is entirely dependent on the industry. 

Financial institutions require as close to zero error rate. 

Where as casual blogs or personal site may tolerate more. 

Ask the question what type of address error is permissible then work backwards. 

No reasonable organization can tolerate errors. One offs and system outages excepted. 

Volume of web traffic can cause the biggest issues that are scale able if planned correctly. Orphan pages or bad code are harder to trace but must be addressed. 

Aim for 100% error free. You should come close to that, excepting less only demeans your business. 

Lightbulb on1
Global CIO in Consumer Goods2 years ago

Percentage of requests is not the right way to look at it as it depends entirely what the failure is. If you have one failure that exposes customer credit card information out of a million requests, that is utterly unacceptable. On the other hand, 10% of calls resulting in a failure to show the correctly updated time is probably ok.

Lightbulb on1
Sr. Director, Head of Global Omnichannel Capabilities Delivery Center in Manufacturing2 years ago

This is a hard one because not every website and mobile app are the same.  For validated/GXP applications, acceptable error rate is 0% for us.  However, for more marketing/sales related apps and sites (non-direct revenue impacting), I think the allowable threshold should be lower.  I typically manage to a 99.9% uptime for these types of apps.  Again, every scenario is different, and the business needs to provide some comfort level knowing that in a high demand and high change request environment (which most sales/marketing applications are), there is room for error if strict change control practices are not followed like with validated applications.  So, the more agile and quick, the more errors that are prone to happen.  In the end, both IT and business need to agree and be comfortable with the allowable error rate.

Lightbulb on1

Content you might like

Always9%

Often54%

Sometimes24%

Rarely6%

Never4%

View Results

Yes63%

No31%

Not yet, but we are planning to in 20214%

View Results