The Monthly Metric: Overall Equipment Effectiveness

December 18, 2020
By Dan Zeiger

Year 4 of The Monthly Metric closes with one of the more interesting analytics that has been examined in this space: In fact, so much time can be spent discussing its shortcomings that one wonders if the benefits are a worthy trade-off.

That’s the nature of overall equipment effectiveness (OEE), which can be miscalculated, misinterpreted and — as a result — often misused. “It’s hyped. It’s abused. There’s lots of water to pour into the wine,” explains Simon Jacobson, vice president with the supply chain practice at Gartner, the Stamford, Connecticut-based global business research and advisory firm.

OEE, a key gauge in the total productive maintenance (TPM) manufacturing process and other continuous-improvement programs, is based on a product’s first-time availability, production and quality of output. After those three rates are multiplied, the resulting percentage is designed to reveal the room for improvement in manufacturing efficiency. However, in recent years, Jacobson says, OEE has become a metric best suited among a wider analytics dashboard.

“I wouldn’t say it’s polarizing,” Jacobson says. “It’s a measure of actual production to desired or ideal production rate, a concept that everyone can understand and a calculation that is widely accepted. It’s a useful measure of relative efficiency and whether an asset is running at the desired rate … but the moral of the story is that, while (the metric) has value, companies must use it wisely.”

Meaning of the Metric

OEE is calculated by multiplying (1) machine availability rate (as a percentage of total time), (2) production performance rate (as a percentage of target) and (3) quality rate (percentage of non-defective parts produced). For example, an availability rate of 90 percent, performance rate of 90 percent and quality rate of 98 percent generates an OEE of 79 percent.

It is a valuable metric for gauging site-specific processes, costs and efficiency, Jacobson says, and a closer examination of the three elements can reveal where a breakdown is occurring. But OEE fails to capture and account for many of today’s trade-offs, he says: “A lot of factors can be hidden. Offering a wider set of options to customers can shift emphasis from asset utilization in favor of inventory buffers. The focus on service levels and profitability can come at the expense of OEE.”

Though an OEE of 75 percent to 85 percent is generally considered world class, that’s not necessarily a benchmark because every facility is different. Machinery age and maintenance practices can vary by location, impacting availability. A production process that is automated at one facility can be more manual at another, so comparing OEE across sites can be misleading. “Companies that try to make such an apples-to-apples comparison with OEE get nowhere,” Jacobson says.

When Jacobson discussed capacity utilization with The Monthly Metric last year, he said that, with companies looking to improve overall production efficiency, he advised them not to overemphasize OEE. “A lot of organizations will gauge the overall health of their production based on OEE,” he said then. “(But) the reality is that OEE hides so much and is limited beyond the factory. I don't ever think I've come across a company that has attributed quarterly or yearly profitability because of OEE.”

Maximizing OEE, he says, involves three steps:

  • Understand where the maximum value of the metric is: measuring performance on a site or unit level.
  • If it must be used across sites, do so on an achievement-of-target basis. Ensure that OEE calculations are standard and consistent.
  • Do not overemphasize OEE in isolation. Recognize its interdependencies with other measurements of manufacturing’s performance.

Case Study and Digital’s Impact

Because multiple factors go into the OEE calculation, the metric can be misinterpreted or gamed, Jacobson says: “Some companies have rules which stipulate prolonged downtime can be classified as ‘planned,’ which means unplanned downtime can masked so OEE isn’t impacted.” That is one reason it’s important for OEE to be part of a broader suite of KPIs, he adds, using an analogy that has been made often in this space: baseball.

“Organizations need to have a more balanced portfolio of KPIs,” Jacobson says. “It needs to be treated like a baseball game. OEE is a batting average (hits divided by number of at bats), but it doesn’t tell the whole story. A player could get walked a lot, which means a higher on-base percentage (OBP), or has a lot of RBIs, which means he’s responsible for a lot of runs scoring. Neither high OBP and RBIs translate to having a high batting average.”

Such metrics to use in concert with OEE include capacity utilization, cycle time, inventory turnover ratio and on-time delivery.

One Tech, a Dallas-based artificial intelligence (AI)-driven technology and software company, announced it has helped automaker clients achieve a 15-percent improvement in assembly-line OEE. While not referring to a specific case study, Jacobson says an accurate OEE measurement might take time. “If OEE goes up X percent in three months, that might be a reflection of the newness of a system, or that the data might be measured (accurately or differently),” he says. “See what happens in six months, when things will likely level off.”

Finally, the coronavirus (COVID-19) pandemic has not ceased companies’ investment in digital technologies, Jacobson says, and that will benefit OEE evaluations.

“I have yet to see a company that has hit the pause button on smart manufacturing,” he says. “Companies are still going digital. Improved data access and analytics will lead to a better analysis of OEE and other measures, which will lead to better approaches to continuous improvement. We’re still far off from OEE-based machine learning algorithms, but that’s only more water in the wine with respect to tempering the hype.”

Have a Metric Christmas!

As has become custom around the holidays, The Monthly Metric counts its blessings: the experts that have shared analytics and insights with us. Due to COVID-19, this space didn’t present a new metric for three months, as entries were devoted to reviewing risk and inventory metrics that suddenly became especially critical. (Also, the pandemic made analytics more about survival than strategy at many organizations.)

However, Jacobson and this year’s other contributors formed a sizeable and powerful roster, and we thank them:

  • Lisa M. Ellram, Ph.D., MBA, C.P.M., Rees Distinguished professor of supply chain management at Miami University in Oxford, Ohio, and CAPS Research, the Tempe, Arizona-based program in strategic partnership with Arizona State University and Institute for Supply Management® (ISM®)
  • Jim Fleming, CPSM, CPSD, ISM Program Manager, Certification
  • Jim Hess, director of warehouse business development at Greenville, North Carolina-based Yale Materials Handling Corporation
  • Chris Jones, executive vice president, marketing and services at Descartes Systems Group in Waterloo, Ontario
  • Chris Sawchuk, principal and global procurement advisory practice leader for The Hackett Group, the Miami-based business consultancy
  • Tracey Smith, MBA, MAS, CPSM, president of Numerical Insights LLC, a boutique analytics firm in Charlotte, North Carolina.

This space is nothing without the wisdom of our guest experts and the support of our readers, who provide suggestions and social-media word of mouth. Happy holidays!

To suggest a metric to be covered in the future, email me at dzeiger@ismworld.org.

About the Author

Dan Zeiger

About the Author

Dan Zeiger is Senior Copy Editor/Writer for Inside Supply Management® magazine, covering topics, trends and issues relating to supply chain management.