Benchmarking—The Measure of Success

Jun 9, 2014, 08:39 AM
Content author:
External link:
Grouping:
Image Url:
ArticleNumber:
0
September/October 1999

Benchmarking helps companies discover their strengths and weaknesses while pursuing an industry’s best practices.

By Robert L. Reid

Robert L. Reid is managing editor of Scrap.

Want to be the best?

“Of course,” you answer quickly.

But then the questions get more specific: The best at what? Compared to what or whom? And once you determine what it takes to be the best, how do you get there and stay there?

Benchmarking is one way to answer these questions.

Benchmarking, part of the total quality and continuous improvement movement, is a business tool that helps identify your company’s strengths and weaknesses, then compares those results against the best practices in your industry or other industries. The process has been used extensively by Fortune 500 corporations such as Federal Express, Motorola Inc., Digital Equipment Corp., and others, but it’s also applicable to small, family-owned firms. The key requirement is a willingness to gather information about yourself and share it with others.

“Benchmarking is being smart enough to realize that you don’t do something as well as other people do and humble enough to go and learn how they do it,” explains Ron Webb, practice area manager for the American Productivity & Quality Center (APQC) (Houston), which operates both nonprofit and for-profit benchmarking efforts.

Evan Koplin, vice president of Macon Iron & Paper Stock Co. Inc. (Macon, Ga.), who has participated in small-group benchmarking studies, adds that benchmarking “can make you a better manager.” For instance, if you find that your cost of shearing scrap is 40 percent higher than the industry average, you can ask yourself, “What am I doing wrong?” and try to become more competitive. Conversely, if some other processor comes up with a radically different bid for material but you know, through benchmarking, that your bid is in line with the industry, “you can know that you’re right and stick by your guns,” Koplin explains.

Identifying Best Practices

At their heart, all benchmarking efforts try to identify “best practices,” which APQC defines as “practices that have been shown to produce superior results; selected by a systematic process; and judged as exemplary, good, or successfully demonstrated.” But APQC reminds potential benchmarkers that “different missions, cultures, environments, and technologies” mean that best practices aren’t always “best” for everyone and must be adapted to fit each organization.

Thus, no single approach to benchmarking exists. Instead, benchmarking can be as informal as two people agreeing to visit each other’s facilities to study how certain operations are performed. Or it can be highly structured, using a lengthy, detailed survey administered by an outside consultant. Such surveys are usually “blind,” which means they promise anonymity to participants. But site visits by consultants and even face-to-face meetings between participants can also play a role in more in-depth benchmarking studies.

APQC also draws a distinction between benchmarks and benchmarking. A benchmark is a metric—a measure of performance—that tells you, for instance, how many tons of ferrous scrap you processed this month or how long it took your drivers to make their pickups. Benchmarking, on the other hand, involves “the actual how and why” of achieving those benchmarks, Webb says.

The first step, he adds, is to know your own processes, which means gathering data on your facility’s production rates, manpower needs, costs—whatever it is you want to benchmark. After all, it’s hard to learn anything from someone else’s experience if you aren’t sure what your own numbers are, Webb notes.

Next, you have to decide what you want to benchmark, says Mark Czarnecki, president of the Benchmarking Network Inc. (Houston), which conducts benchmarking studies for companies and trade associations. It can be a process unique to your industry or one that extends across many industries, such as billing, payroll, or customer service. A process that you use repeatedly is the easiest to benchmark, Czarnecki adds, but it should also be something critical to your company’s success—something that closely matches your organization’s goals or strategic plan.

At the Steel Service Center Institute (SSCI)(Cleveland), a series of benchmarking surveys—one large annual study and four smaller quarterly versions—collects income and performance information that enables SSCI members like Henry Oliner, president of both General Steel Co. (Macon, Ga.) and Macon Iron & Paper Stock, to compare data on their costs of operations, man-hours per ton, and sales, among other items. And while these SSCI surveys don’t explore the whys behind those numbers, they do give companies a general idea of where they stand within their industry, broken out by region, company size, and line of business, Oliner says. The surveys also focus on a select group of the most profitable steel service centers, so other firms can compare their numbers against those top performers.

But profitability is only part of the equation. Several experienced benchmarkers recommend taking a “holistic” approach to identifying best practices. After all, one company may process twice as much scrap as you do each day, but its approach is hardly the best if it also has twice your accident rate. Thus, companies must look at the area being benchmarked from multiple dimensions, Czarnecki explains.

Launching a Benchmarking Program

A good benchmarking program should include four key steps: planning, collecting, analyzing, and adapting, says APQC’s Webb.

During the planning phase, you need to develop the specific questions you want to ask other companies about their processes, again focusing not just on what they do but getting more into how and why they follow certain procedures. You’ll also have to find those other companies to survey, and that can require some creative thinking.

For instance, if you want to benchmark “scrap intake,” you can certainly look to other scrap processors. But don’t limit yourself, Webb says. If you only look at competitors, “you’ll only be as good as your best competitor and you’ll never be able to beat that competitor. But if you go outside your industry and can adapt something, that may be the thing that helps you catch up.”

So re-examine what “scrap intake” means. Doesn’t it also involve a broader concept, such as materials intake? Doesn’t it involve interacting with a supplier? Many companies in many industries take in material and interact with suppliers. So your benchmarking could focus on companies that handle at least 1,000 orders a month, regardless of whether they’re for widgets or scrap metal, and look for the best practices in fulfilling those orders, Webb explains.

The collecting phase involves sending your survey questions to the companies you want to benchmark and even conducting site visits—tasks often handled by a third-party, especially when the results will be kept anonymous. APQC, for instance, conducts site visits for gathering in-depth process information, examines secondary literature such as articles and databases, and considers sources such as trade associations and the speaker lists from industry conferences.

At SSCI, benchmarking surveys are distributed by the association, but the forms are returned to the accounting firm of Arthur Andersen for compilation and analysis, notes Craig Schulz, vice president, research. SSCI’s annual survey is “fairly lengthy,” he says, averaging 12 to 16 pages, while the quarterly surveys are essentially a one-page version of the annual questionnaire.

The Benchmarking Network uses a long survey for its clients—about 20 to 25 pages, involving several hundred questions—and then tries to gather participants at a roundtable meeting for face-to-face discussions. “If we can’t get you in front of someone—a peer company, a major corporation—then we know you’re not going to get the full value out of the process,” Czarnecki stresses.

Analyzing and Adapting for Success

Analyzing the data collected forces participants to prioritize, to decide which processes they want to change, Webb says. If, for instance, the data reveal a “key gap” in production rates—say, you process 6 tons of scrap in the same time other firms process 20 tons—you can compare your processes with theirs and determine why the difference exists. Perhaps some other company has dedicated more employees to that task, or the employees are better trained, or they use machinery you don’t have.

At General Steel, Oliner says he’s used the SSCI benchmarking reports to help develop a productivity-based bonus system and set sales goals. The survey data “gives you the numbers to work with,” he says.

But it’s also possible to undergo a benchmarking study and not change anything at all.

You might, for instance, discover that your current processes work best for you—even if they’re not a “best practice” per se—because of some factor unique to your business. “If that’s what you have to do to satisfy your customers, you’re not going to change,” stresses Schulz.

At APQC, the final phase—adapting the benchmarking data to your company’s processes—is left completely up to the participating firms, Webb notes. Otherwise, the outside researchers might bias the survey if they knew they’d also have to implement the findings. But Webb does offer this advice: Make sure you use the information. The greatest pitfall in benchmarking is gathering information and then not applying it, he says.

There are many reasons for such inaction. Some people might fear that they’ll benchmark themselves right out of a job by improving efficiencies to the point where the organization no longer needs as many employees. Or the benchmarking effort might have overlooked the organization’s internal politics and culture. Top management support is always critical, Webb explains, but so is involving all the stakeholders who might be affected. This includes people from departments not even being benchmarked because the changes proposed for the benchmarked area might require changes by other departments as well.

Problems and Provisos

Though benchmarking is essentially procompetitive—the more companies that use best practices, the more level the playing field—there’s an inherent danger of violating antitrust laws whenever businesses share information. It’s an ironic problem because, Czarnecki notes, the federal government is of two minds when it comes to benchmarking—heavily promoting the practice in one sense, especially in benchmarking its own agencies against private-sector companies, while also worrying about business collusion.

To avoid antitrust problems, APQC has established a benchmarking code of conduct that stresses: “Avoid discussions or actions that could lead to or imply an interest in restraint of trade, market and/or customer allocation schemes, price fixing, dealing arrangements, bid rigging, or bribery.” Moreover, APQC warns: “When cost is closely linked to price, sharing cost data can be considered to be the same as price sharing.”

It’s probably all right to benchmark historic prices—say, the fact that you got $75 a ton for a certain material six months ago—or what it cost you to run your equipment last year, advises Paul Green, president of Paul Green Enterprises (Montgomery Village, Md.), a scrap consultant. After all, the historic prices were probably published in various industry sources. But don’t discuss any current or projected numbers.

Also, keep a realistic view of what benchmarking can—and can’t—do for your company. Green echoes other benchmarkers in emphasizing that benchmarking is no panacea. If benchmarking participants “think they’re going to get this information and all their problems are going to go away, that’s not the case,” he says. “What benchmarking does is serve as a guide, or a road map, of whether your company is on the right path to success. But it’s not a quick fix.”

Indeed, time is one of the most important commitments a company must make to benchmarking, experts note. That’s because the information usually must be gathered from scratch—rarely is there any useful benchmarking data sitting on a shelf somewhere in some previous report. So a good benchmarking study often takes four to six months or more to develop, with as much as another six months to a year needed for companies to implement whatever changes are necessary.

And in many ways, benchmarking should truly be a continuous improvement effort. Individual benchmarking projects can be completed, for instance. But there are always other processes to examine.

“If you benchmarked billing processes this year, you probably don’t want to do billing processes again next year,” Czarnecki notes. “Instead, rotate in some other areas. Next year, look at accounts payable or information systems.” And then, two or three years down the line, revisit a process you’ve already benchmarked to make sure you’re still following the best practice.

At SSCI, for instance, benchmarking surveys are actively reviewed to make sure questions and market segments remain relevant. “Benchmarking is not a one-time thing,” Schulz stresses. “It should be ongoing to make sure you keep on top of the ways industry is changing.” 

Benchmarking the Scrap Industry

It helps to have a thick skin when benchmarking.

“Sometimes benchmarking gives you information you don’t want to face or deal with,” notes Paul Green, president of Paul Green Enterprises (Montgomery Village, Md.), a scrap consultant and former assistant executive director of ISRI. “It does show you your weaknesses, and many times people don’t like to see that.”

Green knows all about the reluctance some firms have to benchmarking. During 17 years with ReMA and one of its predecessors, the Institute of Scrap Iron and Steel (ISIS), he fought a losing battle to get scrap processors interested in benchmarking. Even today, ReMA is still just in the most preliminary stages of discussing the topic.

Green’s efforts at benchmarking the scrap industry began in the late 1970s with a program designed to get scrap firms to collect their own cost data. Conducted with assistance from Arthur Andersen, the program focused strictly on internal cost-accounting. But Green had hoped it could be the first step toward real benchmarking: “Companies would share cost information on an anonymous basis to allow them to see how they were doing compared with the rest of the industry,” he says.

Though the program and its manual helped many companies do a better job of collecting their own costs, Green says, the effort never came close to that information-sharing aspect of real benchmarking.

The principal reason: fear of disclosing “proprietary information.”

Ironically, Green notes, scrap processors were eager to learn about other scrap companies, but they didn’t want to share their own information. He even tried to gather information using percentages rather than actual dollars but still convinced only about 15 companies total to submit data. 

Evan Koplin, vice president of Macon Iron & Paper Stock Co. Inc. (Macon, Ga.), worked with Green on that initial benchmarking effort and doesn’t think the industry’s reluctance to divulge information has changed much since then. That’s unfortunate, he says, because he’s seen the benefits of benchmarking at Macon’s sister firm, General Steel Co., a steel service center that receives periodic benchmarking data through its trade association, the Steel Service Center Institute.

“It’s been very helpful and made General Steel a better company because it knows where it stands” in terms of gross margins per employee, dollar sales per employee, and other metrics for firms of a similar size with similar sales, explains Koplin.

He concedes, however, that the first benchmarking effort for scrap processors might have tried to go “too far, too fast.” After all, there are so many product lines and other variables in the scrap industry it can be difficult to ensure that everybody is comparing identical data. If you want to benchmark ferrous sales as a percentage of total sales, for instance, does that include brokerage sales? he asks. If some firms say yes and others say no, the data won’t be comparable.

Even in a small benchmarking study that Koplin joined, “we spent half the morning talking about definitions,” he notes. In fact, the group never did completely settle the definitions question before market conditions turned sour and everybody dropped out to concentrate on running their businesses, he adds.

All those variables also make it difficult for scrap processors to gain much from benchmarking other industries, Koplin states, though he concedes it might be useful to compare certain transportation numbers—such as costs per mile—with the waste industry. But overall, he doesn’t think it would be a tremendous help for scrap processors to benchmark beyond their own industry.

Green agrees, noting that scrap company executives “become much more engaged when the discussion is specific to their industry—they can relate to that.”

Despite the many obstacles, though, Green and Koplin remain true believers in scrap benchmarking. Green, in fact, has run some three or four small benchmarking projects within the past few years that did convince both ferrous and nonferrous participants to sit down together and discuss information in areas such as shearing, baling, torching, and administrative costs.

He credits “enlightened managers” for making these small group studies successful and adds, “We proved that it really isn’t hard to do benchmarking, but you have to develop trust to share the information.” •

Benchmarking helps companies discover their strengths and weaknesses while pursuing an industry's best practices.
Tags:
  • benchmarking
  • 1999
Categories:
  • Sep_Oct
  • Scrap Magazine

Have Questions?