In the action thriller movie Shooter, character actor Ned Beatty, playing highly corrupt and power mad Senator Charles F. Meachum delivered this way-to-close to reality line of dialog: “The truth is what I say it is!”
In our world, the way technical performance data, or “specs” are presented and developed is more often than I care to consider, created under a similar mindset.
Now let’s be clear, some manufacturers actually do publish accurate and technically correct data on their products, often at a competitive disadvantage. However, it is apparent from a totally unscientific and random survey of product data sheets, that the adage oft quoted by Dr. Gene Patronis, of Dean of Georgia Tech’s Physics Department, “The data you have is one of two things: the only data they ever took, or the best data they ever got” is more often the case.
Why you may ask? Good question! Simple answer: marketing!
When there are often dozens of similar products in a category the game of “specsmanship” is often viewed as the only way to gain any competitive edge.
How do you tell whether you might be looking at data that doesn’t correlate with real world performance? Look for terms like smoothing, averaged, typical, normalized, representative, and lots of other similar adjectives (use your thesaurus to find dozens of others).
Examine the data to determine if the actual test conditions are given. For example, did they measure just one cabinet in a line array in an anechoic chamber and then present that as the performance of a typical group of cabinets? Is the amplifier data generated from a laboratory type “load” under ideal conditions or is it real data measured with realistic loudspeaker type loads on the device?
These are simple basic steps to take with any product data you examine. But the best information is gong to come from the field. Talk to other users, ask the manufacturer for information on completed installs, and then check with the facility and those who worked on the project. Ask you colleagues, even ask your competitors.
How any given device functions out in the real world is going to be considerably more useful information than how it performs under ideal controlled test criteria. After all, your project is not a controlled test criteria situation.
If the product has been around a while, make sure to ask about reliability and maintenance issues, service call level and factory support when something goes astray. If it’s new to the market, ask for an evaluation sample and the names of others who have “tested” the device in the field. Find out what they think and cross-check that with your own evaluations.
Remember, the most abused word in marketing is “new” followed by improved, updated, enhanced, re-designed, and other similar terms you can imagine.
Just because it’s new and improved in marketing terms does not mean it’s better or more reliable or even the same as the version you are currently using. Things change. Parts go out of stock or out of fashion or are just plain discontinued. Designs change to accommodate these source component realities. It may well be better or it may not.
Did your supplier suddenly shift to an offshore vendor or change OEM sources? That information should be a flag to you to check on what changed and how it affects the application you have in mind.
I can think of many instances where the OEM offshore vendor made changes to a product to “improve” it (really, what they were doing is improving their own profitability in most cases) and never bothered to tell their customer. Often the first indication of such “improvements” comes from the users in the field. Do you want to be the one who finds out that what you thought you were buying is not what you actually got?
Now, it isn’t the time to get totally paranoid. Just use common sense when evaluating product data and be sure to network with your colleagues. After all, we here in the real world need to support each other to ensure the viability, profitability and satisfaction with what we deliver.