[Date Prev][Date Next][Thread Prev][Thread Next][Author Index][Date Index][Thread Index]
Re: NewsSpeak and "We expect it to win on real data"
- To: <michael@xanadu>, <us@xanadu>
- Subject: Re: NewsSpeak and "We expect it to win on real data"
- From: Nick Whyte <nick@grand-central>
- Date: Wed, 30 Aug 89 10:09:21 PDT
The notion of "We expect it to win on real data" was a very tricky one
at my last company. Sales of Elxsi systems were very dependent on
benchmark results. We designed the Elxsi to be a high throughput machine,
i.e. more "real" bang for the buck. However, most published benchmarks
are designed to measure a narrow concept of performance which do not reflect
real data (and unfortunately for Elxsi, do not reflect that real computers
can use multiple CPUs). The result was that we sometimes lost sales due
to bad benchmark results even though the system was better for the target
application. It is hard to write benchmarks that predict an application's
performance, so most customers don't bother.
So what does this rambling have to do with Xanadu, Amix, or anyone else?
1. If you think your product provides the "real solution" to the "real problem"
you need to publish and promote a set of benchmarks so that customers will
use it to compare you with your competitors. Otherwise, someone who wants
to get a quickie article published will invent a silly benchmark and run
it on a few systems, and before you know it, there is now a defacto
standard benchmark.
2. As disgusting as it may be, it can be worth the engineering effort to make
sure the product performs well on well-known stupid benchmarks. In other
words it may be cheaper to make stupid things run well than to have your
sales and marketing people spend time convincing people that stupid things
are stupid. (You also have more credibility convincing someone that a
benchmark is stupid when you don't lose big on that benchmark.)