It was a great success, with over a million being sold. The BBC, happy that it had done its best to turn the UK into a nation of computer literates ready for the 21st century -- or 'anoraks', as most people called them -- retired from the scene. Acorn was left to its own devices. The basic Beeb architecture had immoderate amounts of expansion capabilities, but even with the network, second processors, expanded memory and other bolt-ons it remained an 8-bit computer in a world that was rapidly transforming into a 16-bit, IBM PC-minded place.
Acorn had the fixation, common among Cambridge computer companies of the time, that unless you did things your way there was no point in doing them at all. Rather than just buy in whole processors, Acorn approached Intel and asked to use the 80286 design in its own chips: Intel would have none of it. So Acorn started from scratch. The new processor had to be simple, because it had a design team of seven and no budget, and it had to be fast. Other processors took around 200 man-years to get going, Acorn didn't have that option. It also didn't have anyone who'd designed a microprocessor before: no matter.
Work started in October 1983 and on 26 April 1985 -- a mere five man-years from the tiny team -- first silicon was plugged in and worked first time. Oh, and there was the little matter of the support chipset; video, memory and input/output controllers, each designed by one person. By any standards, this was heroic: the fact that the design not only worked but worked spectacularly well raises the endeavour to the mythical. Not only did the chip run with extraordinary efficiency both in terms of work done and electricity used, but the instruction set -- the basic language it spoke -- proved a joy to write for. The Acorn RISC Machine was born.
AnchorDesk UK: Commentary Box The little Acorn inside Intel
Acorn was still in the business of making computers, not selling chips. By 1987, the first ARM-based Archimedes computers were rolled out; desktop machines that over the next few years proved to be enormously faster and much more elegant than the IBM PC standard. And, of course, vastly incompatible: the company never really found a niche capable of supporting it and in 1999 got out of hardware altogether and renamed itself Element 14.
By then, the ARM had broken free. The chip had quickly gained a reputation for fabulousness, but the company theorised that nobody would buy it and put it in a computer if Acorn was still making competitive products. So, Advanced RISC Machines was created in 1990, backed by Apple (who used the ARM in the Newton PDA three years later), Acorn, and chipmaker VLSI.
Then came the second act of palpable genius. The chief executive, Robin Saxby, had to decide how to sell the ARM. After much boardroom discussion, he got his way: ARM would be a chip company that neither made nor sold chips. Such apparent madness came from two key observations -- first, the chip factories themselves, the so-called fabs, were insanely expensive to build and keep up to date. Second, because fabs were so costly people who already owned them had come up with the idea of ASICs, application-specific integrated circuits. A design company could come up with a chip design that worked according to certain rules and use someone else's fab to make it.
In 1990, more and more companies were making custom chips this way. Saxby reasoned that if ARM became a library component -- a standard part of an ASIC design that anyone could just drop into their own chips -- then other companies would licence the design rather than reinvent that particular wheel. Because the ARM was small and very low power, it would impose few limitations on the rest of the chip: because it was established and highly regarded, there was an existing pool of expertise, software and development tools that would be very attractive to companies wanting to ship products rather than learn all about microprocessors.
The rest is history and hard work. Over the years, ARM has aggressively developed the design, adding core software and hardware components for digital signal processing, Java, streaming media, wireless and anything else that looks likely to turn a buck. The licensing model took a little while to take hold, but when Texas Instruments got on board in 1993, the gates opened. It even got inside Intel, first by stealth when Intel bought Compaq, who'd bought Digital, who'd licensed the ARM for its StrongARM chip, then when Intel bowed to the inevitable and made the architecture the heart of its new XScale range.
So now it's 2002, and our gadget designer finds that no matter where he looks, ARM is there. You probably own several: ADSL modems, Game Boy Advance, mobile phones, set top boxes, PDAs, printers, cameras -- there's no end to the list. Saxby is now bold Sir Robin, he employs nearly 800 people and his company made around £150m last year. That last quarter of 2001, when all around were crying into their accountants, saw ARM's revenue up 35 percent on 2000.
Right at the middle of it all, the basic ideas dreamed up by a handful of Cambridge engineers building their first ever processor live on. And to think it could all have been Intel, all along.