View unanswered posts | View active topics It is currently Thu Mar 28, 2024 11:16 pm



Reply to topic  [ 11 posts ] 
 Origin of ARM processor and architecture 
Author Message

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
(See also the topic at http://forum.6502.org/viewtopic.php?t=1892 which addresses from several angles the question of ARM's inheritance from the 6502)

Acorn was seeking a microprocessor to take over from the 6502 they'd used in the System 1, Atom, and BBC:
Quote:
Steve Furber and Sophie Wilson examined the microprocessors on the market – such as the
Motorola 6800 and the National Semiconductor 32016 – but they weren’t impressed, especially
in terms of speed. Andy Hopper, who had strong university networks, was aware of the Berkeley
RISC (Reduced Instruction Set Computing) research project, where a postgraduate class had built
a reasonable microprocessor in a year using a RISC design philosophy. RISC is an architectural
approach to microprocessor design that cuts down the complexity of operations by executing
fewer instructions more rapidly. Hopper mentioned the RISC projects to Hermann Hauser, and the
Acorn team were impressed with the results, so Wilson started doodling an instruction set.

In October 1983, Furber and Wilson visited the Western Design Centre in Phoenix, Arizona,
as they were interested in the development and extended capabilities of the near-ubiquitous
6502 microprocessor. Steve Furber recalled, “We held the place in awe and respect because it
was American and made microprocessors, but we discovered that they were operating out of a
bungalow somewhere, they were employing school kids to do some of the silicon design on Apple
IIs, and we thought, “if they can design a microprocessor, so can we!

The Acorn team felt that designing microprocessors was a black art. They never expected to
succeed, but they did hope to learn something in the process. The entire industry was very
sceptical about the RISC approach, believing it was philosophically wrong to make a processor
simpler. RISC did however show that if you made the compiler do more work, and the processor
simpler, you would get better performance.

The entire ARM (Acorn RISC Machine) processor was designed in just 18 months, and taken to the
Munich offices of VLSI (an Integrated Circuit manufacturer) in early 1985. The first silicon chips
arrived on 26 April 1985, and by three o’clock in the afternoon it was running BBC BASIC.

- from the document THE LEGACY OF THE BBC MICRO: effecting change in the UK’s cultures of computing
at http://www.nesta.org.uk/sites/default/f ... _micro.pdf

Steve Furber is in possession of the 700 line model of the CPU, written in BBC Basic, but ARM's lawyers have declined permission to distribute. A very small snippet can be seen at http://www.bbcbasic.co.uk/bbcbasic/birt ... model.html


Thu May 22, 2014 8:38 pm
Profile

Joined: Tue Dec 11, 2012 8:03 am
Posts: 285
Location: California
The .pdf seems to have a problem. I get the first page just briefly, and then it goes blank before it gets very far in the download and stops.

Although I have no interest in video programming (which requires far more processing power than most other areas of computing), I do want to look into another line of microcontrollers for the coming decade or two of projects, and ARM would definitely be something to consider. However, I remember the RISC-versus-CISC wars of the 1980's, and the RISC enthusiasts saying that although RISC's reduced instruction set required more instructions to do a job, the simpler instruction decoding could go so much faster that it more than compensated, giving greater performance at lower cost. It made sense. As the wars heated up, it gradually went from "greater performance at lower cost" to "maximum performance at any cost," and I kind of lost interest. Then the 68040 came out which was a nightmare to the RISC enthusiasts, because this CISC definitely outperformed the RISCs. Since then of course the lines have been blurred, and CISCs break the instructions down in the pipeline.

In the quote from above,
Quote:
Steve Furber and Sophie Wilson examined the microprocessors on the market – such as the
Motorola 6800 and the National Semiconductor 32016 – but they weren’t impressed, especially
in terms of speed

I wonder if "6800" should have been "68000." The 6800 would definitely not have been one to look to as a possible upgrade in performance from the 6502. The 32016, although 32-bit, did not outperform the 6502 either, so it's no surprise they would move on. (Let it never be said that more bits automatically guarantees greater performance. This one did not deliver.)

Also from above:
Quote:
RISC did however show that if you made the compiler do more work, and the processor simpler, you would get better performance.

Although I mostly use Forth on the workbench for the ability to get things going so quickly, the commercial applications I do have been in assembly, which is working better and better as I improve my structure, modularity, abstraction, etc., with things like the structure macros. It would be absurd for me to imply that higher-level compiled languages don't have their place, but I would comment (and this is a copy-and-paste from something I wrote to a nephew) that some of the highest-performing microprocessors have assembly languages that are totally impractical for a casual user to learn well; so people who can dedicate a large amount of time to understanding it write compilers so their customers (who are programmers) can use a common language like C and not bother with assembly. It seems to partly defeat the purpose, since assembly is the way to get maximum performance, yet to get maximum performance they have designed a processor you almost can't program in assembly! Still, good compilers can bring out some pretty good performance. (That's not to say all compilers do. For some processors, I've read of performance ratios of 5:1 or more from the best compiler to the worst, all for the same processor!)

_________________
http://WilsonMinesCo.com/ lots of 6502 resources


Thu May 22, 2014 10:11 pm
Profile WWW

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
Quite right and good catch - that's certainly meant to be 68000.

I'm pretty sure I read somewhere that they decided memory bandwidth was the limiting factor for performance at that time - so you need a 32-bit memory and to use it usefully on as many cycles as you can. At that time on-chip cache was prohibitive. Possibly they were looking at the 68008 which of course is severely limited with a byte-wide bus. But the 68000 itself, and similarly the 32016 had only a 16-bit wide bus, and they'd had a lot of experience with the latter as a second processor for the Beeb.

The way CISC caught up on performance was to have huge complex chips which use a lot of power - which worked very well, until low power became very important, and it's taken some time for CISC to work its way back down again to hit the power/performance of ARM. Arguably ARM isn't very pure RISC but then we should accept that all categories are fuzzy and have subtleties.

As for compilers: it's refreshingly easy to program in assembly for ARM and presumably for other RISCs, with the one caveat that it's difficult for a human to make good use of a large register file. The instruction sets are generally simple and symmetrical.

My 6502 emulator in ARM assembly can be found at https://github.com/BigEd/a6502 although it's by no means a paragon of programming! Possibly a better codebase to examine would be the baremetal for Raspberry Pi code at https://github.com/dwelch67/raspberrypi

Cheers
Ed


Fri May 23, 2014 8:41 am
Profile

Joined: Tue Dec 11, 2012 8:03 am
Posts: 285
Location: California
Garth wrote:
Although I have no interest in video programming (which requires far more processing power than most other areas of computing), I do want to look into another line of microcontrollers for the coming decade or two of projects, and ARM would definitely be something to consider.

I started a new topic for that, at viewtopic.php?f=3&t=132.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources


Sat May 24, 2014 5:33 am
Profile WWW

Joined: Wed Jan 16, 2013 2:33 am
Posts: 165
Quote:
The British computer manufacturer Acorn Computers first developed ARM in the 1980s to use in its personal computers. Its first ARM-based products were coprocessor modules for the BBC Micro series of computers. After the successful BBC Micro computer, Acorn Computers considered how to move on from the relatively simple MOS Technology 6502 processor to address business markets like the one that was soon dominated by the IBM PC, launched in 1981. The Acorn Business Computer (ABC) plan required that a number of second processors be made to work with the BBC Micro platform, but processors such as theMotorola 68000 and National Semiconductor 32016 were considered unsuitable, and the 6502 was not powerful enough for a graphics based user interface.[16]

After testing all available processors and finding them lacking, Acorn decided it needed a new architecture. Inspired by white papers on the Berkeley RISC project, Acorn considered designing its own processor.[17]A visit to the Western Design Center in Phoenix, where the 6502 was being updated by what was effectively a single-person company, showed Acorn engineers Steve Furber and Sophie Wilson they did not need massive resources and state-of-the-art research and development facilities.[18]

Wilson developed the instruction set, writing a simulation of the processor in BBC BASIC that ran on a BBC Micro with a second 6502 processor. This convinced Acorn engineers they were on the right track. Wilson approached Acorn's CEO, Hermann Hauser, and requested more resources. Once he had approval, he assembled a small team to implement Wilson's model in hardware.
Acorn RISC Machine: ARM2

The official Acorn RISC Machine project started in October 1983. They chose VLSI Technology as the silicon partner, as they were a source of ROMs and custom chips for Acorn. Wilson and Furber led the design. They implemented it with a similar efficiency ethos as the 6502.[19] A key design goal was achieving low-latency input/output (interrupt) handling like the 6502. The 6502's memory access architecture had let developers produce fast machines without costly direct memory access hardware.
The first samples of ARM silicon worked properly when first received and tested on 26 April 1985.[3]

The first ARM application was as a second processor for the BBC Micro, where it helped in developing simulation software to finish development of the support chips (VIDC, IOC, MEMC), and sped up the CAD software used in ARM2 development. Wilson subsequently rewrote BBC BASIC in ARM assembly language. The in-depth knowledge gained from designing the instruction set enabled the code to be very dense, making ARM BBC BASIC an extremely good test for any ARM emulator. The original aim of a principally ARM-based computer was achieved in 1987 with the release of the Acorn Archimedes.[20] In 1992, Acorn once more won the Queen's Award for Technology for the ARM.

The ARM2 featured a 32-bit data bus, 26-bit address space and 27 32-bit registers. Eight bits from the program counter register were available for other purposes; the top six bits (available because of the 26-bit address space), served as status flags, and the bottom two bits (available because the program counter was always word-aligned), were used for setting modes. The address bus was extended to 32 bits in the ARM6, but program code still had to lie within the first 64 MB of memory in 26-bit compatibility mode, due to the reserved bits for the status flags.[21] The ARM2 had a transistor count of just 30,000, compared to Motorola's six-year-older 68000 model with around 40,000.[22] Much of this simplicity came from the lack of microcode (which represents about one-quarter to one-third of the 68000) and from (like most CPUs of the day) not including any cache. This simplicity enabled low power consumption, yet better performance than the Intel 80286. A successor, ARM3, was produced with a 4 KB cache, which further improved performance.[23]


http://en.wikipedia.org/wiki/ARM_architecture


Wed Dec 31, 2014 8:29 pm
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
Furber and Wilson have several times told their stories about reviewing available micros and finding them wanting and proceeding to invent the ARM. I found some choice quotes from some interviews and transcripts and thought them worth sharing:

"... we’d visited Nat Semi in Haifa as I mentioned and they were on Rev H of the 32016, so they’d gone through about a dozen iterations and it still wasn’t quite working right"
- https://www.bl.uk/voices-of-science/int ... d-no-money

"""
The Motorola 68000, the NatSemi 32016 and the many others they'd evaluated all had one flaw in common: not being able to make anything like full use of the memory bandwidth that was becoming available.

“If we were going to use 16- or 32-bit wide memory, we could build a system that would run up to 16, 20 or even 30Mb per second," says Wilson. But the processor was the bottleneck.
"
"""
- http://www.theregister.co.uk/2012/05/03 ... ve_furber/

"The anecdote I always remember is that the National Semiconductor 32016 had a memory-to-memory divide instruction that took 360 clock cycles to complete; it was running at 6 megahertz, so 360 clock cycles was 60 microseconds; it was not interruptible while the instruction was executing. Single density floppies, if you handled them chiefly with interrupts, give an interrupt every 64 microseconds, double density every 32. Hence you couldn't handle double density floppy disks. The complex instruction sets gave them very poor real-time response."
http://cacm.acm.org/magazines/2011/5/10 ... r/fulltext

"There were two things we found somewhat frustrating with the standard offerings; these were microprocessors such as the Motorola 68,000 and the National Semiconductor 32016, both of which had very rich complex instruction sets. The two problems we found were firstly their real-time performance was not as good as the 6502 - they couldn’t handle interrupts at the rate that we got used to running the 6502 with - and secondly, these microprocessors couldn’t really keep up with commodity memory. The major cost component in a small computer then was the memory, and it seemed to us obvious that the most important thing for the processor to do was to make the most of that memory. Neither of these processors would exploit the full bandwidth capability of the memory. So we were wondering which processors to adopt, not being very happy with any of the options available to us, when Hermann brought us a couple of papers describing the Berkeley RISC and the Stanford MIPS processors. "
- http://archive.computerhistory.org/reso ... 01-acc.pdf


Sat Jul 16, 2016 4:34 am
Profile

Joined: Tue Jan 15, 2013 10:11 am
Posts: 114
Location: Norway/Japan
A sad day - Softbank buys ARM Holdings.


Mon Jul 18, 2016 4:27 pm
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
Hermann Hauser agrees with you:
Quote:
ARM is the proudest achievement of my life. The proposed sale to SoftBank is a sad day for me and for technology in Britain.


Mon Jul 18, 2016 4:54 pm
Profile

Joined: Tue Jan 15, 2013 10:11 am
Posts: 114
Location: Norway/Japan
.. and he mentioned that 15 billion ARM chips were sold in 2015. That's a lot.


Mon Jul 18, 2016 5:18 pm
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
It is a lot... made up of very small pieces - by my reckoning they make an average of 5 cents per ARM! That's just royalties - they make nearly as much in licensing. It's an interesting model - I linked to an explanation here:
viewtopic.php?f=3&t=258&p=1702#p1702

(Looks like a CPU is under 1/20th of a square mm these days - small pieces indeed. See
http://www.lowrisc.org/blog/2016/07/not ... -workshop/
for example)


Mon Jul 18, 2016 5:40 pm
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
A few more words on ARM from Steve Furber here:
Quote:
The ARM was conceived as a processor for a tethered desktop computer, where ultimate low power was not a requirement. We wanted to keep it low cost, however, and at that time, keeping the chip low cost meant ensuring it would go in low-cost packaging, which meant plastic. In order to use plastic packaging, we had to keep the power dissipation below a watt—that was a hard limit. Anything above a watt would make the plastic packaging unsuitable, and the package would cost more than the chip itself.

We didn't have particularly good or dependable power-analysis tools in those days; they were all a bit approximate. We applied Victorian engineering margins, and in designing to ensure it came out under a watt, we missed, and it came out under a tenth of a watt—really low power.

Of course, all the previous arguments about keeping it very simple also push in this direction. The first ARM chip had only about 25,000 transistors. It was a tenth the complexity by transistor count of some of the processors at the time.


There are many more words from Steve Furber in this thread over on Stardot.

Also noteworthy, a 1985 Errata sheet from NatSemi on the 32016, showing that revision N has 9 notable problems, as well as some "clarifications" on the datasheet.


Wed May 09, 2018 1:21 pm
Profile
Display posts from previous:  Sort by  
Reply to topic   [ 11 posts ] 

Who is online

Users browsing this forum: No registered users and 9 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group
Designed by ST Software