??? 03/05/09 18:34 Read: times |
#163110 - On that we can agree Responding to: ???'s previous message |
Per Westermark said:
First off, I think every project should be started without bias when looking for a processor to use. And in the same way, the requirements of the project should decide if assembler or C (or possibly something else) is the best choice. ...
I think the 8051 will be heavily used 10 years from now too, but I do not think anyone should select processor just because it belongs to family "x". I've also used 8051, Z80, (not AVR) PIC16 (when it was a chip from General Instruments, not Microchip), ARM, x86, 68xx, 68xxx as well as 6502 types, 1802, the OLD PIC and, of course, the classic microcontroller, the 8048, along with the 8x300 which was probably the first RISC controller. Back in '78, Scientific American carried an article on the 8748, which named it the first complete microcomputer on a chip. In that classic definition, they pointed out that it contained I/O, memory, and the processor logic all on a single IC. Almost concurrently with their release of the MC68000, in 1979, Motorola rolled out their 68705 and 68701 (EPROM versions of 6805 and 6801) which both had the same characteristics. The MC6801 had more or less the same characteristics as the Intel 805x, and both of them had on-board UARTs, which their predecessors had not. That group, the ones with ROM, RAM, and I/O on-chip, were the first "single-chip microcomputers" and almost immediately began appearing in household appliances, mass storage devices, and a host of other things, as they were almost immediately cheaper to use than the logic that they replaced. Those chips were designed to replace logic. They were dubbed "microcontrollers" to distinguish them from "microprocessors" which relied on extensive external resources. Today's chips are designed to address a different need, namely the need to reduce firmware cost. That means that it's got to be made easy for the "sneaker-wearing, pepsi-drinking, unshaven wierdo who seldom bathes and most often doesn't even comb his neglected hair" that many managers envision when someone mentions SOFTWARE. Once the chips can incorporate terabytes of code and gigabytes of data space, and execute their instructions in a few tens of femtoseconds, the chips will be "at least plenty" as those programmers want. I just have to wonder how the chip designers are going to address the need for documentation. RE |