??? 09/17/09 14:10 Read: times |
#168936 - It's not a replacement for the target Responding to: ???'s previous message |
Per Westermark said:
You are forgetting yourself.
You ask: "What AVR/805x board can provide all that for comparable cost?" Irrelevant if my goal is to create a keyboard controller. I want a final product. An evaluation board with a gazillion features are of no use if the target hardware will never use any of them. True enough, but ... why not use a keyboard encoder rather than an MCU? Among other things, I work with CAN. What use is then that evaluation board if it doesn't have CAN? Yes, there are things that aren't on THIS board, but, generally speaking, it's not likely a board with CAN will, after designing a board, and after omitting the GLCD, cost less than this one. Testing is what you do all the way through the development of your firmware or hardware. It is not just something you do when you have a release candidate of hardware and/or firmware. Not quite ... that may be trial, but it's certainly not testing. About the board you did suggest. Throw away the LCD and the beeper, but add in CAN and some other features, and you will have something similar to one product I have developed the software for. Yes, the product do run Linux. But no, I'm not interested in running any development tools on it. No need to. TCP or BDM can be used if I want to debug in the hardware. Much of the software can be tested directly on a PC. Firmware issues can be worked, to large extent, on a PC. Hardware and hardware/firmware integration issues, well, not so much. Please understand that I don't disagree with much of what you say, Per, but there's a distinct limit to the relative usefulness of PC-based tools. They may work fine for debugging your 'C' code, and for debugging your ASM, but, they unfortunately lack the ability to exercise firmware in an environment wherein there are actual external signals that occur at timings not readily simulated in a PC. I wish it weren't so, but the simulators are just to primitive, and the debugging tools are too limited in scope. The simulations are typically based on instruction cycle timings rather than hardware timings, and I've seen no simulators that do nanosecond-level timing analysis on MCU simulations. Just give ModelSim a try with an MCU core in FPGA sometime with external peripheral logic generating asynchronous signals to the core. There will be problems, not all of which are with the design of your system, that you'd never encounter in a PC-based debugging environment, yet you'd quickly encounter them in a hardware environment. Coming from a 8051, that development board may look like "wow". It normally isn't. The development board is only useful if I am going to use a processor from that specific family and will use that processor with a reasonable amount of the peripherals that the board has. If not, then I may just as well get a reference design for the real processor and drop in the peripherals I need and order a couple of prototypes. If debugging, I would prefer to have gdb running on a PC, and not on the target. When compiling, I would prefer to have gcc compile on the PC, and either copy the binary to the target system, or let the target system export or import a file system. If you want to give your code a trial on the "real" hardware, e.g. if you want to use SEEPROM, or any other feature specifically included on the development/trial board, you can do it on that board, patch it on that board if need be, and stimulate it with external signals. You can't do any of that on a PC in simulation. I wish that weren't the case, but, sadly, the folks who provide simulators generally aren't very thorough. PC-based debugging tools are pretty limited in scope, i.e. they don't really support debugging the hardware. How can you build confidence in hardware that has only been debugged in a simulator on a PC? I submit, a low-cost board like this one, and I'm sure there are others just as capable, allows you to do that. If you examine the available 805x, AVR, or PIC boards of the same sort, you'll find them wanting, and more costly, too. Richard said:
Now, I don't sell these, nor do I even hold stock in the company, but it looks like a pretty inexpensive way to "get something out the door. It is SOP to start from a reference design, if the resulting product is intended to be running Linux. And in that case, this board is just one out of hundreds of boards. For very small series, you may ship this kind of board in your product. For larger quantities, you want your own hardware that is custom-designed. And you want prototypes of that hardware as soon as possible. Quite often, these reference designs may contain peripherial components that have become obsolete, so if a product is intended to ship for a number of years, you want 100% control. The majority of the published "reference designs" I've tried didn't even work properly. Often they're designed to demonstrate one specific feature and completely neglect the remaining functions. That was one of the problems I encountered with early ARM circuits. One important thing is that the market is way smaller for high-end solutions. For each embedded system running Linux, you may find 10 or 100 or 1000 systems that are too small. There are a lot of situations when the total manufacturing cost has to be $10. This is doable with an ARM chip, but obviously not by basing the design on such a board. I think you're missing my point. I agree that an OS like LINUX is inappropriate for the majority of embedded tasks. However, it's not inappropriate for hardware/software integration trials. I'm not convinced that this particular board is the ideal, and I'm still not convinced that it will replace all the benefits one can derive from PC-based tools, but it allows one to do things no PC environment can do. In that sense, it's just a means to an end, and LINUX, because it's open-source, provides not only the means to perform trials on the various hardware features and the underlying code, just to ensure their suitability for a given task, but also the means to develop confidence in the specific interactions between those hardware/firmware interaction problems no PC simulator can emulate. That board may be very funny for a hobbyist, or for doing school projects, but a large percentage of commercial projects has completely different needs. And using cross-compilers really is no problem. It is normally an advantage. Likewise, it enables one to do things no PC tool set alone can do. Cross-compilers do work, but the debugging/simulation tools are generally provided as an afterthought, doing crude simulation of a core based on instruction cycle timing rather than actual hardware timing constraints. I don't really get why you, in some threads, can spend hours bashing C, because you are so convinced that a program written in C will require one step larger processor, adding cost and having your customer thinking you are incompetent. In a completely different thread - discussing the slowdown of this forum - you are ignoring comments that ARM9 or ARM11 or whatever are nice but a lot of the ARM work is with smaller systems. Now you seem very busy explaining why complete embedded Linux systems are a good way to develop our products. A huge percentage of ARM chips sold, are not capable of running Linux. And they are selected for the simple reason that they are cheaper. Way cheaper. That is money on the bottom line. I understand why you find this curious. I'm not advocating that one should deploy the larger ARM in a target application. The board to which I referred is interesting to me because I use boards like that to perform proof-of-concept, which is one thing I often do. My emphasis on using this sort of board rather than PC-based tools is not because a PC-based simulator couldn't do what's necessary to simulate the hardware/software interactions, but because a PC-based simulator, as provided by various vendors of PC-based development tools, wouldn't do it, because the developers of that software don't approach the problem from the proper perspective. They're building an add-on for a cross-compiler, not an MCU-specific simulation suite that actually simulates the low-level behavior of the hardware. That would require a modification of their simulator for each and every MCU version, and they're just not that interested in providing such a tool. The developers provide a tool capable doing some simulation because the marketing guys told them they have to have that feature. The marketing guys aren't smart and sophisticated enough to realize what a difference it makes having a real, low-level simulation of the core on which the code will be expected to run. For companies like KEIL, the simulation and debugging are just a means of putting a check in the box on the list. They see it as a way to get past the purchasing department, and couldn't care less whether it is really helpful in getting the job done. Once deployed, management won't let the programmers "fix" it, since it's already "out there" and generating sales. The tech support guys, then, are doomed to spend the rest of the product's life either apologizing for, or lying about, the product. RE |