??? 06/01/13 14:24 Read: times Msg Score: +1 +1 Informative |
#189843 - Reality Check...... Responding to: ???'s previous message |
Jos� A. Ruiz said:
Hello all,
All advice is welcome, thank you for your thoughts! There is such a lack of good dialog here at this site now that it is in slow die more. Maybe this can get a good discussion going.... I would like to veer off the direct topic of your question and ask why you feel it productive to put energy into placing an 8051 core into an FPGA? I have found it rare to be the case that such usage model makes a lot of sense. My experience has shown that it can make a lot more sense to use a well validated commercial part in a design to contain the code and then have this MCU interface to an FPGA for that special high performance circuitry that must be done there for either performance or density requirements. Even though FPGA costs per glob of gates are continuing to come down in price it is still more expensive in terms of silicon cost to embed a microcontroller core into an FPGA as opposed to using an external commercial chip. The wide range of available commercial chips, both in terms of cost and capabilities, lets you easily select the appropriate part to connect to the FPGA. You can interface via slower serial links (SPI, I2C etc) or via an actual bus structure. For example many of the higher end SiLabs parts support an optional external BUS interface that can map your XRAM data space directly into the FPGA in a glueless manner. With the use of the commercial MCU parts you get to retain the best selection of development tools, debugger capabilities and choice of free/commercial compiler/tool sets. You completely eliminate the pain in the a$$ issue that you are dealing with right now. One argument often put forth for putting the MCU core into an FPGA is that the eventual goal is to advance to a custom made chip. That is all well and good but might I suggest that it may be a good idea to develop a separate firmware development platform using a well supported MCU mated with an FPGA where you can retain the best of the tools needed to develop good code. Once code is validated in such environment it can be moved over in object form to the custom chip environment where it can be tested and validated. For this mode of testing you rarely need to have the ability to look at MCU registers and step code after breakpoints! If the custom chip environment needs validation visibility then design that into the firmware and its supporting hardware as a validation specific tool/interface. This is much simpler than trying to go through the pain of what you are asking about here. |
Topic | Author | Date |
OCD for FPGA core | 01/01/70 00:00 | |
Serial-to-EC2 reverse engineering | 01/01/70 00:00 | |
C2spec.pdf | 01/01/70 00:00 | |
Reality Check...... | 01/01/70 00:00 | |
Agreed | 01/01/70 00:00 | |
multi-threaded | 01/01/70 00:00 | |
FPGA and soft cores | 01/01/70 00:00 | |
Yes ... but which debugger? | 01/01/70 00:00 | |
Actually no | 01/01/70 00:00 | |
Who's "they" | 01/01/70 00:00 | |
I wouldn't use FPGA unless I need more than just the core | 01/01/70 00:00 | |
FPGA on-chip debugging redundant? | 01/01/70 00:00 | |
debugging embedded processors | 01/01/70 00:00 | |
That's good to know. | 01/01/70 00:00 | |
nice idea | 01/01/70 00:00 | |
Von Neumann first | 01/01/70 00:00 | |
if that were the case ... | 01/01/70 00:00 | |
Poorly chosen acronym... | 01/01/70 00:00 | |
On Chip Debug is common | 01/01/70 00:00 | |
On Chip Debug *is* a very good idea indeed! | 01/01/70 00:00 | |
PC | 01/01/70 00:00 | |
PC | 01/01/70 00:00 | |
PC | 01/01/70 00:00 |