??? 04/17/06 23:54 Read: times |
#114424 - It's just a transfer mechanism Responding to: ???'s previous message |
I'd never expect Windows to keep up with a steady stream of anything.
You're right, I was thinking about the old v1.0 speed, which was 1.2 Mbps (bits), I think. Full-speed is, indeed 12 Mbps, but that's got protocol as well as Windows overhead, and the guesstimate of 1 MBps (Bytes), was probably not far off. The way I'd build a logic analyzer if I were doing it today, I'd never try to sample on a continuous basis. First of all, of course, it's because I know how to start and stop the things. What I've advocated right along is sampling into a local buffer, starting with a trigger signal of some sort that says that at least one "event of interest" has occurred and then sampling data into that buffer until it is filled. I find it useful to sample both before and after the "trigger" event, so a bit of history and a bit of subsequent information is available. Sometimes one needs lots of data from before, and sometimes one wants data after, but it always has to be focused on some trigger. If it's buffered at the sampling circuit, then it matters little whether the speed is 1 Mbps or 500 Mbps. It matters some, but only a little, since you set up the instrument, take the sample, and then spend weeks figuring out what it all means. That might not be quite so bad with only four signals under study, but in the general case, it can take quite a long time to make sense of a sample. The preparation makes a BIG difference. The link performance between sampler and PC probably doesn't. RE |