I'm glad you added this, midifidler. I was only describing the difference in USB and MIDI cables as far as protocols and wiring is concerned. Overall system speed and processing is ENTIRELY separate, as this depends on the software and the hardware being used. So yes, my numbers are only estimating the time it takes to go from USB output to USB Input... Nothing else is included. It would be too complex to calculate for everyone; sorry for any confusion.
As far as A/D conversion and MIDI bit-lengths are concerned, this is going beyond the scope of my blog post, since I will admit that I don't know enough about the subject to post. You COULD have an analog input that sends an analog signal to a separate A/D converter... but many modern devices simply sample your analog input on the control itself... much like your camcorder samples real life at 30 frames per second (i.e. 30 pictures a second).
Your analog control input can be converted into a digital output by an encoder... I've seen them with up to around 20 bits or so. The encoder takes the data and sends it to the MIDI microchip processor, which must be capable of handling at least 14 bit messages. The messages are queued alongside the other controls midi messages and sent down the USB line. The time it takes for the actual USB interface to send and receive data is very quick. The time for the data to be separated and processed is another story.
BTW thanks for all the reads and comments guys.
Its allways hard when writing an article like this to walk the line between a simplified technical explanation and to much information!
Well done!
|
Bookmarks