implications if iOS7 on the DAW landscape

I wonder whether, at present, the processing power of the iPad itself is the key limitation here rather than the technical issue of the audio routing, etc.? I’m on a 3rd gen iPad (and I appreciate that the 4th gen models have more horsepower) but, for example, running Cubasis, Audiobus, a few audio tracks and then trying to drive a single instance of something like Thor or Nave or iMS-20 get the CPU meter really cranked.

While I suspect Cubasis could technically already send MIDI data out to several synth/drum apps right now to control them all at the same time, whether my iPad could keep up is another matter. Thankfully, I also suspect this is a limitation that will generally recede as Apple continues to drive the iPad spec forward (and my bank balance downwards).

Not sure about others but, at present, I find working a ‘MIDI track at a time’ provides the best workaround; creating the part, rendering it as audio and then muting that MIDI (I can go back and edit it and re-render if required) while I move to the next thing. Nope, not as slick as on a desktop where you can have multiple VSTi all going at once but it gets the job done. I guess this CPU/processing bottleneck is also why the original Cubasis sample-based instruments and new Micrologue synth are all pretty well streamlined to consume as little CPU as possible (and are therefore not as fully featured as some dedicated synth/sample apps) so as to allow multiple instruments to be run simultaneously within a Cubasis session?

… roll on iPad 5 :slight_smile:

Cheers

John