The real importance of hard drive cache?

Oddly, cache is something that I’ve never payed much attention to when buying hard drives when building my DAW’s. I’ve searched all over the net and as I understand it, cache is a hard drives built in memory which it delivers information (but only information that you’ve already opened) to you faster than off of the hard drive itself.

But what does this really mean for music machines? I mean, even the higher cache rated drives with 64 MB isn’t all that much memory. So we have the typical (3) hard drives in a DAW system (#1) OS/Programs (#2) Audio (#3) Sample content. Is any one of these drives more important regarding cache? I’d THINK the Samples Content drive would be the least important, since they are mostly loaded into actual system memory. Next, Audio drive being the 2nd less, but more important, as audio is a lot more MB’s than the HDD’s cache can deliver, the MOST important I’d THINK would be for the OS?Programs HDD, as it takes care of the 'NOW" factor…just how much would be used on this drive at any given time, dunno…???
I see a lot of HDD’s with cache’s of 8 MB’s, 16 MB’s, 32 MB’s & 64 MB’s. It ‘seems’ that the larger the HDD’s (say 1 TB), the average cache is typically larger for the most part, and the smaller HDD’s (say 300 GB) have smaller cache’s. A brand new HDD can be bought much cheaper with very low cache compared to the higher cache HDD’s. I’m assuming there’s a lot of people who just buy the HDD’s with a high cache only ‘just because’ it sounds like it would be better, but really…does it make THAT much difference in a real world music DAW machine? When it comes to loading a project for all it’s worth, I don’t feel the ‘loading times’ are important for me, though the importance of actually ‘running’ things smoothly is more so.

NOTE; I’m primarily talking about standard hard drives, not SSD hard drives here.

yes OS is probably the more important
however samples are even more so. (real ones)
the more instances you run at once the more important it is
and really should be using an SSD for that

on the other hand a 500G 16meg is fine for an OS.

Thanks Scott, I don’t even know what any of my HDD’s cache are in my Daw’s. As far as I know, none of my VSTi’s stream off of the hard drive, but only utilized from loaded into actual system memory…with that, will VSTi instances even apply?

I typically use a 750 GB to 1 TB for my Samples Drives, which seem to mostly come with higher cache’s.

My systems are older, I don’t believe I can even do SSD’s if I wanted. My reason for asking is that I want to have some spare hard drives on hand is all, have been looking around, and thought I should explore the importance of the drives cache this time. I certainly haven’t noticed any problems with whatever cache I’ve unknowingly ended up with.

HDD cache is only useful for that data that is most frequently used, or re-used in quick succession (that is, before it has been overwritten). New data typically overwrites the oldest data.

The OS will cache data it considers important in reserved sections of RAM. For example, Win buffers the data to be written to the page file, so that there are not a whole lot of little writes that would bring a HDD to its knees. Drives cannot know what is important to the OS, and the OS cannot tell it either. Basically, the OS tells the drive where its wants the data, and the drive just treats it all the same way, using its cache even-handedly.

If the amount of new data to read off the drive is continually greater than the cache (no chance for the old data to be re-used), then the cache is providing no throughput benefit, but just buffering the data, which may even out the flow depending upon its size.

Basically, for large heavily-used libraries, larger cache will probably not provide much benefit as the multiple simultaneous streams will almost always be new data (next block of each playing note), and swamp any cache that is not hundreds of MBs. If you are really pushing a HDD that much you may be better off using an SSD.

Thanks for all that information on how things work Pat,

I’m not sure I’m pushing my HDD’s hard, what would be the signs? Things seem to run pretty smooth on my ordinary SATA HDD’s, no missed notes etc, or whatever bad is supposed to happen (?)

Yes.

Basically, if you don’t hear it, nothing’s wrong! Of course, you may be just short of ‘breaking the camel’s back’.
In particular, you would be seeing Cubase’s ‘Disk’ bar peaking.

Actually Pat, my ‘disk performance meter’ as well as my Win Task Manager activity levels have ALL been greatly reduced since I began off loading all VSTi’s to a dedicated machine. Now neither machine is being taxed high, as with my main machine ‘was’ when I had my largest of projects since I began increasing VSTi usage. At that time, when my overall usages started exceeding my machines capability, what I was experiencing was that, between my audio & midi tracks running 3d party VSTi’s, my tracks began to sound ‘out of alignment’ & ‘drifted’… that was a very strange thing, but again, only this only happened on the heaviest project I’ve ever done, while my other projects played fine.
Keep in mind, I’m talking about a Win XP 32 bit machine with a quad core @2.5 GHz CPU & 4 GB installed RAM, and my latency set to near the lowest setting. I don’t THINK this had anything to do with ‘HDD Cache’ per say, but was an overall limitation of the machine as a whole. Though I believe my ‘disk meter’ and ‘Win Task Manager’ indicators were very high at that time…and raising latency helped when that happened. My VSTi only machine is exactly the same as above, in which I now bounce to my audio machine after I’m done working all my midi tracks & VSTi assignment.

Ok, if I’m absorbing correctly… I take it (my mobo uses up to standard SATA 2 HDD’s) for an OS it’s best to have at least 16 MB HDD cache, and for a Samples HDD, at least 32 MB.

Again, I’m not looking to ‘replace’ my current HDD’s, they’re working… I’m planning on buying ‘spares’ to have on hand. This go’s for everything else, I’ve been collecting spare ‘everythings’ actually.

I suspect that DAWs and samplers are designed to be buffering everything anyway, as they cannot rely on the drives doing exactly as they require, when they require, in which case HDD cache size is probably not a big issue.

As long as you keep every data channel – to/from CPU, RAM and drives – with a healthy margin below bandwidth saturation, you will be safe.

Yeah, as far as I can tell by all the monitoring means I know of mentioned above, I seem to be within a healthy margin all across the board, by all indications and by performance. No doubt in my mind this is due to the load demands being spread out among all available resources of a hybrid system…including relieving ordinary SATA HDD’s of overly strenuous duties.

The way I look at things is, you can have two big strong men carry a large heavy crate, they can do it, but will feel the strain…while four average men carrying that same crate will feel less strained. In the case of a DAW, it’s components and HDD’s, for the most part they’re being underused of their full potential.

There are two consequences of running channels close to saturation:

a) Everything is delayed. This is basic queuing theory. Just look at a bank queue. The more people in it, the average waiting time is longer. Longer data queues require larger buffer settings to ensure that delivery latencies are covered.

b) Not much room for incidental non-audio data, like OS or protocol overheads.

The plot thickens… Makes me want to know a lot more of a deeper understanding than I do about how computers work and their components interact together.

That makes perfect sense, I always thought of it as general system bottle necking, like sand waiting to pass through an hour glass.

And like I said, quite a while back when I brought my main DAW to the edge of space with never before seen demands of excessive VSTi’s & audio tracks (for that machine), I had to raise my latency to what I thought were extremely high (unusable for recording live tracks) levels, but was at least manageable for a mix down…this is when I began hearing those tracks play slightly out of time with each other. Realizing I was outgrowing my machine, I began offloading/externalizing as my solution. Now, my CPU & RAM are typically at 50% +/- … as for my disk usage, I’ve seen it from 50% to 65% or so, on the meter on my loaded up VSTi machine at actual projects. On my initial testing to realize it’s limits, I had the disk meter at a higher level than a real world project (maybe 85% or so), I believe I had to raise my latency from it’s lowest setting to only slightly higher.