a quick follow-up to this thread.
~ what contributes to file size? is it a combination of factors such as track count, VSTi use, etc??
Short answer: everything.
Long answer: sometimes it can be even a project corruption, but it is usually hitpoint detection and time-stretch processes. We also found that this happens more often and in a much more noticeable way in projects where plug-ins that cannot properly handle multi-core support are used.
The Pool also plays a huge role. If you record and delete several takes, the files are not in the project, but their name, path, etc is still reported in the Pool. Most of you know how cleaning the Pool might be risky, if the projects' folders are "shared" between projects.
~ what can be done to keep the file size down, and consequently save times?
Removing automatic hitpoint detection, rendering time-stretched events, cleaning the Pool, consolidating tracks made up of hundreds sliced events, removing unused media (again, safe only if each and every project resides in its own folder). Sometimes backing up the project removing unused media and mimimising the files is the way to go... if the stage of the production allows you to do so.
I dug quite deep into this a while back and remember having other data... but I will need to check my backlog... too much stuff to remember everything :oops:
[EDIT] I forgot to add: long saving times and big project size can be two unrelated problems, so one may experience either or both.
[EDIT 2] I also forgot to add that on a Mac, long saving times might be due to encrypted file system and automatic Time Machine backup. Same on Windows with automatic backup, of course. Some laptops also still have a 5400RPM HD, which does not really help.