This module changes the Drupal core aggregation mechanism process. It greatly reduces I/O and aggregated number of files, and improves chances of client cache hit, therefore while it produces bigger aggregated files, it reduces greatly the bandwidth utilization while users browse.
Core Library: http://drupal.org/project/core_library
How many are required before the module changes something?
It doesn't switch on by itself, you have to select an aggregation mode manually. You have to be in one of the learning modes first and navigate throught the maximum different pages you have (basically all where tricky JS or CSS files lies) then switch to one of full bypass if you want to.
Learning mode is effective as soon as you switch to either to Anonymous only, All, no admin or All modes.
While in one of those modes, the aggregated files can change over time since it's learning new files as soon as it discovers it.
I'd definitely recommend to anyone not to use the All mode, but the All, no admin instead, and let the admin section use conditional inclusion: some admin JS can conflict with the live frontend site one's.
Learnt files while browsing are stored into the core_library_aggregation_orphans variable. Once you reached a stable state, i.e. more or less all pages have been browsed by someone, the variable won't be modified again (even if in learning mode).
As long as you're not in full bypass mode, yes. The module mainly regroup CSS and JS in order to have fewer groups: the final aggregation still is done by core.
The minification is done on the fly, even before core build the JS headers, so core will naturally aggregate files, but minified one instead of the original ones.
Where is the collected information stored?
The learnt files are stored into the core_library_aggregation_orphans variable, while the statistics are being stored into the core_library_stat database table. That's why the statistic collection mode really is slow: you should never use it outside of a development or testing box.