Matt --
I DO try to "think deeper" about detecting; I'm a scientist by profession, so thinking a lot about things is the way I'm wired. But, since the field I work in is not related to electronics, I can't speak to many of the technicalities regarding detector design, electromagnetic waves and waveform analysis, etc. etc.
Having said that, I will answer your questions based on what my understanding/thoughts are...
1 -- Why did Minelab set the default sensitivity at 16? My understanding of it is that Minelab chose a relatively LOW sensitivity setting as the default, a level that would be "low enough" that the machine would run "quiet" or "stable" in a majority of soil types, but "high enough" to still give decent depth. My understanding is that the default "16" setting was one that was low enough so as to ensure a relatively quiet, stable machine in a majority of soil types found in most areas, but still find coins to respectable depth.
To turn the question back to you -- IF my understanding is wrong, and yours is correct -- that "16" is the appropriate sensitivity for moist, acidic soil with low iron content, then why would my machine, running in dry Oklahoma red clay display a recommended sensitivity level well above "16" at almost all times? One could assume that since the "red" color is due to iron oxides present in the soil, then if your understanding is correct my machine would want to run below the "16" level, right? If 16 is the "right setting" for "low ferrous-contaminated soil," as you noted, then my red clay should push suggested sensitivity LOWER than 16, right?
2 -- What is my understanding of auto-sensitivity numbers, relative to soil mineralization? Well, this gets pretty technical, with a need to understand things like reactive and resistive (X and R) signal components, and current induction, and time decay of current, etc. etc. etc. You probably know most of this already, but some here will be bored stiff and couldn't care less. So, I apologize in advance to all those who don't care about these technicalities, and you can skip all of what I'm about to say. Also, as a disclaimer, I don't have FULL understanding of all this, as -- again -- my scientific knowledge lies outside of electronics. But, to answer your question, here's an attempt at a simplified (hopefully) version, that hopefully isn't too flawed by the areas where I lack full understanding...
Sensitivity control runs from 0-30 on the CTX, 0 being "minimum," and 30 being "maximum;" on this I think we agree. Now, while a detector is trying to detect a buried metal object based on inducing electrical current in that object, and then analyzing the characteristics of that induced current, the issue of course is that electrical current is ALSO being induced within the ferrous component of the soil. And since the soil signal (emanating from the induced current in the soil) is much stronger than the target signal (emanating from the induced current in the target), there needs to be a way to both "subtract out" the soil signal so that you are left only with the "target signal," AND a way to "amplify" that remaining -- but weak -- target signal so that the machine can process the signal.
The way a user controls how much the detector "amplifies" received signals is through the sensitivity setting. IF the soil is "mild," and/or IF the detector is doing a good job of dealing with the soil's ferrous content (i.e. accurately subtracting it from the target signal), then increasing sensitivity should allow the machine to "see" a "good" target deeper into the soil. HOWEVER, IF the soil is highly mineralized, OR if the machine is struggling to "subtract out" the soil's ferrous effects, then the machine's ability to accurately ID non-ferrous targets within the highly ferrous ground breaks down. In other words, some soil signal is mis-characterized by the machine as "target signal." And in this type of situation, increasing sensitivity will therefore amplify NOT ONLY that weak target signal, but ALSO will amplify weak ground signals which have "bled through" -- i.e. BOTH signals get amplified, and you end up with noisy, unstable machine operation. "Falses" will increase...ground noise, in other words...as the machine struggles to figure out what is "target signal" and what is "ground signal."
So, bottom line, the less "mineralized" the soil is, the better the machine can separate soil signal from target signal. And thus, when "amplifying" the strength of the return signal to be processed by the unit by raising your sensitivity setting, you can increase the unit's ability to "see" a target deeper into the dirt. But, conversely, the more "mineralized" the soil is, the less effectively the machine can seprate ground signal, and thus both get reported to the user at times as blips and chirps and whispers of "non-ferrous" ID...such that amplifying these signals (through increasing your sensitivity setting) only leads to a noisier, less "stable" audible output to the user. Obviously, in this kind of situation, lowering the sensitivity from a high level down to a lower level will reduce the "chatter," by instructing the machine to "ignore" weak signals received by the machine. The GOOD is, you get a quieter, more "stable" audio presentation, but the BAD, of course, is that you are ignoring not only weak SOIL signals that have "bled through," but ALSO weak signals emanating from deep "good" targets. THUS, you lose "depth," and thus the tradeoff...high sensitivity means better depth, BUT more "chatter" in a mineralized dirt; low sensitivity means less "chatter"/more "stability," but also less ability to report deep targets to the user.
FINALLY, what Minelab tries to do with auto sensitivity, is adjust it up and down, depending upon the amount of "chatter" that the machine is receiving; in other words, if too much ground mineral is affecting the machine's ability to discriminate/identify, then it "dials down" sensitivity; if the machine is experiencing very little ground signal "bleeding through," then it "ramps up" sensitivity so as to give the user a better chance of seeing deep targets. Obviously, Minelab ALSO allows a more experienced user the ability to adjust this for themselves, thus the "manual" sensitivity setting. But for a less sophisticated user, who would likely "have more success" with a quieter, more "stable" machine, Minelab uses a conservative, auto-sensitivity setting as a default, to allow better chances for these less-sophisticated users to have "success" (albeit with some implicit loss of depth).
Now, I may not have SPECIFICALLY, EXPLICITLY answered the question regarding what EXACTLY the auto sensitivity numerical values relate to, in terms of soil make-up/components, but I hope that I answered it at least in a slightly indirect or somewhat implicit way.
Steve