Originally Posted By: tacit
My understanding is that the single-atom transistors is either on or off, no in-between; it can't be used for analog signal amplification.

Though it may be that conventional analog transistors work the same way.

True enough. When it come to finding the "line" between analog and digital, we essentially wind up at a point where there is no perfectly linear/analog world. Go deep enough and it all boils down to discrete levels of something, and we're back into quanta (how much). Planck's constant is the LSB (least significant bit) of action in the universe. And my vague memory of studying transistors recalls that Boltzmann's constant was mentioned quite often.

So yes, perfect linearity does not exist. But for human perception it doesn't need to be perfect. Our hearing and seeing are also based on sampling (and limited bandwidths), as our brains process the information. E.g., 30 frames per second or above appears linear... and i think the decibel scale was based on what our ears can detect. I.e., changes in level less than one decibel go almost unnoticed.


Last edited by Hal Itosis; 02/22/12 06:21 PM.