Understanding and specifying analogue inputs

You have an application that needs to use an analogue sensor and the hardware designer keeps asking questions about the specifications of an ADC input.
You don’t know about bits so how can you even begin to answer?In short you can’t and should not be expected to.
You know the application and it is in these terms that you should specify the interface.

An alternative that may have occurred to you would be to ask for the best ADC (reflecting the engineers term back at them) possible. This obviously can’t be bettered and, therefore, will have to be good enough for the application.
The downside of this approach is that the product price will be more expensive than necessary and, therefore, may cost you sales.

The two terms needed to specify an analogue input are resolution and accuracy.
These terms are not always separated and fully understood.

Resolution is the smallest part of the measurement that you need.
For instance, on a metric rule the smallest marks, and therefore the resolution, are 1mm.
On a digital speedometer the smallest increment displayed is likely to be 1mph (or 1kph).

Accuracy is how much from the true value is acceptable in the measurement.
For our metric rule if the length is 33cm instead of 30cm we cannot be more accurate than 10%, no matter how carefully we measure something.
Is that acceptable?
This is where your knowledge of the application is key.

Often accuracy is confused with resolution, but they need not be the same.
One thing to be aware of is that your overall accuracy cannot be smaller than the resolution as any intermediate reading will need to be rounded.