An analogue sensor has a bandwidth which extends from very low frequencies up to a maximum of 14.5 kHz. Using the Sampling Theor
em, what is the minimum sampling rate (number of samples per second) required to convert the sensor signal into a digital representation? If each sample is now quantised into 2048 levels, what will be the resulting transmitted bitrate in kbps?
Give your answer in scientific notation to 1 decimal place.
Hint: you firstly need to determine the number of bits per sample that produces 2048 quantisation levels
By Nyquist's theorem we must have 2*14.5kHz=29kHz so 29,000 samples per second. 2048=2^11 so we have 11 bits per sample. Finally we have 29000*11 bits per second (bps) =319000=3.2 * 10^5