Information Theory – Exercise – 1

1. Channel capacity is exactly equal to –

(a) bandwidth of demand
(b) Amount of information per second
(c) Noise rate in the demand
(d) None of the above

Explanation
No answer description available for this question. Let us discuss.

2. The capacity of a channel is :

(a) Number of digits used in coding
(b) Volume of information it can take
(c) Maximum rate of information transmission
(d) Bandwidth required for information

Explanation
No answer description available for this question. Let us discuss.

3. Entropy is basically a measure of :

(a) Rate of information
(b) Average of information
(c) Probability of information
(d) Disorder of information

Explanation
No answer description available for this question. Let us discuss.

4. The Hartley-Shannon theorem sets a limit on the :

(a) highest frequency that may be sent over a given channel
(b) maximum capacity of a channel with a given noise level
(c) maximum number of coding levels in a channel with a given noise level
(d) maximum number of quantizing levels in a channel of a given bandwidth

Explanation
No answer description available for this question. Let us discuss.

5. The maximum value of entropy is :

(a) 1
(b) 2
(c) 3
(d) 4