Information Theory – 1 – 1

1. Channel capacity is exactly equal to –

(a) bandwidth of demand
(b) Amount of information per second
(c) Noise rate in the demand
(d) None of the above

ANSWER

EXPLANATION

2. The capacity of a channel is :

(a) Number of digits used in coding
(b) Volume of information it can take
(c) Maximum rate of information transmission
(d) 
Bandwidth required for information

ANSWER

EXPLANATION

3. Entropy is basically a measure of :

(a) Rate of information
(b) Average of information
(c) Probability of information
(d) 
Disorder of information

ANSWER

EXPLANATION

4. The Hartley-Shannon theorem sets a limit on the :

(a) highest frequency that may be sent over a given channel
(b) maximum capacity of a channel with a given noise level
(c) maximum number of coding levels in a channel with a given noise level
(d) maximum number of quantizing levels in a channel of a given bandwidth

ANSWER

EXPLANATION

5. The maximum value of entropy is :

(a) 1
(b) 2
(c) 3
(d) 4

ANSWER

EXPLANATION

1

Be a Part of The New & Next

MCQs fb  MCQs G+

Share the Knowledge

Shares
Copy Protected by Chetan's WP-Copyprotect.