Home / Information Theory – 1 – 1

Information Theory – 1 – 1

1. Channel capacity is exactly equal to –

(a) bandwidth of demand
(b) Amount of information per second
(c) Noise rate in the demand
(d) None of the above

Answer
Answer : (b)

Explanation
No answer description available for this question. Let us discuss.

2. The capacity of a channel is :

(a) Number of digits used in coding
(b) Volume of information it can take
(c) Maximum rate of information transmission
(d) Bandwidth required for information

Answer
Answer : (c)

Explanation
No answer description available for this question. Let us discuss.

3. Entropy is basically a measure of :

(a) Rate of information
(b) Average of information
(c) Probability of information
(d) Disorder of information

Answer
Answer : (b)

Explanation
No answer description available for this question. Let us discuss.

4. The Hartley-Shannon theorem sets a limit on the :

(a) highest frequency that may be sent over a given channel
(b) maximum capacity of a channel with a given noise level
(c) maximum number of coding levels in a channel with a given noise level
(d) maximum number of quantizing levels in a channel of a given bandwidth

Answer
Answer : (b)

Explanation
No answer description available for this question. Let us discuss.

5. The maximum value of entropy is :

(a) 1
(b) 2
(c) 3
(d) 4

Answer
Answer : (1)

Explanation
No answer description available for this question. Let us discuss.

1

General Knowledge Books

Be a Part of the New & Next

MCQs fb  MCQs G+

Share the Knowledge

MORE INFORMATION
GATE Total InfoGATE 2019 BooksFree Notes
IES Total InfoIES 2019 BooksFree Mock tests
JAM Total InfoJAM 2019 BooksEngg Diploma
PSUs Total InfoM Tech Total InfoUGC NET Total Info
Copy Protected by Chetan's WP-Copyprotect.