Calculate precisely the mean length of optional code for probabilities (0.3,0.2,0.2,0.1,0.1,0.05,0.05)
Indicate a DMC(Discrete Memoryless Channel)
a)
Define a discrete memoryless channel
a)
Find a DMC (Discrete memoryless channel)
a)
Decode precisely by the (7,4) Hamming code an output vector
a) 1110000
7. Let a signal is defined on interval [2,8]. Its quantization step is equal to 0.2. Compute the number of quantization levels.
a) 30
8. Let a signal is defined on interval [-3,1]. Its quantization step is equal to 0.3. Compute the number of quantization levels.
a) 30
9. Let X be a discrete random variable with a probability mass function p(x)=(0.25,0.25,0.25,0.125,0.125)
a) 2.25
Let p(x,y) be given by
X\Y | ||
1/3 | 1/3 | |
1/3 |
Find H(X|Y).
a) 2/3
Let p(x,y) be given by
X\Y | |||
1/8 | 1/8 | 1/8 | |
1/8 | 1/8 | ||
1/8 | 1/8 | 1/8 |
Find H(X,Y)
a) 3
Consider the capacity of discrete memoryless channel given by
P(X|Y) | A | B | C | D | E | F |
A | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
B | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
C | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
D | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
E | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
F | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
a) 0
Take into account the capacity of a DMC
P(X|Y) | A | B | C | D | E | F |
A | 1/2 | 1/4 | 1/4 | |||
B | 1/2 | 1/4 | 1/4 | |||
C | 1/2 | 1/4 | 1/4 | |||
D | 1/2 | 1/4 | 1/4 | |||
E | 1/4 | 1/2 | 1/4 | |||
F | 1/4 | 1/4 | 1/2 |
a) Log 3-1/2
Channel capacity is defined by
a) max I(X,Y)
Define the formula of channel capacity
a) max {H(X)-H(X|Y)}
Decode 0111110 by the (7,4) Hamming code.
a) 0111100
Calculate precisely the mean length of optional code for probabilities (0.3,0.2,0.2,0.1,0.1,0.05,0.05)
a) 2.6
18. Find H(X) for a discrete random variable with a probability mass function p(x)=(0.5,0.25,0.125,0.125)
a) 1.75
Let an experiment with dice tossing be given. Find its entropy.
a) log 2
20. On interval [-3,1] is defined a signal . The quantization step is equal to 0.1. Then the number of quantization level is
a) 90
For probability distribution (1,0,0,0) find its entropy.
a) 0
22. Let (X,Y) have next joint distribution . Find I(X,Y).
a) 0
23. For probability distribution (1/3, 1/3,1/3,0,0) find its entropy.
a) 1.585
24. Let be given the entropy H(X) of input of the channel. P(X|Y)= .
Find the conditional entropy H(X|Y).
a) 0
25. Let be given the entropy H(X) of input of the channel. P(Y|X)= .
Find the mutual information I(X,Y).
a) H(X)
26. Let be known the entropy H(X) of input of the channel P(Y|X)= .
Calculate H(Y).
a) H(X)
27. Let be given the entropy H(X) of input of the channel. P(Y|X)= .
Find the conditional entropy H(Y|X).
a) 0
A channel is given with (7,4) Hamming coding. Decode an output of a channel 1100000.
a) 1110000
Decode an output vector 0000001 of a channel with the (7,4) Hamming code
a) 0000000
The entropy of coin tossing is equal to
C)1
Let be given the entropy H(X) of input of the noisy channel
P(Y|X) = .Find H(Y|X).
a) 0
Which of matrices define a discrete memoryless channel
a)
Point a matrix of a discrete memoryless channel
a)
b)
6. Find H(X) for a discrete random variable with a probability mass function p(x)=(0.5,0.25,0.125,0.125)
b) 1.75
Let p(x,y) be given by
X\Y | |||
1/3 | 1/3 | ||
1/3 | |||
Find H(Y) - H(Y|X).
b) Log 3
Let p(x,y) be given by
X\Y | |||
1/4 | 1/4 | ||
1/4 | |||
1/4 |
Find H(X,Y)
b) 4
c) 3
d) 2
e) 5
f) 1
Compute the capacity of a binary channel
P(X|Y) | A | B |
A | ||
B |
a) 1
10. Determine the channel capacity fora) 1
P(X|Y) | A | B | C | D |
A | 1/2 | 1/2 | ||
B | 1/2 | 1/2 | ||
C | 1/2 | 1/2 | ||
D | 1/2 | 1/2 |
a) 1
Evaluate the channel capacity of
P(X|Y) | A | B | C | D |
A | 1/2 | 1/4 | 1/4 | |
B | 1/4 | 1/4 | 1/2 | |
C | 1/4 | 1/2 | 1/4 | |
D | 1/2 | 1/4 | 1/4 |
a)1/2
Estimate the capacity of
P(X|Y) | A | B | C | D |
A | 1/4 | 1/4 | 1/4 | 1/4 |
B | 1/4 | 1/4 | 1/4 | 1/4 |
C | 1/4 | 1/4 | 1/4 | 1/4 |
D | 1/4 | 1/4 | 1/4 | 1/4 |
a) 0
Calculate precisely the mean length of optional code for probabilities (0.3,0.2,0.2,0.1,0.1,0.05,0.05)
b) 2.6
14. Let (X,Y) have next joint distribution . Find H(X|Y).
b) 1
1. The entropy of a discrete random variable with probability distribution ( , , …, ) is equal to
A)