ECE 515
Information Theory


Assignment Submission


Assignments should be submitted on Brightspace https://bright.uvic.ca/d2l/home

Assignments

Assignment 1 - Due September 26, 2025 Solutions
  1. Consider two binary random variables X and Y with joint probability distribution p(x,y) given by
    p(0,0) = 1/2, p(0,1) = 1/4, p(1,0) = 0, and p(1,1) = 1/4.
    Find the value of
    (a) H(X)
    (b) H(Y)
    (c) H(X|Y)
    (d) H(Y|X)
    (e) H(XY)
    (f) I(X;Y)
  2. Consider a discrete random variable X with 2n+1 symbols xi, i = 1, 2, …, 2n+1. Determine the upper and lower bounds on the entropy and the corresponding symbol probabilities when
    (a) p(x1)=0
    (b) p(x1)=1/2
  3. A jar contains 5 black balls and 10 white balls. Experiment X involves randomly drawing a ball out of the jar. Next, experiment Y involves randomly drawing a ball with the ball drawn in experiment X not replaced in the jar. One is interested in the colour of the drawn ball.
    (a) How much uncertainty does experiment X contain?
    (b) What is the uncertainty in experiment Y given that the first ball is black?
    (c) What is the uncertainty in experiment Y given that the first ball is white?
    (d) How much uncertainty does experiment Y contain?
  4. Let X be a random variable whose entropy H(X) is 8 bits. Suppose that Y(X) is a deterministic function that takes on a different value for each value of X.
    (a) What is H(Y)?
    (b) What is H(Y|X)?
    (c) What is H(X|Y)?
    (d) What is I(X;Y)?
    (e) Suppose now that the deterministic function Y(X) is not invertible, so that different values of X may correspond to the same value of Y(X). In this case, what can be said about H(Y), and also about H(X|Y)?
  5. The Stanley Cup Final is a seven-game hockey series that ends as soon as either team wins 4 games. Let X be the random variable that represents the outcome of the Stanley Cup Final between teams A and B. For example, some possible values of X are AAAA, BABABAB, and BBBAAAA. Let Y be random variable for the number of games played, which ranges from 4 to 7. Assuming A and B are equally matched and that the games are independent, determine
    (a) H(X)
    (b) H(Y)
    (c) H(Y|X)
    (d) H(X|Y)
  6. In a country, 25% of the people are blond and 75% of all blond people have blue eyes. In addition, 50% of the people have blue eyes. How much information is received in each of the following cases
    (a) if we know that a person is blond and we are told the colour (blue/not blue) of their eyes
    (b) if we know that a person has blue eyes and we are told the colour (blond/not blond) of their hair
    (c) if we are told both the colour of their hair and that of their eyes.
  7. A roulette wheel is subdivided into 38 numbered compartments of various colours. The distribution of the compartments according to colour is 2 green, 18 red, and 18 black.

    The experiment consists of throwing a small ball onto the rotating roulette wheel. The event that the ball comes to rest in one of the 38 compartments is equally probable for each compartment.
    (a) How much information is received if the colour is revealed?
    (b) How much information is received if both the colour and number are revealed?
    (c) If the color is known, what is the remaining uncertainty about the number?

Assignment 2 - Due October 10, 2025
  1. Let X and Y be random variables which are identically distributed but not necessarily independent. Define the correlation between X and Y as ρ = 1 - H(Y|X)/H(X).
    (a) Show that ρ = I(X;Y)/H(X).
    (b) Show that 0 ≤ ρ ≤ 1.
    (c) When is ρ = 0?
    (d) When is ρ = 1?
  2. Let F, G and D be three discrete random variables. Prove the validity of the following inequalities and if true find conditions for equality. Venn diagrams are not sufficient.
    (a) I(GD;F) ≥ I(D;F).
    (b) H(GD|F) ≥ H(D|F).
    (c) I(G;F|D) ≥ I(D;F|G) + I(F;G) - I(D;F).
    (d) H(FDG) - H(DF) ≤ H(GD) - H(D).
  3. The following equiprobable messages are used to send data over a binary sym metric channel with crossover probability p:
    M1 = 0000 M2 = 0011 M3 = 0101 M4 = 0110
    M5 = 1001 M6 = 1010 M7 = 1100 M8 = 1111
    If the sequence Y=0000 is received
    (a) how much information is provided about M1 by the first digit received?
    (b) how much additional information about M1 is provided by the second, then the third, and finally the fourth received digit?
  4. Construct a decision tree for job acceptance prediction using the following training data

    The output is Accept Job: {Yes, No}.

    Instance Salary Commute Worklife Balance Remote Working Accept Job
    1HighShortGoodYesYes
    2HighMediumGoodNoYes
    3HighLongGoodYesYes
    4HighLongPoorNoNo
    5HighShortPoorYesYes
    6HighMediumAverageNoYes
    7HighLongAverageYesYes
    8HighShortAverageNoYes
    9MediumShortGoodYesYes
    10MediumMediumGoodYesYes
    11MediumLongAverageYesNo
    12MediumMediumPoorNoNo
    13MediumShortAverageNoYes
    14MediumLongGoodNoNo
    15MediumShortPoorYesNo
    16MediumMediumAverageYesYes
    17LowShortGoodYesYes
    18LowMediumGoodNoNo
    19LowMediumPoorYesNo
    20LowLongGoodYesNo
    21LowShortAverageNoNo
    22LowLongPoorNoNo

    (b) Predict the job acceptance for the following instances

    Instance Salary Commute Worklife Balance Remote Working Accept Job
    23HighMediumAverageYes?
    24MediumMediumAverageYes?
    25MediumLongPoorNo?
    26LowShortPoorNo?
    27LowMediumAverageYes?
    28HighMediumPoorNo?

  5. Consider the binary entropy function h(p) = -plog2p-(1-p)log2(1-p). Calculate the average entropy when the probability p is uniformly distributed in the range 0 ≤ p ≤ 1.
  6. The binary erasure channel is a channel with a simple type of noise in that transmitted bits can be erased but cannot be received in error. The channel input is a binary random variable X and the channel output is a ternary random variable Y. Let the probability of x=0 be w so the probability of x=1 is 1-w. The three possible channel outputs are {0,e,1} where e denotes an erasure. The probability of an erasure is p so the probability of correct reception of a symbol is 1-p.
    (a) Calculate (I(X;Y) in terms of w and p.
    (b) Find the value of w that maximizes I(X;Y).
  7. Relative entropy
    (a) Consider a random variable X with four possible outcomes {a, b, c, d}, and two distributions on X:

    xp(x)q(x)
    a5/83/8
    b1/81/4
    c1/81/4
    d1/81/8

    Calculate H(p), H(q), D(p||q), and D(q||p), and verify in this case that D(p||q) ≠ D(q||p).
    (b) Part (a) shows that D(p||q) ≠ D(q||p) in general, but there could be distributions for which equality holds. Provide an example of distributions p(x) and q(x) on a ternary alphabet {a, b, c}, such that D(p||q) = D(q||p) (other than the trivial case p(x) = q(x) for which D(p||q) = D(q||p) = 0).

Assignment 3 - Due
Assignment 4 - Due
Assignment 5 - Due