Y=1x+2 (you don’t have to write the 1 in front of the x so you can write it as y=x+1)
Answer:
(a)![E[X+Y]=E[X]+E[Y]](https://tex.z-dn.net/?f=E%5BX%2BY%5D%3DE%5BX%5D%2BE%5BY%5D)
(b)
Step-by-step explanation:
Let X and Y be discrete random variables and E(X) and Var(X) are the Expected Values and Variance of X respectively.
(a)We want to show that E[X + Y ] = E[X] + E[Y ].
When we have two random variables instead of one, we consider their joint distribution function.
For a function f(X,Y) of discrete variables X and Y, we can define
![E[f(X,Y)]=\sum_{x,y}f(x,y)\cdot P(X=x, Y=y).](https://tex.z-dn.net/?f=E%5Bf%28X%2CY%29%5D%3D%5Csum_%7Bx%2Cy%7Df%28x%2Cy%29%5Ccdot%20P%28X%3Dx%2C%20Y%3Dy%29.)
Since f(X,Y)=X+Y
![E[X+Y]=\sum_{x,y}(x+y)P(X=x,Y=y)\\=\sum_{x,y}xP(X=x,Y=y)+\sum_{x,y}yP(X=x,Y=y).](https://tex.z-dn.net/?f=E%5BX%2BY%5D%3D%5Csum_%7Bx%2Cy%7D%28x%2By%29P%28X%3Dx%2CY%3Dy%29%5C%5C%3D%5Csum_%7Bx%2Cy%7DxP%28X%3Dx%2CY%3Dy%29%2B%5Csum_%7Bx%2Cy%7DyP%28X%3Dx%2CY%3Dy%29.)
Let us look at the first of these sums.
![\sum_{x,y}xP(X=x,Y=y)\\=\sum_{x}x\sum_{y}P(X=x,Y=y)\\\text{Taking Marginal distribution of x}\\=\sum_{x}xP(X=x)=E[X].](https://tex.z-dn.net/?f=%5Csum_%7Bx%2Cy%7DxP%28X%3Dx%2CY%3Dy%29%5C%5C%3D%5Csum_%7Bx%7Dx%5Csum_%7By%7DP%28X%3Dx%2CY%3Dy%29%5C%5C%5Ctext%7BTaking%20Marginal%20distribution%20of%20x%7D%5C%5C%3D%5Csum_%7Bx%7DxP%28X%3Dx%29%3DE%5BX%5D.)
Similarly,
![\sum_{x,y}yP(X=x,Y=y)\\=\sum_{y}y\sum_{x}P(X=x,Y=y)\\\text{Taking Marginal distribution of y}\\=\sum_{y}yP(Y=y)=E[Y].](https://tex.z-dn.net/?f=%5Csum_%7Bx%2Cy%7DyP%28X%3Dx%2CY%3Dy%29%5C%5C%3D%5Csum_%7By%7Dy%5Csum_%7Bx%7DP%28X%3Dx%2CY%3Dy%29%5C%5C%5Ctext%7BTaking%20Marginal%20distribution%20of%20y%7D%5C%5C%3D%5Csum_%7By%7DyP%28Y%3Dy%29%3DE%5BY%5D.)
Combining these two gives the formula:

Therefore:
![E[X+Y]=E[X]+E[Y] \text{ as required.}](https://tex.z-dn.net/?f=E%5BX%2BY%5D%3DE%5BX%5D%2BE%5BY%5D%20%5Ctext%7B%20%20as%20required.%7D)
(b)We want to show that if X and Y are independent random variables, then:

By definition of Variance, we have that:
![Var(X+Y)=E(X+Y-E[X+Y]^2)](https://tex.z-dn.net/?f=Var%28X%2BY%29%3DE%28X%2BY-E%5BX%2BY%5D%5E2%29)
![=E[(X-\mu_X +Y- \mu_Y)^2]\\=E[(X-\mu_X)^2 +(Y- \mu_Y)^2+2(X-\mu_X)(Y- \mu_Y)]\\$Since we have shown that expectation is linear$\\=E(X-\mu_X)^2 +E(Y- \mu_Y)^2+2E(X-\mu_X)(Y- \mu_Y)]\\=E[(X-E(X)]^2 +E[Y- E(Y)]^2+2Cov (X,Y)](https://tex.z-dn.net/?f=%3DE%5B%28X-%5Cmu_X%20%20%2BY-%20%5Cmu_Y%29%5E2%5D%5C%5C%3DE%5B%28X-%5Cmu_X%29%5E2%20%20%2B%28Y-%20%5Cmu_Y%29%5E2%2B2%28X-%5Cmu_X%29%28Y-%20%5Cmu_Y%29%5D%5C%5C%24Since%20we%20have%20shown%20that%20expectation%20is%20linear%24%5C%5C%3DE%28X-%5Cmu_X%29%5E2%20%20%2BE%28Y-%20%5Cmu_Y%29%5E2%2B2E%28X-%5Cmu_X%29%28Y-%20%5Cmu_Y%29%5D%5C%5C%3DE%5B%28X-E%28X%29%5D%5E2%20%20%2BE%5BY-%20E%28Y%29%5D%5E2%2B2Cov%20%28X%2CY%29)
Since X and Y are independent, Cov(X,Y)=0

Therefore as required:

Answer:

Step-by-step explanation:
Normally, when I do this, I differentiate each term first with respect to x then with respect to y. In this solution, I differentiated the entire expression with respect to x, then with respect to y. That makes separating the dx and dy coefficients much easier, so the solution almost falls into your lap.

This can solved using the cosine law which is:
c² = a² + b² - 2ab cos θ
Using the values given from the problem
6² = b² + b² - 2bb cos 112.62
And solving for b
36 = 2b² - 2b² cos 112.62
b = 3.6
The answer is the 3rd option.