Complete question is;
Do oddsmakers believe that teams who play at home will have home field advantage? Specifically, do oddsmakers give higher point spreads when the favored team plays home games as compared to when the favored team plays away games?
Two samples were randomly selected from three complete National Football League seasons (1989, 1990, and 1991). The first sample consisted of 50 games, where the favored team played in a home game, while the second sample consisted of 50 games, where the favored team played in an away game. The oddsmakers’ point spreads (which are the number of points by which the favored team is predicted to beat the weaker team) were then collected.
If µ1 and µ2 represent the mean point spread for home games and away games, respectively, which of the following is the appropriate.
A) H0: μ1 - μ2 = 0
Ha: μ1 - μ2 < 0
B) H0: μ1 - μ2 = 0
Ha: μ1 < μ2
C) H0: μ1 - μ2 > 0
Ha: μ1 - μ2 = 0
D) H0: μ1 - μ2 = 0
Ha: μ1 - μ2 > 0
E) None of the above
Answer:
D) H0: μ1 - μ2 = 0
Ha: μ1 - μ2 > 0
Step-by-step explanation:
We want to find out if oddsmakers give higher point spreads when the favored team plays home games as compared to when the favored team plays away games.
Now, since µ1 and µ2 represent the mean point spread for home games and away games, respectively;
It means we want to find out if µ1 > µ2 as the alternative hypothesis.
Thus, alternative hypothesis is;
Ha: µ1 - µ2 > 0
Meanwhile null hypothesis assumes that equal point spreads are given when the favored team plays home games as well as when the favored team plays away games.
Thus, null hypothesis is;
H0: μ1 - μ2 = 0
The only correct option is D.