The length of time a full length movie runs from opening to credits is normally distributed with a mean of 1.9 hours and standar d deviation of 0.3 hours. Calculate the following: A random movie is between 1.8 and 2.0 hours. A movie is longer than 2.3 hours. The length of movie that is shorter than 94% of the movies
1 answer:
Answer:
0.26 0.91 1.43 Step-by-step explanation:
given data
mean = 1.9 hours
standard deviation = 0.3 hours
solution
we get here first random movie between 1.8 and 2.0 hours
so here
P(1.8 < z < 2 )
z = (1.8 - 1.9) ÷ 0.3
z = -0.33
and
z = (2.0 - 1.9) ÷ 0.3
z = 0.33
z = 0.6293
so
P(-0.333 < z < 0.333 )
= 0.26
so random movie is between 1.8 and 2.0 hours long is 0.26
and
A movie is longer than 2.3 hours.
P(x > 2.3)
P( > )
P (z > )
P (z > 1.333 )
= 0.091
so chance a movie is longer than 2.3 hours is 0.091
and
length of movie that is shorter than 94% of the movies is
P(x > a ) = 0.94
P(x < a ) = 0.06
so
P( < )
a = 1.43
so length of the movie that is shorter than 94% of the movies about 1.4 hours.
You might be interested in
The deviation is unidentifiable.
1. 1/6 2. 1/8 3. 80% 4. 1/5 5. 2/8 or 25%
Answer:
a
Step-by-step explanation:
Answer:
x = ± 2
Step-by-step explanation:
given
40 - x² = 0 ( add x² to both sides )
40 = x² or
x² = 40 ( take the square root of both sides )
x = ± = ± = ± 2
Answer:
35/4
Step-by-step explanation:
8 × 4 = 32 32 + 3 = 35 Write 35 over the original denominator: 35/4
I hope this helps!