Answer:
Step-by-step explanation:
<h2>
1:</h2>
- A mile unit is equal to 1760 yards. inches =mile * 63360 2*63360=126720
2 Miles = 126720 Inches
- There are 36 inches in a yard
3600/36= 100
3600 inches = 100 yards
6800/1000 = 6.8
6800 m = 6.8 km
15000/100=150
15000 cm=150 m
2:
a) Johnson ran yard in second= 9
1 minute = 60 second
Johnson ran yard in one minute= 9 *60= 540 yards
b) Johnson ran yard in second= 9
1 hour = 60 minutes *60 second= 1200 seconds
Johnson ran yard in one hour= 9 *1200= 10,800 yards
c) Johnson ran yard in one hour= 9 *1200= 10,800 yards
1 mile = 1760 yards
miles did he run in one hour= 10,800 / 1760 =6
Johnson ran 6 mile in one hour
The number of tests that it would take for the probability of committing at least one type I error to be at least 0.7 is 118 .
In the question ,
it is given that ,
the probability of committing at least , type I error is = 0.7
we have to find the number of tests ,
let the number of test be n ,
the above mentioned situation can be written as
1 - P(no type I error is committed) ≥ P(at least type I error is committed)
which is written as ,
1 - (1 - 0.01)ⁿ ≥ 0.7
-(0.99)ⁿ ≥ 0.7 - 1
(0.99)ⁿ ≤ 0.3
On further simplification ,
we get ,
n ≈ 118 .
Therefore , the number of tests are 118 .
Learn more about Error here
brainly.com/question/17062640
#SPJ4