IDNLearn.com offers a seamless experience for finding and sharing knowledge. Discover comprehensive answers to your questions from our community of knowledgeable experts.

suppose the speeds of vehicles traveling on a highway are normally distributed. if the population standard deviation is 2 miles per hour, what minimum sample size is needed to be 90% confident that the sample mean is within 1 mile per hour of the true population mean?

Sagot :

The 90% confidence interval is given by (3.9911; 7.9911)

It is 90% confident that the true average speed of a vehicle on a highway is between 3.9911 and 7.9911 miles per hour.

1) Former concept

A confidence interval is "a range of values ​​that is likely to encompass the values ​​of a population with some degree of confidence. Often expressed as a percentage.

The error bar is the range of values ​​above and below the sample statistic in the confidence interval.

A probability distribution that is symmetric about the mean and indicates that data close to the mean are more common than data far from the mean.

X = 2, represents the sample mean of the sample

μ = population mean (variable of interest)

s = 1 represents the standard deviation of the sample

n = 11 represents the sample size

2 ) Confidence Interval:

Confidence Interval for mean is given below:

                  X ± t (α/2) s/√n

To compute the critical value, we first need to find the degrees of freedom given by

df = n -1

   = 11 -1 = 10

Confidence is 0.90 or 90%, so the value of and is α = 0.1 , and α/2 = 0.05 you can use Excel, a calculator, or a spreadsheet to find the critical values. The Excel command would be "=-T.INV(0.05,10)".

2 - 1.81 (1/√11) = 3.9911

2 + 1.81 (1/√11) = 7.9911

So in this case the 90% confidence interval is (3.9911 , 7.9911)

We are 90% confident that the true mean for the speeds of vehicles traveling on a highway is between 3.9911 and 7.9911 miles per hour.

Learn more about Confidence interval:

https://brainly.com/question/17741237

#SPJ4