ANOVA Using Microsoft
One-Way Analysis of Variance
These tutorials briefly explain the use
and interpretation of standard statistical analysis techniques. The
examples include how-to instructions for Excel. Although there are
different version of Excel in use, these should work about the same for
most recent versions. They also assume that you have installed the Excel
Analysis Pak which is free and comes with Excel (Go to Tools,
Addins... if it is not already installed in your version of Excel.)
for files mentioned in this tutorial,
© TexaSoft, 2008
Definition: An Independent Group ANOVA is an extension of the independent group t-test where you have more than two groups. This test is used to compare the means of more than two independent groups and is also called a One Way Analysis of Variance.
Assumptions: Subjects are randomly assigned to one of n groups. The distribution of the means by group are normal with equal variances. Sample sizes between groups do not have to be equal, but large differences in sample sizes by group may effect the outcome of the multiple comparisons tests.
Test: The hypotheses for the comparison of independent groups are: (k is the number of groups)
Ho: u1 = u2 ... = uk (means of the all groups are equal)
Ha: ui <> uj (means of the two or more groups are not equal)
The test is performed in an Analysis of Variance (ANOVA) table. The test statistic is an F test with k-1 and N-k degrees of freedom, where N is the total number of subjects. A low p-value for this test indicates evidence to reject the null hypothesis in favor of the alternative. In other words, there is evidence that at least one pair of means are not equal.
Example: Independent Group ANOVA (One-Way Analysis of variance)
The FEED_ANOVA.XLS file contains information on four different feeds and weight gain of animals after they had been fed one of the feeds for a period of time. You want to know if any feed is better for producing weight gain.
Step 1: Open the file FEED_ANOVA or enter thedata into an Excel datasheet.
Step 2: In Excel 2003 or earlier, pull down “Tools” to “Data Analysis” In Excel 2007 click on Data then Data Analysis.
Step 3: Select Anova: Single Factor.
Step 4: In the following Dialog box, enter the input range that corresponds to the data columns ($A$1:$D$5) and click OK. Check the option "Labels in First Row".
The tesults appear in a new worksheet, as shown here:
In this output, the test statistic, F, is reported in the analysis of variance table, F(3,11) = 39.82 . The p-value for this statistics is p< 0.001 (reported in the table as 3.36E-E06). This means that there is evidence that there are differences in the means across groups.
Unfortunately, Excel does not include a standard multiple comparison test you can use to determine which means are different from the others.
Step 5: One way to determine specific difference is to perform paired analyses of the group, two at a time. For example, compare the mean for group A vs the mean for group B, then A vs C then A vs D and so on. In Excel, your option is to do this using multiple two-sample t-tests.
If you do these pairwise comparisons, you should modify the resulting p-value for each t-test, since performing multiple t-tests increases the probablity of finding an incorrect significance. To correct for this problem you should multiple the p-values for each of the pair-wise comparisons by the number of comparisons. This is called a Bonferonni adjustment. For example, in this case your comparisons are
A vs B, A vs C, A vs D, B vs C, B vs D, and C vs D -- 6 pairwise comparisons in all. Thus, you'd correct each t-test p-value by multiplying it by 6.
For example, a t-test comparison of Mean A vs Mean C (61.025 vs 89.067) yields an unadjusted two-tail p-value p=0.0006. The adjusted p-value (the one you should report) would be 0.0006*6 = 0.0036