http://www.montefiore.ulg.ac.be/~kvansteen/Teaching20082009.html
> conc
[1] 0 10 20 30 40 50
> signal
[1] 4 22 44 60 82
lm(Y ~ model)
> lm(signal ~ conc)
Call:
lm(formula = signal ~ conc)
Coefficients:
(Intercept)
3.60
conc
1.94
> lm.r = lm(signal ~ conc)
> summary(lm.r)
Call:
lm(formula = signal ~ conc)
Residuals:
1
2
3
4
5
0.4 -1.0 1.6 -1.8 0.8
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 3.60000 1.23288 2.92
0.0615 .
conc
1.94000 0.05033 38.54 3.84e-05 ***
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Residual standard error: 1.592 on 3 degrees of freedom
Multiple R-Squared: 0.998,
Adjusted R-squared: 0.9973
F-statistic: 1486 on 1 and 3 DF, p-value: 3.842e-05
Before accepting the result of a linear regression it is important to evaluate it suitability at
explaining the data. One of the many ways to do this is to visually examine the residuals.
If the model is appropriate, then the residual errors should be random and normally
distributed. In addition, removing one case should not significantly impact the model’s
suitability. R provides four graphical approaches for evaluating a model using the plot( )
command.
> layout(matrix(1:4,2,2))
> plot(lm.r)
# Suppose we wish to predict the signal for concentrations of 0.05, 0.15, 0.25, 0.35 and
0.45 along with the confidence interval for each
> newconc=c(5,15,25,35,45);newconc
[1] 5 15 25 35 45
> predict(lm.r,data.frame(conc = newconc), level = 0.9, interval = "confidence")
# add regression line to plots
> plot(conc, signal)
> abline(lm.r)
> plot(conc, signal)
> abline(lm.r)
No comments:
Post a Comment