Abstract
Fitting a straight line through data can be done in many ways that may seem different at first, but after closer inspection prove to be based on the same mathematics. In this chapter, we shall fit a line to data in twelve different ways and compare the resulting parameter estimates. So our goal is to estimate the intercept and the slope of the line.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Exercises
Exercises
1. Prior uncertainty. Reduce the prior variances, which we specified as being \(10^4\) in Eq. (8.6), to more realistic values such as \(10^2\). Does that increase or decrease the posterior estimates for \(\beta \)? Why?
2. Kalman Filtering (KF) without intercept uncertainty. In our application of KF, we specified a very large prior uncertainty for \(\beta \). Repeat the KF exactly as before, but with \(\Sigma _{\beta }[1,1]\) set to zero. How does that change the posterior distribution for \(\beta \)? What kind of linear regression have you just done? Why is the posterior uncertainty for the slope-parameter lower than before?
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
van Oijen, M. (2020). Twelve Ways to Fit a Straight Line. In: Bayesian Compendium . Springer, Cham. https://doi.org/10.1007/978-3-030-55897-0_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-55897-0_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-55896-3
Online ISBN: 978-3-030-55897-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)