{"nbformat":4,"nbformat_minor":0,"metadata":{"kernelspec":{"display_name":"Python 3 (Anaconda)","language":"python","name":"anaconda3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.5.3"},"colab":{"name":"CurveFitting-Working.ipynb","provenance":[],"collapsed_sections":[]}},"cells":[{"cell_type":"markdown","metadata":{"id":"OmsFis6Mtcdt"},"source":["# Curve Fitting"]},{"cell_type":"markdown","metadata":{"id":"nHkH3jjKtcd0"},"source":["Suppose that you want to fit a set of data points $(x_i,y_i)$, where\n","$i = 1,2,\\ldots,N$, to a function that can't be linearized. For example, the function could be a second-order polynomial, $f(x,a,b,c)=ax^2+bx+c$. There isn’t an analytical expression for finding the best-fit parameters ($a$, $b$, and $c$ in this example) as there is for linear regression with uncertainties in one dimension. The usual approach is to optimize the parameters to minimize the sum of the squares of the differences between the data and the function. How the difference are defined varies. If there are only uncertainties in the y direction, then the differences in the vertical direction (the gray lines in the figure below) are used. If there are uncertainties in both the $x$ and $y$ directions, the orthogonal (perpendicular) distances from the line (the dotted red lines in the figure below) are used."]},{"cell_type":"markdown","source":["