# Tag: Regression

## Sum of the residuals for the linear regression model is zero.

Prove that the sum of the residuals for the linear regression model is zero.________________________________________________

This post is brought to you by

- Holistic Numerical Methods Open Course Ware:
- Numerical Methods for the STEM undergraduate at http://nm.MathForCollege.com;
- Introduction to Matrix Algebra for the STEM undergraduate at http://ma.MathForCollege.com

- the textbooks on
- the Massive Open Online Course (MOOCs) available at

## Effect of Significant Digits: Example 2: Regression Formatting in Excel

In a series of bringing pragmatic examples of the effect of significant digits, we discuss the influence of using default and scientific formats in the trendline function of Microsoft Excel. This is the second example (first example was on a beam deflection problem) in the series.

________________________________________________

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, the textbook on Introduction to Programming Concepts Using MATLAB, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos. Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

## Does the solve command in MATLAB not give you an answer?

Recently, I had assigned a project to my class where they needed to regress *n* number of *x-y* data points to a nonlinear regression model *y*=exp(*b***x*). However, they were NOT allowed to transform the data, that is, transform data such that linear regression formulas can be used to find the constant of regression *b*. They had to do it the new-fashioned way: Find the sum of the square of the residuals and then minimize the sum with respect to the constant of regression *b*.

To do this, they conducted the following steps

- setup the equation by declaring
*b*as a syms variable, - calculate the sum of the square of the residuals using a loop,
- use the
*diff*command to set up the equation, - use the
*solve*command.

However, the *solve* command gave some odd answer like log(z1)/5 + (2*pi*k*i)/5. The students knew that the equation has only one real solution – this was deduced from the physics of the problem.

We did not want to set up a separate function mfile to use the numerical solvers such as *fsolve*. To circumvent the setting up of a separate function mfile, we approached it as follows. If dbsr=0 is the equation you want to solve, use

F = vectorize(inline(char(dbsr)))

fsolve(F, -2.0)

What *char* command does is to convert the function dbsr to a string, inline constructs it to an inline function, vectorize command vectorizes the formula (I do not fully understand this last part myself or whether it is needed).

## Does it make a large difference if we transform data for nonlinear regression models

__________________________________________________

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos and http://www.youtube.com/numericalmethodsguy

Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

## To prove that the regression model corresponds to a minimum of the sum of the square of the residuals

Many regression models when derived in books only show the first derivative test to find the formulas for the constants of a regression model. Here we take a simple example to go through the complete derivation.

_________________________________________________________

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos and http://www.youtube.com/numericalmethodsguy

Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

## How do I do polynomial regression in MATLAB?

Many students ask me how do I do this or that in MATLAB. So I thought why not have a small series of my next few blogs do that. In this blog, I show you how to do polynomial regression.

- The MATLAB program link is here.
- The HTML version of the MATLAB program is here.
**DO NOT COPY AND PASTE THE PROGRAM BELOW BECAUSE THE SINGLE QUOTES DO NOT TRANSLATE TO THE CORRECT SINGLE QUOTES IN MATLAB EDITOR.****DOWNLOAD THE MATLAB PROGRAM****INSTEAD**

%% HOW DO I DO THAT IN MATLAB SERIES?

% In this series, I am answering questions that students have asked

% me about MATLAB. Most of the questions relate to a mathematical

% procedure.

%% TOPIC

% How do I do polynomial regression?

%% SUMMARY

% Language : Matlab 2008a;

% Authors : Autar Kaw;

% Mfile available at

% http://numericalmethods.eng.usf.edu/blog/regression_polynomial.m;

% Last Revised : August 3, 2009;

% Abstract: This program shows you how to do polynomial regression?

% .

clc

clear all

clf

%% INTRODUCTION

disp(‘ABSTRACT’)

disp(‘ This program shows you how to do polynomial regression’)

disp(‘ ‘)

disp(‘AUTHOR’)

disp(‘ Autar K Kaw of https://autarkaw.wordpress.com’)

disp(‘ ‘)

disp(‘MFILE SOURCE’)

disp(‘ http://numericalmethods.eng.usf.edu/blog/regression_polynomial.m’)

disp(‘ ‘)

disp(‘LAST REVISED’)

disp(‘ August 3, 2009’)

disp(‘ ‘)

%% INPUTS

% y vs x data to regress

% x data

x=[-340 -280 -200 -120 -40 40 80];

% ydata

y=[2.45 3.33 4.30 5.09 5.72 6.24 6.47];

% Where do you want to find the values at

xin=[-300 -100 20 125];

%% DISPLAYING INPUTS

disp(‘ ‘)

disp(‘INPUTS’)

disp(‘________________________’)

disp(‘ x y ‘)

disp(‘________________________’)

dataval=[x;y]’;

disp(dataval)

disp(‘________________________’)

disp(‘ ‘)

disp(‘The x values where you want to predict the y values’)

dataval=[xin]’;

disp(dataval)

disp(‘________________________’)

disp(‘ ‘)

%% THE CODE

% Using polyfit to conduct polynomial regression to a polynomial of order 1

pp=polyfit(x,y,1);

% Predicting values at given x values

yin=polyval(pp,xin);

% This is only for plotting the regression model

% Find the number of data points

n=length(x);

xplot=x(1):(x(n)-x(1))/10000:x(n);

yplot=polyval(pp,xplot);

%% DISPLAYING OUTPUTS

disp(‘ ‘)

disp(‘OUTPUTS’)

disp(‘________________________’)

disp(‘ xasked ypredicted ‘)

disp(‘________________________’)

dataval=[xin;yin]’;

disp(dataval)

disp(‘________________________’)

xlabel(‘x’);

ylabel(‘y’);

title(‘y vs x ‘);

plot(x,y,’o’,’MarkerSize’,5,’MarkerEdgeColor’,’b’,’MarkerFaceColor’,’b’)

hold on

plot(xin,yin,’o’,’MarkerSize’,5,’MarkerEdgeColor’,’r’,’MarkerFaceColor’,’r’)

hold on

plot(xplot,yplot,’LineWidth’,2)

legend(‘Points given’,’Points found’,’Regression Curve’,’Location’,’East’)

hold off

disp(‘ ‘)

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos and http://www.youtube.com/numericalmethodsguy

Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.