COVID19 Regression Model and Other Thoughts

March 28, 2020
I am not an epidemiologist but do regression modeling for a living. I have done regression modeling to predict student grades just after the first test, and now we will be using adaptive learning metrics to improve our identification schemes even earlier in the semester. It is not to spell boom or doom to a student but only to intervene early with personalized recommendations.
Looking at whatever data I can use and have time to scrape, four things are reasonably clear at this time to me about COVID19.
1) First, the rate of infection is exponential but it does not stay like that forever.  The logistic function of the infection rate is analogous to how the mass of a moving rocket decreases as it burns up its fuel. F=ma, but m is not a constant.
2) Second, President Trump is finally thinking about quarantining NY, NJ, CT area. A little late but it will definitely decrease the power of the exponent.
3) Third, we have to get more testing done but one which is totally random. We could have found the effect of the spring breakers coming to FL and of the college kids being sent home to parents who are having kids late in life. Please do not send your grandkids to grandpa/grandma’s retirement home. They may be the children of the corn.
4) Fourth, Florida and Louisiana need to get their head straightened out and use tougher rules to keep people inside and a method to keep outsiders out. They are the next hot zone.

Sum of the residuals for the linear regression model is zero.

Prove that the sum of the residuals for the linear regression model is zero.sum of residuals is zero Page 1sum of residuals is zero Page 2________________________________________________

This post is brought to you by

Effect of Significant Digits: Example 2: Regression Formatting in Excel

In a series of bringing pragmatic examples of the effect of significant digits, we discuss the influence of using default and scientific formats in the trendline function of Microsoft Excel.  This is the second example (first example was on a beam deflection problem) in the series.

________________________________________________

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, the textbook on Introduction to Programming Concepts Using MATLAB, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos.  Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

Does the solve command in MATLAB not give you an answer?

Recently, I had assigned a project to my class where they needed to regress n number of x-y data points to a nonlinear regression model y=exp(b*x).  However, they were NOT allowed to transform the data, that is, transform data such that linear regression formulas can be used to find the constant of regression b.  They had to do it the new-fashioned way: Find the sum of the square of the residuals and then minimize the sum with respect to the constant of regression b.

To do this, they conducted the following steps

  1. setup the equation by declaring b as a syms variable,
  2. calculate the sum  of the square of the residuals using a loop,
  3. use the diff command to set up the equation,
  4. use the solve command. 

However, the solve command gave some odd answer like log(z1)/5 + (2*pi*k*i)/5.  The students knew that the equation has only one real solution – this was deduced from the physics of the problem. 

We did not want to set up a separate function mfile to use the numerical solvers such as fsolve.  To circumvent the setting up of a separate function mfile, we approached it as follows.  If dbsr=0 is the equation you want to solve, use

F = vectorize(inline(char(dbsr)))
fsolve(F, -2.0)

What char command does is to convert the function dbsr to a string, inline constructs it to an inline function, vectorize command vectorizes the formula (I do not fully understand this last part myself or whether it is needed).

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, the textbook on Introduction to Programming Concepts Using MATLAB, and the YouTube video lectures available athttp://numericalmethods.eng.usf.edu/videos.  Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

Does it make a large difference if we transform data for nonlinear regression models

__________________________________________________

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos and http://www.youtube.com/numericalmethodsguy

Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

To prove that the regression model corresponds to a minimum of the sum of the square of the residuals

Many regression models when derived in books only show the first derivative test to find the formulas for the constants of a regression model.  Here we take a  simple example to go through the complete derivation.

Finding minimum of sum of square of residuals

Minimum of sum of square of residuals

_________________________________________________________

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos and http://www.youtube.com/numericalmethodsguy

Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.

How do I do polynomial regression in MATLAB?

Many students ask me how do I do this or that in MATLAB.  So I thought why not have a small series of my next few blogs do that.  In this blog, I show you how to do polynomial regression.

  • The MATLAB program link is here.
  • The HTML version of the MATLAB program is here.
  • DO NOT COPY AND PASTE THE PROGRAM BELOW BECAUSE THE SINGLE QUOTES DO NOT TRANSLATE TO THE CORRECT SINGLE QUOTES IN MATLAB EDITOR.  DOWNLOAD THE MATLAB PROGRAM INSTEAD

%% HOW DO I DO THAT IN MATLAB SERIES?
% In this series, I am answering questions that students have asked
% me about MATLAB.  Most of the questions relate to a mathematical
% procedure.

%% TOPIC
% How do I do polynomial regression?

%% SUMMARY

% Language : Matlab 2008a;
% Authors : Autar Kaw;
% Mfile available at
% http://numericalmethods.eng.usf.edu/blog/regression_polynomial.m;
% Last Revised : August 3, 2009;
% Abstract: This program shows you how to do polynomial regression?
%           .
clc
clear all
clf

%% INTRODUCTION

disp(‘ABSTRACT’)
disp(‘   This program shows you how to do polynomial regression’)
disp(‘ ‘)
disp(‘AUTHOR’)
disp(‘   Autar K Kaw of https://autarkaw.wordpress.com’)
disp(‘ ‘)
disp(‘MFILE SOURCE’)
disp(‘   http://numericalmethods.eng.usf.edu/blog/regression_polynomial.m’)
disp(‘ ‘)
disp(‘LAST REVISED’)
disp(‘   August 3, 2009’)
disp(‘ ‘)

%% INPUTS
% y vs x data to regress
% x data
x=[-340  -280  -200  -120  -40  40  80];
% ydata
y=[2.45  3.33  4.30   5.09  5.72  6.24  6.47];
% Where do you want to find the values at
xin=[-300 -100 20  125];
%% DISPLAYING INPUTS
disp(‘  ‘)
disp(‘INPUTS’)
disp(‘________________________’)
disp(‘     x         y  ‘)
disp(‘________________________’)
dataval=[x;y]’;
disp(dataval)
disp(‘________________________’)
disp(‘   ‘)
disp(‘The x values where you want to predict the y values’)
dataval=[xin]’;
disp(dataval)
disp(‘________________________’)
disp(‘  ‘)

%% THE CODE
% Using polyfit to conduct polynomial regression to a polynomial of order 1
pp=polyfit(x,y,1);
% Predicting values at given x values
yin=polyval(pp,xin);
% This is only for plotting the regression model
% Find the number of data points
n=length(x);
xplot=x(1):(x(n)-x(1))/10000:x(n);
yplot=polyval(pp,xplot);
%% DISPLAYING OUTPUTS
disp(‘  ‘)
disp(‘OUTPUTS’)
disp(‘________________________’)
disp(‘   xasked   ypredicted  ‘)
disp(‘________________________’)
dataval=[xin;yin]’;
disp(dataval)
disp(‘________________________’)

xlabel(‘x’);
ylabel(‘y’);
title(‘y vs x ‘);
plot(x,y,’o’,’MarkerSize’,5,’MarkerEdgeColor’,’b’,’MarkerFaceColor’,’b’)
hold on
plot(xin,yin,’o’,’MarkerSize’,5,’MarkerEdgeColor’,’r’,’MarkerFaceColor’,’r’)
hold on
plot(xplot,yplot,’LineWidth’,2)
legend(‘Points given’,’Points found’,’Regression Curve’,’Location’,’East’)
hold off
disp(‘  ‘)

This post is brought to you by Holistic Numerical Methods: Numerical Methods for the STEM undergraduate at http://numericalmethods.eng.usf.edu, the textbook on Numerical Methods with Applications available from the lulu storefront, and the YouTube video lectures available at http://numericalmethods.eng.usf.edu/videos and http://www.youtube.com/numericalmethodsguy

Subscribe to the blog via a reader or email to stay updated with this blog. Let the information follow you.