Machine learning : R Squared Regression Archive
R^2 = 1 - SSres / SStot
SSres = SUM ( Yi - Yi^)^2
SStot = SUM ( Yi - Yavg)^2
if Y = b0 + b1 * X1 when having the only one dependent term in Linear equation R square will Goodness of fit hence greater the prediction .
Y = b0 + b1 * X1 + b2 * X2 , when new variable add to the Linear regression i.e. X2 and b2 then value of R square will not decrease because the value of SSres remain minimum so that the R square is increasing when added new term so that we get new term i.e. Adjusted R square .
Adjust R Square
Adj R ^2 = 1 - ( 1 - R ^2 ) (n - 1) / ( n - p - 1)
P = No of regressor ( independent variable )
n = sample size
when P is increase the ( adding new variable ) the ( n - p -1 ) this part will be decrease so that ration increase mean the , ( 1 - R ^2 ) (n - 1) / ( n - p - 1) this part will increase so that hole term 1 - ( 1 - R ^2 ) (n - 1) / ( n - p - 1) i.e. 1- increasing term = Decrease .so the value of Adj R square will Decreasing when adding the new variable which in only R square will increase the R square . So that the prediction will work fine in Linear Regression .The Model become Robust .
Comments
Post a Comment