About 343,000 results
Open links in new tab
  1. regression - When is R squared negative? - Cross Validated

    For simple OLS regression with one predictor, this is equivalent to the squared correlation between the predictor and the dependent variable -- again, this must be non-negative.

  2. regression - Interpreting the residuals vs. fitted values plot for ...

    Therefore, the second and third plots, which seem to indicate dependency between the residuals and the fitted values, suggest a different model. But why does the second plot suggest, as …

  3. Why is ANOVA equivalent to linear regression? - Cross Validated

    Oct 4, 2015 · ANOVA and linear regression are equivalent when the two models test against the same hypotheses and use an identical encoding. The models differ in their basic aim: ANOVA …

  4. How should outliers be dealt with in linear regression analysis?

    Often times a statistical analyst is handed a set dataset and asked to fit a model using a technique such as linear regression. Very frequently the dataset is accompanied with a disclaimer similar...

  5. regression - Why does adding more terms into a linear model …

    Jan 12, 2015 · Many statistics textbooks state that adding more terms into a linear model always reduces the sum of squares and in turn increases the r-squared value. This has led to the use …

  6. What happens when we introduce more variables to a linear …

    Feb 22, 2020 · What happens when we introduce more variables to a linear regression model? Ask Question Asked 5 years, 9 months ago Modified 4 years, 7 months ago

  7. Linear Regression For Binary Independent Variables - Interpretation

    Jan 18, 2019 · For linear regression, you would code the variables as dummy variables (1/0 for presence/absence) and interpret the predictors as "the presence of this variable increases …

  8. model - When forcing intercept of 0 in linear regression is …

    Jun 10, 2014 · The problem is, if you fit an ordinary linear regression, the fitted intercept is quite a way negative, which causes the fitted values to be negative. The blue line is the OLS fit; the …

  9. In linear regression, when is it appropriate to use the log of an ...

    Aug 24, 2021 · Taking logarithms allows these models to be estimated by linear regression. Good examples of this include the Cobb-Douglas production function in economics and the Mincer …

  10. When is it ok to remove the intercept in a linear regression model ...

    The standard regression model is parametrized as intercept + k - 1 dummy vectors. The intercept codes the expected value for the "reference" group, or the omitted vector, and the remaining …