weights (optional) in linear regression

Discuss the jamovi platform, possible improvements, etc.
Post Reply
sehoon
Posts: 2
Joined: Sat Jul 09, 2022 12:44 am

weights (optional) in linear regression

Post by sehoon »

Hi,

I am a big jamovi fan. Regarding the linear regression, I wonder what "weights (optional)" means and how it can be used.
When applied using one of the factors, the regression results vary hugely.

It would be great if someone can explain what it is and how different it is from regressions without it.

Thanks!
User avatar
MAgojam
Posts: 421
Joined: Thu Jun 08, 2017 2:33 pm
Location: Parma (Italy)

Re: weights (optional) in linear regression

Post by MAgojam »

Hey @sehoon,
we could be OT with an answer here, but short and simple let's leave it.

One of the key assumptions of linear regression is that residuals are distributed with equal variance at each level of the predictor variable. This assumption is known as homoscedasticity.
When this hypothesis is violated, we say that heteroscedasticity is present in the residues. When this occurs, the regression results can become unreliable.
One way to handle this problem is to use weighted least squares regression instead, which assigns weights to observations such that those with small error variance are assigned more weight since they contain more information than those observations that have variance of major error.
If you have a numeric variable (positive values ​​only, not negative values) that contains the weights for each individual case you can select it for the "Weights (optional)" text box, to help improve the general fit of the model.

Cheers,
Maurizio
zoniasuarez
Posts: 1
Joined: Fri Aug 19, 2022 2:33 am

Re: weights (optional) in linear regression

Post by zoniasuarez »

Thx Maurizio...all clear for me
Post Reply