[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.12332631 [View]
File: 5 KB, 268x153, asdc.png [View same] [iqdb] [saucenao] [google]
12332631

>>12332227
Ok, so I tried to make this question earlier but it was very badly formulated. So here it goes again. It's about linear regression.

So, if I'm understanding it all right, when we want to do statistical inference with a linear regression model, we have to consider our observations (x, y) as random variables —because they have been extracted from random samples.

BUT the Gauss-Markov assumption tells us that we have to consider the values of x as "fixed" in the sense that they're deterministic, not stochastic.

First question: does this mean that only the y values are considered random variables? And, if the y's are random variables, then the coefficients (the betas) also are, and so are the residuals.

If that reasoning is right, my question is the following, why does my textbook (which hasn't been helpful at all) say that the expected mean of Y given the values of x is the "systematic (deterministic) part"? That part depends on the values of x, which yes we said was systematic (deterministic), but also of the coefficients. The coefficients depend, in turn, of the values of x and y. If Y is a random variable and each observation y is random with a certain probability, then the coefficients beta are also random variables. Why is it then that we call this part "systematic"?

I have the feeling that I'm getting something incredibly wrong but I can't know what, my professor has completely dissappeared and the textbooks are not helpful.

>> No.12332624 [View]
File: 5 KB, 268x153, asdc.png [View same] [iqdb] [saucenao] [google]
12332624

>>12314504
Ok, so I tried to make this question earlier but it was very badly formulated. So here it goes again. It's about linear regression.

So, if I'm understanding it all right, when we want to do statistical inference with a linear regression model, we have to consider our observations (x, y) as random variables —because they have been extracted from random samples.

BUT the Gauss-Markov assumption tells us that we have to consider the values of x as "fixed" in the sense that they're deterministic, not stochastic.

First question: does this mean that only the y values are considered random variables? And, if the y's are random variables, then the coefficients (the betas) also are, and so are the residuals.

If that reasoning is right, my question is the following, why does my textbook (which hasn't been helpful at all) say that the expected mean of Y given the values of x is the "systematic part"? That part depends on the values of x, which yes we said was systematic, but also of the coefficients. The coefficients depend, in turn, of the values of x and y. If Y is a random variable and each observation y is random with a certain probability, then the coefficients beta are also random variables. Why is it then that we call this part "systematic"?

I have the feeling that I'm getting something incredibly wrong but I can't know what, my professor has completely dissappeared and the textbooks are not helpful.

Navigation
View posts[+24][+48][+96]