[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 28 KB, 636x162, qba2.jpg [View same] [iqdb] [saucenao] [google]
4718303 No.4718303 [Reply] [Original]

anyone know how to do this? It's econometrics/statistics related

>> No.4718312

You need to minimize it.
Minimization means finding a point where the derivative is 0 and the second derivative is increasing.
It's a sum.
The derivative of a sum is equal to the sum of the derivatives.

Derive that bitch.

Hint: fucking gradients, how do they work?

>> No.4718387

but be careful... derivate by b0 and b1, assume x and y as constants

>> No.4718588

>>4718312
ok, that does help, thanks. BUt to derive do i need to expand it first or just take the power down the front and times it by the negative signs of b0 and b1 so it goes back to positive

>> No.4718592

>>4718588
>BUt to derive do i need to expand it first or just take the power down the front and times it by the negative signs of b0 and b1 so it goes back to positive

If you have to ask then you haven't understood the chain rule. Go back and review it.

You do not need to expand it.

>> No.4718602
File: 1.09 MB, 965x815, econometrics.png [View same] [iqdb] [saucenao] [google]
4718602

OP have you posted some other econometrics threads? We have had an unusual number of them recently. Pic related. It some econometrics I did for another homework question someone asked.

I know how to do it. Although these are a bitch. Whenever I faced problems like this in econometrics the answer always came about from just plugging the provided formulas and then deriving using the rules of summation.

If I were to take a crack at this that is the first thing I would do.

>>4718312

I am pretty sure calculus wont be necessary to solve OP's problem.

>> No.4718618

>>4718602
who said it was necessary?

>> No.4718630

>>4718618

You (or that guy) said take the derivative and set it equal to zero.

Thats not what we mean by minimization here. The estimators are like the coefficient used to determine some dependent variable. By minimization, we mean, whats the smallest difference between our statistical model and reality that we can build.

Although I think its weird to use the word minimization at all.

>> No.4718681

Damn, this is tough.

I just did about a page of work trying to figure this one out. I think I have made some progress, but not an answer.

>> No.4718976

Ugh, scalar notation. OP, here are two ways to derive coefficient estimates for OLS (split over two posts).

Matrix notation:
<span class="math">Y = Xb + u[/spoiler]
<span class="math">X'Y = X'Xb + X'\epsilon[/spoiler]
By assumption (exogeneity of variables (i.e., no omitted variable bias)) <span class="math">X'\epsilon[/spoiler] are orthogonal, and hence the multiplication is zero.
<span class="math">X'Y = X'Xb /Rightarrow b = (X'X)^{-1} X'Y[/spoiler]

This is the smart way of solving this stuff. Since you probably really need the scalar notation, I'll show it as well:

>> No.4718993

>>4718976
Fucked up my implication sign. Should've been \Rightarrow, of course. Hopefully I haven't fucked up my LaTeX in this post.

Scalar notation:
<span class="math">E[\epsilon] = E[y - \beta_0 - \beta_1 x] = 0[/spoiler]
The sample equivalent is that <span class="math"> n^{-1} \sum^{n}_{i=1} (y_i - \hat{\beta_0} - \hat{\beta_1} x_i) = 0[/spoiler]
We also have the property that <span class="math">E[xu] = E[x (y - \beta_0 - \beta_1 x)] = 0[/spoiler] which means that our error term is uncorrelated with the explanatory variable. The sample equivalent is <span class="math"> n^{-1} \sum^{n}_{i=1} x_i (y_i - \hat{\beta_0} - \hat{\beta_1} x_i) = 0[/spoiler]
It should be obvious to see that the first equation implies that <span class="math">\bar{y} = \hat{\beta_0} + \hat{\beta_1} \bat{x} \Rightarrow \hat{\beta_0} = \bar{y} - \hat{\beta_1} \bar{x}
We then use the zero-covariance equation (while dropping the <span class="math">n^{-1}[/spoiler] because it's not important here anyway): <span class="math">\Sum^{n}_{i=1} x_i ( y_i - (\bar{y} - \hat{\beta_1} \bar{x}) - \hat{\beta_1} x_i) = 0[/spoiler].
This can be rewritten as <span class="math">\Sum^{n}_{i=1} x_i (y_i - \bar{y}) = \hat{\beta_1} \Sum^{n}_{i=1} x_i (x_i - \bar{x})[/spoiler]
It's easy to show (really) that <span class="math">\Sum^{n}_{i=1} x_i(x_i - \bar{x}) = \Sum^{n}_{i=1} (x_i - \bar{x})^2[/spoiler] and that <span class="math"> \Sum^{n}_{i=1} x_i (y_i - \bar{y}) = \Sum^{n}_{i=1} (x_i - \bar{x})(y_i - \bar{y})[/spoiler]

The solution should be trivial to see now, since there's only one step left.[/spoiler]

>> No.4719000

>>4718993

Goddamn it. Let's try this again. Wish my TeX preview button were still here...

It should be obvious to see that the first equation implies that <span class="math">\bar{y} = \hat{\beta_0} + \hat{\beta_1} \bar{x} \Rightarrow \hat{\beta_0} = \bar{y} - \hat{\beta_1} \bar{x}[/spoiler]
We then use the zero-covariance equation (while dropping the <span class="math">n^{-1}[/spoiler] because it's not important here anyway): <span class="math">\Sum^{n}_{i=1} x_i ( y_i - (\bar{y} - \hat{\beta_1} \bar{x}) - \hat{\beta_1} x_i) = 0[/spoiler].
This can be rewritten as <span class="math">\Sum^{n}_{i=1} x_i (y_i - \bar{y}) = \hat{\beta_1} \Sum^{n}_{i=1} x_i (x_i - \bar{x})[/spoiler]
It's easy to show (really) that <span class="math">\Sum^{n}_{i=1} x_i(x_i - \bar{x}) = \Sum^{n}_{i=1} (x_i - \bar{x})^2[/spoiler] and that <span class="math"> \Sum^{n}_{i=1} x_i (y_i - \bar{y}) = \Sum^{n}_{i=1} (x_i - \bar{x})(y_i - \bar{y})[/spoiler]

The solution should be trivial to see now, since there's only one step left.

>> No.4719013

How do I delete failed posts? Pressing the 'delete' button at the bottom didn't seem to do much, and I don't want to spam the thread.

We then use the zero-covariance equation (while dropping the <span class="math">n^{-1}[/spoiler] because it's not important here anyway): <span class="math">\Sum^{n}_{i=1} x_i ( y_i - (\bar{y} - \hat{\beta_1} \bar{x}) - \hat{\beta_1} x_i) = 0[/spoiler].
This can be rewritten as <span class="math">\sum^{n}_{i=1} x_i (y_i - \bar{y}) = \hat{\beta_1} \sum^{n}_{i=1} x_i (x_i - \bar{x})[/spoiler]
It's easy to show (really) that <span class="math">\sum^{n}_{i=1} x_i(x_i - \bar{x}) = \sum^{n}_{i=1} (x_i - \bar{x})^2[/spoiler] and that <span class="math"> \sum^{n}_{i=1} x_i (y_i - \bar{y}) = \sum^{n}_{i=1} (x_i - \bar{x})(y_i - \bar{y})[/spoiler]

The solution should be trivial to see now, since there's only one step left.

>> No.4719058

>>4719013

By the name area there is a little check box. Check the boxes and then press delete.

The whole Beta times summation = the summation of x(y-ybar) thing was cool and makes sense. I will remember that one.

>I need to learn how to latex

>> No.4719064

>>4719013
ok think i get it now, thanks a heap.Although i have no idea how to use LaTex and don't know much about matrices it still helped

>> No.4719091
File: 243 KB, 3600x1300, 1310681628183.png [View same] [iqdb] [saucenao] [google]
4719091

>>4719064
LaTeX is great, but it's kind of arse here on /sci/ if you can't preview it. The odds of making mistakes are somewhat high, and there's no way to edit posts to fix them. In usual texts, I use the \Sum command because it's clearer, but apparently that one doesn't work here and I have to use \sum. Tiny typographical difference, but every such typo can already fuck up the entire .tex sequence.

>>4719058
I showed both the (simple) matrix and scalar derivation, so hopefully you should have the answer now.

>> No.4719098

>>4719091
You can always check the delete button in the bottom right hand of the page to delete your post. It usually takes me a couple tries to get the LaTeX right. Just select all and copy before you post, then if there's a mistake, delete the post and try again. Or you can download TeX and write them in a TeX editor, then just use snipping tool or some such.

>> No.4719125

>>4719098

Tried to do that several times, but it didn't do anything. Not sure why the delete button didn't work. As for using a TeX editor, I've got several installed and I use them a lot, but /sci/ doesn't seem to support all TeX commands (\Sum doesn't work, but \sum does, for instance), so unfortunately previewing it there isn't quite as helpful as it should be.