[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 71 KB, 588x391, 0C575C57-6202-4606-BABA-9428948F53FD.jpg [View same] [iqdb] [saucenao] [google]
10261249 No.10261249 [Reply] [Original]

So im trying to learn linear regression intuitively and the thing im having a problem against is how do you come up with the coefficients in an intuitive sense?

y = mx + b , so given a bunch of data points I figured you averaged them in some way to get a line thats the average but how exactly do you get two average points to find the slope? m = y2 - y1 / x2 - x1

>> No.10261257

You don't do linear regression for the average, you do it for the least squares, that is, minimizing the sum of square errors.
If you just want the average, take differences, and pick the value that minimizes the sum of errors for the slope, then minimize again for the independent variable.

>> No.10261258

>>10261249
root mean squared error
The line that minimizes the total area of all the squares between the line and all the points. Has to do with the square root of the mean somehow

>> No.10261263

>>10261257
>>10261258
Sorry when I say average I meant average as in a line that best fits the data, closes to the amount of points. I was stuck on the word average since I was trying to see if I could figure out a formula to do it without looking

>> No.10261272

>>10261249
>>10261263
It's called method of least squares. Specifically, linear least squares when you're looking for a linear fit but iterative methods exist for higher order polynomials. It's actually a pretty simple method.

>> No.10261277

>>10261263
>reinvent
>the
>wheel
It's also just optimization, so I have no idea how did you actually get stuck.

>> No.10261282

>>10261249
>how do you come up with the coefficients in an intuitive sense?
You're building a model. A good model with have little error. You are fitting a line through the data such that the error (distance from data to model) is minimized.

>> No.10261286

>>10261272
>>10261277
Hey sorry guys I guess I wasnt too clear in the OP, im or trying to reinvent the wheel but more like trying to see a problem and know how to do it intuitively instead of just googling linear least squares.
Like if you guys forgot the name of what its called would you be able to come up with an approximation just from looking at the graph?

>> No.10261301

>>10261249
Hey, a fellow /an/on!

>> No.10261314

>>10261286
Yeah, I'd just write up a linear system on two variables and minimize.

>> No.10261319

>>10261286
Sure, I guess. Like the other anon said, it's really just an optimization problem. If you recognized your goal as minimizing least squares, I supposed you could figure out how to do it in a reasonable amount of time. I think it's far less intuitive than you're hoping for. In my opinion, numerical methods don't really lend themselves to immediate intuitive derivations in the way calculus problems do, which is what I think you're looking for.

>> No.10261355
File: 232 KB, 887x900, AC25F4C8-A07B-4388-9332-15C81E541C42.jpg [View same] [iqdb] [saucenao] [google]
10261355

>>10261301
:3

>>10261314
>>10261319
Thanks anons, im trying to implement object recognition with a support vector machine but I wanted to make sure I understood machine learning a bit more instead of mindlessly plugging into stuff I know nothing about.
So I figured I could implement a basic linear regression machine learning from scratch but I dont want to just copy what someone else did without understanding the decisions being made.
But if its not that intuitive then I guess I feel less bad