[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 6 KB, 478x338, graphic8.gif [View same] [iqdb] [saucenao] [google]
4619834 No.4619834 [Reply] [Original]

I want to program a neural network that will grow in complexity.

My idea is to use evolution as a sort of guide. I will start with some "environment" and the program will try different neural connections until it gets one that "survives", then the "environment" gets more complex and it repeats.

For example I might start be teaching it addition.
So the perhaps the "environment" would consist of two inputs that only send a 0 or a 1. There will be two outputs that can allow it to output a binary number from 0-3
It will "survive" if it correctly adds the inputs and puts the result in the output.

What I need help with is how to model the neurons.
If neurons get more than one incoming connection should it add them? Or just be a 1 if it gets a signal from any or all of its connections? Or should it only be a 1 if it has an odd number of signals, that way giving it 2 can allow for the ability to turn neurons off.

Should I not just do 1's and 0's and perhaps treat it more like the voltages in neurons? and should I give each neuron a threshold voltage so that it only fires if it is high enough?


What do you think /sci/?

>> No.4619842

>>4619834
derp

>> No.4619857

>>4619834
did you even read one textbook about neural networks? no of cource not becasue in the fucking first chapter they will explain all the methods to do that with the advantages and disadvantages.

yes it sould add them. then it will give it to the activation function and give you the output. and all neural networks work in the way you want to do. that the point of having neural networks.

>> No.4619867

just a little tip to help out; with anything that develops evolutionarily the most important thing is your method of determining "fitness" and the accuracy of your test for such. actual mechanisms of evolution are less important.

>> No.4619874

South-Seeking Magnetic Bacteria?!

http://jeb.biologists.org/content/86/1/345.2.full.pdf

>> No.4619881

>>4619857
Well I would assume they have it in a textbook on neural networks...

Thanks a lot though. So that's all then? Or is there more to it?
any other advice?

I should note that the programming part is just a fun idea I'm entertaining right now, but at the heart of all of this is really just an interest in the topic more than anything.

>> No.4619883

>My idea is to use evolution as a sort of guide
>neural network.

so you want to train it AND use some genetic algorithm? how will you combine it? and how will it be different from pure training?

>> No.4619890

>>4619881
it depends on what you want the thing to do. for a simple one. add them all up, add a bias term (input that is always 1) and then give it to your activation function (this can be 1/(1+e^(-x))) what structure are you using? are you gonna use multiple layers? what training are you gonna use? back-propagation works if you don't have loops in the network.

>> No.4619893

>>4619883
I never said genetic.

It will just adapt to different tasks slowly working up to a larger more complex goal.
Is this not as good of an idea as just training it directly for a specific purpose from the start?

>> No.4619909

>>4619893
>It will just adapt to different tasks slowly working up to a larger more complex goal.

that may be harder that normal training. if you teach it addition, then multiplication, it will forget about addition unless you teach it some combination of the 2 (1 set addition, then 1 set multiplication, repeat). unless you add more nodes to the network before each new task and lock the weights of the things it already knows, that might work actually. the network will need to be larger in that case though.

>> No.4619913

>>4619890
I wasn't even going to have it be in distinct layers.
I was just thinking it would try all possible connections to every neuron and if it is not good enough for the task (in the example, if it is unsuccessful in adding the numbers) then it will try a different connection.

If it connects the existing neurons in all possible ways and still hasn't found a satisfactory configuration then it adds a new neuron and again tries all possible connections that can exist with those neurons until it finds a configuration that is able to complete the task.

That was my idea, I guess that's closer to back propagation right?
Or am I just way off

>> No.4619918

>>4619909
what if the next stage just repeats all of the older ones plus the new one?
So it would still test for its ability to complete addition, as well as multiplication when it gets there

>> No.4619921

>>4619918
actually I think I get what you were saying.. nevermind.

>> No.4619943
File: 21 KB, 318x318, skynet.jpg [View same] [iqdb] [saucenao] [google]
4619943

Eventually it will learn how to read and write, and then I will set it loose on the interwebs at which point its growth will be beyond my ability to control.

>> No.4619945

>>4619913
the problem with not using layers is that the number of connections for n nodes are ~n^2 where if you distribute it between layers its only ~n. so it will be a lot faster in larger networks. the training is also simplified because signals only travel one way. you get interesting things when you allow all of then to connect to all of them like the ability to store short term information and such though.

>If it connects the existing neurons in all possible ways and still hasn't found a satisfactory configuration then it adds a new neuron and again tries all possible connections that can exist with those neurons until it finds a configuration that is able to complete the task.
that work work very well in a neural net. a neural net will always approximate the task it is given (by adjusting the weights of the connections.) so you will have to train the neural net, see if the approximation is good enough, if not add a connection, repeat. that will take forever. and you could just add all the connections from the start and it will automatically set the connections that arnt used to 0 during training. your method does have the advantage of the neural net kind of localising the abilities in a small number of neurons, which may be good in large nets, but be prepared for leaving your computer on for a week to train it just how to add, subtract, divide and multiply.

>> No.4619957

>>4619945
>that work work very well in a neural net.
>that wouldn't work...

i think.

>> No.4619985

>>4619945
thanks for all the insight, this is really helpful.

I guess the big thing that I still feel I'm not sure on would be how the weights are determined. Do I just adjust them through trial and error? Or is there a way that through running the neural network it slowly adjusts its own weights?

>> No.4619998
File: 11 KB, 277x182, unregulated capitalism.jpg [View same] [iqdb] [saucenao] [google]
4619998

>>4619943

>> No.4620012

>>4619985
the only method ive used is using back propagation of error.

if the weight is w(t) you give the network some input I(t) and the desired output D(t). then the neural network calculates a output O(t) from I(t). you take the error E = (I-O)^2. (for example, you can do it in different ways.) then you calculate the new weight as w(t+1) = w(t) - L*(dE/dw) where L is how fast is learns. as you can see you need to know E (thus O) us a function of the weight you are adjusting. this needs some math to determine but is pretty straight forward and generalizes pretty easily to arbitrary networks.

this will eventually convergence to the correct solution.

>> No.4620019

>>4620012
*E = (D-O)^2

>> No.4620042

>>4620019
awesome thanks, I guess that's probably what takes the most computing time.

sorry, 1 more thing just popped into my mind, what is the strength of the firing neuron?
Do they all just fire with the same strength, or is it the weighted sum of the inputs or something?

>> No.4620092

https://www.coursera.org/course/ml will give you the tools you need.

>> No.4620127

>>4620042
you sum the inputs multiplied by the weights, then you have some function that calculated the output. it can be as simple as , if the sum of the inputs are positive its 1, else 0. or iot can be some function that smoothly changes from 0 to 1, or -1 to1

>> No.4620188

thank you gentlemen.

>> No.4620453
File: 17 KB, 320x297, cyborg.jpg [View same] [iqdb] [saucenao] [google]
4620453

FINAL QUESTION!!!

What would you name your AI if you ever succeeded in making one?

>> No.4620472

>>4620453
R. danieel olivaw