AAAI Video Competition - 2014-07-31
Evolving Neural Networks That Are Both Modular and Regular Joost Huizinga, Jean-Baptiste Mouret, Jeff Clune University of Wyoming Best Video AAAI Video Competition http://aaaivideos.org
Hi, the android picture at the end is property of ESET. The cybersecurity company I work for.
Thank you for these videos, this makes research accessible and is really helpful
can you help me understand how to implement this cost of connections function. I would hate to have to write a function that does by randomly choosing
EDIT: it's been 4 months since I posted this. I have a pretty good imagination, but I can't for the life of me figure out how to implement this cost function.
I'm out of my league here since this is really the first I have heard of this but what about implementing a cost for distance? IE: If a connection wanted to be made further away it would lower the fitness of the system. This would force connections to remain and evolve as local as possible unless it otherwise just HAD to get data from far away. Genetic algorithms would clean the data up and untangle the mess.
Fantastic!!
"Connection cost" could you explain that? What kind of cost? Do they have a maximum number of connections, so a current neural connection must be broke in order to add another connection?
for training an AI you need to set a value called "fitness" (how well it is doing relative to a task we want it to do) the program will do actions "at random" and if the actions help the AI get a higher "fitness" those actions are more likely to be repeated in future iterations if you add a connection cost every time two neurons interact they lower the score so a "messy" network gets a lower "fitness" because of that a network that do the same thing with less connections get a higher score and due to their evolutive nature the ones with higher score "survive" and get replicated and the lower scores get eliminated (something like natural selection). I am not an expert I just started studying neural networks and evolutive algorithms so take what i said with a little of skeptisism and english isn't my first language so sorry for the possible mistakes
Oh I see, that makes sense.
Skyneat -_-
I for one welcome our AI overlords
This makes sense since connections and nodes in nature also have an energetic cost associated to them.
But how a complex structure evolve while costs prevent it? That structure will never develop because it'll be eliminated before it can start to benefit to fitness. NEAT paper clearly points out it's non benefical side.
It depends on how much of a cost you apply to structure. A slight cost will not prevent structures from developing.
As the theory goes we make memories during sleep. therefore we process our daytime data and convert it into connection. Is the process of weighing each event just a way of data reduction which a computer has no need of as it has infinite resource. Does organic memory need a method of storing data by importance due to limited resource. And due to this limit do we restrict data by giving it an importance. yes we do, we label it by how succesful our actions fed us and allowed us to reproduce and avoid being eaten. but a computer has none of these fears except battery life. Only by introducing sensors can we understand and can a cpu combine the required weighing process to make ONLY the connections neccesary.
This is fine and all, but what is the advantage of having a modular and regular network? I think this means that the network could learn to re-use elements like how our brains work. i.e. a section of our brain can solve 2 completely different problems because they share a common way to be processed, such as how our visual processing uses spatial relativity to identify objects, but that same system could be re-used to determine our position in space because we know we are somewhere relative to an object
what if you try this on a regular problem though?
I think you make a mistake. The cortext is a generic structure wich have 2 mm thick. See cortical columns (mini and maxi columns). The great problem with neural network is the post part of the people think all neurons are mix in the brain. For me the zones you show on the cortext are composed of several Maxi column
sikor02 - 2014-11-25
Could it be that network with more connections (higher connection cost) will perform better than network with less connections in specific task? If so, then how you leverage connection cost and overall network fitness to achieve best network with minimum number of connections?
Tuc - 2016-10-31
Lets use our neural networks brains to make a neural network brain!