Post by general313 on Nov 17, 2017 1:09:09 GMT
So you think there is something special that happens at a higher level in the brain that is beyond the reach of machine learning designers? Some kind of emergent property? As I said before, given the history of observers imaging some fundamental and special difference between the biological and non-biological, I won't bet on it.
As neural scientists probe deeper into how human vision works, they are discovering more about how the brain processes images for recognition, and it bears a striking resemblance to the neural networks used in machine learning.
It doesn't matter where the inspiration came from. If we can create artificial neural networks that exhibit the same behavior as brain structures it takes away some of the mystery of how the brain works. That the behavior can be replicated with inorganic machinery suggests that there's nothing special about biological brain cells. It's simply a factor of how we evolved,
just as we locomote via muscles and bones, and not with wheels.
Learning and memory retention in human brains and neural networks work the same way, by adjusting interconnection weights (synapses in the case of humans and layer node coefficients in a neural network).
You dismiss "simple mechanical circuits", but brains and smart computers derive their power in the organization of very large numbers of simple elements.
That is true so far, but that is mainly a question of the number of neurons that are in the human brain (100 billion or so I believe). Admittedly we are still in the infancy of machine learning, but I think there is every reason to expect great advances in our ability to make hardware powerful to rival the capabilities of the human brain in the not too distant future. Hence the concern of the main topic of this thread.

