Humans have around 400 Trillion synapses in the brain;
The largest neural net (2023) is 100 Trillion parameters.
it’s a fascinating analogy to compare two objects - the human brain and a neural net; And whether the comparison of such sole variable - the synapses are the fundamental core to engineering intelligence. Is all synapses that is built into ourselves the core cognition aspects - having 400 trillion synapses being designed by evolution to be in an active while loop until death; More over, the brain itself is dynamic - synapses are added, dropped, and modified based on a particular environment - and constant by biological growth. Now, what’s interesting how such biological mechanic can be modeled mathematically as a network where these artificial synapses are constantly being adjusted based solely on the training phase of a neural net. We implement this phenemona by the back propogation (see Beyond Backpropogation ) for the differences between artificial and biological learning - which all comes down to training the synapses strength. And a fun side note: the mention that AI is branching off of two different fields - one grounded from a brain and cognition model and the other is purely mathematical in nature - perceiving a neural net as a composition of functions and matrix operations.