RE: Would it be possible to download skills into your brain like in the movie Matrix?
The term "information" is difficult to define. In relation to human beings, the reception of information does not only refer to the conscious but also to the unconscious process. You will know this when you have received training in hypnosis.
Your confusion is not properly defining the scope you're talking about.
scope n.
the extent of the area or subject matter that something deals with or to which it is relevant.
At the scope where we can talk about single calculations of a CPU or the firing of a neuron, information is the signal that comes in as well as the signal that comes out. It's really simple at that scope. Note that it's impossible to talk about any kind of subconscious at this level.
At a higher level, well past that level of analysis, we can talk about input in terms of consciously perceived, vs. subliminal.
People absorb information in their interaction with other people largely through body language, much less through verbal expressions.
This is something that people who teach body language interpretation would tell you. However, the old adage that 90% of communication is non-verbal (or some other made up percentage) means exactly that -- non-verbal. Well, non-verbal in this case means implicit within your behavior and the situation as contrasted with anything you intentionally make explicit. The easiest way to explicitly communication something is simply to state it, with words.
The totality of communication is much more than just the conveyance of facts; it includes the interpretation of perception.
I'll go with that, if you keep in mind that the other person will interpret things in their own way. (As the NLPer's say: The meaning of any communication is the response you get.)
In particular, symbolic reasoning is a matter that an AI could hardly keep up with, because it is a human activity that has to do with learning language throughout the body.
I tried to find something online that would go into more of what you're talking about but I'm not finding much. If you don't mind, please give me a few examples of "symbolic reasoning" in the way you're talking about it.
The neuronal impulses that are measured only say something about the "where", but nothing about the "what" and "why".
A neural impulse itself doesn't even contain information about "where". All of the above, "where", "what" and "why" are all established somewhere downstream at a higher level of scope. If by "measured" you mean "intercepted" then that's something else entirely.
Information is interpreted in terms of its meaning in the context of its occurrence.
If you mean that context is fundamentally integral to perception, then I agree. A whimsical example: it's fairly common for a chick to choose an ugly friend to make themselves appear better looking by comparison.
A person who witnesses an act of violence, for example, can unconsciously or consciously think of historical events such as war. A memory of which he is not aware suddenly flashes up and causes a certain feeling about the situation. Without the fact that man himself has even come close to this physical violence, he can feel compassion and fear.
This sounds a lot like priming. It's a direct consequence of memory being stored in an associative network.
From where, for example, should an AI take the collective memory of a family system, the unconsciously passed on convictions of parents who were refugees, for example, and pass on their fears and judgments to their children? A human being is not a self-contained operational system, it is an ecosystem that interacts with all other systems.
I never said that instances of AI were humans.
Information that makes sense does not just consist of an electromagnetic impulse, it is not just a physical singular event.
Theme: wrong scope.
People think in contexts and events to which they give meaning. This sense is not found in the neuronal impulses, they may only serve as a partial expression of a whole organism, in which not only the brain but everything in the human body is involved.
Meaning is a part of the basic structure of a belief.
I'm not sure what you mean by not only in the brain but everything in the body is involved. If you could elaborate or give me a clue about what you mean, I'd appreciate it. Thanks.
So where is the information localized or where is its initial origin to be found when people are interacting? The answer is that we do not know. There is no such thing as a definite beginning and a definite end of an information process.
Whaaaat? Please give me a clue of what information you speak and at what scope, and maybe an example. Thanks.
To simulate a human brain, wet-ware would be needed if it were to be transferred to machine technology. But then you would simply have a wet electronic brain, but no human-like intelligence.
Theme: at no point did I say that any implementation of AI was a person. At any rate, I don't know what would happen nor exactly what it would take to simulate a human brain. Even though I'm a total psychology geek and a programmer, I'm not a computational neuroscientist and any ideas I have about trying to simulate a human brain -- as a whole -- would be directly farted from my rectal nethervoid.
One would have exactly that: a moist artificial brain mass that "processes" electronic impulses.
Theme: wrong scope.
AI can therefore never be better or a kind of almighty knowledge store, AI is always something else, but just like humans, it must specialize its knowledge in order to be useful
AI is a tool. Nothing more, nothing less. If AI wasn't useful, no one but hobbyists would bother with it. Or are you implying that some AI instance must, on it's own, specialize it's knowledge to be useful to itself?
A big difference between a computer AI and a human being is that a machine learns only through 0 and 1, but a human being, when born, learns through all senses, most of all through the sense of feeling as a baby. This completely different way of learning should be clear to us.
I'll just leave this here.
And... I need a break. Will write more in a bit.