The ability to decode neurons can connect a person straight with the net.
The AI itself is an impressive tool. If the office uses general-AI-based technology. That means the director can simply give a "to-do" list to AI in the morning. And then the AI makes those things, like sending business letters to customers, and following that payments are made.
But the AI allows us to create things like BCI (Brain-computer interfaces). Those interfaces link people straight with computers. The fully working BCI system allows people to make connections using brainwaves. And the BCI system makes technical telepathy possible. In those systems person can control robots using the brainwaves.
The BCI means that the computers can decode thoughts. Then system can transmit those thoughts on the computer screen. The ability to control robots using brain waves is possible only when the system knows what a person thinks. The problem is this. If we control robots by using brainwaves most thoughts are not meant to control robots. We might accidentally think something that we don't dare to say. And then robots will do something that we don't mean. The AI should filter those dangerous things away from systems, meant for civilian use.
New simulations can simulate neuron's work very effectively. And we can say that the laboratories are decoded neurons. This thing makes a very big advance in many technologies. The complete knowledge of the system is required for full control of it. Things like BCI (Brain-computer interfaces) and neuron-microchip hybrid systems control things like nanomachine swarms.
The ability to transfer information directly between computers and neurons makes it possible to create new types of interfaces. In those interfaces, people control computers straight with their brainwaves. If we think that people use BCI interfaces when they operate with robots, that means the robot is like another body for a person. The person will not make a difference between the robot body and themselves.
The BCI systems allow to creation of ultimate augmented reality. That connects a person straight to the internet. In some visions, the neural link-type of systems can interact with the Internet through mobile telephones. And when we are talking about augmented reality, we mean multilayer systems. In those systems, people can use robots as exobodies, but they can also operate in fully synthetic simulation.
And that thing can revolutionize everything from surgery to military, games, and adult entertainment. In some dystopic visions, some game addicts can use so many BCI-based augmented reality levels that they lost in that virtual world. In the worst case, those people die of hunger and thirst, because they simply forget. That they are in virtual reality. But making those things and visions functional, the creators of those systems must know how neurons interact.
When we think of half-organic microchips those microchips interact with neurons. If engineers can program neurons. That allows them to transfer complicated programs to small microchips that interact with the machines straight through their kernel.
Same way, programmed neurons can control nano- or mini-machines. The problem with nanotechnology is that the nanomachines are too small to carry effective computers. There are two ways to handle that problem. The nanomachines can make WLAN-based swarms. Another version is that the nanomachines can use living neurons for handling their operations.
https://scitechdaily.com/neurons-decoded-the-universal-workflow-powering-brain-insights/
Comments
Post a Comment