Everyone is now scrambling to integrate AI with as many facets of human life as possible. Neural nets and machine learning can offer greatly improved processing speeds, yet these aspects still rely on digital pathways that may never fully mimic the biological structure of the human brain. The next step in AI improvement would be to combine the best of both the digital world and the biological world. Some scientists are already experimenting with this possibility, as a new article published in the academic journal Frontiers of Science is deep diving into the realm of biocomputers and organoid intelligence (OI).
All AI applications today rely on computing power provided by powerful CPUs or GPUs. OI, on the other hand, is seeking to bring “unprecedented advances in computing speed, processing power, data efficiency and storage capabilities” by harnessing the complexity of lab-grown cell-cultures repurposed from adult skin cells that consist of 3D clusters of neurons and other brain cells.
Instead of training computer models to reproduce human thought processes, a more effective solution would be to actually combine digital processing methods with biological brain structures. Thomas Hartung, Professor at John Hopkins University’s Bloomberg School of Public Health and one of the co-authors of the aforementioned scientific article explains:
Silicon-based computers are certainly better with numbers. For example, AlphaGo (the AI that beat the world’s number 1 Go player in 2017) was trained on data from 160,000 games. A person would have to play five hours a day for more than 175 years to experience that many games.
We’re reaching the physical limits of silicon computers because we cannot pack more transistors into a tiny chip. The brain is wired completely differently. It has about 100 billion neurons linked through over 1015 connection points. It’s an enormous power difference compared to our current technology.
While new ways of shrinking transistors beyond the Angstrom threshold could be envisioned, the energy efficiency of computer circuits would still be nowhere near the capabilities of human brain cells in this regard. Hartung specifies that “the amount of energy spent training AlphaGo is more than is needed to sustain an active adult for a decade.
For now, scientists have managed to construct organoids that contain around 50,000 cells. It is a promising start, but, in order to see any practical use beyond what AI is capable of now, Hartung believes the organoids need to scale to more than 10 million cells. With current possibilities, we are looking at years, maybe even decades before such complex organoid structures could be achieved, so there is enough time to develop an ethic code for this approach. To put things into perspective, another co-author, Dr. Brett Kagan, demonstrated in a December 2022 study how a single flat brain cell structure is able to learn to play Pong.
Apart from boosting the processing power of AI applications, OI could also help identify cures for neurological degenerative diseases such as Alzheimer’s or even discover various substances that boost human learning and memory capacities.
Are you a techie who knows how to write? Then join our Team! Wanted:
- Specialist News Writer
- Magazine Writer
- Translator (DE<->EN)
Details here
Source(s)
via Techradar
Join our Support Satisfaction Survey 2023: We want to hear about your experiences!
Participate here
Top 10 Laptops
Multimedia, Budget Multimedia, Gaming, Budget Gaming, Lightweight Gaming, Business, Budget Office, Workstation, Subnotebooks, Ultrabooks, Chromebooks
under 300 USD/Euros, under 500 USD/Euros, 1,000 USD/Euros, for University Students, Best Displays
Top 10 Smartphones
Smartphones, Phablets, ≤6-inch, Camera Smartphones