How computers built with human neurons could cut emissions

How computers built with human neurons could cut emissions

A startup from Switzerland wants to change the conventional methods of building artificial intelligence (AI) models: Instead of relying on digital chip processors, it believes the world needs biological ones that use much less energy.

Founded in 2014 in Vevey by Martin Kutter and Fred Jordan, FinalSpark says that it has tested 10 million living neurons and that research work is already underway on building thinking machines from live human neurons derived from skin.

The startup is growing neurons in cell cultures to showcase self-sustaining computing capability for the future creation of AI models.

How does this work?

Using an electric wire, the startup is training human neurons to process information the same way the human brain does. Co-founder Fred Jordan told Quartz that his team has been testing futuristic methods in AI models, such as in silico spiking neural networks and genetic programming. The aim is to achieve artificial general intelligence.

Contrary to current AI models, which mimic human thinking after months of training on advanced data, FinalSpark wants to achieve actual human reasoning capable of analyzing emotions while creating new ideas and concepts outside its own experience. “This is what a ‘real’ thinking machine should do,” Jordan said. “Since the best-known processor of information is a human neuron.”

The company wants to lead the shift from artificial engineering to biological engineering, predicting that DNA data storage may outperform cloud storage in the future, in terms of sustainability and efficiency. The website for its lab shows live demonstrations of its biochips working in real time.

In February, scientists led by researchers at Johns Hopkins University in Baltimore detailed a plan for what they called “organoid intelligence.” This would work by designing a thinking system of tiny three-dimensional neural structures grown from human stem cells. These would be connected to sensors and output devices and trained through machine learning. “Looking at this trend, one can imagine that biological neural networks could also replace artificial neural networks for many computing applications, including AI,” Jordan added.

However, a cellular neuroscience paper published by Frontiers notes that synthetic biological intelligence is still in its nascent stage, and there are still some obstacles in its path. What needs to be improved, the paper states, is the “accuracy and efficiency of the AI algorithms used to analyze the data, as well as the reproducibility of the synthetic biological models themselves.” With continued advancement in this field, “synthetic biological intelligence has the potential to revolutionize the field of medicine,” the paper says.

Shifting from energy-intensive creation of large language models

It is no surprise that the continuous training of AI algorithms on billions of data points consumes a lot of energy as data centers require huge amounts of water to keep them cool during this whole process. Actually, data centers around the world emit more carbon than the commercial airline industry. Training a single AI model or chatbot, for instance, can use more electricity than 100 US homes do in a year.

While the human brain, which has a storage capacity of 2,500 terabytes, contains at least 86 billion neurons, it only uses 10 watts of power for daily computing. The brains also performs about 1,000 trillion operations per second, while having the energy consumption of a dim light bulb. To match the human brain’s organic computing power, silicon AI chips would require 10 megawatts. This means human neurons cut energy usage by a million.

“If we can use alternative hardware—such as living neurons—for AI models with the same or better result—this is the best way to stop increasing carbon emissions from AI models,” Jordan said.

Source link

Latest stories