Imagine teaching a robot to understand how one stimulus affects another, Pavlov-ian style. In my lab, a little Lego robot named Bit has learned that when he hears a high tone, he should jump back, in much the same way lab rats learned to associate a high tone with electric shock. Bit can credit his comprehension to his virtual brain, grown on a computer from virtual DNA. He is part of an exciting experiment that blends the tenets of genetic code and computer code.
From the advent of the Jacquard Loom, to the historic work in computing done by Alan Turing, John von Neumann, and other early 20th-century math wizards, we have been slowly “rediscovering” the power of the encoded instruction that has always existed within our very DNA. Whereas genetic code is a selection of biological instructions that performs the task of growing us, we use computer code, a selection of digital instructions, to program machines to perform our own tasks. Over time, we (try to) fix errors in our programs, add functionality, and make them better.
The universe performs a similar function via natural selection. It continually discards and reengineers more effective programs suited to survive. This evolution of genetic code has worked so well as to produce the most complex machine on the planet: the human brain.
There is no doubt that traditional programming methods are better suited for engineering simple systems quickly, but they also require a complete understanding of how a system works. Conversely, evolutionary algorithms only require a picture of what the end result should look like. The heavy lifting is done by the trial-and-error nature of artificial selection; perfect for the problem of recreating intelligence, where we don’t understand all the rules at play. Wouldn’t it be awesome if we could combine the best of both worlds—manually engineering the parts we understand, and letting evolution give us a helping hand with the parts we don’t, to quickly develop AI systems?
This is the heart of my work on SynthNet, a computer program I developed that emulates neural connections. The software combines code capable of emulating the neurophysiology of the brain (akin to other projects like NEURON and GENESIS) with code capable of mimicking the actions of human genes. Using it, neural structures can be grown using virtual DNA. Programmers can create this DNA themselves, or have the computer build it through artificial evolution.
In this way, we can design a neural framework using well-understood principles, then let evolution do its thing to get us the rest of the way. SynthNet has already shown promising success toward its goal of using genetics to grow an intelligent system, thanks to Bit. In that first experiment with him, I would touch Bit and he would move backward, while the high tone was played. He ignored that tone initially, but then after repeatedly touching him while simultaneously playing the tone, he learned to associate the two, eventually moving backward when only hearing the tone.
SynthNet is capable of producing even more complex structures. My current work focuses on mirroring the functionality of a rudimentary hippocampus, capable of indexing and recalling declarative memory, then subjecting the DNA to evolution for increased functionality. The applications for this are endless: web-crawling artificial life that sniffs out specific types of information, resulting in truly intelligent search engines; Brain Computer Interface (BCI)-connected translators grown for speechless communication between individuals, thereby removing language and physical handicap barriers; forecasting agents evolved for economic or weather systems; even the holy-grail of the AI community, sentient artificial intelligence connected to robotic systems.
To many, it may all sound the stuff of Star Trek, but this future is provably here for those ready to embrace it and use it to make the world a better place. It is simply a matter of funding and refining these tools, then putting them to work. I hope you’ll join me in this exciting next step!
This article is commissioned by Qualcomm Incorporated. The views expressed are the author’s own.