I believe that there are fundamental principles of computation that we can learn by studying neurobiology. If we can understand how biological information-processing systems operate, then we can learn how to build circuits and systems that deal naturally with real-world data. My goal is to investigate the organizational and adaptive principles on which neural systems operate, and to build silicon integrated circuits that compute using these principles. I call my approach silicon neuroscience: the development of neurally inspired silicon-learning systems.
I have developed, in a standard CMOS process, a family of single-transistor devices that I call synapse transistors. Like neural synapses, synapse transistors provide nonvolatile analog memory, compute the product of this stored memory and the applied input, allow bidirectional memory updates, and simultaneously perform an analog computation and determine locally their own memory updates. I have fabricated a synaptic array that affords a high synapse-transistor density, mimics the low power consumption of nervous tissue, and performs both fast, parallel computation and slow, local adaptation. Like nervous tissue, my array simultaneously and in parallel performs an analog computation and updates the nonvolatile analog memory.
Although I do not believe that a single transistor can model the complex behavior of a neural synapse completely, my synapse transistors do implement a local learning function. I consider their development to be a first step toward achieving my goal of a silicon learning system.