Booz Allen Hamilton Distinguished Colloquium in Electrical and Computer Engineering
"Energy Efficiency Past Moore's Law: Utilizing all the Physics Possible for Computation"
Professor Jennifer Hasler
School of Electrical and Computer Engineering
Increasing demand for portable electronics results without increasing battery capability results in an ever increasing demand for increased computational resources in a constant power environment. We effectively want to utilize the physics of computing systems more efficiently; the question is how to engineer such systems. These techniques are even more critical given the saturation of computational energy efficiency of digital multiply accumulate structures, the key component for high-performance computing.
One approach would be to consider using analog techniques. These approaches are fueled by recent advances in programmable and configurable large-scale analog circuits and systems enabling a typical factor of 1000 improvement in computational power (Energy) efficiency over their digital counterparts. Large-Scale Field Programmable Analog Arrays (FPAA), devices analogous to FPGAs, enable configurable analog approaches. The ability for non-volatile analog memory fuels all other innovations. We will overview a few examples in this area including speech, vision, and sensor interfaces, as well as the CAD tools to enable design of these systems.
Another approach is to take inspiration from neurobiological systems to further improve the resulting energy efficiency. These silicon systems mimic extremely energy efficient neural computing structures, potentially both for solving engineering applications as well as understanding neural computation. These neuromorphic systems should enable additional increases in energy efficiency; recent research has demonstrated some of these examples. Scaling of energy efficiency, performance, and size impact the application space of Neuromorphic systems.
Therefore, research in analog signal processing, neuromorphic engineering, and programmable / configurable analog approaches provide opportunities for continued energy efficiency scaling. We have huge technological opportunities possible for smaller and more energy efficient systems. At the same time, these advances require, and have been building, a framework to bring these techniques towards a systems perspective, undergoing a similar transformation seen in digital design through the early VLSI age.
Jennifer Hasler is a Professor in the School of Electrical and Computer Engineering at Georgia Institute of Technology. Dr. Hasler received her M.S. and B.S.E. in Electrical Engineering from Arizona State University in 1991, and received her Ph.D. From California Institute of Technology in Computation and Neural Systems in 1997. Her current research interests include low power electronics, mixed-signal system ICs, floating-gate MOS transistors, adaptive information processing systems, "smart" interfaces for sensors, cooperative analog-digital signal processing, device physics related to submicron devices or floating-gate devices, and analog VLSI models of on-chip learning and sensory processing in neurobiology. Dr. Hasler received the NSF CAREER Award in 2001, and the ONR YIP award in 2002. Dr. Hasler received the Paul Raphorst Best Paper Award, IEEE Electron Devices Society, 1997, IEEE CICC best paper award, 2005, Best student paper award, IEEE Ultrasound Symposium, 2006, IEEE ISCAS Sensors best paper award, 2005, and best demonstration paper, ISCAS 2010. Dr. Hasler is a Senior Member of the IEEE.