Time: 3:00 - 4:00 P.M.
A reception will be held in the café at 2:30 p.m.
Abstract: In the past decade, we have witnessed spectacular successes of machine learning in solving key AI problems, including automatic pattern recognition, image recognition, speech recognition, robot control and autonomy, game playing, such as chess and Go. Today’s AI landscape is dominated by powerful deep learning approaches, which exploit massive data resources and the availability of cheap computing power. They are propelled by the overarching demand for various application domains, such as handheld devices and the Internet. Deep learning approaches are based on developments in neural networks in the past 40 years.
In this talk we analyze recent AI successes and possible ways to expand on these achievements. We study recurrent network architectures and the corresponding dynamical processes, leading to dynamical memories, motivated by brain dynamics. We prove the existence of phase transitions between dynamical states (fixed points, limit cycles, chaos) in a class of cellular neural network. Specific applications include knowledge elicitation from big data, development of new powerful dynamic memories, self-organized ontogenetic development of autonomous navigation and control, and brain-computer interfaces.
Bio: Dr. Kozma Professor is Visiting Professor at CICS, University of Massachusetts Amherst, Co-Director of the BINDS Lab, coordinating a DARPA initiative on brain-inspired AI research. He is Professor of Mathematics at the University of Memphis, TN, Director of the Center for Large-Scale Intelligent Optimization and Networks (CLION). Dr. Kozma is Fellow of IEEE, Fellow of the International Neural Network Society (INNS). He has held various leadership roles in the community of computational intelligence and AI, including serving as President of INNS (2017-2018).
Faculty Host: Erik Learned-Miller