March 18: 4:00 in PY 101Perceptual-Motor Learning in Infancy
Department of Psychological Sciences
The first two years of life are marked by profound, rapid changes in human behavior and abilities. Much remains to be understood about these developments, including how infants' developing motor skills enable and constrain their learning about objects. One important form of object interactions we see evidence for early in life is tool use. As soon as infants begin taking control of tools, they can use them in instrumental ways to help them accomplish certain tasks. In this talk I will describe research that investigates the question of how infants' experiences using objects to act on other objects facilitates various aspects of their object exploration, object knowledge, and object-directed action. These findings show evidence for multiple factors influencing the development of infants; behavior and abilities, including their own motivation to accomplish certain activities and their tendency to notice the consequences of their own actions. (Background Paper).
April 15: 4:00 in PY 101Using Analogy to Discover the Meaning of Images
Computer Science Department
Portland State University
Enabling computers to understand images remains one of the hardest open problems in artificial intelligence. No machine vision system comes close to matching human ability at identifying the contents of images or visual scenes or at recognizing similarity between different scenes, even though such abilities pervade human cognition. In this talk I will describe research on bridging the gap between low-level perception and higher-level image understanding by integrating a cognitive model of pattern recognition and analogy-making with a neural model of the visual cortex. (Background Paper)
September 24: 4:00 in PY 101Mining Event-Related Brain Dynamics
Swartz Center for Computational Neuroscience
University of California, San Diego
Electroencephalography (EEG), the recording of electric potentials produced by the partial areal synchrony of electrical field activity in cortical neuropile, was the earliest and is still the most widely known, most portable, most relatively low-cost, and most non-invasive brain imaging modality. However, for a variety of reasons until recently EEG imaging has not received adequate attention from engineers and applied mathematicians to the important question of how to extract more of its biologically and psychologically relevant information. Today, neurologists still typically review clinical EEG 'squiggles' by visual inspection alone, and most psychophysiologists consider only peaks in scalp-recorded event-related potential (ERP) averages - by so doing ignoring 90-99% of the recorded EEG signals. It is now generally accepted that spatiotemporal changes in EEG activity patterns correlate with changes in cognitive arousal, attention, intention, evaluation, and the like, thereby providing a high temporal-resolution window on the brain/mind. However, the biological mechanisms that link EEG patterns to these and other aspects of cognition are not understood in much detail.
In the last two decades, more adequate signal processing methods, made feasible by ever-faster computers, have greatly increased the amount of meaningful information about brain / mental function that can be mined from high-density EEG signals. My laboratory, SCCN, continues to develop the open-source the EEGLAB signal processing environment for Matlab (Delorme & Makeig, 2004) that in particular implements use of EEG source imaging based on independent component analysis (ICA) and time/frequency analysis. We are also working to develop a new imaging modality, mobile brain/body imaging (MoBI), that combines portable high-density EEG ('What the brain does.') with full-body motion capture, eye gaze tracking, and behavioral response recording ('What the brain controls?') to better understand and monitor what might be called our 'natural cognition' that guides and evaluated our naturally motivated actions -- and interactions. Using MoBI, macroscopic changes in cortical field synchrony, including interactions between multiple brain areas timed precisely to our actions (and interactions), can be detected and modeled, hopefully leading to better basic understanding of brain dynamics supporting our daily living (and their pathologies).
In coming years, as well, more adequate, near real-time EEG signal processing for feature extraction and state prediction or recognition, in combination with fast-developing non-invasive, dry, wireless and wearable EEG and other biosensor systems, will likely produce meaningful 3-D functional brain imaging and brain-computer interface (BCI) applications for a wide range of purposes. Thus EEG, the oldest brain imaging modality, is rapidly becoming a 'new' and important imaging modality, both for basic neuroscience and for the quickly evolving field of 'neurotechnology' applications. (Background Paper, Optional Additional Paper)
October 22: 4:00 in PY 101The Neuroeconomics of Simple Choice
Economics and Neuroscience
Neuroeconomics seeks to characterize the computational and neurobiological basis of different types of decisions. This talk will discusses a series of studies designed to understand how the brain makes simple choices, such as whether to choose and apple or an orange, as well as the quality of the resulting decision. This includes understanding how the brain assigns value to stimuli at the time of choice, how values are computed to make a choice and generate the motor movements necessary to implement the choices, and how these basic processes extend to more complex choice. (Background Paper 1, Background Paper 2)
Spring 2012February 6: Brian Scassellati, Department of Computer Science, Yale University, "Using Human-Robot Interactions to Study Human-Human Social Behavior"
March 19: John Spencer, Department of Psychology, University of Iowa, "The Integration Challenge in Cognitive Science and the Promise of Embodied Neural Dynamics"
Fall 2011October 24: Jeffrey Krichmar, Dept. of Cognitive Sciences and Dept. of Computer Science, University of California at Irvine, "Neuromodulation as a Brain-Inspired Strategy for Controlling Autonomous Robots and a Means to Investigate Social Cognition during Human-Robot Interactions"
November 28: Anthony Chemero, Psychology, Franklin & Marshall College, "Interaction-Dominant Dynamics, Phenomenology, and Extended Cognition"
Spring 2011January 24: Hod Lipson, Mechanical and Aerospace Engineering, Cornell University, "Self-reflective machines"
March 21: Lawrence Barsalou, Department of Psychology, Emory University, "Grounding Knowledge in the Brain's Modal Systems"
April 4: Susan Goldin-Meadow, Department of Psychology, University of Chicago, "How Our Hands Help Us Think"
Fall 2010September 13: Michael Richardson, Center for Cognition, Action and Perception, Psychology Department, University of Cincinnati, "Affording structured coordination: An ecological approach to self-organized social action"
October 18: Ennio Mingolla, Department of Cognitive and Neural Systems, Department of Psychology, Boston University, "Neural models of visually-guided steering, obstacle avoidance, and route selection"
Spring 2010February 22: Mary Hayhoe, Dept. of Psychology, The University of Texas at Austin, "Adaptive Control of Attention and Gaze in the Natural World".
February 24: Dana Ballard, Dept. of Computer Sciences, The University of Texas at Austin, "Modular Reinforcement Learning as a Model of Embodied Cognition".
April 5: Asif Ghazanfar, Dept. of Psychology, Princeton University, "Vocal Communication Through Coupled Oscillations: Substrates for the Evolution of Speech".