Indiana University Bloomington
NSF
An IGERT Training Program

The Dynamics of Brain-Body-Environment Systems in Behavior and Cognition

Spring 2014

March 31: 4:00 in PY 101
Perception Viewed as a Phenotypic Expression
Dennis Proffitt
Dept. of Psychology
University of Virginia

Visual experience relates the optically-specified environment to people's ever-changing purposes and the embodied means by which these purposes are achieved. Depending upon their purpose, people turn themselves into walkers, throwers, graspers, etc., and in so doing, they perceive the world in relation to what they have become. People transform their phenotype to achieve ends and scale their perceptions with that aspect of their phenotype that is relevant for their purposive action. Within near space, apparent distances are scaled with morphology, and in particular, to the extent of an actor's reach. For large environments, such as fields and hills, spatial layout is scaled by changes in physiology - the ioenergetics costs of walking relative to the ioenergetics resources currently available. When appropriate, behavioral performance scales apparent size; for example, a golf hole looks bigger to golfers when they are putting well. Research findings show that perception is influenced by both manipulations of and individual differences in people's purposive action capabilities. (Background Paper)

April 7: 4:00 in PY 101
Neurocognitive modeling of perceptual decision making
Thomas Palmeri
Psychological Sciences
Vanderbilt University

Mathematical psychology and systems neuroscience have converged on stochastic accumulation of evidence models as a general theoretical framework to explain the time course of perceptual decision making. I describe collaborative research that aims to forge strong connections between computational and neural levels of explanation in the context of saccade decisions by awake behaving monkeys. Viable models predict both the dynamics of monkey behavior and the dynamics of neural activity. I describe how neural measures can be used to constrain models, how model predictions of neural measures can be used for model selection, and how cognitive models inform understanding of neurophysiological signals. I also discuss strengths and limitations of mapping between computational and neural levels.(Background Paper 1, Background Paper 2)

April 21: 4:00 in PY 101
Individual differences as a crucible for testing embodied theories of language comprehension
Arthur Glenberg
Dept. of Psychology
Arizona State University

I will present the results from two projects that examine individual differences in the embodiment of language comprehension. Both projects test the claim that language comprehension is a simulation process that uses neural and bodily systems of perception, action, and emotion. But, are embodied effects found only when reading special texts designed to elicit them, or is simulation the basis for comprehension of all types of texts? If the latter is correct, then people who best simulate should best understand. This prediction was tested in the first project. We first measured reading comprehension skill using the Gates-McGinite standardized reading test, and then the same participants read a passage using Zwaan & Taylor's reading by rotation paradigm that provides a measure of embodiment. The embodiment prediction is for a positive correlation between the two measures, whereas non-embodied positions predict either a negative correlation (simulation is a waste of resources) or a zero correlation (simulation is epiphenomenal). The second project takes seriously the claim that bodily resources contribute to simulation. In this case, large (tall/heavy) people should find it relatively easier to understand sentences such as, "You pushed the SUV to the gas station for a fill-up" compared to sentences such as "You entered the house by crawling through the doggie door." And, smaller people should show the reverse. (Background Paper)


Fall 2013

September 9: Morana Alac, Communication and Science Studies, University of California, San Diego, "Being social as rooted in practical situations of interaction"

October 21: William Warren, Dept. of Cognitive, Linguistic and Psychological Sciences, Brown University, "Self-organization in human crowds: From individual to collective behavior"

December 2: Nicholas Turk-Browne, Dept. of Psychology, Princeton University, "Statistical learning in the mind and brain"


Spring 2013

March 18: Amy Needham, Department of Psychological Sciences, Vanderbilt University, "Perceptual-Motor Learning in Infancy"

April 15: , Melanie Mitchell, Computer Science Department, Portland State University, "Using Analogy to Discover the Meaning of Images"


Fall 2012

September 24: Scott Makeig, Swartz Center for Computational Neuroscience, University of California, San Diego, "Mining Event-Related Brain Dynamics"

October 22: Antonio Rangel, Economics and Neuroscience, Caltech, "The Neuroeconomics of Simple Choice"


Spring 2012

February 6: Brian Scassellati, Department of Computer Science, Yale University, "Using Human-Robot Interactions to Study Human-Human Social Behavior"

March 19: John Spencer, Department of Psychology, University of Iowa, "The Integration Challenge in Cognitive Science and the Promise of Embodied Neural Dynamics"


Fall 2011

October 24: Jeffrey Krichmar, Dept. of Cognitive Sciences and Dept. of Computer Science, University of California at Irvine, "Neuromodulation as a Brain-Inspired Strategy for Controlling Autonomous Robots and a Means to Investigate Social Cognition during Human-Robot Interactions"

November 28: Anthony Chemero, Psychology, Franklin & Marshall College, "Interaction-Dominant Dynamics, Phenomenology, and Extended Cognition"


Spring 2011

January 24: Hod Lipson, Mechanical and Aerospace Engineering, Cornell University, "Self-reflective machines"

March 21: Lawrence Barsalou, Department of Psychology, Emory University, "Grounding Knowledge in the Brain's Modal Systems"

April 4: Susan Goldin-Meadow, Department of Psychology, University of Chicago, "How Our Hands Help Us Think"


Fall 2010

September 13: Michael Richardson, Center for Cognition, Action and Perception, Psychology Department, University of Cincinnati, "Affording structured coordination: An ecological approach to self-organized social action"

October 18: Ennio Mingolla, Department of Cognitive and Neural Systems, Department of Psychology, Boston University, "Neural models of visually-guided steering, obstacle avoidance, and route selection"


Spring 2010

February 22: Mary Hayhoe, Dept. of Psychology, The University of Texas at Austin, "Adaptive Control of Attention and Gaze in the Natural World".

February 24: Dana Ballard, Dept. of Computer Sciences, The University of Texas at Austin, "Modular Reinforcement Learning as a Model of Embodied Cognition".

April 5: Asif Ghazanfar, Dept. of Psychology, Princeton University, "Vocal Communication Through Coupled Oscillations: Substrates for the Evolution of Speech".