NSF Award Abstract - #0427372 |
NSF Org | IIS |
Latest Amendment Date | September 22, 2004 |
Award Number | 0427372 |
Award Instrument | Continuing grant |
Program Manager |
Junku Yuh IIS Division of Information & Intelligent Systems CSE Directorate for Computer & Information Science & Engineering |
Start Date | December 1, 2004 |
Expires | November 30, 2005 (Estimated) |
Awarded Amount to Date | $240000 |
Investigator(s) |
Alan Bovik bovik@ece.utexas.edu (Principal Investigator)
Eyal Seidemann (Co-Principal Investigator) |
Sponsor |
University of Texas at Austin P.O Box 7726 Austin, TX 78713 512/471-6424 |
NSF Program(s) | ITR FOR NATIONAL PRIORITIES |
Field Application(s) |
0104000 Information Systems, 0116000 Human Subjects, 0104000 Information Systems, 0116000 Human Subjects |
Program Reference Code(s) |
|
Program Element Code(s) |
|
Project Abstract This study is directed towards developing flexible, general-purpose Visual Search systems capable of Searching for objects in real, cluttered environments. The research will include extensive psychophysical and physiological experiments on humans and primates that will prototype artificial systems that mimic this behavior. The goals of the study can be conveniently divided into four Aims: Aim 1: Develop and prototype a revolutionary camera gaze control device dubbed Remote High-Speed Active Visual Environment, or RHAVEN. RHAVEN will allow telepresent control of the gaze of a remote camera using eye movements as rapidly and naturally as if viewing the scene directly. Aim 2: Develop optimal statistical bounds on Visual Search, by casting it as a Bayesian problem, yielding a maximum a posteriori (MAP) solutions for firstly, finding a target in a visual scene using a smallest number of fixations, and secondly, for next-fixation selection given a current fixation. Aim 3: Construct models for Visual Search based on Natural Scene Statistics at the point of gaze. Visually important image structures can be inferred by analyzing the statistics of natural scenes sampled by eye movements and fixations. Aim 4: Conduct neurophysiological studies on awake, behaving primates during Visual Search tasks. Measure and analyze search performance in awake, behaving monkeys, while measuring the responses of neural populations in the brain's frontal eye fields (FEF) which help control saccadic eye movements. Broader Impact: The results of this research should significantly impact numerous National Priorities: Searching Large Visual Databases, Robotic Navigation, Security Imaging, Biomedical Search, Visual Neuroscience, and many others. It is easy to envision scenarios that would benefit by a fundamental theory of Visual Search. For example: searching for suspect faces in airport security systems; examining internet streams for questionable material; semi-automatic search for lesions in mammograms; steering robotic vehicles around obstacles in hostile environs; navigating huge visual data libraries, etc.