Skip Navigation
Search

Gregory Zelinsky, Ph.D.


Brown University (1994)
Professor, Cognitive Science
Joint Appointment: Associate Professor, Computer Science

Dr. Gregory Zelinsky will not be reviewing graduate student applications for the 2024-2025 academic year.

Zelinsky

Contact:

gregory.zelinsky@stonybrook.edu
Office: Psychology B-240
Phone: (631) 632-7827

Visit Website

Research Interests:

Visual cognition, visual search, eye movements, visual attention, visual working memory, scene perception.

Current Research:

My work attempts to integrate cognitive, computational, and neuroimaging techniques to better understand a broad range of visual cognitive behaviors, including search, object representation, working memory, and scene perception. In a current research project I monitor how people move their eyes as they perform various visual search tasks. I then describe this oculomotor behavior in the context of an image-based neurocomputational model. The model "sees" the same stimuli presented to the human observers, then outputs a sequence of simulated eye movements as it performs the identical task. This simulated pattern of eye movements is then compared to the human behavior in order to evaluate and refine the model. 

Representative Publications:

Schmidt, J. & Zelinsky, G. J. (2017). Adding details to the attentional template offsets search difficulty: Evidence from contralateral delay activity. Journal of Experimental Psychology: Human Perception and Performance, 43(3), 429-437. 

Adeli, H., Vitu, F., & Zelinsky, G. J. (2017). A model of the superior colliculus predicts fixation locations during scene viewing and visual search. Journal of Neuroscience, 37(6),1453-1467. 

Wei, Z., Adeli, H., Zelinsky, G. J., Samaras, D., & Hoai, M. (2016). Learned region sparsity and diversity also predict visual attention. Advances in Neural Information Processing Systems (NIPS 2016).

Yu, C-P, Maxfield, J. T., & Zelinsky, G. J. (2016). Searching for category-consistent features: A computational approach to understanding visual category representation.Psychological Science, 27(6), 870-884.  

Ryoo, J., Yun, K., Samaras, D., Das, S. R., & Zelinsky, G. J. (2016). Design and evaluation of a foveated video streaming service for commodity client devices.  In Proceedings of the 7th International Conference on Multimedia Systems (ACM MMSys ‘16). Article No. 6.

Chen-Ping Yu, Hieu Le, Gregory Zelinsky, and Dimitris Samaras. Efficient Video Segmentation using Parametric Graph Partitioning. International Conference on Computer Vision, Santiago Chile, 2015.

Zelinsky, G. J., & Yu, C-P. (2015). Clutter perception is invariant to image size. Vision Research, Special Issue on Computational Models of Attention, 116, 142-151

Zelinsky, G. J., & Bisley, J. W. (2015). The what, where, and why of priority maps and their interactions with visual working memory. Annals of the New York Academy of Sciences, 1339, 154-164. DOI: 10.1111/nyas.12606

Maxfield, J. T., Stalder, W. D., & Zelinsky, G. J. (2014). Effects of target typicality on categorical search. Journal of Vision, 14(12):1, 1-11.

Yu, C-P., Samaras, D., & Zelinsky, G. J. (2014). Modeling visual clutter perception using proto-object segmentation. Journal of Vision, 14(7):4, 1-16.

Alexander, R., G., Schmidt, J., & Zelinsky, G. J. (2014). Are summary statistics enough? Evidence for the importance of shape in guiding visual search. Visual Cognition (Special Issue on Eye Movements and Visual Cognition: Honoring George W. McConkie), 22:3-4, 595-609.

Schmidt, J., MacNamara, A., Proudfit, G. H., & Zelinsky, G. J. (2014).  More target features in visual working memory leads to poorer search guidance: Evidence from Contralateral Delay Activity. Journal of Vision, 14(3):8, 1-19.

Zelinsky, G. J. (2013).  Understanding scene understanding. Frontiers in Psychology: Perception Science, 4:954, 1-3. doi: 10.3389/fpsyg.2013.00954

Zelinsky, G. J., Peng, Y., & Samaras, D. (2013).  Eye can read your mind: Using eye fixations to classify search targets. Journal of Vision, 13(14):10, 1-13. doi: 10.1167/13.14.10

Yu, C-P, Hua, W-Y., Samaras, D., & Zelinsky, G. J. (2013).  Modeling clutter perception using parametric proto-object partitioning. Advances in Neural Information Processing Systems (NIPS 2013).

Zelinsky, G. J., Peng, Y., Berg, A. C., & Samaras, D. (2013). Modeling guidance and recognition in categorical search: Bridging human and computer object detection. Journal of Vision, 13(3):30, 1-20.  doi: 10.1167/13.3.30

Zelinsky, G. J., Adeli, H., Peng, Y., & Samaras, D. (2013).  Modelling eye movements in a categorical search task. Phil. Trans. R. Soc. B, 368(1628), 1-12.  20130058.

Yun, K., Peng, Y., Samaras, D., Zelinsky, G. J., & Berg, T. L. (2013).  Studying relationships between human gaze, description, and computer vision. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 739-746.

Yun, K., Peng, Y., Samaras, D., Zelinsky, G. J., & Berg, T. L. (2013).  Exploring the role of gaze behavior and object detection in scene understanding. Frontiers in Psychology: Perception Science, 4:917.  doi:10.3389/fpsyg.2013.00917

Dickinson, C. A., & Zelinsky, G. J. (2013).  New evidence for strategic differences between static and dynamic search tasks: an individual observer analysis of eye movements. Frontiers in Psychology: Perception Science, 4(8), 1-18.

Maxfield, J. T., & Zelinsky, G. J. (2012). Searching through the hierarchy: How level of target categorization affects visual search. Visual Cognition, 20(10), 1153-1163.

MacNamara, A., Schmidt, J., Zelinsky, G. J., & Hajcak, G. (2012). Electrocortical and ocular indices of attention to fearful and neutral faces presented under high and low working memory load. Biological Psychology, 91, 349-356.

Alexander, R. G., & Zelinsky, G. J. (2012). Effects of part-based similarity on visual search: The Frankenbear Experiment.  Vision Research, 54, 20-30.

Brennan, S. E., Hanna, J. E., Zelinsky, G. J., & Savietta, K. J. (2012). Eye gaze cues for coordination in collaborative tasks. DUET 2012: Dual eye tracking in CSCW; Proceedings of the 2012 ACM Conference on Computer Supported Cooperative Work, Seattle, WA.

Zelinsky, G. J. (2012). TAM: Explaining off-object fixations and central fixation biases as effects of population averaging during search. Visual Cognition(Special Issue on Behavioral and Computational Approaches to Reading and Scene Perception), 20:4-5, 515-545.

Alexander, R.G., & Zelinsky, G. J. (2011). Visual similarity effects in categorical search. Journal of Vision, 11(8):9, 1-15.  doi: 10.1167/11.8.9  

Neider, M.B., & Zelinsky, G. J. (2011). Cutting through the clutter: Searching for targets in evolving complex scenes. Journal of Vision, 11(14):7, 1-16.

Schmidt, J., & Zelinsky, G. J. (2011).  Visual search guidance is best after a short delay. Vision Research, 51, 535-545.  

Zelinsky, G. J., Loschky, L.C., & Dickinson, C.A. (2011). Do object refixations during scene viewing indicate rehearsal in visual working memory?  Memory & Cognition, 39, 600-613.  

Alexander, R.G., Zhang, W., & Zelinsky, G. J. (2010). Visual similarity effects in categorical search. In S. Ohlsson & R. Catrambone (Eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science Society (pp. 1222-1227).  Austin, TX: Cognitive Science Society.  

Neider, M.B., Chen, X., Dickinson, C., Brennan, S.E., & Zelinsky, G. J. (2010). Coordinating spatial referencing using shared gaze.  Psychonomic Bulletin & Review, 17(5), 718-724.

Neider, M.B., & Zelinsky, G. J. (2010).  Exploring the perceptual causes of search set-size effects in complex scenes.  Perception, 39, 780-794.  

Zelinsky, G. J, & Todor, A. (2010). The role of “rescue saccades” in tracking objects through occlusions. Journal of Vision, 10(14) 29, 1-13.  

Current Research Support:

National Institute of Mental Health (R01 MH063748-06A1), "Eye Movements During Real-world Visual Search: A behavioral and computational study."
4/1/09 - 3/31/14. $1,348,046 (total costs)
Gregory Zelinsky (Primary Investigator)

National Science Foundation (#0527585), "HSD: See Where I'm Looking: Using Shared Gaze to Coordinate Time-Critical Collaborative Tasks."
9/1/05 - 8/31/08. $742,006 (total costs)
Gregory Zelinsky (Primary Investigator)