top of page

Research

Perceptual & Neural Mechanisms of Early Audio-Visual Integration

Although our perceptual experience is ubiquitously multisensory, little is known about how the brain integrates different sensory signals registered in separate sensory pathways. I've focused on the integration of auditory and visual information at a perceptual level. I developed an experimental design utilizing the 'MAE phenomenon' to pinpoint the perceptual/sensory level of audio-visual integration.

Three related projects were carried out to answer each question:

Genuine sensory integration of audio-visual motion

AV_adaptation_cropped.gif
  • Investigating whether audio-visual information can be integrated at the perceptual and sensory stage in motion perception.

  • Park, M., Blake, R., Kim, Y., & Kim, C. Y. (2019). Congruent audio-visual stimulation during adaptation modulates the subsequently experienced visual motion aftereffect. Scientific Reports, 9(1), 1-11. [Paper]

  • Awarded for the Graduate Award of Thesis Excellence with a Rewarding Scholarship, Korea University.

Decoding audio-visual direction congruency

in the early visual cortex

Audio-visual interactions outside visual awareness

  • Examining whether the early sensory cortex (e.g., visual cortex) is involved in audio-visual integration (applying multi-voxel pattern analysis on fMRI data).

  • Park, M., & Kim, C-Y. (In preparation), Audio-visual integration during adaptation modulates the brain activity in the early visual cortex.

  • Exploring whether audible sound potentiates visual adapting motion rendered invisible using CFS (continuous flash suppression).

  • Park, M., Blake, R., & Kim, C-Y. (2024). Audio-visual interactions outside visual awareness during motion adaptation. [Paper]

Cross-Modal Interactions

Cross-modal interactions in visuo-haptic

& Intra-modal correspondence in color and shape

MovieS2_lowqual.gif
  • Exploring non-random associations between color (hue, luminance, saturation) and shape complexity and whether the associations is tightened by adding haptic to the visual exploration.

  • Song, J., Shin, H., Park, M., Nam, S., & Kim, C. Y. (2022). Complex Shapes Are Bluish, Darker, and More Saturated; Shape-Color Correspondence in 3D Object Perception. Frontiers in Psychology, 13, 854574-854574. [Paper]

  • Roll: Generating 3D shape models, designing & conducting experiments, analyzing data, presenting results, revising & review manuscripts.

  • R&D project collaborated with Corp. AMOREPACIFIC.

  • Awarded for Excellent Academic Presentation.

Temporal order judgment in audio-visual events

  • Founded that  the ability to judge temporal order varies in individuals, with notable difficulty created when auditory events closely follow visual events.

  • Wen, P., Opoku-Baah, C., Park, M., & Blake, R. (2020). Judging relative onsets and offsets of audiovisual events. Vision, 4(1), 17. [Paper]

  • Roll: Creating stimuli, designing experiments, analyzing data, and reviewing manuscripts.

Predictive Processing in Perception

Bias in a perceived location of sound towards a predicted visual location

시온연구2.png
  • Testing whether Ventriloquism effect can occur with a predicted but not physically presented flash.

  • Roll: Guiding a graduate student, conceiving the idea, developing experiments, and analyzing data.

Perceptual facilitation by an expected visual feature

  • Testing whether the expectation of a specific orientation can enhance the orientation sensitivity of a target.

  • Roll: Guiding a graduate student, conceiving the idea, designing experiments, and analyzing data.

Top-Down Influences on Visual Perception

The perceptual system interprets incoming sensory information over time according to our expectations and prior knowledge. Top-down processing helps make sense of all the information our senses bring in. Studying top-down influences on perception can provide clues to how the brain binds sensory information into a single perception. My specific interest is in whether the top-down influences truly modulate what we see. I participated in two projects related to my interest.

The impact of social meaning by associative learning on perceptual selection

  • Investigating the impact of social meaning by associative learning on perceptual selection and brain activity during binocular rivalry.

  • Roll: Designing experiments, performing data collection, and analyzing data.

  • Whang, S. Y., Park, M., Lee, M., & Kim, C. Y. (2021). Brain activity reflecting social values associated with faces during binocular rivalry. Journal of Vision, 21(9), 2890-2890. [Poster]

Functional plasticity in hMT+

by learning of implied motion

  • Examining functional plasticity of human motion perception area (hMT+) following learning of implied motion in 2D abstract painting.

  • Roll: Performing data collection, analyzing data, and writing manuscripts.

bottom of page