Restoration of visual perception with the use of a retinal prosthesis in adult patients with late blindness from retinitis pigmentosa can also experience improvements in auditory-visual tasks, such as spatial location matching, according to a report published in Vision Research. This successful interaction with audition was even seen in patients who only experienced limited visual perception restoration. The effect cueing may also improve the prosthetic’s visual functionality, the research shows.

The research reviewed 10 patients implanted with the Argus II retinal prosthesis and 10 control patients. The investigators say their findings imply  that “remodeling of visual regions during blindness, and the challenges of artificial vision, do not prohibit the rehabilitation of these types of auditory-visual interactions.”

The Argus II’s constraints include low temporal resolution and a narrow field of view. The researchers sought to determine whether auditory-visual interactions could assist patients. The study patients participated in a crossmodal matching experiment. Of the 10 total, 6 also participated in a crossmodal matching task with randomized sound order, a visual search task, and a timed localization task. Also, 10 age-matched, sighted control individuals participated in the crossmodal matching experiments.


Continue Reading

All experiments were performed in a dimly lit room. The participants sat before a computer monitor that displayed visual stimuli. Argus II patients were prevented from using natural visual perception and encouraged to feel the monitor’s edges to determine the visual stimuli’s position.

In the first experiment, the researchers tested auditory-visual spatial location matching (left, center, right sounds) and auditory stimuli pitch-visual stimuli elevation (high, middle, low pitch) matching by asking participants to locate the visual stimulus and choose the sound that best matched it.

Argus II patients correctly matched auditory and visual stimuli “significantly above chance” and similar in strength to age-matched sighted participants. The researchers noted no significant differences between the groups in the second experiment, which solely added further randomization of the multisensory stimuli.

In another experiment, patients using the Argus II were verbally trained to locate a visual search target that flashed amid distractors. Again, these patients’ localization of the visual target was “significantly above chance” for both the auditory-visual task (M=.92, SD=.10, t(5)=14.61, P =2.71×10-5) and the vision alone task (M=.90, SD=.12, t(5)=11.16, P =1.01×10-4). They located the visual target significantly faster when auditory cues were present than when the visual stimuli were alone (Paired t-test, t(5)=–3.03, P =.03).

In a localization task with both congruent and incongruent auditory-visual stimuli without visual clutter, participants showed an average faster visual localization with congruent auditory-visual stimuli than incongruent or visual stimuli alone.

Argus II patients had significant mapping between auditory pitch and visual elevation, suggesting crossmodal mapping for both operates at the same perceptual level. Two Argus II patients had “significantly faster” localization with auditory cueing due to auditory-visual mappings while 4 had “faster” localization for the congruent relative to incongruent stimuli. The crossmodal mappings in Argus II patients were not significantly different from sighted controls, though Argus II patients had higher variability in crossmodal interaction strength.

A curing algorithm could benefit patients with ultra-low artificial vision, and their relearning crossmodal interactions enhance artificial vision by integrating with natural auditory perception, the research says.

Disclosure: One study author declared affiliation with the biotech or pharmaceutical industries. Please see the original reference for a full list of authors’ disclosures.

Reference

Stiles NRB, Patel VR, Welland JD. Multisensory perception in Argus II retinal prosthesis patients: Leveraging auditory-visual mappings to enhance prosthesis outcomes. Vis Res. Published online February 17, 2021. doi:10.1016/j.visres.2021.01.008