Computer Science and
     Software Engineering

Computer Science and Software Engineering

CSSE Seminar Series (CSSESS)

Quick links: Past seminarsfuture seminarsCSSESS Home


Seminar

~ But can the Holodeck do a good Pinot Noir? ~


Speaker
Assoc. Prof. Jeremy Cooperstock

Institute
McGill University

Time & Place
15:00 hrs, Thursday, 26 January, in Room 446, Erskine Building

All are welcome

Abstract

The Star Trek Holodeck experience was pitched as the ultimate in VR environments, perhaps embodying the dream of true telepresence, in which users receive the full sensory experience of being in another location. However, most so-called "telepresence" systems today offer little more than high-resolution displays, as if all one needs to achieve the illusion is Skype on a big screen. Despite the hype, such systems generally fail to deliver a convincing level of co-presence between users and come nowhere close to providing the sensory fidelity or supporting the expressive cues and manipulation capabilities we take for granted with objects in the physical world.

My lab's objectives in this domain are to simulate a high-fidelity representation of remote or synthetic environments, conveying the sights, sounds, and sense of touch in a highly convincing manner, allowing users, for example, to collaborate with each other as if physically sharing the same space. Achieving this goal presents challenges along the entire signal path, including sensory acquisition, signal processing, data transmission, display technologies, and an understanding of the role of multimodality in perception. This talk surveys some of our research in these areas and demonstrates several applications arising from this work, including support of environmental awareness for the blind community, remote medical training, multimodal synthesis of ground surfaces, and low-latency cross-continental distributed jazz jams. Nevertheless, simulating reality is a far cry from achieving it, and I doubt that even in the future of real Holodecks, a VR wine tasting would compete effectively with the real thing.

Biography

Jeremy Cooperstock (Ph.D., University of Toronto, 1996) is an associate professor in the department of Electrical and Computer Engineering, a member of the Centre for Intelligent Machines, and a founding member of the Centre for Interdisciplinary Research in Music Media and Technology at McGill University. He directs the Shared Reality Lab, which focuses on computer mediation to facilitate high-fidelity human communication and the synthesis of perceptually engaging, multimodal, immersive environments, and also leads the theme of Enabling Technologies for a new Networks of Centres of Excellence on Graphics, Animation, and New Media (GRAND). Cooperstock's accomplishments include the Intelligent Classroom, the world's first Internet streaming demonstrations of Dolby Digital 5.1, uncompressed 12-channel 96kHz/24bit, multichannel DSD audio, and multiple simultaneous streams of uncompressed high-definition video, and a simulation environment that renders graphic, audio, and vibrotactile effects in response to footsteps. His work on the Ultra-Videoconferencing system was recognized by an award for Most Innovative Use of New Technology from ACM/IEEE Supercomputing and a Distinction Award from the Audio Engineering Society. Cooperstock has worked with IBM at the Haifa Research Center, Israel, and the T.J. Watson Research Center in Yorktown Heights, New York, the Sony Computer Science Laboratory in Tokyo, Japan, and was a visiting professor at Bang & Olufsen, Denmark, where he conducted research on telepresence technologies as part of the World Opera Project. He chaired the Audio Engineering Society (AES) Technical Committee on Network Audio Systems from 2001 to 2009 and is currently an associate editor of the Journal of the AES.


Quick links: Past seminarsfuture seminarsCSSESS Home