Home
Sponsorship Opportunities

Call for Papers(pdf)
People
Demo
Special Session
Technical Program
Invited Talks
For Authors
Social Program
Registration
Local Information
Conference Venue
Hotel Reservation
Important Dates
Related Sites

National Natural Science Foundation of China

Google

Invited Talks

 

Invited Talk 1#: Language and thought: talking, gesturing (and signing) about space

Speaker: John Haviland

Time: Monday,Nov. 8 2010, 9:00-10:00

 

Invited Talk 2#: Musical performance as multimodal communication: Drummers, musical collaborators, and listeners

Speaker: Richard Ashley

Time: Tuesday, Nov. 9 2010, 9:00-10:00

 

Invited Talk 3#: Activity-based UbiComp: A New Research Basis for the Future of Human-Computer Interaction

Speaker: James Landay

Time: Wednesday, Nov. 10 2010, 9:00-10:00

-----------------------------------------------------------------------------


Invited Talk 1#: Language and thought: talking, gesturing (and signing) about space

Speaker: John Haviland

Time: Monday,Nov. 8 2010, 9:00-10:00

Abstract
Recent research has reopened debates about (neo)Whorfian claims that the language one speaks has an impact on how one thinks---long discounted by mainstream linguistics and anthropology alike. Some of the most striking evidence for such possible impact derives, not surprisingly, from understudied "exotic" languages and, somewhat more surprisingly, from multimodal and notably gestural practices in communities which speak them. In particular, some of my own work on GuuguYimithirr, a Paman language spoken by Aboriginal people in northeastern Australia, and on Tzotzil, a language spoken by Mayan peasants in southeastern Mexico, suggests strong connections between linguistic expressions of spatial relations, gestural practices in talking about location and motion, and cognitive representations of space---what have come to be called spatial "Frames of Reference." In this talk, I will present some of the evidence for such connections, and add to the mix evidence from an emerging, first generation sign language developed spontaneously in a single family by deaf siblings who have had contact with neither other deaf people nor any other sign language.

Speaker Biography

JBHPatApas.jpgJohn Haviland received his Ph.D in Social Relations from Harvard in 1972 and isdistinguished professor of anthropology at the University of California San Diego, where he joined the department in 2005. His research centers on the social life of language, including gesture and emerging sign languages, and his fieldwork concentrates on indigenous Mexico and Aboriginal Australia. His recent books include Old Man Fog and the Last Aborigines of Barrow Point, the story of the last speaker of the Barrow Point Language from Queensland, Australia (1998, Smithsonian Institution), as well as the Spanish edition, with Jose Antonio Flores Farfan, of Bases de la DocumentacionLinguistica (2007, INALI) . Address for correspondence: Department of Anthropology, UCSD, 9500 Gilman Drive, La Jolla, CA 92093-0532 USA (jhaviland@ucsd.edu, anthro.ucsd.edu/~jhaviland).

Invited Talk 2#: Musical performance as multimodal communication: Drummers, musical collaborators, and listeners

Speaker: Richard Ashley

Time: Tuesday, Nov. 9 2010, 9:00-10:00

Abstract
Musical performance provides an interesting domain for understanding and investigating multimodal communication. Although the primary modality of music is auditory, musicians make considerable use of the visual channel as well. This talk examines musical performance as multimodal, focusing on drumming in one style of popular music (funk or soul music). The way drummers interact with, and communicate with, their musical collaborators and with listeners are examined, in terms of the structure of different musical parts; processes of mutual coordination, entrainment, and turn-taking (complementarity) are highlighted. Both pre-determined (composed) and spontaneous (improvised) behaviors are considered. The way in which digital drumsets function as complexly structured human interfaces to sound synthesis systems is examined as well.

Speaker Biography

JBHPatApas.jpgDr. Richard Ashley is Associate Professor of Music and Cognitive Science at Northwestern University. His research interests are in cognitive aspects of musical structure and expressive performance of music; his research has been published in Music Perception, Journal of Neuroscience, Computer Music Journal, Proceedings of the New York Academy of Sciences, and Journal of New Music Research, among others. He has served as President of the Society for Music Perception and Cognition and is a member of the editorial boards for the journal Music Perception and Pyschology of Music. Dr. Ashley's work has been supported by numerous agencies, including two Fulbright grants to The Netherlands (one as a student and one as a faculty member), the National Endowment for the Humanities, the U.S. Department of Education, and the Dutch Science Foundation. He remains active as a performer on acoustic and electric bass.

 

Invited Talk 3#: Activity-based UbiComp: A New Research Basis for the Future of Human-Computer Interaction

Speaker: James Landay

Time: Wednesday, Nov. 10 2010, 9:00-10:00

Abstract
Ubiquitous computing (UbiComp) is bringing computing off the desktop and into our everyday lives. For example, an interactive display can be used by the family of an elder to stay in constant touch with the elderĄ¯s everyday wellbeing, or by a group to visualize and share information about exercise and fitness. Mobile sensors, networks, and displays are proliferating worldwide in mobile phones, enabling this new wave of applications that are intimate with the userĄ¯s physical world. In addition to being ubiquitous, these applications share a focus on high-level activities, which are long-term social processes that take place in multiple environments and are supported by complex computation and inference of sensor data.However, the promise of this Activity-based UbiComp is unfulfilled, primarily due to methodological, design, and tool limitations in how we understand the dynamics of activities. The traditional cognitive psychology basis for human-computer interaction, which focuses on our short term interactions with technological artifacts, is insufficient for achieving the promise of Activity-based UbiComp. We are developing design methodologies and tools, as well as activity recognition technologies, to both demonstrate the potential of Activity-based UbiComp as well as to support designers in fruitfully creating these types of applications.

Speaker Biography

JBHPatApas.jpgJames Landay is the Short-Dooley Professor of Computer Science & Engineering at the University of Washington, specializing in human-computer interaction. From 2003 through 2006 he was also the Laboratory Director of Intel Labs Seattle, a university affiliated research lab exploring ubiquitous computing. His current research interests include Automated Usability Evaluation, Demonstrational Interfaces, Mobile & Ubiquitous Computing, User Interface Design Tools, and Web Design. He is spending his 2009-2011 sabbatical at Microsoft Research Asia in Beijing.
Landay received his B.S. in EECS from UC Berkeley in 1990 and M.S. and Ph.D. in CS from Carnegie Mellon University in 1993 and 1996, respectively. His Ph.D. dissertation was the first to demonstrate the use of sketching in user interface design tools. He was also the chief scientist and co-founder of NetRaker. In 1997 he joined the faculty in EECS at UC Berkeley, leaving as an Associate Professor in 2003.