PhD Defence • Human-Computer Interaction • Pervasive Desktop Computing by Direct Manipulation of an Augmented Lamp

Friday, September 20, 2024 9:00 am - 12:00 pm EDT (GMT -04:00)

Please note: This PhD defence will take place in DC 3317.

Yuan Chen, PhD candidate
David R. Cheriton School of Computer Science

Supervisors: Professors Daniel Vogel, Géry Casiez, Sylvain Malacria

Desktop computing, despite its long-standing dominance in personal productivity, remains largely confined to screens. Many efforts to expand beyond a single screen, from multiple monitors to incorporating projector-camera units or head-mounted displays, have shown promise. However, this is often from the desktop display to other devices and it lacks the awareness of physical environments and user activities. This thesis explores a novel form of direct manipulation projector-camera system, which leverages unique characteristics of physical lamp movement to manipulate content to and from the desktop display, but also to and from devices and the physical environment, while maintaining the awareness in the workspace.

Three projects examine the design, prototyping, and human factors aspects of an augmented lamp system in which the lamp works as an input and output device connecting desktop computing and physical environment.

In the first project, an interaction design space is introduced for physical direct manipulation using an architect lamp with a proof-of-concept system using a projector and motion tracking system. We demonstrate its potential usage through three scenarios, describe study results evaluating its potential, and summarize design implications.

In the second project, we study the impacts on user performance and interaction strategies when interacting with an augmented lamp in a desktop space. We conduct a controlled experiment in Virtual Reality to understand the impact of two control mechanisms for target acquisition tasks in a dynamic peephole display: “coupled,” when the display centre is used for selection and “decoupled,” when the selection is handled by separate inputs like direct touch. We find that two control mechanisms have subtle differences in total time and error, but people follow different strategies for coordinating the movement of dynamic peephole display with different target acquisition techniques.

In the third project, we explore this observation in a more general context. Using a controlled Virtual Reality environment, we conduct an experiment to investigate whether what users intend to do with a virtual target impacts how they plan and perform the initial target acquisition. Our results lead to an understanding of user motion profiles before acquisition for different intended interactions with the same target. We discuss how these motion profiles can then be used to improve the lamp design, such as integrating force sensors into the lamp to improve activity awareness. Together, these findings establish a promising way to connect current desktop computing with the surrounding physical desktop environment based on a deeper understanding of user activities in that space.