3DUI 2015 Keynote talk given by Mark Billinghurst on March 24th 2015, as part of the 3DUI 2015 conference. The talk is a survery of Augmented Reality and Empathic Computing
Unleash Your Potential - Namagunga Girls Coding Club
The Reality of Augmented Reality: Are we there yet?
1. The Reality of Augmented Reality:
Are we there yet?
Mark Billinghurst
mark.billinghurst@hitlabnz.org
The HIT Lab NZ, University of Canterbury
March 24th 2015
10. Lesson’s Learned
! Have a clear driving vision
! Enjoy the journey
! Travel with others
! Have a well equipped vehicle
! The end always seems nearer than it really is
12. The Ultimate Display
The ultimate display would .. be a room within which
the computer can control the existence of matter. A
chair displayed in such a room would be good enough
to sit in. Handcuffs .. would be confining, and a bullet ..
would be fatal.
With appropriate programming such a display could
literally be the Wonderland into which Alice walked.
Ivan E. Sutherland
Sutherland, I.E. The ultimate display. Information Processing 1965, Proc. IFIP
Congress 65, 506-508.
13. Realizing the Vision
! 3D computer graphics
! Physical input devices
! Speech interaction
! Eye-gaze input
! Kinesthetic force feedback
Sutherland, I. E. (1968, December). A head-mounted three dimensional display. In Proceedings
of the December 9-11, 1968, fall joint computer conference, part I (pp. 757-764). ACM.
16. The Super Cockpit (1980’s)
! Furness - USAF
Furness, T. A. (1986, September). The super cockpit and its human factors challenges. In Proceedings of the Human
Factors and Ergonomics Society Annual Meeting (Vol. 30, No. 1, pp. 48-52). SAGE Publications.
23. To Make the Vision Real..
! Automatically detecting real environment
! Environmental awareness, Physically based interaction
! Gesture interaction
! Free-hand interaction
! Multimodal input
! Speech and gesture interaction
! Intelligent interfaces
! Implicit rather than Explicit interaction
24. Environmental Awareness
! AR MicroMachines
! AR experience with environment awareness and
physically-based interaction
! Based on MS Kinect RGB-D sensor
! Augmented environment supports
! occlusion, shadows
! physically-based interaction between real and
virtual objects
Clark, A., & Piumsomboon, T. (2011). A realistic augmented reality racing game using a
depth-sensing camera. In Proceedings of the 10th International Conference on Virtual
Reality Continuum and Its Applications in Industry (pp. 499-502). ACM.
25. Physics Simulation
! Create virtual mesh over real world
! Update at 10 fps – can move real objects
! Use by physics engine for collision detection (virtual/real)
! Use by OpenScenegraph for occlusion and shadows
27. Natural Hand Interaction
! Using bare hands to interact with AR content
! MS Kinect depth sensing
! Real time hand tracking
! Physics based simulation model
28. Skeleton Based Interaction
! 3 Gear Systems
! Kinect/Primesense Sensor
! Two hand tracking
! http://www.threegear.com
30. Skeleton Interaction + AR
! HMD AR View
! Viewpoint tracking
! Two hand input
! Skeleton interaction, occlusion
Piumsomboon, T., Altimira, D., Kim, H., Clark, A., Lee, G., & Billinghurst, M. (2014, September). Grasp-Shell
vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality.
In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium on (pp. 73-82). IEEE.
31. Multimodal Interaction
! Combined speech and gesture input
! Gesture and Speech complimentary
! Speech: modal commands, quantities
! Gesture: selection, motion, qualities
! Previous work found multimodal interfaces
intuitive for 2D/3D graphics interaction
! However, few multimodal AR interfaces
32. Free Hand Multimodal Input
! Use free hand to interact with AR content
! Recognize simple gestures
! Open hand, closed hand, pointing
Point Move Pick/Drop
Lee, M., Billinghurst, M., Baek, W., Green, R., & Woo, W. (2013). A usability study of multimodal
input in an augmented reality environment. Virtual Reality, 17(4), 293-305.
35. Results - Performance
! Average performance time
! Gesture: 15.44s
! Speech: 12.38s
! Multimodal: 11.78s
! Significant difference across conditions (p < 0.01)
! Difference between gesture and speech/MMI
36. Subjective Results (Likert 1-7)
! User subjective survey
! Gesture significantly worse, MMI and Speech same
! MMI perceived as most efficient
! Preference
! 70% MMI, 25% speech only, 5% gesture only
Gesture Speech MMI
Naturalness 4.60 5.60 5.80
Ease of Use 4.00 5.90 6.00
Efficiency 4.45 5.15 6.05
Physical Effort 4.75 3.15 3.85
37. Lessons Learned
! Multimodal interaction significantly better than
gesture alone in AR interfaces for 3D tasks
! Shorter task time, more efficient
! Multimodal input was more natural, easier,
and more effective that gesture/speech only
! Simultaneous input rarely used
! More studies need to be conducted
! What gesture/speech patterns? Richer input
38. Intelligent Interfaces
! AR interface + intelligent tutoring system
! ASPIRE constraint based system (from UC)
! Constraints
- relevance cond., satisfaction cond., feedback
Westerfield, G., Mitrovic, A., & Billinghurst, M. (2013). Intelligent Augmented Reality Training for
Assembly Tasks. In Artificial Intelligence in Education (pp. 542-551). Springer Berlin Heidelberg.
42. ! Digital Imperceptible from the Real World
! Multi-sensory Display
! Natural unencumbered input
! Seamless blend between real and virtual
! Huge technical and non-technical challenges
The Ultimate AR Display
43. AR Research Trends
! Zhou 10 year ISMAR survey (1998-2008)
! 276 papers reviewed
! Most researched topics
! (1) Tracking techniques (20%)
! (2) Interaction techniques (15%)
! (3) Calibration and registration (14%)
! (4) AR applications (14%)
! (5) Display techniques (12%)
Feng Zhou, Henry B.L. Duh, Mark Billinghurst. Trends in Augmented Reality Tracking, Interaction and Display:
A Review of Ten Years of ISMAR. In proceedings of ISMAR 2008, Cambridge, UK, 15-18th of Sept. 2008.
58. Vision
“Using technology to create shared
emotional experiences between users
and so create a deeper sense of
empathy and understanding”
59.
60. Empathic Computing
1. Computing systems that can
understand your feelings and emotions
2. Computing systems that help you
better understand the feelings of others
62. Appliances That Make You Happy
! Jun Rekimoto – Univ. Tokyo
! Smile detection + smart devices
63. Can we develop interfaces
that allow us to be more
empathetic to others?
64. Movies are like a machine
that generates Empathy
Roger Ebert
65. Empathy Computing Requirements
! Basic Requirements
! Making the technology transparent
! Empathy Definition
! Seeing with the eyes of another
! Hearing with the ears of another
! Feeling with the heart of another
66. Using AR for Empathy
! Augmented Reality can:
! Remove technology barriers
! Enhance communication
! Change perspective
! Share experiences
! Enhance interaction in real world
68. Current Collaboration on Wearables
! First person remote conferencing/hangouts
! Limitations
! Single POV, no spatial cues, no annotations, etc
69. Sharing Space: Social Panoramas
! Capture and share social spaces in real time
! Enable remote people to feel like they’re with you
70. Technology
! Google Glass
! Image capture, viewpoint sharing
! Remote device (desktop, tablet)
! Immersive viewing, live annotation
71. Key Research Questions
! Where is my partner looking?
! Enhanced radar display, Context compass
! How can we interact together?
! Shared pointers, Shared drawing
74. Wearable Interface
! Google Glass + e-Health + Spydroid + SSI
! Measure GSR, pulse oxygen, ECG, pitch
! Share video and audio remotely
! Represent emotions back to Glass user (4 states)
! !
75. Desktop Interface
! Live video, real time emotion data
! See what sender sees, emotion representation
!
76. Emotion Representation
! How can we show what you’re feeling?
! Tested - Raw data, Visual tinting, Emotion labeling
!
! !
Excited Happy
77. Early Results
! Colour overlay + video stream preferred
! Easier to understand
! Disagreement over best colour coding
! Different emotional response with diff. stimuli
! Scary movie best
! Remote users felt connected to local user
! Understanding of emotions
79. Analysis
Table 2. Correlation between different streaming scenarios
obtained from the feedback of the local and remote users
Experiment
3
(Creating
a
shared
experience
while
watching
an
audio
Visual
Watching
an
audio
visual)
Happy
Sad
Neutral
Excited
Case1
(Only
video
stream)
0.77
0.84
0.85
0.85
Case
2
(Case
1
+
Color
Overlay)
0.92
0.91
0.85
0.98
Case
3
(Case
2
+
Graphical
Analysis)
0.86
0.75
0.85
0.94
80. Capturing Space: Real World Capture
! Hands free AR
! Portable scene capture (color + depth)
! Projector/Kinect combo, Remote controlled pan/tilt
! Remote expert annotation interface
86. Scaling Up
! Seeing actions of millions of users in the world
! Augmentation on city/country level
87. AR + Smart Sensors + Social Networks
! Track population at city scale (mobile networks)
! Match population data to external sensor data
! medical, environmental, etc
! Mine data to improve social services
88.
89.
90. Research Challenges
! How to convey emotion?
! How to measure empathy?
! Interface/interaction models?
! How to communicate emotion?
! Scaling up to city/country scale?
92. Take Home Messages
! Have a Vision
! Find people to travel with
! Equip the vehicle
! Enjoy the journey
Most Important: We’re nowhere near the end
of interesting research in Augmented Reality
93. More Information
• Mark Billinghurst
– mark.billinghurst@hitlabnz.org
– @marknb00
• Website
– www.hitlabnz.org