Lecture 3 in the COMP 4010 course on Augmented and Virtual Reality taught at the University of South Australia. This lecture was taught by Bruce Thomas on August 13th 2019
1. LECTURE 3: VR TECHNOLOGY
COMP 4010 – Virtual Reality
Semester 5 - 2019
Bruce Thomas, Mark Billinghurst, Gun Lee
University of South Australia
August 13th 2019
3. Presence ..
“The subjective experience of being in one place or
environment even when physically situated in another”
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
4. How do We Perceive Reality?
• We understand the world through
our senses:
• Sight, Hearing, Touch, Taste, Smell
(and others..)
• Two basic processes:
• Sensation – Gathering information
• Perception – Interpreting information
6. Creating the Illusion of Reality
• Fooling human perception by using
technology to generate artificial sensations
• Computer generated sights, sounds, smell, etc
7. Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
8. Using Technology to Stimulate Senses
• Simulate output
• E.g. simulate real scene
• Map output to devices
• Graphics to HMD
• Use devices to
stimulate the senses
• HMD stimulates eyes
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
9. Creating an Immersive Experience
•Head Mounted Display
•Immerse the eyes
•Projection/Large Screen
•Immerse the head/body
•Future Technologies
•Neural implants
•Contact lens displays, etc
14. Tracking in VR
• Need for Tracking
• User turns their head and the VR graphics scene changes
• User wants to walking through a virtual scene
• User reaches out and grab a virtual object
• The user wants to use a real prop in VR
• All of these require technology to track the user or object
• Continuously provide information about position and orientation
Head Tracking
Hand Tracking
15. • Degree of Freedom = independent movement about an axis
• 3 DoF Orientation = roll, pitch, yaw (rotation about x, y, or z axis)
• 3 DoF Translation = movement along x,y,z axis
• Different requirements
• User turns their head in VR -> needs 3 DoF orientation tracker
• Moving in VR -> needs a 6 DoF tracker (r,p,y) and (x, y, z)
Degrees of Freedom
20. MagneticTracker (Active)
• Idea: difference between a magnetic
transmitter and a receiver
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
• -- : error increases with distance
Flock of Birds (Ascension)
21. Example: Razer Hydra
• Developed by Sixense
• Magnetic source + 2 wired controllers
• Short range (1-2 m)
• Precision of 1mm and 1o
• $600 USD
34. Tracking Coordinate Frames
• There can be several coordinate frames to consider
• Head pose with respect to real world
• Coordinate fame of tracking system wrt HMD
• Position of hand in coordinate frame of hand tracker
35. Example: Finding your hand in VR
• Using Lighthouse and LeapMotion
• Multiple Coordinate Frames
• LeapMotion tracks hand in LeapMotion coordinate frame (HLM)
• LeapMotion is fixed in HMD coordinate frame (LMHMD)
• HMD is tracked in VR coordinate frame (HMDVR) (using Lighthouse)
• Where is your hand in VR coordinate frame?
• Combine transformations in each coordinate frame
• HVR = HLM x LMHMD x HMDVR
37. Haptic Feedback
• Greatly improves realism
• Hands and wrist are most important
• High density of touch receptors
• Two kinds of feedback:
• Touch Feedback
• information on texture, temperature, etc.
• Does not resist user contact
• Force Feedback
• information on weight, and inertia.
• Actively resists contact motion
38. Active Haptics
• Actively resists motion
• Key properties
• Force resistance
• Frequency Response
• Degrees of Freedom
• Latency
41. Haptic Glove
• Many examples of haptic gloves
• Typically use mechanical device to provide haptic feedback
42. Passive Haptics
• Not controlled by system
• Use real props (Styrofoam for walls)
• Pros
• Cheap
• Large scale
• Accurate
• Cons
• Not dynamic
• Limited use
46. Vibrotactile Cueing Devices
• Vibrotactile feedback has been incorporated into many
devices
• Can we use this technology to provide scalable, wearable
touch cues?
52. Audio Displays
• Spatialization vs. Localization
• Spatialization is the processing of sound signals to make
them emanate from a point in space
• This is a technical topic
• Localization is the ability of people to identify the source
position of a sound
• This is a human topic, i.e., some people are better at it than others.
55. Head-Related Transfer Functions (HRTFs)
• A set of functions that model how sound from a source at
a known location reaches the eardrum
56. Measuring HRTFs
• Putting microphones in Manikin or human ears
• Playing sound from fixed positions
• Record response
57. Capturing 3D Audio for Playback
• Binaural recording
• 3D Sound recording, from microphones in simulated ears
• Hear some examples (use headphones)
• http://binauralenthusiast.com/examples/
58. OSSIC 3D Audio Headphones
• https://www.ossic.com/3d-audio/
63. Motivation
• Mouse and keyboard are good for desktop UI tasks
• Text entry, selection, drag and drop, scrolling, rubber banding, …
• 2D mouse for 2D windows
• What devices are best for 3D input in VR?
• Use multiple 2D input devices?
• Use new types of devices?
vs.
64. Input Device Characteristics
• Size and shape, encumbrance
• Degrees of Freedom
• Integrated (mouse) vs. separable (Etch-a-sketch)
• Direct vs. indirect manipulation
• Relative vs. Absolute input
• Relative: measure difference between current and last input (mouse)
• Absolute: measure input relative to a constant point of reference (tablet)
• Rate control vs. position control
• Isometric vs. Isotonic
• Isometric: measure pressure or force with no actual movement
• Isotonic: measure deflection from a center point (e.g. mouse)
65. Hand Input Devices
• Devices that integrate hand input into VR
• World-Grounded input devices
• Devices fixed in real world (e.g. joystick)
• Non-Tracked handheld controllers
• Devices held in hand, but not tracked in 3D (e.g. xbox controller)
• Tracked handheld controllers
• Physical device with 6 DOF tracking inside (e.g. Vive controllers)
• Hand-Worn Devices
• Gloves, EMG bands, rings, or devices worn on hand/arm
• Bare Hand Input
• Using technology to recognize natural hand input
66. World Grounded Devices
• Devices constrained or fixed in real world
• Not ideal for VR
• Constrains user motion
• Good for VR vehicle metaphor
• Used in location based entertainment (e.g. Disney Aladdin ride)
Disney Aladdin Magic Carpet VR Ride
67. Non-Tracked Handheld Controllers
• Devices held in hand
• Buttons, joysticks, game controllers, etc.
• Traditional video game controllers
• Xbox controller
68. Tracked Handheld Controllers (3 or 6 DoF)
• Handheld controller with 6 DOF tracking
• Combines button/joystick input plus tracking
• One of the best options for VR applications
• Physical prop enhancing VR presence
• Providing proprioceptive, passive haptic touch cues
• Direct mapping to real hand motion
HTC Vive Controllers Oculus Touch Controllers
71. Example: WMR Handheld Controllers
• Windows Mixed Reality Controllers
• Left and right hand
• Combine computer vision + IMU tracking
• Track both in and out of view
• Button input, Vibration feedback
73. Cubic Mouse
• Plastic box
• Polhemus Fastrack inside (magnetic 6 DOF tracking)
• 3 translating rods, 6 buttons
• Two handed interface
• Supports object rotation, zooming, cutting plane, etc.
Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input.
In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 526-
531). ACM.
75. Hand Worn Devices
• Devices worn on hands/arms
• Glove, EMG sensors, rings, etc.
• Advantages
• Natural input with potentially rich gesture interaction
• Hands can be held in comfortable positions – no line of sight issues
• Hands and fingers can fully interact with real objects
76. Myo Arm Band
• https://www.youtube.com/watch?v=1f_bAXHckUY
77. Data Gloves
• Bend sensing gloves
• Passive input device
• Detecting hand posture and gestures
• Continuous raw data from bend sensors
• Fiber optic, resistive ink, strain-gauge
• Large DOF output, natural hand output
• Pinch gloves
• Conductive material at fingertips
• Determine if fingertips touching
• Used for discrete input
• Object selection, mode switching, etc.
78. How Pinch Gloves Work
• Contact between conductive
fabric completes circuit
• Each finger receives voltage
in turn (T3 – T7)
• Look for output voltage at
different times
79. Example: Cyberglove
• Invented to support sign language
• Technology
• Thin electrical strain gauges over fingers
• Bending sensors changes resistence
• 18-22 sensors per glove, 120 Hz samples
• Sensor resolution 0.5o
• Very expensive
• >$10,000/glove
• http://www.cyberglovesystems.com
84. Comparison of Glove Performance
From Burdea, Virtual Reality Technology, 2003
85. Bare Hands
• Using computer vision to track bare hand input
• Creates compelling sense of Presence, natural interaction
• Challenges need to be solved
• Not having sense of touch
• Line of sight required to sensor
• Fatigue from holding hands in front of sensor
86. Leap Motion
• IR based sensor for hand tracking ($50 USD)
• HMD + Leap Motion = Hand input in VR
• Technology
• 3 IR LEDS and 2 wide angle cameras
• The LEDS generate patternless IR light
• IR reflections picked up by cameras
• Software performs hand tracking
• Performance
• 1m range, 0.7 mm accuracy, 200Hz
• https://www.leapmotion.com/
88. Non-Hand Input Devices
• Capturing input from other parts of the body
• Head Tracking
• Use head motion for input
• Eye Tracking
• Largely unexplored for VR
• Microphones
• Audio input, speech
• Full-Body tracking
• Motion capture, body movement
89. Eye Tracking
• Technology
• Shine IR light into eye and look for reflections
• Advantages
• Provides natural hands-free input
• Gaze provides cues as to user attention
• Can be combined with other input technologies
91. Pupil Labs VIVE/Oculus Add-ons
• Adds eye-tracking to HTC Vive/Oculus Rift HMDs
• Mono or stereo eye-tracking
• 120 Hz eye tracking, gaze accuracy of 0.6° with precision of 0.08°
• Open source software for eye-tracking
• https://pupil-labs.com/pupil/
92. HTC Vive Pro Eye
• HTC Vive Pro with integrated eye-tracking
• Tobii systems eye-tracker
• Easy calibration and set-up
• Auto-calibration software compensates for HMD motion
94. Full Body Tracking
• Adding full-body input into VR
• Creates illusion of self-embodiment
• Significantly enhances sense of Presence
• Technologies
• Motion capture suit, camera based systems
• Can track large number of significant feature points
95. Camera Based Motion Capture
• Use multiple cameras
• Reflective markers on body
• Eg – Opitrack (www.optitrack.com)
• 120 – 360 fps, < 10ms latency, < 1mm accuracy
103. Omnidirectional Treadmills
• Infinadeck
• 2 axis treadmill, flexible material
• Tracks user to keep them in centre
• Limitless walking input in VR
• www.infinadeck.com
106. Input Device Taxonomies
• Helps to determine:
• Which devices can be used for each other
• What devices to use for particular tasks
• Many different approaches
• Separate the input device from interaction technique (Foley 1974)
• Mapping basic interactive tasks to devices (Foley 1984)
• Basic tasks – select, position, orient, etc.
• Devices – mouse, joystick, touch panel, etc.
• Consider Degrees of Freedom and properties sensed (Buxton 1983)
• motion, position, pressure
• Distinguish bet. absolute/relative input, individual axes (Mackinlay 1990)
• separate translation, rotation axes instead of using DOF
107. Foley and Wallace Taxonomy (1974)
Separate device from
interaction technique
108. Buxton Input Device Taxonomy (Buxton 1983)
• Classified according to degrees of freedom and property sensed
• M = devise uses an intermediary between hand and sensing system
• T = touch sensitive