Week 11: Project Week 4 (of 7)

Assignment for Wednesday

Present STUDIES to help you make production decisions for your project. You can use Ricoh Thetas, audio recording, post-production AV tools like AfterEffects, and 3D game engines like Unreal. Sounds heavy but pick things simple and useful. Please present them both in group mode on the big flat display and in VR using GoPro VR Player or in Unreal-to-VR. Ask us if you need help.

Current Plan

Obsidian S camera arrives Friday by noon and stays here for one month. The workflow is shoot, stitch (Windows only), post-produce in video, then move assets to Unreal for spatial sound and interactive branching. Use the production calendar for the camera coordination. Assume you’ll want a second iteration after your first shoot.

Week 9: Project Week 2 (of 7)

Things to Address
1) Title/Topic/Content
2) Who/What is in front of the camera
3) Sound
4) Interactivity

Production Specs

Hardware: VR headset and handheld pushbutton

Clips are all video sequences.
– Destination clips are 360 degree video, stationary camera, spatial sound.
– Route clips are ~270 degrees video sped up, moving camera, music track.

Hotspots are still frames with highlighted areas which, when clicked, call up a new clip.
– Destination hotspots have >1 option to allow users to “click to choose.”
– Route hotspots may have 1 option, which means “click to continue.”

Goal is total spatial continuity, perfect (or near perfect) “match cuts.”

+++++

Some Guidelines

It’s far easiest to have all imagery be actual stuff in front of the camera, even text.

Stuff in front of the camera can be digitally added or deleted if the camera hasn’t moved and the lighting stays ~the same.

Sound can be more flexible, recorded during camera shoots (for synced voices) or recorded separately and added later.

Careful planning makes for easy shooting. When camera is at a destination, shoot all options at the same time.

Options mean interactivity:

– Print several different text messages to be held in the same spot, or different spots.

– Shoot multiple performances, like a student demoing their project in 5 takes expressing different levels of satisfaction or frustration.

– People can be hotspots (“follow me”) or parts of people (“chose which hand”).

– Props can be hotspots.

Week 8: Project Week 1 (of 7)

The Plan
Our own productions inspired by the NY Times “Met” video but interactive and in VR.

Routes & Destinations in a Game Engine
Routes

– VR camera on dolly tripod
– Sped up video
– Auto-Freeze on Easter Eggs
 
Destinations
– Close-up center of attention
– Composited video and sound
– Options for what’s next
 
Game Engine
– Interactive hotspots for branching
– Spatial sound
– Possible CGI component
 
Themes (3 teams)
Activist
Poet
Entertainer

Week 7: Current Dilemmas & Opportunities

This is a lead-in, pre-production, to the second half-semester.

Project Priorities
Imagery is stereoscopic
Imagery is immersive (at least 180 degrees)
—- Imagery may be shot <180 degrees

Sound is spatial
—- in VR video
—- in game engine

Interactive (in a game engine)
—- manipulating elements
—- branching only

Relevant Videos
NYTimes at the Met (7 min)
Ode to Joy with audio (4 min)
6DOF Depth maps from stereo (15 min)
Close Up in VR (15 min)

Possible Venues
Moving around a space
Extreme stereoscopic 3D
Interactive people

Cameras
Jaunt
Yi Halo
DIY stereo rig

Style
Documentary / Political
Art / Conceptual
Entertainment / Humor

Exercise: 5 Groups of 3
Name 3 things you’d like to see more in VR titles today.
(10 minutes)

Assignment for Wednesday
Your 3 most
[compelling / novel / cool / interesting]
[techniques / innovations / styles / moments]
you’ve seen / heard / read in VR.

Please post and present.

VR/AR Fundamentals 4/5: Input & Interactivity

Michael’s Presentation (PDF minus videos,62 slides, 5.2MB)

Week 5 – VR / AR FUNDAMENTALS 4: Input & Interactivity
Using our effectors and intentions as inputs and how they shape interactive experiences

Input
I/O
Computation
Sensors (Non-AV)
—- Mechanical
—- Position & Movement (IMUs)
—- Biometric & Environment
Vision/Cameras
—- Body
—- Hands
—- Eyes
Sound/Microphones
—- Voice Control
—- Smart Mics
Mind
—- Electrodes & Big Machines
—- Optical Imaging & Holography
—- Psi & Consciousness Research

Interactivity
UI
Navigation v Manipulation
Symmetry
Responsiveness
Inference
Control v Illusion of Control

VR/AR Fundamentals 3/5: Other Senses (Haptic, Smell, Taste, Mind)

Michael’s Presentation (PDF minus videos, 51 slides, 1.3MB)

Week 4 – VR / AR FUNDAMENTALS 3: Other Senses (Haptic, Smell, Taste, Mind)
Fooling the other senses.

Haptics
—- Haptics & Force-Feedback
—- Seats & Motion Platforms
—- “4D”
—- Skin as Input
—- Hands & Controllers
—- Touching Real Things / MR
—- Non-Contact Haptics

Smell & Taste
—- Smell-O-Vision & the Food Simulator
—- Current Work

Mind
—- Remote Viewing (ESP), Brainstorm (the movie), & Science
—- Hacks

Issues
—- When is suggestion good enough?
—- Synaesthesia

VR/AR Fundamentals 2/5: Audiovisual Spatiality & Immersion

Michael’s Presentation (PDF minus videos, 88 slides, 2.7MB)

Week 3 – VR / AR FUNDAMENTALS 2: Audiovisual Spatiality & Immersion
What does it take to fool two eyes and ears and with an unframed image?

Stereoscopic & Stereophonic Displays
—- Parallax & Disparity
—- Interpupillary Distance (IPD)
—- Convergence
—- Stereo & Binaural Sound

Multiscopic & Multiphonic Displays
—- Holography
—- Head Tracking
—- Volumetric Video
—- Spatial Sound

Panoramic  Displays
—- History
—- Monoscopic Panoramas & Nodal Points
—- Stereoscopic Panoramas & Head Rotation
—- Multiscopic Panoramas

Putting It All Together

Hacks