Back To The Future - The Ride



  • Thank you so much :-)

    That would be amazing, something more streamlined would do wonders with being able to add motion to 1st person clips of our favorite old theme park rides.

  • Yep the SRS 4d player/recorder will allow YouTube videos 360 3d and yaw1/yaw2 will sync with motion.

  • edited July 2021

    Hey Folks! This is Mike. I don't have time to create a tutorial at the moment but I can detail a little bit of how I got this to work.

    This is all made in Unreal Engine using the Yaw VR SDK that I worked on. You can download that on Yaw's site I believe. I didn't want to simply display a video and create a motion profile for it. Instead, I was aiming to emulate the actual ride where you sit in a DeLorean, and watch the ride video on an IMAX screen with smaller screens in the DeLorean itself that show the communications between you and Doc.

    I did this by modeling a simple IMAX-like screen in Blender, grabbing and modifying a free DeLorean model from a website, and downloading some source videos and audio of the ride media and communications media from YouTube.

    I then took all this media and aligned them in sequencer inside of Unreal engine. This way they are all synced when I want to play or scrub through them while creating my motion profile. The motion profile itself is an extra track in sequencer that I animate along with the media.

    The effect of this isn't that you are watching a video of the ride. Instead you are sitting in a DeLorean, looking at a dome playing the media just like the real ride. It sounds like I'm nitpicking the experience, but you'll understand what I mean if you try it yourself.

    Once I play the sequence, I grab that motion data each tick and send it over to my Yaw SDK to be sent to the Yaw VR for playback.

    There is another track in that sequence for motions such as impacts and engine rumble that run separately from the normal motion animation since the oscillation is highly repeatable. These use triggers for changes in code rather than animations.

    On top of sending my motion profile to the Yaw VR, I also offset the VR camera using an algorithm I've created in order to counter all of the motions so that your viewpoint is correct. This eliminates the need for an external tracking device, such as a stationary controller. This aspect is very much a work in progress that I'm planning on continuing once I move to a new location.

    Obviously, the entire process is much more complicated than this but hopefully this outlines a little bit of how it all works.


  • Wow Mike, thank you for providing insight into this process. Even when you simplify it,it sounds a bit daunting so kudos to you for this amazing job.

    hopefully the upcoming add on to SRS mentioned above will simplify the process of simply adding motion to YouTube 360 videos as I think with my know-how this is all I would be able to do. :-)

    much respect and hope you continue to release these from time to time!

  • @MesaMarshall Welcome to the yawvr family huni and Thank you for your continued work for yawvr. Hope to see and feel other experiences soon 😉

  • Man. What an awesome thing to put out for the community. Seriously, thanks Mike-ster!! Speaking of... it's been a couple years since we heard from the legend.... anyone heard from Mike? He doing alright? Still out there... doin Mike things?

Sign In or Register to comment.