February 12th-18th, 2018:

This week, one of y main goals was to try to get some of our data into Unity. We wanted to start testing animation in Unity, but needed to get the FBX mocap file working. The video below shows the process of how we got the animation working. It was a bit complicated with the right scripts and components getting attached to have it go on play. Then from there it needed to be adjusted to positioning. The second video is Abby's attempt on getting the animation to react to her controls via the Vive setup in Unity. This is going to require some more technical research though.


On Sunday (02/11/2018), we were lucky enough to have two members of the performing arts department come and do some improvisational performance work for us. We had them act out different characters, giving us a broad range of data to work with for the story. The characters we decided to use were:

- Police Officer(s0

- Mob (multiple roles)

- Mrs. Barbara Henry (teacher)

- Ruby

- Mrs. Bridges (mom)

We then continued to capture more data on 02/18/2018 with two other members of the theater department. We did the same setup, with the same scene capture, but with two person capture instead of individualized captures. This allowed us to see the data with the people interacting with each other, trying to act to scale, and also get it live streaming into Unity. Abby and I had some issues getting Unity and Steam to cooperate, but got Lakshika to help us this go around compared to last week. This will allow us to pose the actors to interact in the scene rather than imagining the area.


Below are links to some relevant videos I watched this week focusing on a multitude of topics from our mediums of motion capture and virtual reality, and also the history of Ruby Bridges. This process will document my gathering of material and what my thoughts were in response to these videos.

"With virtual reality we have the opportunity to actually let the actors see their environment and move to those cues and have virtual teleprompters and all kinds of fun stuff you'd have in a stage environment, but now they can actually see what they're acting towards."

This company is using the same type of programs we are, motion capture, virtual reality with the HTCVive, and Unity for game design, so to see that this can work and that live-captured performance is possible backs up our technological research.

(Original Link: https://www.engadget.com/2016/01/13/vr-motion-capture/)

"Control VR is the world's first full upper body motion tracking solution."

This company also has their own mocap system outside of VR, where they connect the two in order to see your body and hands within the range of motion up to 1/10th accuracy. They go into how the technology can be used for everyday people rather than big companies, giving use insight into the common user.

This technology also has optical vision so that you're being tracked even when your hands aren't in the VR headset viewer.

This is a company kickstarter setup, but interesting to see the integration of VR and mocap.

(Original link: https://www.youtube.com/watch?v=qZhcINr8bw4)

Recent Posts

See All

October 7th-11th, 2019:

I'm getting terrible at consistently doing a post of my work! But I am going to try and really focus on it from here on out (insert all of you reading this rolling your eyes right about now). For the