February 19th-25th, 2018:

4-Week Prototype Presentation

(Link: https://docs.wixstatic.com/ugd/f1daf1_d2d57c4d3f9340aeb2a07f71f2e63d03.pdf)

Work of the Week

This week I focused on:

1. Designing prototype characters that we can map the data onto for some more 'real like' scenarios than just random 3D characters - just editing from free models on Mixamo and changing the textures.

2. Getting data from our main character database (Ruby, mom, teacher, cops/officials, mob) all into one scene that we can mess around with in Unity.












Here is the process of me getting all the data into one scene file right now. I started merging clips together (some, not all) and have gotten them into one scene, but they play on their separate takes/timelines. So I will still need to mess around to get them all into one file, playing simultaneously, but have the ability to edit and offset some things.

Here is getting all of the data into one Maya scene, without editing offsets yet, but getting the feeling of multiple characters in the same shot.

Here is the same data, put into Motion Builder to have all the characters with differences in the same scene - still trying to get the characters I adjusted above in there, but having issues getting the geometry changes to show.


"The Research of Motion Capture Technology Based on Inertial Measurement"

This article was more objective than subjective in describing how the researchers essentially built their own mocap system based on inertial measurement, and they explain their methods with it. They first go into detail about the systems that are already out there, and then compare those to its system design. Math equations and algorithms come into play, but that is more of the technical side that they dive into.

(Original Link: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6844369 )

"Real-Time Motion Capture System From Disney Research Uses as Few Sensors as Possible"

Here describes using only 5 inertial sensors to capture data, increasing time setup which would be extremely helpful when incorporating it into virtual reality. This is just a study to better time and receive live feedback with motion capture. It also discusses how it can use so little markers because it's based off of a physics system that can discover how the body can 'plausibly' move with only the input of 5 markers.

(Original Link: https://techcrunch.com/2016/12/12/real-time-motion-capture-system-from-disney-research-uses-as-few-sensors-as-possible/)

"Virtual Dance and Music Environment Using Motion Capture"

This article discussed the use of motion capture translating into analyzed sound data through triggers and translation. It got me thinking as to how Abby and I can implement some sort of data transfer into our 'interactive' aspect of the project. It involved collaborating of other departments, and I want to somehow get the message of 'movement language' across.

(Original Link: https://www.researchgate.net/publication/228880972_Virtual_dance_and_music_environment_using_motion_capture)

"Designing Games to Foster Empathy"

Of all of the articles I've read about VR and mocap so far dealing with games and empathy, this is the most relevant to our personal project. It really dives into how to design for 'values at play' and the different types of empathy that should be goals in game design. Cognitive vs. emotional empathy rounds out what feeling(s) we are aiming for in who our audience is, and how they may react. The article also goes into design principles for games and empathy, touching on attitudes and mechanics of approach.

This is an article I want to discuss more with Abby in furthering our project.

(Original Link: http://www.tiltfactor.org/wp-content/uploads2/cog-tech-si-g4g-article-1-belman-and-flanagan-designing-games-to-foster-empathy.pdf )

Recent Posts

See All

October 7th-11th, 2019:

I'm getting terrible at consistently doing a post of my work! But I am going to try and really focus on it from here on out (insert all of you reading this rolling your eyes right about now). For the