April 2nd-8th, 2018:

DSN 6400:

For the past couple of weeks, after narrowing down the specifics of our project in relation to the time of the ACCAD Open House, Abby and I put our focus on what we call the 'Prologue Scene'. This scene is where the user will experience Ruby's first walk up into her new school. This scene is all about immersion, not interaction at this point, so I wanted the animations to be the soul of this experience first.


This week I messed around with figuring out how to get all of the animations we took for motion capture in one scene, how to get them to loop, and then also replicate them so that there was a crowd affect.


In MotionBuilder, I haven't figured out a way to get all of the the data into one scene - they stay separate takes. Regardless if I merge or import them into each other's scene. Now maybe this was a glitch, since the merge function should work - but it is something that I need to look more into. So instead I would bring the FBX takes I had edited in MotionBuilder and then import them into a Maya Scene. I could bring in multiple of them, edit their keys, and offset them to where I wanted. Finally, this is where I placed most of them in a Maya Scene:

Then from there, I exported these back into individual FBXs. I did it this way because when I got them to be in Maya, that was to figure out timing against each other. Now, I needed them to be individual FBXs again so that I could bring those into Unity for individual characters to be applied to them. Here is a screenshot of how many clips of data I used as FBXs:

Now, I needed to bring them into Unity to see what that process would be like of applying 'avatars' to the characters and how long it may all take. After getting the 'avatars' on them, this is what the Maya scene turned out to look like:

From here, I wanted to try to loop the animations, but also make more characters (just duplicate) so that there was a crowd effect. We didn't have time to learn how to use a crowd simulation in Unity, but after clipping down the mass of the motion capture data, it allowed me to duplicate them. From there, Abby instantiated them as prefabs so that the animation wouldn't lag as much.

From here we worked together to input sound, attach it to the menu, and have a full 'prologue' sequence. Here is some of our testing the night before the open house to see what glitches we still needed to overcome:

At the Open House, we got a lot of feed back and praise for this project. So far, we have only been hearing from our peers and instructors, so it was beneficial to have a fresh pair of virgin eyes on the project to give critique.


We explained to those who approached Ruby's story and importance in history. Then we discussed how we are in the prototype testing phases of this simulation, and would then offer up the headset/headphones for them to experience our current state of the project.


We had a range of on takers, including toddlers and going all the way up to an elderly woman I would say who was in her mid to late 90s. So some of the feedback we received was on how immersive it was for all age groups. For me, that was a success because I wanted the animation to come across as making you feel surrounded and immersed. With that and the audio, I think Abby and I are moving in the right direction of immersion.


Other comments included hesitation to turn around or spin because of dizziness. This I think is because it was some of the users first time in VR, while for others, the ongoing motion may have caused a vertigo like feeling. This is one of the downsides to virtual reality that we are going to look into more for a hope of not causing motion sickness or fear of self movement.


Moving forward, Abby and I will be taking into account what we learned at the Open House and try to implement it into our future production of this project. We want to get the crowd simulation working so that the Unity file isn't enormous and lags. And we want to extend the experience, while also giving the user functionality to walk around in the space rather than being still. I will also start adding the models in this week that I have been creating in Fuse to get more of a real life effect rather than the robot features we have now.


ANIMATION PRODUCTION:

In preparation for the open house, I made a rigging video to show off the final rig of the Kai character in "The Stargazer" movie. This shows where the simulations will take place, the functionality of the controls, and how the skin weights work.



WWW.TORICAMPBELL.ORG
  • LinkedIn - Black Circle
  • Instagram - Black Circle
  • YouTube - Black Circle