Hey Rifters!

This week is another change up: a live stream that I did last night with some VR devs and enthusiasts!
Here are some highlights:

15:55 Oculus at E3 Speculation
17:05 Browser Apps and Productivity Apps
19:00 Music Games and VR
26:25 “Concert Attendant Hero”
26:55 VR and Stage Fright
32:20 Donkey Kong VR
38:23 Planet 1 Play test
47:18 First Law Play test
1:02:40 Notch Cubes Play test
1:07:20 Rift Amp Play test
1:12:00 Rift Runner Play test
1:19:38 What I am looking forward to with the Rift
1:23:00 Is the Rift Consumer Ready?
1:28:50 Malfate’s Castle Play Test
1:32:05 Rift Software Resources
1:36:18 Head Bob or no Head Bob?
1:43:10 Heli Hell 2 Play test
1:56:12 Rift, 3D, and Eye Strain
2:00:55 Watching a SBS 3D movie in Media Player
2:06:12 Single vs. Double Screen HMDs

I also will be putting a Torque 3D play test on my YouTube Channel. I am thinking of switching over to Torque from UDK, I will keep you posted on that status.

Still working on tutorial writing, trying to make time!

Logging Out,
–Cymatic Bruce

Hi Rifters!

Here we go with another VR Dev journal entry. Not a whole lot to show off here – this past week was a crazy one! I have been helping test alpha builds, writing articles for Road To VR, making videos, and even doing an interview or two. Whew!

In this video, I show off the Kismet toggle switch. A great place to learn how to do that is the 3DBuzz tutorials:


Also, I have begun working on the key functionality of the Holodeck – taking you somewhere else. My first test is to gradually take the player from the holodeck room to the epic courtyard. In my first attempt, I simply made the walls invisible when opening the door. DOH!

Game development requires a lot of determination, and sometimes things don’t progress as quickly or as smoothly as we hope. Don’t give up!

Logging Out,


Hey Rifters!

What a week!

Its strange to know that I have only had my Rift dev kit for 7 days, with everything that has happened. One first impressions vid turned into two, which turned into a ridiculous number of videos, views, and comments. Oh, and an article on PCGamer.com:


A huge THANK YOU to all the folks who have watched my videos, subscribed, left feedback and encouraging words. Words fail to express how moved I am!

So, a different type of blog entry this time around. I wanted to discuss some impressions and observations after week 1 of Rifting. Discussed in this video are:

Getting Acclimated

-Looking around before beginning the game experience really helps. Force the player to do this, but nest it inside of the game narrative… kind of like Halo?
-Recommended game order: Tiny Room > Tuscany > Microstar > Dear Esther > Mirror’s Edge > TF2 > Skyrim
-Stay hydrated! Water, Ginger Ale, and frequent breaks are good.

Visible Area

-Not looking at a lot of the screen at all!
-FOV : 108 – 112 feels great, depending on the game and what lens cups you are using.
-“Rift Peripheral Vision”, when you are able to see a little more looking straight than to the side. Gameplay application for this?

My best estimate of the RPV (Rift Peripheral Vision) of all three lens cups, A, B, and C:  http://i.imgur.com/iUiwRl7.jpg


-Giving the player options are key
-Controller hierarchy: Hydra > Controller > Mouse & Keyboard
-Learned behavior: using the mouse/analog stick to accelerate turns. Analogous to spinning around the body?

The Game Narrative

– Camera “head bob” is not such a bad thing! In fact, I prefer it. Glide motion still makes me dizzy, even after a week of play.
– Cutscenes are probably best Half Life 2 style – let the player keep control. However, slow and steady camera movements work well. Avoid fast motion and radical rotation.
– Transitions are best when there is a fade to black or fade to white effect. Clean cuts are not as jarring as I expected them to be, but fade outs are better.
-Sound cannot be overstated. Make the environment immersive with high quality, realistic stereo sound.
-Horror implications? Rift peripheral + sound exploitation = NOPE. XD

So that pretty much covers it. This next week will be more on my own development progress with my game project. Also, I plan on a regular bi-weekly video release schedule for gameplay stuff on my YouTube Channel.

Honeymoon’s over, time to do the REALLY fun stuff – creating a world of my own!

Until the Next VR Experiment,

–Cymatic Bruce

Hey VR Heads!

Special Update! So my Rift arrived on Monday, and I have hardly slept since I got it. I have tried a whole lot of random software, including:

-Oculus Tiny Room
-Oculus Tuscany Demo
-Rust Ltd’s Museum of the Microstar
-UDK Maps
-VR Player
-Mirror’s Edge with Virieo
-Various YouTube videos and webpages
-A SBS 3D movie in media player

And finally, my own creation!! Walking down hallways and looking around a room that I put together was surreal. Is this how architects feel?

In other news, my videos on YouTube of my first impressions have become quite popular! The immense amount of support from this community has floored me. This spirit of support and cooperation will definitely take us to the next level in VR experiences!

Back to editing video, replying to comments, and logging more hours in the Rift!



Hey VR Heads!

Work and volunteering took up a lot of my time this week, but I managed to make some meager progress with Hydra integration in UDK. I used examples from a couple of places:

Craig just keeps posting awesome things! I took the camera modes that he constructed from his latest release.

This post by Comicaztaway is a tutorial for getting the arm of a skeletal mesh to move with the mouse, Little Big Planet style.

After a lot of experimentation and frustration, I arrived at what you see in the video. Nowhere close to where I want to be, but its progress nonetheless. I will make a tutorial video outlining what I did in the near future. In the mean time, here is the quick-and-dirty tutorial:

  • Make a copy of, and then edit CH_AnimHuman_Tree. Add b_RightWeapon and b_LeftWeapon to the AnimTree. Create 2 new SkelControl_CCD_IKs named RightArmIK and LeftArmIK.
  • Make the edits that Comicaztaway sugests in his post, then close and save the package.
  • Add the following code to your Pawn.uc:

var SkelControl_CCD_IK RightArmIK;
var SkelControl_CCD_IK LeftArmIK;

simulated event PostInitAnimTree(SkeletalMeshComponent SkelComp)
RightArmIK = SkelControl_CCD_IK( mesh.FindSkelControl('RightArmIK') );
LeftArmIK = SkelControl_CCD_IK( mesh.FindSkelControl('LeftArmIK') );

Begin Object class=SkeletalMeshComponent Name=SkeletalMeshComponent0
End Object

  • Add the following code to your PlayerController.uc:

var vector RightArmLocation;
var vector LeftArmLocation;
var HydraGamePawn Utp;
var HydraGamePlayerInput Hgp;

// function for SkelControl movement.
function UpdateRotation(float DeltaTime)
Hgp = HydraGamePlayerInput(PlayerInput); // gives us access to the Hydra variables
Utp = HydraGamePawn(Pawn); // this simply gets our pawn so we can then point to our SkelControl

// assigns vector RightArmLocation to the position of the Right Hydra Wand.
RightArmLocation.Z = Pawn.Location.Z + Hgp.RightHandPosition.Z;
RightArmLocation.Y = Pawn.Location.Y - 200 - Hgp.RightHandPosition.Y;
RightArmLocation.X = Pawn.Location.X - 400 - Hgp.RightHandPosition.X;
Utp.RightArmIK.EffectorLocation = RightArmLocation;

// assigns vector LeftArmLocation to the position of the Left Hydra Wand.
LeftArmLocation.Z = Pawn.Location.Z + Hgp.LeftHandPosition.Z;
LeftArmLocation.Y = Pawn.Location.Y - 400 - Hgp.LeftHandPosition.Y;
LeftArmLocation.X = Pawn.Location.X - 400 - Hgp.LeftHandPosition.X;
Utp.LeftArmIK.EffectorLocation = LeftArmLocation;

// Optional log feed, so you can see the number values and decide how to tweak the assignments above.
`log("RightArmIK.EffectorLocation Value is :"$Utp.RightArmIK.EffectorLocation);


I have weird numbers in there to offset position based on how I have my Hydra situated on my desk. The base station is off to my right. This solution is brute force and is not at all flexible, but may serve as a jump-off point for someone out there.

I will post the full files to GitHub sometime tomorrow, for now I have to get some sleep.

Oh yeah, on other thing: my Rift is due to arrive tomorrow! Next week’s video will include impressions and direct feeds if I can swing it.



Hey VR Heads!

So, this past week has been lots of random tutorials and reading the documentation in the newly opened Oculus Developer Center. Not a whole lot new to show off, but I definitely feel like I am making progress!

I got crouch working with the Hydra. Yet again, UDK is simpler that what I make it out to be. No need for Pawn.ShouldCrouch(bool bCrouch) or StartCrouch(float HeightOffset). Just use bDuck. Here’s the code from my HydraGamePlayerController.uc:

function LeftTriggerPress()
if(bDuck==0) bDuck = 1;

function LeftTriggerRelease()
if(bDuck==1) bDuck=0;

Also, I have been slowly but steadily learning more about the UDK editor. Kismet and Matinee are not terribly intuitive, but I am getting the hang of it.

Finally, after several recommendations, I ended up watching the anime series Sword Art Online on CrunchyRoll. It is a really cool show about gamers stuck in a VRMMO. The battle scenes and story were pretty fantastic.
I was impressed with how they presented the player UI in the show. It has a really clean design, and is gesture based. You can see an example here (skip ahead to 8:45):

Also in this vid (skip ahead to 5:52 and 18:45):

I began investigating how I would do such a thing in UDK. As it turns out, Scaleform has 3D UI functionality in UDK! Awesome! I have no idea how the layering of 2D objects will look in the Rift, but I am going to attempt to throw together a UI mock up that is similar to SAO.

Other than that, I am still working on binding the arms/hands of a Pawn to the position of the Hydra controllers. I have been slowly working through the info on this page: http://udn.epicgames.com/Three/UsingSkeletalControllers.html
Once I have that done, I will put together a comprehensive tutorial for adding Hydra support to UDK on a separate page, so future devs don’t have to go digging through blog posts.

Hoping to receive my dev kit this week!



Hey VR Heads!

So most of this week was spent learning how to make static meshes in Blender. Blender is complex – definitely not the “jump in and push buttons” type of program. After poking around a bit, I found a great tutorial that was highly recommended:

Blender 3D: From Noob to Pro (wikibook)

I am in the beginning of Unit 2, and I was able to put together the stuff you see in the video above. Not bad. 😀

The UDK tutorials from The New Boston are also fantastic:

The New Boston’s UDK Tutorial playlist

A lot of great knowledge and step-by-step walkthrough for the UDK Editor, Kismet, and Matinee.

Finally, I was able to get some buttons on the Hydra linked to in game actions! If you are using Craig’s source that I posted in part 3.5, you simply need to add Pawn instance functions (or whatever else you want) within the PlayerController class. A list of Pawn instance fuctions is here:

UDK List of Pawn Instance Functions

Here are some examples I was able to get working:

function RightTriggerPress()

function RightTriggerRelease()

function RightBumperPress()

function RightBumperRelease()

function RightB1Press()

StartFire and StopFire need a byte argument (number between 0-255). What firing type each number value corresponds to is up to the weapon. By default, 0 = primary fire and 1 = alternate fire.

Other than that, I changed the AimMode variable within PlayerInput to 2, which allows the movement of the right Hydra wand to control looking around. It feels pretty nice so far. I also slapped together a Pawn file that would silence the footsteps in game. Those footsteps are loud, and ruin my videos! XD

That about wraps it up. Look forward to the hallway getting fancier and fancier!