top of page

Game Development

01

Milsey - Portalgraph + DIYSlimeVR mocap Live Virtual Production Projector Wall

In January 2026, I Learned how to utilize the PortalGraph Unity package and SlimeVR to create a new live virtual production workflow. By using an HTC vive tracker on the back of this selfie stick camera rig, the virtual camera follows the real-life camera, giving the illusion of looking into a 3D-scanned environment created using The 3D scanner app on iOS. The SlimeVR trackers connect to the Unity Demo through the EVMC4U Unity package. By setting the VMC ports to the same output Slime trackers can connect to this basic Vroid Studio avatar to create animations someone can look around without the use of a VR headset. 

Screenshot 2026-02-09 at 10.57.25 PM.png

02

Penelope Scott - Y2k Baby Intro VRChat Virtural Production Set

In January 2026, I discovered that Penelope Scott was running a multi-animator project for her latest single, "Y2K Baby," where fans could submit small animated segments of the song for a chance to be included in a new community visualizer. I decided to utilize my Unity development skills to create a set with interactable props within VRChat so I could utilize the in-app camera to film a music video on a short turnaround. Some additional benefits of recording inside of VRChat compared to Blender include hair physics and the ability to puppet facial expression with controller inputs. One downside to recording in VRChat is that I am unable to have individual finger input with the flex sensor controller I built. This means that I have less freedom to use finger gestures without additionally triggering facial shape keys.

IMG_0808.jpg
Screenshot 2026-01-18 at 10.04.45 AM.png

03

TMU - Unreal Engine Portrait Lighting and Still Camera Workshop

In November 2025, these 3 images were created as part of TMU's Unreal Engine workshop for lighting and camera work centred around Metahumans, hosted by Robert Delarosa. I decided to experiment with incorporating some 3D Lidar scans captured through the "3D Scanner App" on iOS in order to give the subjects context while exploring various lighting techniques.  One of the most interesting lighting techniques used was Gobo lighting. This technique involves placing a Pattern on top of the light in order to project it on the wall ( window frames in the bottom image for example)

TMU UE Lighting Workshop P1.png
Lighting grafitti 3.png
TTC buisnessperson p2.png

04

Milsey - LucidVR Glove Haptic Animations Unity Demo

This Unity demo demonstrates the potential behind combining haptic gloves, SlimeVR limb trackers and 3D scanning into an immersive viewing experience. The haptics on the Vroid studio avatar with the brown hair (myself) are done by adding cube or spherical hitboxes to each part of their body (hands, lower arms, legs, etc).

 

The grow and shrink tubes also can demonstrate to viewers the difference between haptics on a smaller milk carton compared to when its enlarged to the size of a wall. Finally, the button controls are designed to give the viewer the ability to pause or play the DIY SlimeVR animations along with enabling or disabling the size changing tubes if anything starts to glitch out. 

3D Scanning the room using an Insta360 X4 and Metashape also shows how we can capture the environments around us to use in video production. In the future, I'm hoping to expand on this concept by controlling 3D person scans with my custom SlimeVR "Trillium" trackers.

Screenshot 2025-09-18 at 2.47.11 PM.png
Screenshot 2025-09-18 at 2.45.02 PM.png
Screenshot 2025-09-18 at 2.16.56 PM.png
Screenshot 2025-09-17 at 12.01.08 AM.png
Screenshot 2025-09-16 at 11.59.17 PM.png
  • Bandcamp
  • Linkedin
  • Youtube
bottom of page