On this page, I will display some demo videos - of various projects I developed in the last few years. Feel free to skip the lengthy auto-biography on this page, and just peruse the videos. Though if you do read through this entire page, you should get a good impression of me and my diverse capabilities and recent history as a developer. I'm skipping over the earlier decades of my career, as I'm more solely focused on my current path. Though I am not averse to merging VR and AR into eLearning development, as there is SOOOO much potential in such a combination!The videos I am showcasing are projects for which I was the sole programmer. Literally. The last couple years were spent mostly working from home (thanks COVID!) and our little team got very good at working remote, via Google Meetings - we stayed in contact all day every day via video conferencing!
As the company I began truly delving into VR was a startup, there was a lot of room for experimentation with new technologies. We had awesome 3D artists (a great team and I have to drop a shout out to Jade and Omar - "Team JOD!", artists extraordinaire! - as I do not have skills in 3D modelling - they would both agree!), but the code - the development of these projects... I can honestly say they were lovingly developed solely by me. It was a small team back then. There were other projects I worked on in conjunction with a couple other programmers very early on, but they each left at different points early on (at a previous company I worked for - which no longer exists, and well before our 3D team became employed at a new startup), leaving myself as the sole game engine, Unity/Unreal, simulation/VR/AR/XR programmer.

There were quite a few other projects I worked on that I just don't have any media/video for. And if I did, I wouldn't feel comfortable showing as the content is proprietary. I can say, without divulging any company secrets - that I worked on simulations and creating tools for internal use with the 3D team. I can feel comfortable showing these videos (below), as none of these projects fall under the new direction the company has taken. Also, none of the source code for any of these projects is available, and nothing proprietary to the new company is divulged in these videos. I am sharing these videos as a way to present my experience and accomplishments, and none of it should in any way reflect my prior employers.

I've developed all the below using Unity, though I have experience with developing quite a few Unreal based projects and experiments. Disclaimer: I like Unity better. Don't get me wrong - Unreal Engine is fantastic graphically. Unreal's graphics capabilities can be stunningly amazing. But I would much rather program in C#. There is just so much more you can do with Unity. Unity is well documented with a massive and helpful community - whereas Unreal's documentation is notoriously sparse, conflicting, and sometimes just outright incorrect. Also... Once you require something beyond Blueprinting in Unreal - you need to dig into Unreal's flavor of C++... I've had my taste of C++ and I don't ever want to go back. 😉
Time to start viewing some of my projects!
Ah, the Apollo Lander Simulation. This was one of the early projects I worked on.
This project used a Vive headset connected to a beefy PC, and a beefier VR Chair - a special 6 Degrees of Freedom, servo operated chair, custom modified by us (and by us I mean a colleague of mine who was a certified genius, tinkerer, and the hardware guy who was absolutely amazing with technology). On this project, he did the harware side - I did the coding.

The controllers were flight-sticks mounted to a custom built console attached to the chair. I coded it to move relatively opposite to the motion of the Lander while in flight - based off of the acceleration. Not the rotation. That way, it leaned in the direction opposite the accelleration in a given direction, making it feel as if you were being "pushed" - like in a car - when you turn sharp to the right, you feel yourself pushed to the left...Again, I'm going off on tangents... Just watch the video and see how cool it looks.
Terrain was standard game terrain, but the lander was literally scanned by my previous company and the awesome graphic artists there put it all together from scanned data to get the most authentic Lunar Lander you will ever see (virtually). Very cool stuff! Visually (and viscerally!) stunning demo. The flight controls were even "dumbed down" to ensure you couldn't flip the lander, and hopefully dampened the motion enough so virtual astronauts wouldn't become nauseous...
The rover was my idea - as I figured it should be FUN once you land! And what could be more fun than driving a lunar rover around in low-G?! Later in development, I added tire track effects and more. This was an early demo video from my view inside the Vive VR headset while sitting in the 6DOF chair.
Experimentation with Hololens - merging video from Hololens to PC external camera (webcam in this case), and synchronizing their views so that the PC can visually render the objects I'm seeing in the Hololens into the video at the correct position and orientation - by adjusting a virtual "camera" object's position and orientation by hand in AR to line it up with a real-life camera, and using "greenscreen" effect using OBS video software to link them together. Pretty nice results!
And this is the setup from my view while wearing the Hololens headset.
Next, below is a video of a COVID Transmission simulator I started at the very beginning of COVID. Though not VR, it was a simple project developed in Unity. I actually started this project at home as a "fun project", and we ended up using it on our website as a demonstration. There was still much to discover about COVID while it was still so new (as we all remember!). I added the latest (at the time) data available to estimate chances of transmission, and used gamification to navigate characters around in an environment. Chance of transmission between one person to the next was used to calculate infection rates and scaled up to visualize the spread among populations. Adding "Stay at home" measures with differing durations, at different points in a timeline, and the ability to modify settings such as chance to spread and survival rate would produce a graph over time showing the results of different scenarios. It was scary as heck back then. And the numbers this simulator was generating was eerily accurate.
ViewAR was a very early prototype (experimenting with remote telepresence) connecting a PC to Oculus Quest headsets or mobile devices over a network/internet connection (using PUN), in a multiplayer session. So the AR user (using a mobile device (ARKit/ARCore) could synchronize a target location (on a table), and users in VR (multiplayer over the internet) could also see a representation of the mobile phone user in VR space on an Oculus Quest. Each user would see each other as an avatar in relative orientation to the central object! This was early in the life-cycle of the Oculus Quest, and sooo much more has since become available to the currently available SDKs, that this project seems almost archaic now.

 

I'm eagerly awaiting the release of the source project "Unity Slices: Table" tech demo, as it is exactly what I was trying to accomplish! Below is a video from the PC user's point of view.
Below is a video from the point of view of the Oculus Quest headset. At one point, I connected the user's voice input to online Speech Recognition and Language Translation services to see how far I could take it. Essentially, a real-time universal translator! This was also before hand tracking was available - which, if you follow Oculus[Meta], has just been updated with Hand Tracking 2.0!
And lastly, a "blast from the past"! It seems so long ago, with all that's occurred since. One of my earliest projects at the aforementioned startup...
Below is a video of the early stage prototype of a project I developed for a museum, using ARKit/ARCore (again, early versions of those SDKs) to use Augmented Reality on mobile devices (phones). Also pushed that technology to its limits at the time. There was also an "easter egg" I added where the floor targeted models would randomly spawn a spider that would use the camera's position and the floor-plane to navigate (skitter!) toward the feet of the person holding the camera. Fun times!
There were so many more projects I had the opportunity to develop over the last few years. I've developed applications for so many devices, and gained so much real-world experience with best practices in the worlds of AR, VR, and XR. And at home I kept on experimenting and tinkering with each new update as they continued to become available.It is amazing what you can do with these technologies.The SDKs available for these devices right now is advancing so rapidly (specifically for the Quest), and new devices are coming soon! They're right around the corner!I'm looking forward to having access to color cameras on the upcoming Quest, as that will open so many promising applications. Color pass-thru cameras on such an inexpensive VR headset will essentially give us access to an AR headset like the Hololens (with all the potential for AR applications - a 'la MRTK SDK - which is already possible, just with grayscale pass-through-video which it was never intended for - until us developers got access to it!), as well as being a fully self-contained VR headset!

Such amazing potential! In fact, the hand-tracking on the Quest just got a massive boost just weeks prior to me typing this!I wish I could show more samples. What you see here is just a taste of what I'm capable of developing. A very diverse set of examples, which I hope demonstrates the diversity of my capabilities.

I'm ever grateful for the opportunities I had at my previous employment, to experiment with more technology than many have the chance to get their hands on. I hope the next chapter of my career allows me the chance to continue exploring what's possible - and will allow me to leverage these technologies as they evolve. It is for sure an exciting time for a developer with the passion to create interactive worlds that end users can actually viscerally feel they are a part of!