Thursday, 25 July 2013

VR AR and colouring in

The Virtual Reality (VR) of the Oculus Rift is stellar (first look yesterday) as are advances in surgery with VR (New 3D Surgeon Helmet Turns Complex Surgery Into Virtual Reality-Like Experience) but the idea of Augmented Reality (AR) is perhaps the real way forward as far as general takeup goes. With Jordi, Google Glass and a host of smartphone apps - the precursors are here.

I thought it was worth a quick post to show that the consumer-end is often driving advanced technologies forward. The ColAR APP is conceptually simple, print out one of a defined set of drawings, let the kids colour them in and then use the iPad app to bring those coloured versions to life in AR 3D. Such a clever idea to bring the digital and physical worlds a touch closer together...

Wednesday, 24 July 2013

Oculus Rift

Today I took a moment to install and try out the game changing Oculus Rift. With education in the Built Environment disciplines spending much of its endeavour in the practice of designing potential spaces - being able to experience them seems like a no-brainer. At the moment we primarily do this in our heads from stills, some get us to video and others use Game Engines like CryEngine to deliver a real-time experience. Still fewer take that real-time idea to a 3D screen and the leap to Virtual Reality has always been a very big one.


The glasses I have used of late were like watching a 3D TV, a screen in the distance and not delivering the full field of view and elegant head tracking. Well, enter the Oculus Rift that ballooned on Kickstarter and the Dev Kits are now available.

The Faculty (Built Environment UNSW) has a few already and it was a pretty simple setup and install of the base functionality. The hardware itself is very slick and feels quite high-grade for a dev unit. once you get used to how close the lenses are meant to be to your eyes it is immediately impressive how vast the display is to our eyes. Bringing up the Oculus World Demo (Tuscan inspired mini level) we get the full effect, looking around by moving our head is very natural and being able to look up through the trees or over a balcony railing is just so immersive. Once you find your mouse and the movement keys you are off and exploring using a combo of movement and looking about in a way that it seems silly we haven't had this before now.

The experience of being in a world and really feeling like you are in it with the VR headset is impressive. Add to this moving through that environment while our actual bodies don't move means we have to deal with motion sickness - zipping up and down those stairs and even just general movement was enough to make me queezy. This is a hard thing to eliminate as the logic will always be there, we 'see' movement but our body doesn't 'feel' it. With a conventional screen the illusion isn't all encompassing - now it is. [detailed look at this issue]

The dev unit isnt without fault of course. The current 1280x800 screen res means those pixels are definitely visible and does make for a blocky experience in many ways. The production versions are meant to be higher res, but perhaps the larger issue is the gap between pixels is quite apparent creating a quite marked grid across our vision. If future versions could make those gaps vanish then even if our res doesnt go up that much the effect will be more transparent to us as we run around.
It might be handy to have the lens distance adjustable with a nob rather than needing a coin...

All up - the Oculus Rift is a remarkable thing and just so approachable and beneficial for the exploration of design concepts in the built environment disciplines. More news as I get to explore this tech further...


Monday, 22 July 2013

UV Monster dives into CryEngine

I set out today to UV unwrap the base mk3 Monsterpiece shape and bring that into CryEngine with a texture. It turns out I had a fair bit to brush up on, learn or look up again to complete all those steps for a basic test.

First I went through the UV Mapping Workflows in 3ds Max tutorials as the last time I did this was using Maya some time ago and remembering the principles just want enough for me to actually do the deed. So bringing up the current mk3 Monsterpiece 3ds Max model I decided that I would have each hump use the same texture for this test and thus I overlaid the UVs for each section over the top of each other with the head and tail sticking out. I resized things a little so I had a bit more res on the head and exported out the UV template.

In Photoshop I quickly sketched in a rough texture and guessed at locations for the mouth, eyes, nostrils and gills and then brought that back into 3ds Max to see how it looked. I edited things few more times before it made for a passable test and exported out the .tif and .dds using the Crytif exporter.

Then I exported out the revised UV version of the geometry and brought it into the current version of the 3D loch. I ended up creating a material, when I could have generated it from Max, but once I added the right texture to the diffuse and specular I suddenly had a decent looking monster standing there proudly as the water washed back and forth.

Clearly this could be more refined and the next step is to create a much more detailed model, but I am happy that at least this process yielded a sane result.