At the Intel Developer Forum, Microsoft Executive VP for Windows and Devices Terry Myerson announced a partnership with Intel to collaborate on mixed reality computers and head mounted displays. As part of this initiative, Myerson also announced that the technology used by Microsoft’s HoloLens would be coming to Windows 10 next year.
Named the Windows Holographic Shell, Microsoft is aiming to bring virtual reality to the masses. The Holographic Shell will allow both 2D and 3D applications to run in Virtual Reality, Augmented Reality and Mixed Reality scenarios. The great thing is that Windows Universal applications will be supported running in the Holographic Shell. If you have Windows 10 and a head mounted display, you should be able to use it. Since the Intel-Microsoft partnership is focused on traditional applications instead of just high end games, even low-end computers should be able to run the Holographic Shell.
Picture yourself standing at a desk with an unlimited amount of space for displays. As you use the device in your hand, you can control the windows in front of you. Accept a meeting invitation and check on the weather without having to move windows around. It is one step closer to having the fanciful interface that wowed so many people in the Spielberg movie Minority Report. But… there is always a gotcha to bring you back down to reality.
Lack of Information
As is typical for Microsoft, they left out nearly all of the details. Other than stating the info above, there is no definitive release date or supported hardware. It’s also unclear how this will impact the HoloLens project. My assumption is that this is using the same software-based tech that HoloLens uses but opens it up to other third party hardware.
From what I have heard, the HoloLens system is nothing short of amazing. Seeing and being able to interact with holographic images within your real environment is undeniably cool. However… the HoloLens is an expensive piece of equipment only available to developers at this point.
Microsoft and Intel are wise to try to broaden the market by allowing third party hardware onto the scene, but based on the demo video Microsoft put together, there appear to be many practical limitations of this new technology.
For example, in the video, the user interacts with several open windows using a small device held in her hand. This implies the system she is using doesn’t support hand gestures. They also display a very neat looking scenario where the user clicks on a location on a holographic globe only to be transported to that location while a museum guide is discussing architectural history of the location. In theory, it’s pretty cool and I think we could all dream of being able to take a virtual reality vacation to some of the world’s hottest tourist destinations, but from a practical level, how would this work? You can’t really walk around in your house without hitting furniture. Walking in place? Sounds silly.
What about typing? In the demo, the user just uses simple applications that she can click on using the peripheral in her hand. I’m not sure about you, but very few applications don’t rely on a keyboard at least part of the time. I’m sure Microsoft and Intel are well aware of these issues, but before you dive in head first when the first iteration comes out, be sure to do your research.
It’s an exciting time for VR enthusiasts – are you excited for what is coming or are you more cynical? Leave a comment below!