UWP enables natural transition to HoloLens development
Zengalt leveraged UWP and standard APIs to create the HoloLens app that can be controlled by museum staff through a Microsoft Surface Pro 3.
“When you create something for HoloLens, you expect that HoloLens is a completely different animal compared to Xbox or PC or mobile phones, but it's not,” Evdokimov said. “In terms of development, it's the same platform, and that's really amazing because you can actually run the same app on the HoloLens or a Surface Pro 3 or an IOT device. UWP and being able to use the same stack of technologies for the back end really helps.”
By building applications on Windows 10, developers can support multiple hardware platforms with a single code base. This enabled Evdokimov to draw from his skillset in developing apps for Xbox and capitalize on new business opportunities in 3D mixed reality.
Mixed reality maximizes immersion for viewers
The 3D An American Supercar experience starts with the hum of an engine, then the museum wall crumbles and the Ford GT40 races into view. The exhibit illustrates the evolution of the GT40, showing the progress of innovation that led to the 2017 model. Visitors see virtual images merge onto the physical cars on the floor. A 3D model of an engine is projected as a hologram to show off its inner workings, and simulated airflow showcases aerodynamic features of the body designs.
A leader in developing mixed reality experiences, Zengalt used spatial mapping and sharing to position the mixed reality experience onto real cars and objects within the museum. This approach of combining virtual and real items maximizes immersion for users. In addition, all sounds in the app use spatial sound, giving life to virtual objects with a presence in the real world.
“We wanted to create an experience that was both simple and compelling,” Evdokimov said. “You want everybody to go for the experience and not spend a lot of time learning how to use HoloLens.”
For museum staff, Zengalt implemented remote monitoring at the device level. This way, staff have the ability to control the user’s experience, and any commands given through voice or gestures. All users are synchronized in time and see exactly the same experience.