Performance recommendations for immersive headset apps

Hardware targets

Windows Mixed Reality Ultra PCs will consist of desktops and laptops with discrete graphics (plus additional requirements) and will support experiences at 90Hz.

Windows Mixed Reality PCs will consist of desktops and laptops with integrated graphics (plus additional requirements) and will support experiences at 60Hz. To more easily distinguish the two PC targets, we'll refer to these PCs as "core" (differentiated from "ultra") through the rest of this article.

If you target just Windows Mixed Reality Ultra PCs for your experience, it will have more power at its disposal, but you'll also be limiting your audience. Conversely, if you target Windows Mixed Reality "core" PCs for your experience, you'll have a much larger audience, but won't be offering unique performance value to customers with higher-end Windows Mixed Reality Ultra PCs. Thus, a hybrid approach for your VR experience may be the best of both worlds.

We recommend testing your app on the lowest-end hardware in each category you intend to support. When targeting Windows Mixed Reality Ultra PC's, that would be a PC with an NVidia GTX 1050 or Radeon RX 460 GPU. Since laptops often have additional performance constraints, we recommend testing with a laptop with one of those GPUs. A list of Windows Mixed Reality PCs and Windows Mixed Reality Ultra PCs you can purchase for testing purposes will be coming soon.

Performance targets

Framerate targets

The target framerate for your VR experience on Windows Mixed Reality immersive headsets will be either 60Hz or 90Hz depending on which Windows Mixed Reality compatible PCs you wish to support.

For the PC you're currently using, you can determine its target framerate by checking the holographic frame duration, or, in Unity, checking the device’s refresh rate.

CPU budget

Coming soon.

Content guidance

Optimizing performance for Unity apps

Unity has many resources available with guidance on how to optimize your app:

Hitting performance on Windows Mixed Reality PCs is a two step process. First, we recommend you attempt to improve the overall performance of you app as much as possible. Second, we recommend finding viewport and/or quality settings between pre defined buckets to enable the highest possible visual quality, while hitting framerate. We have provided a tool to help with both of these tasks via Unity package on GitHub (note: there are two versions of this package; use the one that corresponds to the version of Unity you're using). This package contains a visualizer of the current frame rate and an adaptive quality and viewport manager. The visualizer will show several numbers: the time interval between frames, the amount of time spent on rendering, the current estimated frame rate, the target framerate on the PC you're currently using (based on compatibility specifications), the current viewport scale factor and the current Unity quality level. The visualizer has input bindings to allow you to change viewport scaling value and/or the quality settings to help you find the values that meet framerate and visual quality requirements. The adaptive quality and viewport manager can update the viewport scaling and quality settings based on the application performance.

First download FPSViewportQualityBundle.unitypackage from the link provided. In your project, select Assets > Import Package > Custom Package.

Import a custom package

Then open and include the whole asset package

Select our FPS viewport quality bundle

Import all files

You should see an FPSCanvas prefab in MixedRealityToolkit > Utilities > Prefabs > Performance. If you drop this prefab onto the main camera of your scene, it will display all of this information on a canvas in front of you. You can toggle the display on/off using the 'm' key. To start performance optimizing, first build a version of your app in Unity that is not a developer build. Target Master and x64 in your Visual Studio configuration, and run this on a minimum spec computer without debugging attached. Once the game is running you can change viewport scale and quality values on the fly. Default keys for that are: '- viewport scale decrease by 0.05, 0.05 is minimum value; '' - viewport scale increase by 0.05, 1.0 is the maximum value; ',' - decrease quality 1 step; '.' - increase quality 1 step.

You may use this tool and take the following steps to improve the overall performance of your app:

  • First, look at how the actual framerate compares with the target framerate. Run through your whole experience this way to see where the biggest changes need to be made.
    • If your framerate is below the target consistently in a scene, try decreasing the viewport scaling factor. If that improves your framerate, then you are likely GPU bound, and shader improvements, polygon count reduction, and simplifying long lasting screen space effects will help you. Conversely, if decreasing the viewport does not improve your FPS, then you are likely CPU bound and script, animation, and physics simplifications will help you.
    • If you find framerate drops on events in your game, likely things such as particle effects or temporary screen space effects are reducing your frame rate. You may find reducing the number of particles, or simplifying your screen space effects will give you a reasonable framerate improvement while giving you the same effect.
  • Once you have identified likely performance issues in your app, a good tool to start debugging them is the Unity profiler.
    • You may be able to find shaders that are taking up a majority of your time in that scene by switching one object at a time and comparing the overall render times. If you find an object with a high render time on its own, you can try swapping out the shader for some of the fast shaders found in the Mixed Reality Toolkit. You may also find that simplifying a complicated object down to a mesh with fewer triangles will improve your speed.
  • Unity can also compile your shaders for you and show you how many operations they call, which can help compare relative speed. To see this, navigate to a shader in your project menu, and in the inspector menu click the “compile and show code” button. A Visual Studio window should open with stats similar to these:

Shader stats output[hide]
// Stats for Vertex shader:
//        d3d11: 39 math
// Stats for Fragment shader:
//        d3d11: 4 math, 1 texture

Once you've improved performance across your app, you can use the same tool to set up adaptive quality and viewport adjustment for your app.

With the FPS canvas in your scene you will also get the adaptive performance manager. Before you can use it, you'll need to predefine a set of performance buckets with values for viewport scale factor and quality settings. The set must be ordered in a list starting with the highest performance settings (lower quality, small viewport) and going up to the lowest performance settings settings (high quality, full viewport). The set of buckets is hard coded as an array field in AdaptivePerformance.cs script. You should have something like this in your code:

Adaptive Performance Bucket Set[hide]
private PerformanceBucket[] perfBucketList=
        new PerformanceBucket()
            QualityLevel = 0,
            ViewportScale = 0.5f,
            ShaderLevel = 2
        new PerformanceBucket()
            QualityLevel = 1,
            ViewportScale = 0.7f,
            ShaderLevel = 2
        new PerformanceBucket()
            QualityLevel = 2,
            ViewportScale = 0.75f,
            ShaderLevel = 1
        new PerformanceBucket()
            QualityLevel = 3,
            ViewportScale = 0.8f,
            ShaderLevel = 1
        new PerformanceBucket()
            QualityLevel = 4,
            ViewportScale = 0.9f,
            ShaderLevel = 0
        new PerformanceBucket()
            QualityLevel = 5,
            ViewportScale = 1.0f,
            ShaderLevel = 0

Aside from the viewport scale and the quality level, there is also a shader level example. This is to demonstrate on how the adaptive performance code to be extended beyond the parameters we provide by default. For example, a game developer may want to add a parameter modifying the behavior of a particle effect. To use the shader level, you need to add the ShaderControl component to the prefab you want to manage dynamically. In the inspector, add the default material as first entry in the material list. Then add more materials as needed. The first (the default) material is expected to be the most expensive. Materials in the following entries should be computationally less expensive than the preceding materials. The ShaderControl subscribes to get callbacks from AdaptivePerfomance when a performance bucket changes. When a bucket change occurs, if the component has a material with the same level as indicated by the new performance bucket, the said material will be used to render the object.

By default the adaptive performance is not running. The AdaptivePerformance class offers several methods to manage it.

Adaptive Performance API[hide]
// public property for retrieving the current bucket index
    public int CurrentBucketId;

    // public method for getting the performance parameters of the current bucket
    public PerformanceBucket GetCurrentBucket()

    // public method for explicitly moving a bucket up. (lower perf, higher quality)
    public int SwitchToHigherBucket();

    // public method for explicitly moving a bucket down. (higher perf, lower quality)
    public int SwitchToLowerBucket()

    // public method for starting adaptive performance management
    // during adaptive performance, performance settings will automatically changed depending
    // on the current performance. Bad performance moves to lower bucket, exceeded performance
    // moves to higher bucket
    // Adaptive performance will run until StopAdaptivePerformance() is called
    public void StartAdaptivePerformance();

    // public method for starting adaptive performance with a timeout
    // Adaptive performance stops running after the given amount of seconds
    public void StartAdaptivePerformance(float time);

    /// <summary>
    /// Event that is flagged when a performance bucket changes
    /// </summary>
    public class PerformanceBucketChangedEvent : UnityEvent<PerformanceBucket> { }

    // instance of the event that clients can subscribe to and receive notifications
    // when a performance bucket changes
    // to subscribe call OnPerformanceBucketChanged.AddListener(callbackFunc)
    // when done call OnPerformanceBucketChanged.RemoveListener(callbackFunc) to unsubscribe
    public PerformanceBucketChangedEvent OnPerformanceBucketChanged;

The starting performance bucket is determined in the following way:

  • If we're running in Unity editor, the starting bucket is picked from the startBucket field in the inspector.
  • If adaptive performance was run or a performance bucket was selected explicitly. The last performance bucket used is saved to a file and loaded the next time the game starts.
  • If we don't have a last known bucket saved in a file and we're not running in the editor, the starting bucket will be picked from the startBucket field in the class with the value that was saved in the Unity scene from the inspector.

When the adaptive manager is running it will analyze the performance over a period of time before making decision whether to move to a different quality bucket. Currently the adaptive manager supports 2 analyzers:

  • Frame rate analyzer
  • GPU render time analyzer

Frame rate analyzer

The frame rate analyzer measures the FPS over a time sample of half a second. When it accumulates samples over a period of 1 minute, the analyzer will check if at least 80% of the samples meet the target frame rate. If less than that meet the target frame rate, the adaptive manager switches to a lower quality bucket. If we are meeting the target frame rate consistently over 3 minutes, the adaptive manager will try a higher quality bucket. Note that if we had previously switched from a higher quality bucket to a lower quality bucket, the adaptive manager will not attempt a higher quality bucket if frame rate is consistently on target.

GPU render time analyzer

The GPU render time analyzer measures render time on the GPU. If that time exceeds 95% of the target time consistently, the adaptive manager will switch to a lower quality bucket. If the time is consistently less than 75% of the target time over the course of a number of frames, the adaptive manager will try a higher quality bucket.

Considerations for Windows Mixed Reality "core" PCs

In order to hit performance goals on Windows Mixed Reality "core" PCs, you may need to reduce your quality settings in Unity, and/or reduce the viewport for those devices. Both of these modifications will have visual fidelity implications, however, low framerate can induce motion sickness in users, and we strongly recommend considering hitting the target framerate as a requirement for running your game. If you decide that the loss of visual fidelity in your game would be too great on lower spec machines, update your Windows Store description to discourage users with lower specifications from buying your game.

Loading screens and quad planes

Coming soon.

Default Render Target Size

Windows Mixed Reality immersive headsets contain lenses which distort the presented image to give higher pixel density in the center of view, and lower pixel density in the periphery. In order to have the highest visual fidelity on Window Mixed Reality Ultra devices, we set the render target’s pixel density to match the highly-dense center of the lens area. As this high pixel density is constant across the whole render target, we end up with a higher resolution than the headset's display. By contrast, other VR platforms may default to the render size of the display, which would require you to increase this size to get the correct pixel density in the center of the lensed image. This means that if you keep the default settings, your app may be rendering more pixels compared to other VR platforms, which might decrease performance but increase visual fidelity. If you have found on other platforms that you need to increase your render scale in order to achieve this high pixel density (in Unity 2017 the line would be something like UnityEngine.XR.XRSettings.renderScale = 1.x), you likely will want to remove this logic for our platform as it won’t gain you any added visual fidelity, and will cost you performance.

In order to hit the more difficult performance target of Windows Mixed Reality "core" PCs, we also lower the resolution target.

For either sort of device you may want to scale the default resolution target smaller in order to get back some GPU memory and reduce the number of rendered pixels. You may do this by setting the "Windows.Graphics.Holographic.RenderTargetSizeScaleFactorRequest" key in the property bag on the CoreApplication, however, it needs to be done before you create your holographic space and cannot be changed once you create your holographic space. In order to help you determine what systems might need such a change, we have provided a sample project you may use to get information about the system you are running on here.

In order to use this project, add it to your visual studio project and add a reference to it in your app's project. If you are using c#, you may then use something like the following snippet in the Initialize function of your App.cs file:

Sample C# code for setting render scale with SystemInfoHelper Project[hide]
var holographicDisplay = Windows.Graphics.Holographic.HolographicDisplay.GetDefault();
if (holographicDisplay != null) //if null, no HMD is connected!
    double targetRenderScale = 1.0d;

    SystemInfoHelper.SystemInfo systemInfo = new SystemInfoHelper.SystemInfo(holographicDisplay.AdapterId);

    var renderScaleOverride = await systemInfo.ReadRenderScaleAsync();
    if (renderScaleOverride == null || renderScaleOverride.MaxVerticalResolution == holographicDisplay.MaxViewportSize.Height) // Ensure we are on the same type of headset
        /// You may insert logic here to help you determine what your resolution 
        /// should be if you don't have one saved
        /// Here are some potentially usefull calls other then what SystemInfo provides:
        // holographicDisplay.DisplayName()
        // holographicDisplay.MaxViewportSize
        targetRenderScale = renderScaleOverride.RenderScaleValue;

    CoreApplication.Properties.Add("Windows.Graphics.Holographic.RenderTargetSizeScaleFactorRequest", targetRenderScale);

If you are using unity with a .NET scripting backend this App.cs file will be generated for you when you build from unity.

If you are running in C++ or with the IL2CPP backend of unity, add the following code to the Initialize function in your app.cpp file:

Sample C++ code for setting render scale with SystemInfoHelper Project[hide]
auto holographicDisplay = Windows::Graphics::Holographic::HolographicDisplay::GetDefault();
if (nullptr != holographicDisplay) //if null, no HMD is connected!
	double targetRenderScale = 1.0;

	auto systemInfo = ref new SystemInfoHelper::SystemInfo(holographicDisplay->AdapterId);

	auto readAction = systemInfo->ReadRenderScaleAsync();
	while (readAction->Status == Windows::Foundation::AsyncStatus::Started)

	auto renderScaleOverride = readAction->GetResults();
	if (renderScaleOverride == nullptr || renderScaleOverride->MaxVerticalResolution == holographicDisplay->MaxViewportSize.Height) // Ensure we are on the same type of headset
		/// You may insert logic here to help you determine what your resolution 
		/// should be if you don't have one saved
		/// Here are some potentially usefull calls other then what SystemInfo provides:
		// holographicDisplay->DisplayName();
		// holographicDisplay->MaxViewportSize
		targetRenderScale = renderScaleOverride->RenderScaleValue;

	CoreApplication::Properties->Insert("Windows.Graphics.Holographic.RenderTargetSizeScaleFactorRequest", targetRenderScale);

As this value must be set before you can actually run your program and do any performance evaluation, you may find that you need to adjust it's value for the next startup. SystemInfoHelper has the ability to save and load a different value that you find might better suit how your app actually runs on the hardware.

Dynamic resolution scaling

Viewport scaling (dynamic resolution scaling) is the practice of rendering your image to a smaller render target then your output device can display, and sampling from those pixels to display your final image. It trades visual fidelity for speed. Windows Mixed Reality devices support viewport scaling at a platform level. This means if you set the viewport to be smaller (in Unity: UnityEngine.XR.XRSettings.renderViewportScale = .7) Unity will inform the platform it is rendering to a smaller section of the render target, and the platform will composite its display from that smaller section of the render target.


Coming soon.

Detecting Windows Mixed Reality "core" PC vs. Ultra PC

Coming soon.

Performance Tools

Visual Studio

Visual Studio Graphics Diagnostics can debug immersive applications running on Windows Mixed Reality. Please note that GPU Usage is not supported for Window Mixed Reality.

Unity Performance Profiler

The Unity Profiler is particularly useful if you are CPU bound, as it will show you how long you are spending in each update function. The most accurate performance measurements will come from profiling a deployed UWP app. To profile on a built UWP app, make sure you have turned on the InternetClient capability build with the developer build checkbox marked. To turn on InternetClient capability, go to Edit > Project Settings > Player, select “Publisher Settings” and under “Capabilities” check “InternetClient”. If you already know that you need to improve performance in a given scene, you may use play mode to iterate quickly, and you will likely see proportionate improvements in your UWP solution. If your bottleneck is in the GPU, you can still start with the Unity Profiler and make significant process. You may, for example, isolate which object seems to be causing you the most render issues by turning off all object in the unity hierarchy and turning them on selectively until you find one that takes a particularly ling time to render. Once you have discoved that, you can either try to simplify the object, or improve performance in its shader. The Mixed Reality Toolkit has some excellent fast shaders that might be helpful.

Windows Device Portal

The Windows Device Portal lets you configure and manage your device remotely over a network or USB connection. It also provides advanced diagnostic tools to help you troubleshoot and view the real time performance of your Windows device.

Intel Power Gadget

Intel® Power Gadget is a software-based power usage monitoring tool enabled for Intel® Core™ processors (from 2nd Generation up to 6th Generation Intel® Core™ processors), Intel® Atom™ processors not supported. It includes an application, driver, and libraries to monitor and estimate real-time processor package power information in watts using the energy counters in the processor.

See also