Game Stack Blog

Twitter Icon
Enabling build access from home during COVID-19

​​​​​woman in front of a computer with a hyper x gaming headset on

COVID-19 has been challenging for many disciplines, and game development is no exception. With the bulk of Microsoft's workforce now working from home, this posed a unique challenge for the content development teams within our Xbox Game Studios.

For our game developers to be able to work safely and productively from home, they need to be able to access and check-in content from/to various build systems within a reasonable timeframe. Downloading daily builds is a critical step in the development and validation processes. While development machines are typically collocated with build cache servers in the office and connected through high speed local area networks (40-80 Gigabit network), working from home requires engineers to download large builds onto their development machines using their existing home ISP network (40 Mbps to 1 Gbps download speed).  The problem we ran into is how to best enable speedy downloads of ~300 Gb sized builds multiple times a day to 300+ developers located in the greater Seattle area (Washington state). 

To address this challenge, teams in Xbox Game Studios partnered with Azure to create a fast and secure build transfer solution. Using Azure Blob Storage and AzCopy, the team successfully rolled out a proof-of-concept with 30 engineers to ensure the solution could scale. The proof-of-concept, which was developed and tested within a few weeks, exhibited reasonably good throughput of ~100-120 Mbps for offsite uncompressed download. This also stress tested the limits of Azure Blob Storage, Azure Networking and AzCopy by simulating simultaneous connections throughout the day.

These were the steps undertaken by the team to establish a fast and secure off-site build download:

  1. Created six premium blob storage accounts in the West US 2 region, to minimize network latency between engineers in Washington state and the Azure Data Center
  2. Rewrote the build download scripts to call invoke AzCopy to enable secure and quick off-site download (Script 1)
  3. Leveraged off-the-shelf compression techniques such as Gzip in combination with AzCopy's decompress copy feature to make the builds more lightweight where possible. Compression ratio of ~1.14 for content builds and ~4.8 for binary builds were achieved

    Before COVID-19, typical set-up
    ​​Figure 1: Before COVID-19, typical set-up
    During COVID-19, off-site build Storage + AzCopy set-up
    Figure 2: During COVID-19, off-site build Storage + AzCopy set-up

    While the proof-of-concept throughput was reasonably high for off-site development, a higher performance would have significantly sped-up the development process. Innovating further, teams used compression techniques coupled with AzCopy's decompress at destination feature, additionally reducing the build download time.  Due to the overall redesign above, a 5X – 7X throughput improvement was achieved. Download times went from a 120Mbps using SMB over VPN, to around 600 - 700Mbps throughput using AzCopy and Azure Blob Storage.  This reduced engineers' time to download a build from ~1hr down to 8-12 minutes, while working from home!

    The above success indicates the possibility of a paradigm shift for content developers especially in the entertainment sector. While development of entertainment software such as those for gaming and movies relies heavily on strong on-premises infrastructure with high throughput networks and storage requirements, the work from home situation has inevitably forced innovation in off-site development strategies. Azure has demonstrated the ability to support such challenging use cases, which can eventually eliminate the need for content collaborators to be tied to a fixed workplace. The architecture suggested here is not only applicable to game development but also to any team that wishes to enable working from home where content distribution is essential to business continuity.

    Using the custom script available here, and the Azure storage tutorials linked, you can instantly unlock a similar solution for your organization's work-from-home scenarios. ​ See below for a sample script. 

    <#
    .SYNOPSIS
        Copies build folders to a destination share.
    .OUTPUTS
        System.Int32: The exit code from the operation.
    #>
    [CmdletBinding(SupportsShouldProcess = $true)]
    param
    (
        # The source path of the build.
        [Parameter(Mandatory = $true)]
        [string] $SourcePath,
    
        # The root destination path of AzStorage (ex: https://test.blob.core.windows.net).
        [Parameter(Mandatory = $true)]
        [string] $AzStorageRootPath,
    
        # The location where compressed builds are stored while being copied to AzStorage (ex: D:\AzCopyTemp).
        [Parameter]
        [string] $CompressionRootPath
    )
    
    begin
    {
        [PSCustomObject] [hashtable] $PSBoundParameters | Format-List | Out-Host
    
        #REPLACEME with your specific build number pattern
        $Script:BuildNumberPattern = "\d+\.\d\d\.\d\d\.\d\d\.\d{4}[^\\]*" # RegEx pattern for extracting build number from folder path.
    
        $success = $false
        $exitCode = 9999
    }
    
    process
    {
        # Row found; extract the configuration; remove trailing \ from paths.
        $SourcePath = $SourcePath -as [string] -replace "\\$", ""
    
        # Strip off the path prefix; keep the build number and sub-folders if available, otherwise just keep the last segment.
        $Matches = $null
        $folderName = $null
        if ($SourcePath -match "$Script:BuildNumberPattern.*")
        {
            $folderName = $Matches[0]
        }
        else
        {
            Write-Information "Split-Path -Leaf -Path $SourcePath"
            $folderName = Split-Path -Leaf -Path $SourcePath
        }
    
        # Remove trailing \ from path.
        $folderName = $folderName -replace "\\$", ""
        Write-Information "FolderName: $folderName."
    
        # Use AzCopy to cache to Azure Storage for Offsite
        $AzStorageCachePath = $null
        $AzStorageCachePath = "$AzStorageRootPath//$folderName" #REPLACEME with your blob container name
    
        if ([string]::IsNullOrWhiteSpace($CompressionRootPath))
        {
            $CompressionRootPath = "D:\AzCopyTemp"    
        }
    
        # Get certificate password
        Write-Information "Retriving password for AzCopy"
        $password = $null #REPLACEME with AzCopy service account password retrieval; KeyVault, password manager, etc.
        Write-Information "AzCopy password received"
    
        $env:AZCOPY_SPA_CERT_PASSWORD = $password
    
        # Perform the copy.
        "_____________________________________________" | Out-Host
        try
        {
            # Compression
            Write-Information "Starting pre-copy compression..."
            $CompressionFullPath = Join-Path $CompressionRootPath $folderName
    
            $ArgumentList = @()
            $ArgumentList += "/source:$SourcePath" # Source Path
            $ArgumentList += "/target:$CompressionFullPath" # Target Path
            $ArgumentList += "/compressionlevel:Optimal" # Compression Level
            $ArgumentList += "/NoSuffix" # Remove .gz from file names
            Write-Information "Start-Process -FilePath """" -ArgumentList $($ArgumentList | ConvertTo-Json) -PassThru"
            $Result = Start-Process -FilePath "" -ArgumentList $ArgumentList -PassThru #REPLACEME with location of compression tool: Zipper exe
            $TimedOut = $null
            $Result | Wait-Process -Timeout 2700 -ErrorAction SilentlyContinue -ErrorVariable $TimedOut 
    
            if ($null -ne $TimedOut)
            {
                throw $TimedOut
            }
    
            Write-Information "Pre-copy compression exit code: $($Result.ExitCode)"
            if ($Result.ExitCode -ne 0)
            {
                throw "Issue with pre-copy compression: $($Result.ExitCode)"
            }
    
            # Login and Upload to Azure Storage
            Write-Information "Starting AzCopy process..."
            $CertPath = "" #REPLACEME with location of AzCopy certificate (.pfx)
    
            $ArgumentList = @()
            $ArgumentList += "login"
            $ArgumentList += "--service-principal"
            $ArgumentList += "--certificate-path=""$CertPath"""
            $ArgumentList += "--tenant-id=" #REPLACEME with your tenant id
            $ArgumentList += "--application-id=" #REPLACEME with your AzStorage app id
            Write-Information "Start-Process -FilePath """" -ArgumentList $($ArgumentList | ConvertTo-Json) -Wait"
            Start-Process -FilePath "" -ArgumentList $ArgumentList -Wait #REPLACEME with local path to AzCopy Exe
    
            $ArgumentList = @()
            $ArgumentList += "cp"
            $ArgumentList += "$CompressionFullPath\*"
            $ArgumentList += $AzStorageCachePath
            $ArgumentList += "--recursive"
            $ArgumentList += "--content-encoding=gzip"
            Write-Information "Start-Process -FilePath """" -ArgumentList $($ArgumentList | ConvertTo-Json) -Wait"
            $Result = Start-Process -FilePath "" -ArgumentList $ArgumentList -Wait -PassThru #REPLACEME with local path to AzCopy Exe
    
            $exitCode = $Result.ExitCode
        }
        catch
        {
            # Log an error and save an error notification.
            $_ | Out-Host
        }
    
        Write-Information "Caching ExitCode: $exitCode."
        if ($exitCode -in ( 0 ))
        {
            $success = $true
    
            # Delete pre-copy compressed folder
            if (Test-Path $CompressionFullPath)
            {
                Remove-Item -LiteralPath $CompressionFullPath -Recurse -Force -ErrorAction SilentlyContinue
            }
    
            # Try to remove any old compressed caches from the transfer folder
            $DaysBack = (Get-Date).AddDays(-3)
            Get-ChildItem -Path $CompressionRootPath -Verbose | Where-Object {$_.CreationTime -le $DaysBack} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue
    
            # Remove large, old AzCopy log files
            $DaysBack = (Get-Date).AddDays(-3)
            Get-ChildItem -Path "$env:UserProfile\.azcopy\" -Exclude *.json -Verbose | Where-Object {$_.CreationTime -le $DaysBack} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue
        }
        elseif(!$exitCode)
        {
            $exitCode = 9999
        }
    }
    
    end
    {
        if ($success)
        {
            # Explicitly return 0 to avoid confusion between non-zero success/fail codes.
            Write-Information "END RESULT: Success"
            return 0
        }
        else
        {
            Write-Information "END RESULT: Fail ($exitCode)"
            return $exitCode
        }
    }
    

    ​Script 1: Sampl​​e script ​​

Check out these useful links to learn more