business context
The original game was aimed at PS VR 2 and Steam VR with relatively powerful software. The client trusted us to develop a flat version of the VR game and, after a month of work, requested that we port the game to Oculus devices.
Our goal was to port the PS VR game to Oculus Quest devices. The hardware of some of the target devices resembles that of 2012-era gaming platforms—mid-tier smartphones or PS3-type consoles.
Pingle Studio recreates the VR part of the remarkable horror franchise on Oculus and Meta VR devices in partnership with Devoted Studios and Steel Wool Studios.
Our goal was to port the PS VR game to Oculus Quest devices. The hardware of some of the target devices resembles that of 2012-era gaming platforms—mid-tier smartphones or PS3-type consoles.
project’s summary
Pingle Studio brings Five Nights At Freddy’s: Help Wanted 2 to Oculus Quest and Meta Quest VR, gaming devices with weaker hardware and tricky software architecture. All devices show stable 72+ fps throughout the gameplay. All the builds passed the Meta certification on time.
player is our priority
Altogether, Five Nights At Freddy’s: Help Wanted 2 is a great all-ages horror game that makes excellent use of VR. It’s a well-executed return to form for the franchise. While its story may leave new players scratching their heads, the variety of solid, replayable stages more than makes up for it. Whether you’ve been facing down Freddy and co. since 2014 or are just looking for something spooky to play on your headset, this one is definitely worth grabbing!
what do gamers say
challenge
keep all the details of the original game
on Oculus with Android limitations
for rendering, light source number, quality, and number of VFX
The biggest issue we had was the platform. Oculus is an Android platform with a huge amount of limitations. While the original game was developed for PSVR2 and Steam VR, the rendering, amount of effects, and their quality. That was a massive callout for the porting team.
Step 1: Forward or deferred render, that is the question…(c). Despite all the guides, documentation, and experience, the team decided to put some effort into making this game work with deferred render at Oculus. It wasn’t working from the box, so first thing first – we fixed the render (with the considerable help of render Devs) … and ultimately decided not to use it – the performance was awful. In the most straightforward scene with only a few actors, we reached only 23-25 FPS. So, at this point, forward rendering was the only option. Later, during the development process, we figured out most caveats for this rendering method and developed a strict pipeline, which allowed us to reach almost the same visual quality level as on the original title.
Step 2: A platform limitation of no postprocess, no decals, only 4 dynamic lights active, etc.
So, we’ve redesigned all the levels to fit these limitations.
We replaced decals with a planes-materials combo where it was possible, and replaced post-processing with tone mappers (carefully adjusted to reach the same or near the original visual quality). But the lightning was the trickiest part. Each level was carefully re-designed to use as much static lightning as possible. Each dynamic light, which definitely needed to light up skeletal meshes, was actively re-used (for some cases, we had to implement very complex solutions, like BreakerRoom minigame, where dynamic Lights were actively used for gameplay purposes).
Step 3: Content size limitation. By default, UE4 supports up to 3 OBB files (each no more than 4Gb) for content out of the box.
Meta store supports unlimited additional asset files. The original game content size was about 10 GB compressed (24 GB uncompressed) — the team approached this issue with complex solutions. The first caveat was the performance callout, which was to use only uncompressed content. The next one – is to increase support by more than 3 OBB, which was successfully implemented (we now can support up to 5 OBB with an overall content size of up to 20 GB).
The last one – while operating large-size textures, multiple spikes and performance hiccups appeared, so we found the golden middle solution to fit in around 10 GB of uncompressed content by using ASTC decoding settings.
This allowed us to quickly scale any additional content and easily manage the build size. The minors, like correctly build outputting to upload into the Meta store, additional files describing the extra content files, etc., were included.
Step 4: Android API support. From the box, UE4.27 (even in the forked version) supports a maximum API level of 29.
From the beginning, the project team faced the necessity of extending this support to API 32 (the Meta store requirement to match the latest device—Oculus 3). So, this was implemented and proven to work stable—we had to research the compatible toolchain to provide the correct Gradle, NDK, JDK, and Android API setup, which allowed us to make builds with API level 32.
Step 5: Permissions, manifests, and other boring Android stuff.
Despite the need for more information in official guides and release notes, the team was able to set up the correct flow for the manifest generation to match Meta store requirements. That was achieved by setting up our own AppLab organization and working with it before integrating final changes into the manifest generation code.
multiple performance spikes all over the game: VFX, sounds, render PSO
That was the most challenging part of the whole project.
We’ve separated them into 3 categories: visual effects spikes, sound and SFX spikes, and render PSO spikes.
There also spawn spikes, you’ll say.
Correct, they were addressed under “Cache screen”—an additional loading screen when the player is actually already on a level, but the game is still preparing the stuff for him. This includes pre-spawns, hot path game logic warmup, caching, etc.
Nevertheless, each main spike category was addressed in different ways:
1. VFX spikes. They occurred only once per application run and were caused by the logic of Niagara Cache. Careful tuning of settings for that cache and manually managing which VFX should be in memory now allow us to prevent these hiccups at 90%. Heavy VFX caused the remaining 10%, so the TechArt team made its playtime by optimizing those assets.
Additional effect was achieved by setting up a built-in warmup feature for Niagara effects.
2. Sound and SFX spikes. Approximately 70% of the sounds in the project were made in the SoundCue class. Despite this approach being pretty effective, it was precisely the reverse – any Android platform (and Oculus isn’t an exception) experienced a lousy time processing SoundCue. So, bad things always come in numbers. We had to re-set up almost all the sounds with the correct category and concurrency settings and adjust attenuations and source effects chains. Also, we’ve made a few changes in the engine code to prevent loading hiccups for the Random node and set the pseudo-cache (since the native cache isn’t working properly with Cue). There is also a platform limitation for no more than 12 soundwaves may be played at runtime, which forced us to redesign some gameplay logic (especially at levels DJ MusicMan, where sound is essential for gameplay)
3. The third category – render PSO spikes- was the hardest to address. It usually occurs when a sudden lightning change happens. So, despite the partial spikes we addressed by making some unimportant meshes unlit by default, they couldn’t be addressed without setting up the PSO cache. This is a tricky and time-consuming operation (collect records -> compile the cache -> build the application with cache -> repeat). Still, in the end, it completely fixed all the heaviest spikes during gameplay.
achieving 72fps on the target device with the weakest hardware
Of all the target devices, Oculus 2 was
the weakest.
The team focused on performance checks on that device only. It was tough to achieve a stable 72 FPS, but with the overall goal — average FPS — we succeeded.
Meta certification requirements – slots for certification are predefined by Meta.
It is not that easy to book a particular slot compared to other platforms’ certification process
The development was scheduled for three months. As we approached the finish date, the client began booking a slot for submission. They thought they would submit the game when it was ready, but it turned out that Meta had a long queue.
Thus, a reminder to developers: slots for submission need to be booked in advance. Some may not know about this possibility, but you can make a demo version, show the completeness of the game, and book a certification slot for an earlier date. In our demo version, we showed the completeness of five starting locations, polished to the required level – and it worked. Meta liked the demo, and they issued us an earlier certification.