Developer deep dive – “It’s a breeze” game

If you haven’t played the “It’s a breeze” game go and do it now and then come back, we’ll wait….

It's a breeze - splash screen

Contents


Project summary

At the end of February, researchers from the University of Bristol’s Bio-inspired Flight Lab and Royal Veterinary College (RVC) approached Research IT to see if we could produce a web game for their part in The Royal Society’s Summer Science Exhibition. The exhibition was to run between the 8th-11th July, but all content needed to be finalised and made available to The Royal Society by 10th June. After initial meetings Research IT developed an estimate and very rough storyboard.

Example pages from the game storyboard
Example images from the game storyboard

The aim of the game was to allow players to adjust parameters of a bird’s wing to improve how the bird copes with a gust of wind. The first two levels allow the player to change a bird wing and see how this affects the flight. After this, a third level has the player designing a wing for an unmanned aerial vehicle to show how research with birds helps in designing aircraft wings. The game is backed with real data generated by the research team from their wind tunnel experiments with real birds and computer simulations.

Technology selection

The Research IT team use common “toolchains” across many of our projects, but it was obvious that this project was going to require us to “skill up” in something new.

The way we approach tool selection is not unusual, we list the core functionality we’re looking for, compile a list of possible tools and assess them against the requirements. For this project we didn’t have time to do this in great depth or create any example projects etc. However, even a rapid critical review at this stage can pay dividends later. Not having a long time to do this can be beneficial as being able to rapidly get information from documentation is also an important consideration.

The requirements we looked for were:

  • Desktop and mobile support
  • Translation (i.e. movement) and rotation of objects
  • Scaling of objects
  • Scenes (support for levels)
  • Support for spritesheets
  • Particles and trails
  • UI controls – specifically sliders and buttons
  • Support and documentation
  • Licensing/cost

I’ve written games before in a variety of tools and, in the (distant) past, I would’ve used Abode Flash for this project but that’s been dead for some time. I have experience of heavyweightgames engines such as Unity and Unreal and both can be used to create games that can be played in the browser (using WebGL and WebAssembly). Even though you can create 2D games in these 3D engines we decided it would be better to create “It’s a breeze” in a JavaScript games engine as we were more likely to be able to deliver something high quality in the required time. We then compiled a shortlist of tools we wanted to look at:

The final choice came down to Phaser and PlayCanvas with both clearly capable of delivering what we needed. We settled on Phaser due to the number of tutorials available, very active community and because PlayCanvas uses a custom editor which would require us to learn this as well as the tool itself. PlayCanvas looks like a very capable library but given the short timescales, we needed to minimise the number of new things to learn.

Phaser uses the concept of “scenes”, each scene defines the sections of your game and might include a welcome scene, the scene for adjusting the wing settings or a scene showing the bird flight etc. Each scene has its own game loop and, if you’ve written games before, you’ll know that game loops define the flow of your game. The game loop is where you’ll move objects, do hit detection, play particle effects etc. Phaser’s game loop runs at a default rate of 60 frames per second. Phaser also has the ability to scale the game automatically to fit the screen resolution which makes supporting a range of devices much easier.

The tool selection process happened alongside work to refine what the game would include and the research team generating data for us to work with for two bird levels and one aircraft level. The data and core game mechanics were largely finalised by the 19th April which left Research IT seven weeks to create, refine and deliver the game.

Working with short timescales and a strict deadline

The Research IT team commonly work on projects using a Scrum methodology and (usually) sprints of two to four weeks. In this case the short, fixed, timescales meant that our ability to review and iterate was going to be compromised and shortening sprints to one week would, we felt, require a disproportionate time commitment from the research team (who not only have teaching and research commitments but also needed to produce other resources for the exhibition). Scrum is a great methodology but can break down when working to very short, fixed, deadlines because fixed deadlines affect your ability to iterate and, when sprints are very short, the amount of time needed for the various review and planning meetings becomes disproportionate to the development time. Within the team we still organised the work into sprints but relaxed requirements around the review and planning meetings to fit the time

In addition to work from the researchers and Research IT, the artwork for the game was being created by Tom Waterhouse from 2DForever but this wouldn’t be finalised until later on in the development process.

With this in mind, and because we were using a new toolchain, Research IT created placeholder artwork using Vectr so that we could develop the game and provide the researchers with something to play before the final artwork was available.

Screenshot of the prototype game showing the placeholder artwork
Screenshot of the prototype game showing the placeholder artwork

Fast prototyping may not create something pretty but allowed us to get to grips with Phaser, gained familiarity with the data from the researchers and allowed the researchers to see how the raw data would affect on-screen visuals. This allowed the research team to refine the game ideas and model settings to make the impact of the wing changes clearer to players.

Once the artwork was finished, we would be able to replace the placeholders and, in theory, there should only be minimal adjustments.

As things turned out, replacing the placeholder artwork was reasonably straightforward but we did need to adjust factors in the game to give the best experience once the artwork was in (for example, making sure the trails lined up with the wings). As part of the prototyping process I’d made it easy to adjust flight speed and vertical movement and this made the adjustments easier to do than if I hadn’t built in this flexibility from the outset. The game prototype also showed that players would need to be shown a demonstration flight before they adjusted the wings and this would form a good introduction to the game. We were able to use a lot of the code from the prototype in the final game and the prototype also allowed us to test out the game ideas early on and without being concerned about how the game looked. This really helped us refine the game and was a critical factor in us successfully meeting the deadline.

Game screenshot showing the final artwork (using the same wing settings as in the prototype screenshot shown previously)
Game screenshot showing the final artwork (using the same wing settings as in the prototype screenshot shown previously)

Managing server load

The Royal Society Summer Science exhibition is normally held in person and attracts more than 10,000 visitors over the week but was not held in 2020 and was online only this year. Content from the researchers invited to take part had to be hosted externally and not by The Royal Society so we needed to host the game on our existing infrastructure but without knowledge of the traffic to expect; we did ask but this was the first time they were running the exhibition online.

No developer wants to optimise prematurely but as the exhibition runs over a four-day period (two of which are over the weekend) we had to take some reasonable optimisation steps in advance as our ability to make rapid changes during the exhibition itself would be low. Our priorities would be to minimise the amount of content players would need to download and minimise the number of requests that browsers would need to make. Browsers are limited on the number of concurrent requests they can make to the same hostname (commonly between 6 – 13 depending on browser). The fewer requests that are needed and the faster we can serve a request, and move on to the next, the better. Aside from basic changes such as configuring caching and the web server gzipping content there were several other things we did to make sure the server could handle the load during the exhibition:-

  • Minify JavaScript – this is an easy task but minifying the JavaScript code of the game reduced its size by around 45%. Smaller files download quicker.
  • Minimise HTTP requests – in addition to the concurrency limit there is an overhead for any request (TCP handshake, headers being sent etc) so a lot of requests for small files can really mount up for the server handling the requests. We can minimise requests in several ways, but texture packing is one that gives big benefits for games. Texture packing involves combining many small images into a single larger image, for example, all the artwork for the side-on view for the owl with different wing positions. A corresponding JSON (JavaScript Object Notation) file tells the games engine the positions of the images within the larger image and these are unpacked by the games engine. This means that instead of requests for, say, 15 individual images, the browser just makes two requests (one for the large image and one for the JSON).
  • Benchmarking the web server – using a JavaScript games engine meant we could host the game on our existing “static” web server by creating a new Apache virtual host. However, we wanted to know what performance we could expect from the server, so we benchmarked it using ‘ab‘. Other tools, such as ‘Locust‘ exist but in this case ‘ab’ was good enough for our needs and easily available. Benchmarking the server at the outset showed it could serve around 50 requests per second (benchmarked with 1000 requests with up to 200 concurrent requests). Jon Hallett and I made a few server changes followed by more benchmarking and Jon found that the bottleneck was ‘MPM Prefork’, and not the number of workers, so we switched to ‘MPM Event’ and the benchmarks increased three-fold so that the server could handle around 150 requests per second.
  • Reducing file size – the game runs client-side so the smaller the files the faster they transfer and the greater throughput of requests the server can handle as they aren’t hanging around as long. After I’d created the packed textures Tom was able to reduce their file size by around 70% by converting the images to 8-bit PNGs without me needing to regenerate the JSON files.
  • Using a CDN – we don’t have access to a centrally provided CDN unfortunately. However, as we were using the popular Phaser games engine this was available via cdnjs from Cloudflare so we could use that for the main library at least. Using a separate hostname also increases the number of concurrent requests a browser can make as the limit is per hostname.

The changes meant the game was delivered in ~45 requests (including all game code, artwork, sound, CSS, fonts etc) for a total download of ~2.8Mb in less than a second (assuming a reasonable connection). This content would then be cached so, if players returned to the game later, they wouldn’t need to download the assets again.

Testing

Anyone that’s been to a talk by our team knows we’re big fans of automated testing and, although testing games is difficult, we wanted this project to be no exception – it provided an opportunity for us to try out automated “visual testing” and the knowledge gained will benefit future projects.

Testing traditional web applications is easy (no excuses) and your aim is always to remove as much manual testing as possible. We write unit tests, integration tests (utilising a test client) and acceptance tests (usually running in headless browsers). In the case of integration tests and acceptance tests it’s easy to check the content of the resulting web page to determine what’s being shown to the user even if that’s being created/manipulated with JavaScript. In the case of web games, the issues become more difficult. The games engine is rendering your game (often using WebGL) within an element on the page but unless it exposes the content (which isn’t traditional web content etc) in some way it’s hard to test directly. For example, we need to be able to test that a particular sprite (the image of the owl for example) has been shown on-screen in a specific position in response to actions of the player etc.

One way to do tests but avoid the issue of what the games engine allows you to test directly is to use visual testing. This involves running and interacting with the game programmatically via a headless browser by ‘feeding’ it locations (in pixels) of where to run a click event (e.g. simulating a mouse click or a tap event on a phone) or performing key presses etc. So, we program the headless browser to act like a human would (without the unreliable human being involved) and it plays through the game. At points of interest in the game you get the test framework to take a screenshot of what is shown in the browser. By generating a set of “known good” reference images then the test can run through the various scenarios in the game and do a pixel-by-pixel comparison between the reference image and an image taken during the latest test run, if discrepancies are found the test fails.

The team is currently moving away from Selenium as our acceptance testing tool and adopting Cypress for new projects. Cypress comes bundled with Electron which includes the Chromium browser (but you can also use Firefox, Chrome and Edge) and there are several third-party plugins for visual testing. Some of these use an external service to do the image comparison but that introduces external dependencies and we want to be able to run this offline as well as in our Continuous Integration (CI) setup. So, we used the cypress-image-diff plugin with Cypress running in a Docker container and running the tests against a copy of the game running in another Docker container using an NGINX server. We can then write tests that run through various success and failure scenarios within the game and confirm that not only can the user interact with the game but what’s shown on the screen is what we expect… so we’ve got end-to-end testing with no humans involved and that’s perfect!

Or is it?

Predictable games may not be fun. Even in an educational/academic game such as this a bit of randomness gives more interest. For example, the particle emitters we use to create the gust animation or the smoke from the volcano are different every time you play the game. If we’re doing a pixel-by-pixel comparison then we’re going to get false negatives in our test results on any scene with an element of randomness. To alleviate this we set a threshold based on the percentage of variation we’ll accept between the screen shots. For example, the particles account for around 20% of the screen so we allow this amount of variation to avoid false negatives.

Results of a failed visual test showing the Reference image, comparison image from the test and the "diff" output showing where the pixels differ.
Results of a failed visual test showing the Reference image, comparison image from the test and the “diff” image highlighting where the pixels differ.

Automated testing is invaluable in any project but, with a project of this nature, we also needed to do a lot of device testing. Testing on the browsers you can run on your laptop or via the Remote Desktop will only get you so far and testing on a range of mobile devices is difficult aside from your own personal device and those of friends and family. Research IT has a Browserstack subscription for exactly this reason so we can do device testing on genuine devices (not emulators) even with code that’s only running in a local environment. This enabled us to test the game on a wide variety of operating systems and browser combinations as well as on a wide range of mobile devices.

Accessibility

Accessibility is a fundamental right and a legal requirement under the Equality Act 2010 so it’s important to make sure we don’t introduce barriers for users. Highly interactive content such as online games pose a greater challenge than more simple content such as traditional web pages. We did several things to make the content as accessible as possible and these changes provided benefits for all users and not just those with a disability affecting their use of the web:-

  • Colour choice – the colour palette and contrast levels should mean people with colour blindness or other visual problems would be able to read the text and see the wing trails clearly. The path of the trails and line thickness also means that if the user cannot differentiate between the colours it is still unambiguous which trail relates to the bird’s body and which to the wings.
  • Keyboard controls – not everyone can use a mouse, so it is important to provide keyboard controls. The game can be played without the use of a mouse and, on ‘tabbing’ to give the content focus, the keyboard controls are shown to the player.
  • Sound toggle – users with cognitive issues may find sound distracting so we added functionality to allow users to disable sound.
  • Large ‘hit’ areas – the user interface of the game has large buttons and big ‘hit’ areas and this benefits people with motor control or dexterity issues.
  • Alternative version – it’s not possible to make content like this accessible to all users in all situations. For example, a user may have several issues that combine to pose significant difficulties. To allow these users to access the content we created a web page version of the game that explained several scenarios and presented similar information to that available within the game.
  • Help page – we also produced a help page for the game covering browser versions, keyboard controls and a link to the alternative version of the game.

Conclusion

This was a fun project to work on, Tom’s artwork was amazing, the project included some interesting challenges and had an international reach that helped showcase the work of researchers at the University of Bristol. During the exhibition, the game was featured in press releases and social media posts from the University of Bristol, RVC, The Royal Society and on aviation websites (e.g. BlueSkyNews).

From a technical perspective, Phaser is a mature and extremely capable library and enabled us to deliver the game on time and with a high degree of confidence that it would work well on a wide range of devices. Cypress, once again, proved itself to be an excellent test framework and this project contributed to the testing expertise within the team as we have experience with automated visual testing and that will feed into other projects.