Whilst there were numerous challenges, we helped the teams overcome them. The project was successfully delivered and became an instant hit with customers.
12 people in California
Air New Zealand started this project in partnership with Panasonic Aviation to replace the software in its existing Panasonic IFE (In-Flight Entertainment) system. Whilst the existing system was already world renowned for being best in it’s class, the experience was begining to look dated.
This project aimed to create a new, modern, and performant customer experience that would last the next 5 to 10 years.
18 months into the project it was both overdue and not in a state that could be put in front of customers.
- The system was so unresponsive and slow it was causing motion sickness in customers - while the plane was still on the ground.
- There was no consistency to the implementation of the design, and so instead of feeling like a single cohesive system it felt like a disparate collection of loosely related apps.
- There were hundreds of bugs and issues on the backlog and they were increasing as development progressed.
The entire project was on the verge of being scrapped in favour of retheming the existing system to current brand guidelines.
Roles & responsibilies
I was brought into the project as Technical Lead alongside the new Design Lead Asher Pilbrow. Our mandate was threefold;
- Re-establish positive working relationships between Air New Zealand and Panasonic.
- Get the project up to Air New Zealand’s customer experience and design standards.
- Get the system stable and the backlog of bugs and issues consistently decreasing over time.
The IFE hardware was already decided and locked in. Anything installed in an aircraft goes through a multi-year certification process, and will thus end up flying for up to 10 years. Some of the hardware the new experience had to run on is already 8 years old.
Whilst we were based in New Zealand, the Panasonic Aviation development team and the test hardware was based in Lake Forest in California. This is a 4 hour timezone difference, or a 12 to 13 hour flight followed by a 2 hour drive away.
The only place to view and review code in progress was on racks of test hardware in Lake Forest.
Due to NDAs no one from Air New Zealand could ever work on, or even see, the code for the project directly.
Half the original Air New Zealand team had been moved to other projects.
We spent the first few weeks on the project in discovery, working out where we were and how we’d got there. The key issues we identified were;
Our design mockups were intended to be used in conjunction with documentation to communicate intent only. Unfortunately we had not communicated this to the development team who were treating them as pixel-perfect reference designs.
Our in-house standardised design system was not communicated to the development team. The result when combined with the design mockups problem, was multiple varied implementations of design elements such as buttons, labels, titles, movie and audio items, and inconsistent spacing and layout throughout the system.
Our designs were heavily based on using real-time blurs as key elements. However, my analysis and experimentation had proven that none of our seatback screen hardware was powerful enough to do these in real-time as originally intended.
Features that were considered “feature complete” by Panasonic Aviation were considered unnaceptably slow by Air New Zealand.
The product owner, the new Design Lead Asher Pilbrow, and myself flew to California to meet the Panasonic Aviation team, rebuild working relationships, and establish the same goals and standards across both teams.
Working with both teams we created performance and consistency goals as acceptance criteria for all work.
As part of this work we established more open and flexible ways of working that bridged the different cultures between the teams. The Panasonic development and testing teams were used to very hierachical relationships with their partners and this took quite a bit of undoing. These new ways of working became one of the cornerstones that enabled the project to be successful.
Asher and I collaborated to modify our designs based on the CPU + GPU capabilities of the seatback hardware so that they were performant whlilst still conveying the intent of the original designs.
Asher then rebuilt all our reference screen designs to be pixel-perfect and thus able to be used by developers as blueprints. He then worked alongside the development team to make several hundred pixel level tweaks across the entire system to align it with our reference designs and design system. Part of this work was also ensuring sizing of labels, buttons, and text fields would correctly allow for more verbose languages.
The Panasonic Aviation development lead and I collaborated to create common implementations of design elements (buttons, titles, labels, spacings, margins, various on screen controls) that were then consistently used throughout the system.
We also identified key interactions and transitions that needed performance attention, and worked to tweak existing implementations or create performant alternatives.
Smoke & mirrors
In both the design and development solutions I was able to use many tricks and optimisations from my demo-scene and gaming development years. We introduced the Panasonic team to game development tools and techniques to measure performance. Open any of the sections below to see some of the stories.
The blur of a screen to create backgrounds as originally specified took 2 seconds to render on the seatback screen hardware, and the system was effectively frozen during this render time.
We replaced this with a different, less calculation intensive blur algorithm but added a darkening effect on top to give largely the same visual result as the original. The resulting blur takes only 0.2 seconds to render.
Originally we animated a blur fading in over an entire screen by rendering the blur hundreds of times in real time, which visually stuttered and made the system completely unresponsive for the duration of the animation.
We replaced this with a pre-calculated blur rendered into an invisible layer over the screen, and then animated the opacity of that layer to become visible. The result looks almost indistinguisable from the original, but is hardware accelerated and runs at 60FPS.
Originally transitions between applications had multiple layers of transparent elements that were animated with different motions, blurs, zooms, and distortion effects. Unfortunately they overwhelmed the capabilites of both the CPU and GPU in the seatback screen hardware.
We simplified the animations to use only GPU accelerated transforms of the key elements and moved them all to a single layer. The result looks almost indistinguisable from the original, but is hardware accelerated and runs at 60FPS.
Going from the poster art of a media item to the detailed synopsis made the system freeze for 2 to 3 seconds, then jarringly jump into the synopsis screen.
Investigation revealed that for a variety of reasons there is a lot happening in the system during in this transition, too much to successfully animate at the same time. Instead we broke the process into parts so it felt like an animation; we blur the current screen, then we display an empty synopsis outline over the top, then we load the data and images into it, then we load in the previous and next synopsis on either side and scroll their edges into the current view.
Whilst the end result is actually slightly slower than before, perceptually it feels much faster because it responds instantly and gives progressive visual feedback. Additionally the new animation of the previous and next synopsis sliding into view now highlights to customers the screen can be scrolled.
The landing screen in most apps has a navigation bar that is translucent and shows a blurred version of what's behind it. Unfortunately this halved the scrolling speed.
We replaced the real-time blur effect with a simple translucent panel tinted to the IFE brand color. This was the biggest visual concession the designers had to make. The result keeps some of the intent of the original, but now runs at 60FPS making it a classic trade-off between performance or visuals.
The landing screen in most apps has large "hero" images with transparent gradients that revealed our color gradient backgrounds. Unfortunately scrolling dropped to under 10FPS when they were on screen.
We traced the performance drop to the large size of alpha blended area, so we removed the real-time alpha blending. Instead we reworked the background gradient to be horizontal only and baked the gradient blend and new background gradient into the hero image. The result is an invisible overlap of the edge of the hero image and the background, no matter where you scroll it.
The result looks indistinguisable from the original, but now runs at 60FPS.
Lists of media items had visible pop-in as image loading couldn't keep up with the speed of scrolling, and the scrolling wasn't as fast or responsive as we desired.
We discovered the media images used in the system were at a much higher resolution than required, which required real-time resizing to display causing the slowdown in responsiveness. It also used more memory and overwhelmed the image caching strategies.
After discussions with the media team, we created different content generation pipelines that created correctly sized and compressed images for each aircrafts screen resolutions. The result was no real-time resizing, reduced memory usage and thus more items fitting in the image cache - all of which gave us a massive reduction in image popping and a much more responsive system.
Launching apps was slow - 3 seconds for the icon to show it had been tapped and an additional 2 to 5 seconds to switch to the selected app.
Investigation revealed that the delay was code to do logging and other housekeeping had grown throughout the development and the tasks now added up to significant delays. We worked out how to move some of these tasks into a low priority background thread and distribute some of them to other non impacting event hooks.
Whilst switching speed was now 1.8 seconds, there was still a delay in visual response to your touch. To give instant feedback, we re-used the highlighting of the icon from the handset navigation feature to show instantly and then start the switching process.
Searching for video or audio was taking 10 to 15 seconds to show any results.
Investigation revealed there were multiple searches happening sequentially, and the system was waiting for all the results to come back before displaying anything. For instance in the Audio app we searched album name, artist name, song title.
We modified the search to work on a background thread and add to the displayed results as each search type returned. User testing revealed we should change the search order to artist name, album name, song title. Coincidentally this was also fastest to slowest in search time as each data set was bigger than the last.
The result is that searches now show first results in under a second.
The dev team had ported a complex system from their previous Linux/QT based IFE system that dynamically changed the fonts based on which language localisation you had selected. Unfortunately this was messing with our layouts differently in each supported language and creating a nightmare of hard-coded pixel adjustments. Worst of all, it stopped us from using the Air New Zealand font in non roman character languages (Korean, Japanese, Chinese).
I convinced the development lead, and then the IFE technical lead, and then the platform lead at Panasonic to use the built in Android font fallback system instead. The end result was thousands of lines of unnecessary code thrown away, all the hard-coded pixel adjustment hacks removed, and we were able to use the custom Air New Zealand brand font across all languages.
The physical controls for the volume and screen brightness were laggy to use. Worse, they aggregated multiple quick presses causing jumps in volume or brightness.
Investigation revealed that there is a lag between the hardware changing it's settings, and the system being able to confirm what the new values are. The development team had debounced and aggregated input to work around this lag.
I worked with the development team to allow the hardware buttons to have effect instantly and remove debouncing and aggregation of presses. Whilst this meant the on screen controls lagged behind actual values, user testing confirmed the system felt far more responsive.
As a two continent team we’ve delivered a new, modern and performant customer experience that we’re now proudly rolling out to our entire fleet.
- Added Te Reo Māori language support across the entire system
- Added full accessibility support with text-to-speech and physical controls
- Added a beautiful new wake up screen feature, complete with shooting stars
- Clean, consistent, and modern design and user experience
- No touch left behind - system responds to input in an immediate and visible way everywhere
- Beautiful transition animations to spatially guide customers and help mask slow transitions
- Interaction performance is consistently 50 to 60 frames per second, even on 8 year old hardware
- System stability is dramatically higher than the previous system, which was already an industry leader
The feedback we’ve had from customers on this project has been very positive and often extraordinary. My personal favourite was hearing Adam Savage of Mythbusters and Tested.com rave about it on his podcast, as a fellow maker this shout out was particularly gratifying.
This has been one of the most challenging and fun projects of my entire career. I would do this again in a heartbeat.