After identifying performance issues they could improve their application by more than 222%
2nd Year Testing
The second year, our testing supported an additional 250% increase in load.
Tourism and Gaming Case Study
FanHub Media is a company that designs engaging software for sports fans, ranging from bracket contests to trivia challenges. They were hired several years ago by a large paper in the UK to develop a fantasy football app. As software experts, they knew whatever product they came up with needed rigorous testing before it got popular. In 2015, iLAB was called in to confirm what user load the site could handle. From the results of our testing, we consulted with FanHub to identify performance issues they were then able to optimize.
Possible Impact of the Problem
The period when the app needs to bear the most traffic is at the start of the season, when fans are registering, creating leagues, and joining leagues. Any lag or other malfunctions during this time would simply send fans to find another service to facilitate their game. This app needed to perform well throughout the season, but especially early on as users were forming first impressions. Not only would a bad experience mean wasted time and investment by the newspaper, it would also possibly damage their brand and cause readers to view them as less of an authority.
Client’s Existing Technology
As expert developers, the team at FanHub tests their software manually during development. However, something specific like peak load testing isn’t part of that initial process of simply making sure the software works. To make sure the app is a great experience for every fan, they need to test thousands of user journeys at the same time, something no one wants to do manually and might not even logistically be able to handle without thousands of employees. They decided to call in iLAB to help them with automated software quality assurance testing.
iLAB tested three functions of the software: registration, creating a league, and joining a league. FanHub knew that there were things to worry about later in the season too, but the main priority was making sure the fantasy football app wouldn’t crash within the first few days or weeks of being on the market.
The first year we tested, we assisted in identifying performance issues that FanHub could address, improving their application by more than 222%, with a 0-2 second response time. The second year, our testing supported an additional 250% increase in load, but this time with sub-second response times. And in the third year of our partnership, the same optimizations were achieved, thanks in large part to FanHub’s commitment to innovation between seasons.
Clearly, this volume of our concurrent testing approach gives our clients lots of valuable information about their software. If developers never observe so many simultaneous user journeys taking place on the system in a controlled environment, they’ll never be able to anticipate the major issues and risks that arise with such a high volume of traffic. Especially when a software has been scaled rapidly, those insights can make the difference between disaster and success.
We’ve been happy to see how our iTEST methodology has supported FanHub in scaling their software in a short period of time. After their innovations improved the app’s capability, we were able to test it to ensure that all those new users wouldn’t mean a change in quality for any single one. Further, the insights from our testing process have helped FanHub streamline the user registration and signup process within the app to become smoother and shorter. This means fans get to the good part faster, and will keep coming back every season for another great experience.
CONTACT FORM TITLE
Fill out the form below and you will receive further information.(Place Holder Text)
* These fields are required.