The Les Mills Virtual app also had an on demand feature that allowed gym owners to let gym goers play Virtual classes from the virtual app. The on demand market was a perfect segment to expand into at Hotels, appartment complexes and schools. However, the user experience was flawed and the user interface looked very dated. After interviewing a number of gym owners and gym goers we realised that there was a real need for the interface to be optimised to make the classes easier to use and capture the on demand market. Last but not least the branding on the app did not match any of the new Les Mills branding which was an important selling tool for gyms around the world.  

The features should come here – Why we considered it, what problems they solved

We undertook a user experience research session where we collected qualitative and quantitative data to get started with what would be the most important parameters that would help deliver an experience that users loved. Using human centered design processes we co designed the experience with gym owners and gym goers. We put together early wireframes to test internally followed by rich prototypes to test with real users. This gave us a chance to iterate on the prototypes. We collected the feedback and finalised the designs and started to design the final prototypes.

How this wass received at gyms around the world. 

customer journey

Customer journey maps

Through interviews and survey data the top most frequented tasks were an obvious low hanging fruit to fix and optimise. Customer journey maps were created for  high priority tasks which would make a significant impact to both the costs and time to serve. The journey map analysed multiple channels and touch points across each channel including the navigation through Velocity required to complete. This map served as a starting point to be able to find out which points along the customer journey was best suited to be fixed to improve the speed and effort required to complete a task.


Well researched personas are one of the best references for any work on product design. Depending on the product a persona could either include their daily schedule or a Myer Briggs scale. I usually began creating proto-personas that were derived through interviewing relevant stakeholders or at workshops using empathy mapping tools. These yielded quick and effective results, especially where interviewing actual users was not a viable option. Personas evolved constantly as new information became available and where stronger evidence of behavioural patterns overruled previous assumptions.

Some of the companies I used Personas were:

Mediaworks: Researching our archetypes from our demographics at 3 News and Radio Live which was a more niche audience. These personas had daily schedules to see how we could fit into their lives. For our early strategy sessions we had basic persona types defined which we used to understand our target demographics. This gave us a good starting point from where to head with our content and design.

Mobile app builder: As our user base grew, we had to identify and create positive persona archetypes as well as negative persona archetypes. This allowed to concentrate and focus on who was our target personas while eliminating the ones that we didn’t want to focus on so we could create a better-defined product for the few personas we cared about. Once these persona types were identified I helped gather more evidence about their pain points, jobs to be done and frustrations through multiple UX methodologies including interviews, surveys, customer feedback and immersion.

Gentrack:  Part of the research involved visiting client sites and immersing myself in an actual working environment. Each persona represented software users and each persona used a completely different subset of features within the software. This led to ideation of how the software could be specialised and customised for each persona.

Creating personas opened up a whole slew of ideas for future product development. Because each persona type had a distinct role with distinct tasks it opened the possibility of optimising the platform to create apps that were a subset of the main platform with tasks that were specialised for a persona type. The hypothesis was that would get users trained faster and would improve their productivity substantially. Personas also allowed us to identify influencers who were responsible for technology purchase versus passive users that had no influence on the software that was used.

Usability testing

Usability testing always yielded some of the best insights for product improvement. The testing involved defining objectives for the testing, putting together a number of tasks, recording the tasks while interviewing the users as they went along completing the tasks. The results were compiled and reviewed against quantitative data to eliminate false positives and put together as a report for improvement to be made to the final product.

Some of the companies that I used usability testing were:

Mediaworks: Usability testing of the 3 News and the Radio Live website before launch by inviting a select number of actual users and giving them tasks to complete and interviewing them while completion of their tasks. The results of the testing resulted in making big usability improvements to the products before launch.

Mobile app builder: Usability testing was conducted on the on boarding journey to get an understanding of how easy it was to onboard new users and build their own apps. The usability studies were recorded and the result of the interviews was tallied to make big usability improvements to the product.

Gentrack: Before releasing the new version of the software to retailers I conducted usability testing at sites that did a comparison of how the new version tracked against the old version of the software by tracking its performance across various metrics. I setup a test group of users and put together usability testing using tasks specialised for those roles. The usability tests were recorded using Camtasia. I tracked the recording of these tasks and output a number of performance metrics like time to complete the task, mouse clicks, scrolling, errors and navigation time. The results were put together into a report that outlined how each version of the software compared to each other over different tasks.