project 1

FairFX: customer acquisition

FairFX provide a range of financial services for consumers that are either travelling or sending money overseas. Their core consumer product, a prepaid currency card, can be used when abroad to either pay for goods directly in shops and restaurants, or to withdraw cash from ATM’s. They had recently gone through a redesign and rebuild of the website to make it fully responsive (for desktop, tablet and mobile) but had not seen any significant increase in new business as a result. I was asked to look at ways to improve the acquisition of new customers by improving the user experience.

% increase in conversion

£m increase in turnover

% increase in ceo happiness

background

introducing user centred design

FairFX entered the travel money space just over 10 years ago when there was little competition in the market. They were very technology driven at the time and had very little expertise in modern approaches to product design. Therefore, when I joined them as Head of Product, my intention for this first project was not only to improve the conversion rate of new customer sign-ups, but also to take the business on the journey and introduce them to the virtues of customer centred product design. So, my intention was to involve stakeholders at all levels (from the CEO down), and to demonstrate how concepts such as user research, user testing, MVP, iterative design, prototyping and user testing would lead to better outcomes. The result was not only a massive 18% uplift in conversion, but a new understanding of, and appreciation for the value of user centred design. Here’s how I did it…
discovery

mind mapping

At the start of the project, the first thing I did was a brain dump of all of the things I wanted to find out during my discovery.

This included information about the product, customers, stakeholders, the business, site traffic, what data was currently available and what additional data would we need to acquire through further research.

This formed the basis of a high-level plan of areas to investigate in more detail.

product and stakeholder research

Getting to know the product and the people in the business with product knowledge was key. I held a series of meetings with people from customer services, marketing, data science, engineering and the senior management team. I also put myself through the customer experience so I would know first-hand what it’s like to order a card, receive it, activate it, use it and administer it through the website.

finding existing data

I wanted to build up a picture of what sources of useful data existed within the business that could be used. After establishing the data owners during my stakeholder research, I compiled a list of current data sources. This allowed me to start gaining insight from existing data, and to establish what additional data I was going to need to source myself to provide the insight I needed.

storyboarding

I created a user journey storyboard which I put on the wall to use as a constant physical source of reference throughout the project. I find it helps to be able to easily see what you are working on, to make notes and to use it to facilitate conversations with stakeholders and people in the team.

USABILITY LAB TESTING

I wanted to see first-hand how people understand, perceive and use the website. I recruited people that worked in the local area that matched the profile of the target customer. I created a temporary user lab environment within the office with facilities to test and record users on desktop and mobile devices. This proved to be an invaluable source of insight and data gathered in these sessions was key in deciding where to focus my efforts when it came to the design stage.

competitor analysis

Using insight from stakeholders plus my own market research, I put together a list of key competitors in the market. I analysed the sites and completed a product teardown of the key players. I wanted to know not only were competitors doing badly (which may provide us with an opportunity to create differentiation), but also were they doing well. I wanted to understand how competitors were solving the same problems we needed to solve for our customers. This provided a benchmark for the minimum we needed to achieve (product parity), but also allowed me to see how high we needed to raise the bar in order to be competitive.

google analytics

Using GA, I examined the data to establish how customers were coming to the site (traffic sources), how they were navigating through the funnel, behaviour by device and the conversion rates for various segments.

heatmaps and session recording

Using Visual Web Optimiser (VWO), I set up a series of heatmap, scrollmap and screen recordings to analyse the various pages and stages in the journey I was interested in. Using data from GA, I was able to target the data capture on specific segments i.e. from the traffic sources that were of most interest to me.

NANOREP ANALYSIS

Nanorep was the company’s smart FAQ’s system which proved to be a great source of insight. The system allows you to see what questions people are asking on each page of the website. This allowed me to see which information was important to customers, and where we could make improvements by better communicating information about the products (without customers having to resort to using FAQ’s).
analysis

AFFINITY DIAGRAMING

With all of the data gathered from the various sources, I created an affinity diagram to group together and categorise insights. This visual representation of the information allowed me to triangulate the data i.e. to identify where the same insights had emerged from more than one data source.

prioritisation

I then put each of the categories on a chart which prioritised on the basis of the level of pain caused and the number of customers affected by the issue. This allowed me to focus on the issues that were causing the most amount of pain to the greatest number of customers (the cluster on the top right of the chart).
ux design

design hypothesis

The research was showing that customers were confused by the products. Although there we some issues with navigation through the funnel, the main are to address was the communication of information on the product pages.

We needed to help customers better understand which product they needed, what were the costs involved, how the products worked and what the benefits were of using the product.  My hypothesis was therefore that if we were to restructure the site navigation and re-organise and re-write the content on the product pages, customers would be able to better understand how the products worked and would be more likely to purchase.

site architecture

One key insight was that customers were confused by the differences between each of the products with regards to where they can be used, and associated costs. This was because all of the product information was on a single page. So I decided to create separate pages for each of the 3 individual products, and one high-level summary page that would link customers through to the details.

I addition to making the navigation easier, it would also enable us to land traffic (i.e. from paid search) directly on to the page related to their specific search. So, for example, a search on the keyword ‘euro card’ could take the user directly to the Euro Card product page.

PAGE DESIGNs

Starting with low-fidelity wireframes, I progressed to a high-fidelity prototype which I built in Sketch and Invision. I tested the prototypes on users by inviting more people in to the lab. I iterated the designs each time until I was happy that the problems I had identified during the original research had been solved.

The information was arranged on the page according to the priority of importance to customers. I added a sticky secondary nav bar to the product pages so that wherever the user was, they could always quickly find the key information they needed.

testing

split traffic (a/B) testing

I set up an A/B test and exposed the new site architecture and page designs to 50% of site traffic (with 50% seeing the control variant).

The test ran for 8 weeks until reaching statistical significance and showing a conversion uplift of 18%.

Success!