top of page
Bosideng

Test & Learn ExamPLES

An eCommerce project.

GOAL
To test design variants in order to find the best solution.
MY ROLE
UX Lead
 
TOOLS
download.jpeg
Google
Hotjar_logo

Project A: Should the customers do the math?

Our team at Expedia faced a recurring question regarding the best way to display savings on the member price badge: should we show a percentage discount (e.g., "Save 15%") or the exact dollar amount (e.g., "Save $25")? It was unclear which would resonate more with users and lead to higher conversion rates.

 

Qualitative research provided mixed insights. Some users expressed that percentage-based savings felt like a better deal. However, other users found percentages confusing and mentioned they didn’t want to "do the math" to figure out how much they were saving exactly.

 

Given these conflicting opinions, we decided that the most effective approach was to run a round of 'test and learn' to see how real user behavior would be impacted by the different messaging styles. The goal was to determine which format would not only improve user clarity but also lead to higher engagement and conversion rates.

Badge (Lodging local) (1).png

(A) Control Varient 

Badge (Lodging local).png

(B) Test Varient

Hypothesis

Changing the message on the price badge from "X% off" to "$X amount off" will lead to higher user engagement and increased conversion rates, as users find the dollar amount more relatable and easier to understand than percentages.

Testing Design

Test Groups (We have recruited travellers that take 2-3 trips a year for leisure)

Control Group (A): Users see the current version of the price badge with "X% off."

Test Group (B): Users see the new version of the price badge with "$X amount off."

Metrics to Measure
Primary metrics

1. Conversion Rate: The percentage of users who complete a purchase after viewing the price badge.

2. Click-Through Rate (CTR): How often users click through to the checkout page after viewing the price.

Secondary metrics

1. Time on Page: Time spent on the product page. Longer times could indicate confusion, while shorter times could indicate clarity and quick decision-making.

2. Bounce Rate: How many users leave the site after viewing the price.

 

Test Duration

3 weeks

Success Criteria

A 2% or more increase in CTR for the test group (B) would indicate that the new “$X amount off” message is more effective in driving purchases.
 

Data Segmentation​

• Region: Check if users from different regions respond differently (e.g., US vs. Europe).

• Customer Segment: Look at new vs. returning users, as returning users may already be familiar with percentage-based discounts.

Results and Analysis
 

We compared the conversion rate and CTR between the control and test groups. This is what we learnt:

A statistically significant improvement in CTR for Group B, which validates the hypothesis.

We have also analysed secondary metrics like time on page and bounce rate to ensure the message change improves clarity and not just user action without a deeper understanding.
 

Project B: What It takes to sign in

Our team at Expedia aimed to increase user sign-ins on the trip page, as many users were browsing and booking without logging in, limiting personalised experiences and retention. We hypothesised that adding value propositions—such as points (OneKeyCash earning), personalised offers, and easier trip management—would encourage more users to sign in.

To validate this, we ran tests to see if surfacing these benefits would increase the authentication rate. The goal was to determine if these value propositions could drive higher engagement and more sign-ins, allowing us to make a data-backed decision on optimising the trip page.

Hypothesis

Adding clear value propositions on the Trip page will encourage users to sign in, leading to an uplift in authentication rates and ultimately improving personalisation and user retention.

app - old - Home Signed out.png

(A) Control Varient 

app2 (1).png

(B) Test Varient 1 

(C) Test Varient 2

Testing Design

Test Groups (We have recruited travellers that take 2-3 trips a year for leisure)

Control Group (A): Users see the current trip page with no additional value propositions.

Test Group (B): Users see the version of the trip page with one value proposition.

Test Group (C): Users see the version of the trip page with multiple propositions.

Metrics to Measure

Primary metrics

Authentication Rate: The percentage of users who sign in after viewing the value propositions on the trip page.

Secondary metrics

1. Conversion Rate: Track if authenticated users are more likely to complete purchases (since increased sign-in rates can often lead to better-personalised experiences and higher conversion rates).

2. Time on Page: Determine if users spend more time engaging with the trip page after value propositions are introduced.

 

Test Duration

2 weeks

Data Segmentation

• User Type: Measure differences between users who have signed in previously vs. new users.

• Device Type: Ensure that value propositions perform equally well across desktop, mobile, and app.

• Trip Status: Evaluate if the value propositions perform differently for users at different stages in the trip-planning process (e.g., booked trips vs. browsing).

 

Success Criteria

An 8-10% increase in authentication rates for either Group B or C would indicate that the value propositions are effectively motivating users to sign in, validating the hypothesis. 

Results and Analysis

This is what we learnt:

1. Saw an 11% uplift in the authentication rate by comparing Group A (control) and Group C (the winning test group), it shows a meaningful increase in sign-in behaviour.

2. We also cross-analyse with engagement rate and conversion rate. As signed in users coverts more than 40% than guest customers, we are able to confirm that the increased sign-in translates to meaningful interactions with the platform.

bottom of page