Quality Selection on PDP

Product Strategy

UX Design

CRO

Duration

28 days

Role

UX/UI Designer

Team

CTO, CX Lead, 3 Engineers

Visit Live Site

Overview

Miss Pompadour offers multiple paint qualities designed for different surfaces and use cases. However, user research revealed that customers often struggled to understand the differences between these qualities. This led to slower decision making, occasional confusion and missed conversion opportunities.
To address this, we redesigned the Quality Selector on the PDP focusing on clarity, hierarchy and ease of comparison.
 The goal was to simplify the selection process, reduce cognitive load and support users in making faster, more confident choices.
The test was deployed across mobile, tablet and desktop to understand how different visitor groups responded to the new structure.

+10.88 %

Revenue Per User for new visitors

+10.88 %

Revenue Per User for new visitors

+10.88 %

Revenue Per User for new visitors

+3.16 %

Conversion Rate for returning visitors

+3.16 %

Conversion Rate for returning visitors

+3.16 %

Conversion Rate for returning visitors

+8.88 €

AOV for new desktop visitors

+8.88 €

AOV for new desktop visitors

+8.88 €

AOV for new desktop visitors

The redesign improved clarity, reduced hesitation and encouraged more confident paint quality selection across devices.

Challenge

The previous Quality Selector created friction across several areas:

  • Users found it difficult to compare qualities due to dense layout
Important details were buried, causing slow scanning and uncertainty

  • Mobile users struggled with readability and scrolling effort


  • The lack of a clear hierarchy produced hesitation in purchase decisions

  • Visitors were unsure which quality was suitable for their specific project

  • These issues directly affected conversion rate and revenue on the PDP.


    Our goal was to make the selector more intuitive so users can understand quality differences quickly and confidently.

Process

We followed a user-centered design approach with multiple iterations and testing phases to ensure the final solution met both business goals and user needs.

1

Research and Insights

We analyzed clarity, scroll behaviour and support queries.

1

Research and Insights

We analyzed clarity, scroll behaviour and support queries.

1

Research and Insights

We analyzed clarity, scroll behaviour and support queries.

2

Visual Design and Prototyping

We simplified the structure to surface the most important quality attributes.

2

Visual Design and Prototyping

We simplified the structure to surface the most important quality attributes.

2

Visual Design and Prototyping

We simplified the structure to surface the most important quality attributes.

3

A/B Test Deployment

The variant and original were tested for 28 days

3

A/B Test Deployment

The variant and original were tested for 28 days

3

A/B Test Deployment

The variant and original were tested for 28 days

Solution

Clarity and Comparison

  • Users can understand main differences between qualities at a glance


  • Important characteristics are visible without long scrolling


  • Comparison points are easy to scan and evaluate


Improved Mobile Experience

  • Better text readability

  • Cleaner spacing

  • Reduced cognitive load


Decision Support

  • Descriptive labels guide users to the correct quality for their project


  • Icons and highlights help communicate differences faster


The primary metric of the test, the conversion rate, shows no significant changes for new desktop visitors. However, the revenue per user shows a significant upliftof 10,88 %with a significance of 94 %.

The conversion rate, shows a significant uplift for returning visitors of 3,16 % with a significance of 90 %.