Boosting retention through a more rewarding proposal experience

Boosting retention through a more rewarding proposal experience

Halo is a technology partnering platform where 3,500+ scientists connect directly with leading companies like PepsiCo and Bayer for research collaborations and funding opportunities. While attracting scientist users has never been a challenge, retaining them has proven difficult. How could design help them feel seen, confident, and motivated to return?

Halo is a technology partnering platform where 3,500+ scientists connect directly with leading companies like PepsiCo and Bayer for research collaborations and funding opportunities. While attracting scientist users has never been a challenge, retaining them has proven difficult. How could design help them feel seen, confident, and motivated to return?

Role

Product Designer

Timeline

Launched in 1 months

Keywords

#Retention, #Iteration, #Web

Impact

Impact

To improve retention, we made the application process more transparent and rewarding. We began by improving the findability of the feature, then embedding positive reinforcement throughout the journey. As a result, we not only increased the 90-day retention rate from 10% to 50%, but also encouraged deeper participation across the platform.

To improve retention, we made the application process more transparent and rewarding. We began by improving the findability of the feature, then embedding positive reinforcement throughout the journey. As a result, we not only increased the 90-day retention rate from 10% to 50%, but also encouraged deeper participation across the platform.

50%

90-day user retention rate

1s

Time to find "My Proposal"

2.3×

Number of applications per user

86%

Users feel more confident

The Problem

The Problem

The Problem

User drop-off after submitting just 1-2 applications

User drop-off after submitting just 1-2 applications

User drop-off after submitting just 1-2 applications

Despite new users signing up each month, Halo struggled to keep them engaged after their first or second proposal submission. We saw a 15% 90-day user retention rate and a growing number of inactive accounts. We were losing promising talent before they could build meaningful momentum.


To find out why, we met with 6 drop-off users.

Despite new users signing up each month, Halo struggled to keep them engaged after their first or second proposal submission. We saw a 15% 90-day user retention rate and a growing number of inactive accounts. We were losing promising talent before they could build meaningful momentum.


To find out why, we met with 6 drop-off users.

The Initial Frustration🔥

The Initial Frustration🔥

The Initial Frustration🔥

Scientists felt lost after applying

Scientists felt lost after applying

Scientists felt lost after applying

The excitement of submitting a first proposal quickly faded when users tried to track their application status. The “My Proposals” page, the only place to view submissions, was buried deep within the user profile, making it difficult to find, track progress, or revisit past proposals.

The excitement of submitting a first proposal quickly faded when users tried to track their application status. The “My Proposals” page, the only place to view submissions, was buried deep within the user profile, making it difficult to find, track progress, or revisit past proposals.

Tension Was Building🔥🔥

Tension Was Building🔥🔥

Tension Was Building🔥🔥

They were left neglected for too long

They were left neglected for too long

They were left neglected for too long

The proposal evaluation process often took weeks, and in that time, users received no meaningful updates — only vague status labels like “evaluating.” While they returned in hopes of good news, the lack of clarity and feedback left them feeling ignored and discouraged.

The proposal evaluation process often took weeks, and in that time, users received no meaningful updates — only vague status labels like “evaluating.” While they returned in hopes of good news, the lack of clarity and feedback left them feeling ignored and discouraged.

Awareness

STAGE

EXPERIENCE

ACTION

TIME

Preparation

Application

Evaluation

Project

1 month

1 day

1–3 months

5 months

5 days

Accepted

Rejected

Browse home page

Explore research opportunities

Choose topics

Attend webinar

Draft proposal

Fill in application

Add publication

Submit proposal

Communicate with sponsor (we have low visibility)

Wait

Receive automated email of evaluations result

The Final Drop-off🔥🔥🔥

The Final Drop-off🔥🔥🔥

The Final Drop-off🔥🔥🔥

They lost faith in the platform

They lost faith in the platform

They lost faith in the platform

After weeks of waiting, most applicants received nothing more than a cold, automated rejection. The experience felt unrewarding; their time, effort, and confidence seemed wasted. Faced with uncertainty and disappointment, they had little reason to return.

After weeks of waiting, most applicants received nothing more than a cold, automated rejection. The experience felt unrewarding; their time, effort, and confidence seemed wasted. Faced with uncertainty and disappointment, they had little reason to return.

Reframe Problem

Reframe Problem

Reframe Problem

How might we keep scientists informed of their proposal status
and emphasize their success?

How might we keep scientists informed of their proposal status
and emphasize their success?

Transparency

Transparency

we should break down the previously vague evaluation journey into clear, trackable stages. We should also add clearer messaging around each phase to set expectations and rebuild trust in the experience.

we should break down the previously vague evaluation journey into clear, trackable stages. We should also add clearer messaging around each phase to set expectations and rebuild trust in the experience.

Redefine 'Success'

Redefine 'Success'

Instead of framing outcomes as a binary win or loss, we introduced positive reinforcement at every stage of the evaluation. Even if they are not selected, they are encouraged to engage with future opportunities.

Instead of framing outcomes as a binary win or loss, we introduced positive reinforcement at every stage of the evaluation. Even if they are not selected, they are encouraged to engage with future opportunities.

Remapping User Journey

Remapping User Journey

Remapping User Journey

We added more checkpoints to keep user informed

We added more checkpoints to keep user informed

We added more checkpoints to keep user informed

I collaborated with the Growth and Sponsor Evaluation teams to map the evaluation flow. We increased the evaluation checkpoints from 5 to 12, which would be given through 3 clear stages. From there, we would provide timely updates to boost transparency, reinforce progress, and keep users meaningfully engaged throughout the process.

Rather than leaving scientist users waiting for months without feedback, we restructured the evaluation process into 3 different stages.

By tracking the application journeys of multiple scientists, I identified key points where timely updates could improve transparency and keep users engaged throughout the process.

Apply

Evaluation

Result

Draft

Submitted

Receive results

Wait for evaluation

Your proposal draft is in progress.

Your proposal has been successfully submitted.

The review team will start the evaluation shortly.

Your proposal is under evaluation.

Your evaluation result is now available.

STAGE

ACTION

CHECK

POINT

Apply

Evaluation + Staged Result

Result

Draft

Submitted

Final result

1st evaluation

1st result

2nd evaluation

2nd result

3rd evaluation

3rd result

Your proposal draft is in progress.

Your proposal has been successfully submitted.

[Sponsor name] started Phase 1 evaluation. You will receive feedback before [Phase 1 Deadline + 5 days].

[Sponsor name] started Phase 2 evaluation. You will receive feedback before [Phase 2 Deadline + 5 days].

[Sponsor name] started Phase 3 evaluation. You will receive feedback before [Phase 3 Deadline + 5 days].

Someone from [Company name] starts reviewing your proposal. You will receive feedback before [Phase 1 Deadline + 5 days].

Someone from [Company name] starts reviewing your proposal.

You will receive feedback before [Phase 2 Deadline + 5 days].

Someone from [Company name] starts reviewing your proposal. You will receive feedback before [Phase 3 Deadline + 5 days].

Great job! You are qualified for the next phase of evaluation. We will let you know when the next phase begins.

Great job! You are shortlisted for this opportunity. We will let you know when the last phase begins.

Great job! You are one of the finalists of this opportunity. We will let you know when the next phase begins.

Congratulations! You are officially the winner of this opportunity.

The review team has decided to move forward with other proposals.

STAGE

ACTION

CHECK

POINT

Before

After

Final Design

Final Design

Final Design

MVP design through 3 steps

MVP design through 3 steps

After finalizing our increased engagement points with users, I moved into design iterations that truly bring transparency and positive reinforcement into the experience.

After finalizing our increased engagement points with users, I moved into design iterations that truly bring transparency and positive reinforcement into the experience.

Step 1

Step 1

Step 1

Prioritized “My Proposals” in global navigation

Prioritized “My Proposals” in global navigation

"My Proposal" is relocated to the main navigation for easier access. We also separate submissions into "Active" and "Archive" tabs to create a cleaner workspace.

"My Proposal" is relocated to the main navigation for easier access. We also separate submissions into "Active" and "Archive" tabs to create a cleaner workspace.

#Accessible

#Accessible

#Accessible

#Organized

#Organized

#Organized

Step 2

Step 2

Step 2

Bring transparency by dissecting evaluation process

Bring transparency by dissecting evaluation process

Instead of a single vague “Evaluating” status, we introduced 3 evaluation phases, along with other key statuses. We also added expected feedback dates to set clearer expectations and give users a sense of momentum.

Instead of a single vague “Evaluating” status, we introduced 3 evaluation phases, along with other key statuses. We also added expected feedback dates to set clearer expectations and give users a sense of momentum.

#Clarity

#Clarity

#Clarity

#Engagement

#Engagement

#Engagement

Step 3

Step 3

Step 3

Turning stage successes into verified status

Turning stage successes into verified status

We designed three badges to celebrate each stage of a proposal’s progress. These badges also appear on their profiles as verified credentials, showcasing their capabilities and helping them stand out for future opportunities.

We designed three badges to celebrate each stage of a proposal’s progress. These badges also appear on their profiles as verified credentials, showcasing their capabilities and helping them stand out for future opportunities.

#Reward

#Reward

#Reward

#Retention

#Retention

#Retention

Qualified

Shortlisted

Finalist

Applicants profile

Full Interaction

Full Interaction

Full Interaction

A more rewarding application experience

A more rewarding application experience

How did we get here

How did we get here

How did we get here

Design iterations

Design iterations

Design iterations

Badge iterations

We designed multiple versions of the badges to make sure each level feels rewarding on its own, while higher tiers become increasingly distinctive and visually prestigious.

Not rewarding enough

Not visually distinctive

Not visually coherent

Card iterations

I also prototyped 5+ proposal card designs, tested it with internal team and early testing users to help us decide the final design.

Progress bar is not rewarding nor accurate

Reward badge is not 'flashy' enough

Over emphasize on "Next Update Date"

Result

Result

Result

Fewer drop-offs, stronger trust

Fewer drop-offs, stronger trust

Working closely with engineering, we built and launched the MVP within just two months. The success badges design received particularly positive feedback from both scientists and sponsors. With a 47% user retention rate, we gained greater confidence in growing the scientist network and attracting new research opportunities.

Working closely with engineering, we built and launched the MVP within just two months. The success badges design received particularly positive feedback from both scientists and sponsors. With a 47% user retention rate, we gained greater confidence in growing the scientist network and attracting new research opportunities.

Reflection

Reflection

Reflection

Product is never truly built, it's a process of building and testing

Product is never truly built, it's a process of building and testing

Designing features was just the beginning, I also took the lead in identifying usability issues through Smartlook analysis and solving those with the cross-functional team. It was fascinating to see how users interacted with different features, and how small changes profoundly changed user engagement.

Designing features was just the beginning, I also took the lead in identifying usability issues through Smartlook analysis and solving those with the cross-functional team. It was fascinating to see how users interacted with different features, and how small changes profoundly changed user engagement.

Halo Team 🚀

Designed by Hailey Yixuan Li

LinkedIn

Email

Instagram

Designed by Hailey Yixuan Li

LinkedIn

Email

Instagram

Designed by Hailey Yixuan Li

LinkedIn

Email

Instagram