Tealbook Reporting Dashboard

Tealbook Reporting Dashboard

Making supplier diversity data clearer, more trustworthy, and easier to act on.

Making supplier diversity data clearer, more trustworthy, and easier to act on.

About the project

TealBook is a supplier intelligence platform for procurement teams. It centralizes supplier data, diversity certifications, and risk information, helping companies source responsibly and track their supplier diversity programs.

As TealBook built their new supplier data platform, we tackled one of the most-used tools first: the Supplier Diversity Spend Reporting tool. Users had been vocal about their frustrations, so this was our chance to address them.

Our goal was to migrate 6 customers from the old platform and reduce support tickets by 5%.

Client

TealBook

Services

User Experience, User Interface

Role

Lead Product Designer

Year

2023

What we needed to understand

What we needed to understand

  • What made the current reporting tool frustrating to use?

  • What parts were actually working well?

  • How were people sharing and presenting their data?

  • What did their workflows look like for setting up programs and hitting diversity goals?

How we found our answers

Audited the current experience using product analytics (Sisense) to find where people were struggling.

Studied competitors like Supplier.io to see what else was out there.

Talked to customers to hear their pain points and wish lists.

Workshopped with internal teams (Solutions and Customer Service) who heard customer frustrations daily.

How we found the answers

Audited the current experience using product analytics (Sisense) to find where people were struggling.

Studied competitors like Supplier.io to see what else was out there.

Talked to customers to hear their pain points and wish lists.

Workshopped with internal teams (Solutions and Customer Service) who heard customer frustrations daily.

The problems we uncovered

Doubt about accuracy: The data changed constantly in real-time, which confused users and made them question if the numbers were even right.

Hard to understand visualizations: Complex charts undermined confidence instead of building it.

No control: Users wanted to manually adjust data but couldn't, they were stuck with what the system gave them.

What did we learn?

Users wanted control: The platform was too rigid. People resorted to workarounds, manual calculations, and external tools just to get the view they needed.

Trust was missing: Without knowing where data came from, how it was validated, or when it updated, users couldn't confidently present reports to their stakeholders and clients.

The workflow was a inefficient: Manual data input from the TealBook implementation team and manually exported reports from the TealBook team slowed everyone down.

Guidance would help: Users wanted alerts for expiring certifications and recommendations to help them hit diversity goals.

What will we do now?

From sketches to screens

Give users filtering options and control over what data to include,

Add pre-set date ranges that match their reporting periods.

Fix double-counting issues in certification spend.

Make data sources and expiration dates visible and clear.

Let users create reports themselves instead of relying on the TealBook team.

Build in goal-tracking and benchmarking so users can make confident decisions.

How did we do it?

From sketches to screens

We explored different ways to display supplier information and tested our solutions with 6 clients to see what resonated.

Wireframes

Before moving to high-fidelity designs, we tested wireframes with our 6 clients and internal stakeholders.

The wireframes helped users visualize the experience, which surfaced needs they hadn't mentioned in initial interviews and provided us with valuable feedback for hi-fis:

  • Our solution for tracking supplier diversity status changes (like when suppliers lose their certification) wasn't clear enough

  • Users wanted benchmarking data to compare their performance against other companies in their industry. This turned out to be more important than we'd realized from the initial research interviews.

Different iterations of the dashboard

I experimented with different ways to visualize the data, getting feedback at all stages to iterate and improve at each step.

The dashboard gives users what they'd been asking for: real-time goal tracking, notifications, and trend insights. Clear comparisons between diverse spend and total spend replace the confusing visualizations, while interactive widgets let users dig deeper when they need to.

Certification and supplier information

We broke down spend by certification and supplier so users could see exactly where their money was going. No more guessing or surprises.

Supplier details

Quick modals show the details that matter: expiration dates, qualifiers, and context for metrics and definitions. This helps users confidently assess their data and makes the platform easier to navigate, especially for newcomers.

Advanced filter

Advanced filtering gives users the control they wanted—tailoring views by supplier type, region, or certification status to match their specific needs.

Spend reclassification

Users can now classify or disqualify potential spend themselves, fixing double-counting and validating their data. This makes their reports more accurate and eliminates second-guessing.

Qualification Rules

Admins can now define which certifications count toward diversity spend and prioritize subcategories to fix double-counting issues. This directly impacts how Qualified Diverse Spend gets calculated. A first step for our users having more control of their data.

What happened and what did we learn?

Early testing showed showed a lot of promise. Clients loved the flexibility and control. Unfortunately, we couldn't track final metrics because priorities shifted and the feature development was put on hold.

Working on this project reminded me of some fundamentals I sometimes lose sight of:

Challenge internal assumptions: What the internal team and stakeholders thought users needed often missed the mark. This reinforced how critical research is to getting design direction right.

Roll with the changes: Projects rarely go as planned. Even though we didn't launch, this taught me to stay flexible and resilient when priorities shift. The work still had value, even if it didn't ship when we expected.