PlayTest AI · 2024Founding Product Designer8-week redesign

Redesigning an AI‑backed game testing platform

PlayTest AI is an AI-driven game testing platform built to optimize the QA testing workflow for PC, VR, and mobile game development. By replacing manual test scenario creation with autonomous bots, it analyzes gameplay in real time, detects bugs, and surfaces actionable insights.

Key outcomes

+55%Engagement rate
20→80%Trial-to-paid conversion
+41%Feature adoption
3→11Client growth

Disclaimer: All data presented has been generated for illustrative purposes only and does not represent actual company data.

Project Snapshot

Company

PlayTest AI

Role

Founding Product Designer

Timeline

8 weeks · 2024

Tools

Figma, FigJam, Jira

Responsibilities

Founding Product DesignerDashboard RedesignAI Product DesignWebappDesign Systems

Team

CEOCTOPMFront-End Dev

Constraints

With limited access to user feedback, design decisions were driven by close collaboration with engineers and informed assumptions based on known pain points in manual QA — in a limited time period of 8 weeks.

Impact tldr;

+55%Engagement rate
20→80%Trial-to-paid
+41%Feature adoption
3→11Client growth
The Challenge

A powerful tool held back by usability issues

The original platform was built by the CTO, not a designer—and it showed. Users hit a wall of complexity immediately after sign-up.

80–90%

User drop-off rate

Users abandoned the dashboard within minutes of signing up.

0

Design involvement

The original dashboard was built by the CTO — no designer had ever touched it.

Low

Dashboard interaction

Users barely engaged with the core scenarios view. Most actions required support.

None

Product moat

No differentiation from competitors — product lacked scalability and a clear USP.

Original PlayTest AI dashboard showing usability challenges — product lacking scalability, low dashboard interaction, no product moat

Who are we designing for?

Persona: Mikala Reiman — Late 20s QA tester who spends hours manually testing for bugs and writing test cases. Motivated to eliminate manual labor, maintain work quality, and reduce errors.
Research

Framework for prioritization

Innovating within constraints—balancing business viability, user desirability, and technical feasibility.

Prioritization framework — Venn diagram of Business, Tech, and User with Viability, Desirability, and Feasibility axes
Viability

Business Goal

Reduce operational costs related to client support. The product had to pay for itself by reducing the support burden.

Desirability

User Needs

Enable game testers to quickly upload, create and test game scenarios from the dashboard — without hand-holding.

Feasibility

Tech Constraints

Maintain competitive edge while working within time and budget limitations of an 8-week sprint.

Competitive analysis — PlaytestCloud is the Player Insights Platform for playtesting and research in all phases of game development

Competitive analysis — PlaytestCloud and similar platforms

Define + Ideate

Restructuring the information architecture

The old IA was flat and feature-centric. The new IA introduces workflow-centric navigation, AI-powered scenario generation, and phased feature rollout.

Old Information Architecture
Old IA — flat structure with limited navigation: Workspace Name, Profile Picture, Start Run CTA, Test Game Name, Upload Build, Scenarios (Queue, In Progress, Completed), Settings (Delete Account)
New Information Architecture
New IA — structured hierarchy: Studio Name, Game Name/Version, Help Center, Scenarios/Builds/Settings nav, Active Scenarios with AI Feature CTA, status categories (In-Progress, Failed, Passed), Phase 2 features

Ideation

Ideation wireframes — Part A: Game Versions & Workspace, Part B: Generating Scenarios with AI, Part C: Clear Headings for Test Cases, Part D: Segregating Failed/Passed Cases
Part A

Game Versions & Workspace

Using top and side Nav Bar for uploading different game versions and games. Users could also set up separate workspaces.

Part B

Generating Scenarios with AI

Top right strategic position for 'New Scenario' generation with AI. No need to manually type test case scenarios.

Part C

Clear Headings for Test Cases

Providing clear headings for quick glance and bulk edit functionality for more control and ease of use.

Part D

Segregating Failed / Passed Cases

Separating In Progress, Passed & Failed test cases for users to quickly identify and debug failed scenarios for efficiency.

Interface Anatomy

User journey of a QA tester

Side-by-side comparison — New Dashboard with organized sidebar, game list, status categories vs Old Dashboard with dark kanban layout and limited navigation

User Journey — 4 stages

01Upload Game Build02Create Scenario03Review Results04Monitoring Progress
01

Upload Game Build

Before
Upload Game Build — Before
02

Create Scenario

Before
Create Scenario — Before
03

Monitoring Progress

Before
Monitoring Progress — Before
04

Reviewing Results

Before
Reviewing Results — Before
Final Design

The redesigned product

A clean, structured dashboard with clear status categories, AI-powered scenario generation, and a design system that scales.

Final Design — polished dashboard with Active Scenarios, Statistics, Archive tabs; In Progress, Failed, Passed status categories; New Scenario AI CTA; severity tags
#1UX Law

Aesthetic-Usability Effect

Users tend to perceive aesthetically pleasing designs as more usable. The visual polish of the redesign directly improved perceived reliability and trust.

#1UX Psychology

Design drives decisions

Design influences about 80% of buying decisions and user satisfaction. The redesigned experience directly contributed to the 20% → 80% trial-to-paid conversion lift.

The Impact

Numbers that tell the story

The redesign didn't just look better—it fundamentally changed how users engaged with the product.

+0%

User Engagement

Increased 30min+ user session duration from 15% to 70% by resolving usability issues, significantly reducing support tickets.

0%

Trial-to-Paid Conversion

Increased trial to paid user conversion rate from 20% to 80% in 8 weeks. Growing clients from 3 pre-MVP to 11 post-redesign.

+0%

Feature Adoption

Redesigning from manual to AI-driven scenario generation increased user interactions from 30–56 clicks, enabling 41% more scenarios generated & tested.

0

Clients Post-Redesign

Users struggled with bug detection and test case resolution pre-redesign. Grew from 3 pre-MVP clients to 11 post-redesign.

PlayTest AI dashboard mockup on laptop