Recreating the SAT Experience—Turning Practice into Progress Through Clear, Familiar Design
We set out to design an experience that didn’t just feel smart but also make it felt supportive. Our challenge was to mirror the structure and pressure of the SAT while reducing cognitive friction and keeping students focused.
Wireframes
We used wireflows and page-level wireframes to define clear, intuitive interactions across the Practice, Analytics, and Improve sections. Our wireflows helped us map out the complete user journey; highlighting key decision points like starting a new test, reviewing results, and tracking progress over time. These flows ensured that users could move through the app with minimal friction and maximum clarity.

We also designed detailed wireframes that explored content hierarchy, functional layout, and visual grouping. Each page was structured to support focused actions and quick comprehension, following principles like Fitts’ Law for tap targets and Hick’s Law to reduce decision fatigue.

Visual Design (UI)
In the high-fidelity UI stage, we applied a token-based design system to maintain visual consistency and ease scalability. We used Figma variables for color schemes, spacing, and typography to streamline iteration and responsiveness.

Our visual design choices were shaped by three guiding principles:
Clarity: Use whitespace and color coding to highlight key scores and insights
Familiarity: Mimic the look and feel of the real SAT interface to reduce test anxiety
Encouragement: Reinforce progress and mastery through subtle animations, badges, and affirming copy

Key screens such as the Analytics Dashboard and Book of Improvement received extra attention to ensure the data they presented was both actionable and motivating. We used stacked visual indicators, progress bars, and color-coded topic groupings to support quick comprehension and informed study decisions.

Iterations from Usability Tests
To validate our design decisions, we conducted usability testing with 10 participants across our target user group. Using the think-aloud method, we observed how users navigated through key flows—taking a practice test, reviewing results, and analyzing performance.
The testing surfaced subtle but critical friction points that shaped our final design:
“Book of Improvement” Terminology Caused Confusion
Several users didn’t immediately understand what the term referred to. To resolve this, we renamed it to a more action-oriented label and added a brief description beneath the section title to clarify its purpose.

Analytics Dashboard Overwhelmed First-Time Users
While users valued the data, they found the layout too dense and hard to scan. We restructured the page into digestible sections, and used stronger visual hierarchy to guide focus.

Practice Flow Felt Abrupt and Unprepared
Several users reported feeling caught off guard when the practice session began immediately after clicking “Start Practice.” There was no transition or opportunity to mentally prepare. In response, we introduced a confirmation pop-up screen that appears after clicking the button. This gave users a moment to get ready and reduced anxiety before entering the test environment.

These small but impactful changes were driven by user feedback and directly improved usability, reducing hesitation and boosting confidence across critical user flows.