Interligence

Interligence is an AI-powered SAT Math practice platform designed to replace fragmented, outdated study methods with a cohesive, adaptive, and intuitive solution. The project aimed to reimagine SAT prep from a design perspective—where user motivation, clarity of feedback, and adaptive learning are built into every screen. By focusing on Thai high school students preparing for international programs, we created a tool that feels approachable and effective, even under exam pressure.


This project is featured in ECTI-CON 2025, The 22nd International Conference on Electrical Engineering/Electronics, Computer, Telecommunication, and Information Technology

ROLES

UX/UI Designer Lead, UX Researcher Lead, Front-End Developer

TIMELINE

Nov 2024 – Jan 2025

PROBLEMS

Students weren’t just struggling with math, they were struggling to even find the right tools to practice it properly.

Our research revealed that current SAT Math prep methods—textbooks, past papers, and online platforms—often fall short. Students faced fragmented resources, outdated formats, and lacked clarity on their weak spots. Instructors also noted the absence of accessible, structured, and learning tools aligned with the new digital SAT.



CHALLENGES

Creating an intuitive SAT prep experience meant untangling complex user needs, cognitive stress, and AI transparency—all through design.

Designing Interligence meant making advanced AI features feel intuitive and trustworthy. We had to present dense academic content in a clean, focused layout while avoiding cognitive overload. Visualizing progress without discouraging users, and clearly communicating feedback, required careful design choices to support learning and build confidence under exam pressure.

Research

Scattered Prep Tools and Resources Left Students Overwhelmed and Insecure

We began with in-depth qualitative interviews involving 15 participants: 4 high school students, 7 university freshmen, and 4 SAT instructors. The aim was to explore their study behaviors, tool usage, frustrations, and expectations.

What We Discovered:
  • Students juggled multiple sources without clear guidance

  • Many had taken practice tests but didn’t know what to do afterward

  • Instructors found it difficult to recommend any all-in-one solution

To deepen our understanding, we conducted a competitor analysis of four commonly used SAT prep tools


To visualize how these tools compare to one another (and to our future solution), we plotted them on a Price vs. Ease of Use matrix.

Requirement Gatherings

Filling in the Blanks: Prioritizing What Students Actually Need

To ensure the product addressed the pain points, we started by translating user research into actionable product requirements where we uncovered recurring frustrations around SAT Math preparation.

Key Requirements Identified
  • Realistic Test Simulation → Users needed SAT-style questions with a digital interface that mirrored the real exam

  • Progress Visibility → Students wanted to track improvement over time, not just see a static score

  • Mistake Review & Retention → There was no structured way for users to revisit and learn from past mistakes

  • Clarity & Simplicity in UI → Overcomplicated dashboards and content-dense layouts made other tools hard to stick with

Persona

To anchor our decisions around the target audience, we created a primary persona based on the most prominent user group: high school students preparing for the SAT. This persona guided our design priorities and helped us stay focused on user behavior, motivations, and challenges.

After identifying user needs, we translated them into potential features. To help us prioritize, we organized all ideas into an Impact–Effort Matrix. This allowed us to filter out low-impact or high-effort features and focus on those that delivered the most value within our timeline.

Prioritized MVP Features
  • AI-Powered Practice Test – for unlimited, adaptive test simulations (Major Requirement)

  • Analytics Dashboard – for visualizing progress and performance trends

  • Book of Improvement – for reviewing and retrying missed questions

Secondary features like streak tracking and topic-specific practice were scoped for future iterations based on their long-term value.

Executions

Recreating the SAT Experience—Turning Practice into Progress Through Clear, Familiar Design

We set out to design an experience that didn’t just feel smart but also make it felt supportive. Our challenge was to mirror the structure and pressure of the SAT while reducing cognitive friction and keeping students focused.

Wireframes

We used wireflows and page-level wireframes to define clear, intuitive interactions across the Practice, Analytics, and Improve sections. Our wireflows helped us map out the complete user journey; highlighting key decision points like starting a new test, reviewing results, and tracking progress over time. These flows ensured that users could move through the app with minimal friction and maximum clarity.


We also designed detailed wireframes that explored content hierarchy, functional layout, and visual grouping. Each page was structured to support focused actions and quick comprehension, following principles like Fitts’ Law for tap targets and Hick’s Law to reduce decision fatigue.


Visual Design (UI)

In the high-fidelity UI stage, we applied a token-based design system to maintain visual consistency and ease scalability. We used Figma variables for color schemes, spacing, and typography to streamline iteration and responsiveness.


Our visual design choices were shaped by three guiding principles:

  • Clarity: Use whitespace and color coding to highlight key scores and insights

  • Familiarity: Mimic the look and feel of the real SAT interface to reduce test anxiety

  • Encouragement: Reinforce progress and mastery through subtle animations, badges, and affirming copy


Key screens such as the Analytics Dashboard and Book of Improvement received extra attention to ensure the data they presented was both actionable and motivating. We used stacked visual indicators, progress bars, and color-coded topic groupings to support quick comprehension and informed study decisions.


Iterations from Usability Tests

To validate our design decisions, we conducted usability testing with 10 participants across our target user group. Using the think-aloud method, we observed how users navigated through key flows—taking a practice test, reviewing results, and analyzing performance.

The testing surfaced subtle but critical friction points that shaped our final design:

  • “Book of Improvement” Terminology Caused Confusion

    Several users didn’t immediately understand what the term referred to. To resolve this, we renamed it to a more action-oriented label and added a brief description beneath the section title to clarify its purpose.


  • Analytics Dashboard Overwhelmed First-Time Users

    While users valued the data, they found the layout too dense and hard to scan. We restructured the page into digestible sections, and used stronger visual hierarchy to guide focus.


  • Practice Flow Felt Abrupt and Unprepared

    Several users reported feeling caught off guard when the practice session began immediately after clicking “Start Practice.” There was no transition or opportunity to mentally prepare. In response, we introduced a confirmation pop-up screen that appears after clicking the button. This gave users a moment to get ready and reduced anxiety before entering the test environment.


These small but impactful changes were driven by user feedback and directly improved usability, reducing hesitation and boosting confidence across critical user flows.

Results

A Digital SAT Experience That Motivates Practice and Builds Confidence

By combining intelligent features with human-centered design, Interligence delivered more than a test prep tool—it created a guided, stress-reducing learning environment that students returned to. The platform didn’t just simulate exams; it gave students structure, clarity, and a sense of progress they could trust.


AI-Powered Practice Tests

Generated SAT-style questions in a realistic, distraction-free interface


Analytics Dashboard

Visualized user performance over time, breaking down topic mastery and highlighting trends


Book of Improvement

Archived all incorrect answers, allowing for focused review and retry tracking


Streak Tracker

Encouraged daily engagement and built momentum through habit reinforcement


Each feature directly mapped to a core user need identified in the research phase, making the final product both relevant and highly usable.


Usability Testing Outcomes

The platform achieved strong usability metrics that validated the success and effectiveness of our design decisions.

IMPACT

Bridging the Gap Between Practice and Progress

Interligence tackled a very real and growing problem in standardized test preparation where students aren’t lacking motivation but they’re lacking structured, supportive tools that help them improve. By focusing on the user experience, we designed a platform that simplifies complex data, mirrors the SAT interface, and motivates learners through clarity and feedback.

This project is not just a senior project, it has the potential to evolve into a real-world solution for students in Thailand and beyond—scalable across subjects, adaptive to learning styles, and affordable enough to level the playing field. With the rise of AI in education, Interligence demonstrates how thoughtful design can humanize technology to meet actual student needs.

Lessons Learned

Throughout this project, I learned that great educational design isn’t about flashy features—it’s about reducing barriers to understanding and creating a system that supports emotional and cognitive focus. Testing with real users helped us uncover blind spots early and often, and guided critical decisions such as simplifying layouts, renaming features, and adding moments of pause within stressful flows.

I also learned the importance of designing not just for functionality, but for mindset. Students under pressure need clarity, control, and small affirmations that they’re on the right path—and good design can deliver that.

Next Steps

Building on this foundation, we envision Interligence growing into a fully scalable learning ecosystem:

  • Support more exams such as ACT, TOEFL, or Thai university entrance tests

  • Add topic-based learning flows with progress tracking and AI-driven study plans

  • Integrate detailed AI explanations for question solving, not just correct answers

  • Build a collaborative version where instructors can assign practices and track student analytics

  • Develop mobile support and offline modes to increase accessibility

By continuing to learn from real users and validating every step through research and testing, Interligence can expand from a prototype into a product that truly transforms the way students prepare—confidently and consistently.

© 2025 Kanis Surajarus

© 2025 Kanis Surajarus

© 2025 Kanis Surajarus