Role
Product Designer
Team
1 Product Manager, 2 Designers, 5 Developers, 1 Marketing Manager
Timeline
September - December 2024
Tools/Skills
Figma, Adobe Creative Suite, User Research, Market Research, User Testing
...
THE STORY BEHIND CU REVIEWS
Every semester at Cornell, students face the same daunting question: "Which classes should I take?" Cornell Digital Technology & Innovation (DTI) set out to solve this problem by launching CU Reviews, a platform where students could share their honest course experiences with peers. This solution has grown into a site with over 17,000 users, collectively contributing more than 5,000 course reviews.
THE CHALLENGE
While the student-facing side of CU Reviews was thriving, behind the scenes was a different story. When I joined the CU Reviews team, I discovered our admin dashboard hadn’t been touched since its initial set up in 2018. The existing system for managing reviews was outdated and limited in functionality, leading to confused admin users. My task was to overhaul this experience while adding crucial features for managing admin users and tracking analytics.
USER RESEARCH
Understanding User Needs
I kicked off the project by interviewing five admin users from different roles – designers, developers, product managers, and marketers. What struck me immediately was how rarely anyone used the admin page. As one admin put it, "I barely remember the last time I used the page– there’s nothing that’s relevant to me."
Through affinity mapping of the interview responses, I uncovered several critical issues with the existing admin site that left our team feeling uncertain and uninformed.

1.  Cluttered With Non-functional Features
Despite multiple features being listed, only the "Manage Administrators" button worked—and even it failed to create proper admin profiles. Developers clarified that most of the buttons were test functions not intended for team use, prompting team members to express confusion: "I never know if I should even try clicking those buttons."
2.  No System Feedback
Users were most frustrated by the complete lack of affordances when interacting with buttons. This left our team second-guessing their actions: "Did that actually work?" This uncertainty meant that even simple tasks felt risky.
3. Lack of Meaningful Metrics
Users found that existing metrics left them in the dark, preventing our marketing and product teams from tracking platform growth and making data-driven decisions. As our PM shared, "There's nothing here that really lets me know how we're doing."

While different roles had varying needs, there was unanimous agreement that the current system fell short of expectations.

INFORMATION ARCHITECTURE / WIREFRAMES
My First Draft
Guided by the user needs, I decided on content requirements for the new admin site. My first instinct was to consolidate everything onto a single page. I sketched out a comprehensive dashboard that included all features – review management, admin controls, and analytics.
A More Focused Architecture
However, during a discussion with developers, the Technical PM made a compelling point: "Our admin interface shouldn't try to do everything at once. It should be organized and focused." This led to a pivotal decision to split the functionality into distinct pages, each serving a specific purpose. I created a new information architecture and a new plan to reflect it.
ITERATION AND FEEDBACK
Deciding On Button Layouts
As I moved into mid-fidelity designs, I faced an interesting dilemma with the review approval interface. Should the approve and reject buttons be placed far apart to prevent mistakes, or close together for efficiency? To answer this issue, I brought both versions to users for testing, and quickly found that they preferred adjacent buttons – while separation might prevent errors, the efficiency gain from quick comparisons proved more valuable.
I also added universally recognizable icons to accompany the text to ensure quick, intuitive understanding and expanded beyond our design system's color palette to meet WCAG accessibility standards, making the buttons clear and distinguishable for all users, regardless of their visual abilities. 
Crafting the Analytics Dashboard
The analytics page proved particularly challenging. I first wrestled with questions like "Should we use library components or custom graphs?" Although pre-built components would be faster to implement, custom graphs would give us the flexibility to evolve with our needs. After consulting with developers, I decided to proceed with designing custom graphs that could serve as references for either approach during implementation.
After creating several iterations of the data visualizations, I shared versions with my subteam for feedback. Through our discussions, we landed on three key elements that told the clearest story: a consolidated overview adjustable by date, a linear progress bar to visualize goal metrics, and categorized bar graphs that made data easy to understand.
The Power of Fresh Eyes
One of my most valuable insights came from an unexpected source – a new team member. While testing the interface for adding new admins, they questioned why this feature was built into the page header rather than being a separate modal. The rest of the team, myself included, had accepted this pattern simply because it existed in the old interface.

This feedback was later echoed during a design critique session, where other designers pointed out that the overhead bar was counterintuitive, as it resembled a search or filter bar rather than an admin management tool. They independently suggested the same solution – converting it to a button that would trigger a pop-up modal.

This parallel feedback from both a fresh team member and external designers served as a reminder of how outside perspectives help us see past our unconscious biases. Their suggestions helped me completely rethink the admin management flow, resulting in a more intuitive modal-based approach.

INTERACTION DESIGN
Designing for Clarity and Confidence
To address the lack of feedback in the original interface, I focused on adding both feedforward and feedback to every interactive element, building a more intuitive and reassuring user experience.
​​I spent time digging into Material Design principles on material.io to guide my decisions. For feedforward, I added hover states to show users that the system was ready to process their actions. For feedback, I introduced visual and contextual cues, including loading states, color changes, and snackbars to confirm successfully approved reviews. I also added safeguard pop-ups for critical decisions, like deleting reviews or removing admin, to help users double-check their choices and avoid mistakes.
COHESIVE DESIGN
Unifying Patterns Across The Platform
As I was finalizing my designs, I made an important discovery: a fellow designer on a different CU Reviews project had created filter and sort patterns, but included a reset button– a thoughtful detail I had missed in my own work. Even though their feature wasn't live yet, I knew we needed to be consistent. I quickly updated my designs to match.
This moment stuck with me. It reminded me that good design isn't about getting everything perfect the first time – it's about staying open to improvements, even in those final stages. While it's easy to rush toward the finish line, taking a step back to align with other designers' work ultimately creates a better, more cohesive experience.
FINAL PROTOTYPES
Approving Reviews​​​​​​​
Admins can now confidently sort through reviews with safeguard pop-ups and confirmation snackbars. They can also seamlessly navigate tabs to view recently approved reviews and reported reviews.
Viewing Analytics
Admins can finally monitor key metrics to understand how CU Reviews is performing. They can also select a specific time period, allowing them to see trends over time and measure the impact of marketing events.
Managing Admin Users
Users can filter, sort, and search to find specific admin in a centralized interface. They can also add, edit, or remove admin accounts.
Testing Developer Tools
Developers now have a place to test in-progress functions and easily view testing history. They can also quickly access GPT Costing information.
WHAT'S NEXT
As I wrap up my design work for CU Reviews, I’ve handed off detailed documentation to the developers to guide the next phase: implementation. While I’ll be shifting my focus to new design projects, I’m not stepping away completely. I’ll stay involved through check-ins to answer questions, clarify design choices, and ensure that the thoughtful decisions I made throughout the design process come to life as intended. After all, the success of this redesign will ultimately be measured by how well it serves our admin users in practice.
REFLECTIONS
Learning to Work Cross-Functionally 
This project was my first deep dive into cross-functional collaboration, and it reshaped how I approach design.  I found myself reaching out to team members across different roles for their unique perspectives— my PM helped prioritize which metrics mattered most, our marketing manager shared ideas for effective data visualization, and developers kept my goals grounded in technical realities. 
One moment that stood out was when I initially suggested color-changing progress bars to indicate completion status. It seemed like a great idea to me, but the developers explained it would add unnecessary complexity for an internal tool. Conversations like these clarified my goals, helping me focus on what matters in the project’s context.
Feedback, Feedback, Feedback!
I made it a habit to seek feedback every stage—whether through methodical user testing, formal critique sessions with other designers, or even quick questions in our team Slack. Each conversation helped shape the rationale behind my design decisions. This constant feedback loop was essential in helping me spot areas for improvement, uncover perspectives I hadn’t considered, and ultimately craft a well-rounded solution.

See More

Back to Top