top of page

UTM Timetable Planner Website

The project was to address usability issues in the current University of Toronto Mississauga timetable planner website used by students to plan out courses and create a new improved user-centered design for the site.

The design streamlined the course selection process by redesigning the planner in an engaging and user-centric way and increasing system usability scale rating and user satisfaction rate by 42% and decreased time taken by 26.3 seconds.

timetable icon

Overview

UX Case Study for University Course: UX Design – Quantitative Methods

Objective

Explore and test different methods of increasing ease of use and satisfaction of stakeholders of current timetable planner site. Offer a new statistically tested and superior design proposal for current site.

Techniques Used

10 plus 10 Ideation Method, User Flow Maps, Wireframing, Prototyping, Design Principles, 10 Usability Heuristic Principles, Benchmark Testing, Guerilla Testing, A/B Testing, Statistic Analysis

Tools Used

InVision, Figma, Mural, Google Survey, SUS Scale, Likert Scale

Responsibilities

Conducting user observation, deriving usability failures, ideating possible design solution on Figma, drafting user testing protocol, facilitating guerilla, benchmark, & A/B testing, preforming statistical analysis, and identifying limitations .

Duration

13 Weeks

The Problem

The previous version of the website offers a complicated and ambiguous design experience for users. The website shows the courses and schedule independently as the schedule appears after all available courses, at the end of the webpage. Users are unable to see their courses being added to the schedule concurrently and without noticeable feedback. Users must scroll back and forth from the course options (on the top of the page) and their schedule (all the way at the bottom of the website).

Problem Statement

How can we improve user’s visibility of system, provide consistent feedback, and allow recognition rather than recall during the course selection process?

Our Aim

The Solution

timetable fall icon

1 Decrease Scrolling

The design offers a split view of courses and schedule to allow for clear indication and visibility of the live system status to bridge the gulf of evaluation and limit the amount of scrolling required for users to plan their semesters to bridge the gulf of evaluation. 

2 Bridging Gulfs

The design offers an alternative hexagonal view of courses to utilize space efficiently and decrease excessive scrolling. It also offers colour differentiated courses to indicate the different winter (blue) and fall (orange) semesters for clear indication of system.

3 Bridging Gulfs & Straightforward Process

The design offers a clear and short streamlined method of course selection by limiting number of clicks needed to bridge the gulf of execution.

Try it Yourself

Try adding the CCT411H5F CCIT Internship II course to your schedule.

Empathize

Users

We have identified our users to be students of the University of Toronto, both novice and experienced users. The novice users consisted of first and second year students as well as potential future students. Experienced users consisted of upper year students such as third year and above.


Through user observation of these students we derived usability fails, areas of user flow breakdown and frustration.

User Flow Map

Old User Flow

A user flow map helped us track different paths of the users experience on the site.

User frustrations were met at the stages outlined in orange.

 

We found that user's had to do a lot of scrolling back and forth between the course selection and timetable view as the timetable appeared at the end of the page after all the overwhelmingly large amount of courses are shown.

 

User's would have to scroll all the way up and find the course again when going back from the timetable view to the browse displayed courses again.

This created user frustration and breakdowns in flow (shown in orange) when adding and adjusting user's schedules while resolving schedule conflicts.

Define

Usability Fails

We found our user's had issues with excessive amount of scrolling between adding/adjusting courses and seeing their planned schedule.

The process of adding courses violated the following two out of the 10 usability heuristics for UI Design:

Recognition rather than Recall

Minimize the user's memory load by making elements, actions, and options visible. The user should not have to remember info that is required to use the design from one part of the interface to another.

Visibility of System

The design should always keep users informed about what is going on, through appropriate feedback within a reasonable amount of time.

Visibility of Live System is not met as user's do not have a coinciding view of the timetable and the courses. Therefore we noticed user's had to scroll all the way down to see if their course was successfully added to the schedule.

Recognition Rather than Recall was also violated as we noticed if a course lecture slot conflicted with user's schedule they would have to scroll all the way back and find their course to find the other time slots to find one fitting for their schedule. This was also the cause of non coinciding views of the timetable and displayed courses.

Ideate

Initial Concepts

With our aim to decrease scrolling in mind, our team began to create rough design sketches based on different real-world using the 10 plus 10 ideation method.

Through this method, we created an idea  for a split view of the courses and the schedule, as well as courses being separate hexagons in a honey comb form to address the limited space issue and decrease the need of excessive scrolling.

Initial Wireframe

10 + 10 Ideation Method

Our ideate portion of the process included using the 10 + 10 method for brainstorming solutions. The technique is used to generate early concepts via rough sketches providing alternative solutions addressing the user pain.

Below are some of the rough ideas our team conceptualized. 

Design

Initial Design

Our initial prototype design was created on the InVision application and allowed us to guerilla test the early design.

Timetable Wireframe Initial Page
Timetable Wireframe Courses
Timetable Wireframe Course Selected
Course Selected
Course Added

New Design User Flow

New vs Old Wireframe

We created a mapped out our new design's process into a new user flow to compare it with the old one. Our group Focused on removing any obstacles and breakdowns in flow the user's met in the old website.

Test & Iterate

Guerilla Testing

Conducting Guerilla testing on our user's allowed us to correct and kinks in the early stages of design and change some design decisions. We analyzed user feedback from the testing and incorporated them into improved new mid-fi interactive prototypes on Figma.

prototype revised 2.png
Revised Course Selected

Added colour to differentiate winter (blue) and fall (orange) courses.

Blurred and darkened background when course is selected to show and limit user's actions.

Separated fall and winter schedule to only show one in screen.

Decreased the number of rows from 5 hexagons to 3 for user's visibility and readability

Test & Iterate

Benchmark

During Benchmark testing we were able to measure interactions on our site to compare them to intended measurements such as number of clicks/time needed to locate and add courses to schedules, Likert scale satisfaction rates, etc. The testing also allowed us to incorporate the data and feedback/pains to iterate our prototype design.

We were also able to determine and understand our dependent, independent, and confound variables and how they effect our results.

Revised Initial Screen
Revised Course Selected

Adding an interactive advanced course search in our prototype that has more options and filters compared to the quick search.

Added blurred background for uniformity of split view of screens as users claimed it looked like 2 different website.

Added outlines to hexagons for colour blind accessibility.

Added additional info such as prerequisites and course description when course is selected

Design & Evaluate

A/B Testing

The A/B testing portion of this project consisted of comparing an alternative Design B to our original hexagonal Design A with statistical tests.

 

Null and Alternative hypotheses were constructed dividing up the designs on superiority of comprehension (time taken to accomplish tasks) and satisfaction rate (measuring mental load and system usability scale). Statistical tests ran for each dependent variables, such as two sample t-tests, to determine the statistically supported design.

Below is the Design B interactive prototype we created on Figma. Try Adding CCT406 Into your schedule!

Design & Evaluate

Final Design

Our statistical tests during A/B testing showed that our hexagonal design A was superior to the alternative card design B in terms of both comprehension and satisfaction, surprising even our team.

Next Steps

  • Refining A/B testing protocol to include Counter-balancing of designs.

  • Refining testing protocol to not be tedious and avoid surveyor bias during SUS (System Usability Testing) scale ratings.

  • Implement course/schedule conflict errors and resolution process.

  • Design and test mobile version of website.

  • Check and test accessibility of site with different user groups to ensure high usability for all user demographics.

​Through this project and process I learned a few lessons on the way. Some key takeaways are the following:

  • YOU ARE NOT THE USER.

  • Despite what you may assume as certainty, users and data may prove you wrong. Trust the the need for a test.

  • Technical or logistical issues will arise during testing. Be prepared for the unpredictable and to improvise on the spot.

  • Always communicate with your team and share material on what you are working on incase a sudden stand-in is needed.

Reflection

bottom of page