👋🏼 About Me

<aside>

Hi, I’m Siddharth Prabhakar (Sid), a graduate student in Human–Computer Interaction at Indiana University.

My work sits at the intersection of accessibility, interaction design, and emerging technologies. I focus on research-through-design approaches that treat disabled people as experts in their own adaptations, not as edge cases to be corrected.

I’m particularly interested in how everyday products are repurposed to support independence, dignity, and variability and how these lived practices can inform more ethical, socially grounded accessibility design.

</aside>

Project Overview

Kuali Time is Indiana University’s tool for hourly student employees to record work hours and get paid. Students rely on it to clock in, clock out and correct missed punches. When my team Siddharth Prabhakar, Tirthraj Rathod, Suhith Vasanth, Diva Shah and Pratima Chaudhary looked at the platform, we saw a mismatch between how simple it should be and how students actually feel using it. People were anxious about missed punches, confused by decimal time displays and frustrated by unclear labels. We decided to dig deeper.

Research goals and questions

Our first step was to articulate what we wanted to learn. Drawing from the project brief and the interview plan, we defined the following goals:

These goals guided the research questions used in our interviews and observations. For example, we asked participants to rate their satisfaction, describe the most confusing part of Kuali Time and imagine one feature they would add.

Methodology

Participants and recruitment

We recruited ten students and hourly employees for semi‑structured interviews and ten participants for the SUS/think‑aloud sessions. Participants ranged from undergraduates to graduate students and staff, with varied familiarity with Kuali Time. Recruitment was done through peer outreach on campus (libraries, labs, student worker group chats and Discord channels). Each researcher conducted two interviews to ensure diversity of perspectives.

Data collection

  1. Semi‑structured interviews. Participants described their typical workday, devices used, frequency of missed punches and perceptions of trust in the system. Demographic questions captured age, gender and role.
  2. Think‑aloud usability tests. We asked users to perform core tasks such as clocking in/out, correcting a missed punch and checking time details. Success criteria were defined in advance (e.g., clock in within one minute). Participants verbalised their thoughts while we captured hesitations, navigation paths and quotes.
  3. System Usability Scale (SUS). After performing tasks, participants completed the ten‑item SUS questionnaire. Scores were calculated using the standard Brooke (1986) method.

My Role

I conducted two semi-structured interviews and two think-aloud usability sessions, focusing on gathering both qualitative insights and performance data. I was also responsible for calculating and interpreting the System Usability Scale (SUS) scores to benchmark overall usability. In addition to hands-on research, I played a key role in maintaining effective communication within our team, coordinating schedules, and helping plan structured test procedures for both the interviews and think-aloud sessions. This ensured consistency across sessions and alignment with our overall research goals.