Slate Implementation Testing Help

Qualitative Feedback Notes

Introduction

This document captures qualitative feedback collected during the user testing phase of the Slate to Banner integration. Feedback was gathered from testers, including staff and students, to understand their experiences, identify usability issues, and propose actionable improvements.

Feedback Collection Methods

  1. Survey Forms

    • Participants completed structured surveys after testing specific workflows.

  2. One-on-One Interviews

    • Selected testers shared detailed insights during follow-up interviews.

  3. Observation Notes

    • Test facilitators recorded observations during live testing sessions.

  4. Open Feedback Forms

    • Testers provided free-form feedback at the end of each session.

Feedback Themes

1. User Interface (UI) and Navigation

  • Positive: "The layout is clean and intuitive."

  • Concerns:

    • Dropdown menus were not always responsive.

    • "Back" button navigation was confusing on multi-page forms.

2. Accessibility

  • Positive: "Keyboard navigation works well for the most part."

  • Concerns:

    • Some screen reader users struggled to interpret error messages.

    • Inconsistent focus indicators for form fields.

3. Error Messaging

  • Positive: "Helpful prompts made correcting errors easier."

  • Concerns:

    • Error messages could be more specific (e.g., "Invalid date format" vs. "Invalid entry").

    • Overlapping error text on mobile devices.

4. Conditional Logic

  • Positive: "Fields appearing dynamically based on selections felt seamless."

  • Concerns:

    • Conditional fields sometimes disappeared too quickly, causing confusion.

5. Process Flow

  • Positive: "The steps are logical and easy to follow."

  • Concerns:

    • Confirmation email delays created uncertainty about submission status.

    • Dual enrollment questions were unclear for transfer students.

Detailed Feedback by Participant Group

Staff Testers

  • Comment: "The administrative tools are robust but need clearer instructions."

  • Recommendation: Include tooltips or hover-over descriptions for admin-facing features.

Student Testers

  • Comment: "The form was long, but I liked that it auto-saved my progress."

  • Recommendation: Provide a visible progress bar for better tracking.

International Students

  • Comment: "The visa and residency fields were straightforward."

  • Recommendation: Add examples or tooltips for document upload requirements.

Part-Time Applicants

  • Comment: "The application didn’t ask me about my work schedule preferences."

  • Recommendation: Add optional questions about work schedules for part-time applicants.

Recommendations

  1. Enhance Error Messages

    • Use specific, actionable language in error prompts.

    • Ensure error messages are compatible with screen readers.

  2. Optimize UI for Mobile Devices

    • Test and adjust overlapping elements.

    • Ensure touch targets are appropriately sized.

  3. Improve Accessibility

    • Add consistent focus indicators for all form fields.

    • Review and refine screen reader labels.

  4. Streamline Process Flow

    • Address email delivery delays.

    • Clarify complex fields, such as dual enrollment and transfer questions.

  5. Provide Feedback Mechanisms

    • Include an "Add Feedback" button on the application portal for live feedback.

Next Steps

  1. Prioritize fixes based on critical concerns identified in this document.

  2. Schedule a follow-up testing phase to validate resolved issues.

  3. Incorporate feedback themes into ongoing development and training materials.

Last modified: 13 January 2025