Context & Problem
About Planning & Self-Study
Watermark’s Assessment Plans tool within Planning & Self-Study enables institutions to manage and assess outcomes for every organization, year over year.
The Problem & Challenge
In 2023 multiple help desk tickets alerted us of a critical issue with users unknowingly changing and losing important historical data. Outcome measures are used across time in assessment plans in order for institutions to see progress or opportunities for improvement within their departments and organizations. The original functionality only allowed users to Edit measure details, which changed the measures details everywhere that measure was used, including in all past open and closed Assessment Plans.
By unknowingly (or knowingly) changing the details of a measure in all current and historical plans, the data or results may be distorted, not make sense, and users lost the ability to track and compare the measure over time. Our goal was to improve the usability of measures to better meet our users' needs, both by preventing accidental data loss and providing an alternative that would allow users to track changes to their measures across time. We hypothesized that providing users with a way to revise their measures and keeping a recorded history of those versions of their measure would help us meet our UX goal.
As a temporary solution, a dialog was implemented warning users of the implications to historical data when editing a measure.
My Role
For this project, I led research efforts, UI and UX, and worked collaboratively with my product manager and development team to ensure quality and consistency in implementation. This project kicked off in November of 2022 and was released January 2023.
Main Responsibilities
Research
UI & UX Design
User Testing
Design Toolkit
Sketch + Abstract | Overflow | Mural
Research
Overview and Goals
We took a look at current Measures functionality and some opportunities for reporting on Measures across time. Our goals were to glean a better understanding of how users are changing Measures in each plan, how the same Measure might be used across programs for comparison, if at all, what insights users are looking for Measures across time, and opportunities to enhance the usability of Measures to make our users’ processes more effective and efficient. To accomplish this, we interviewed clients and subject matter experts to get feedback throughout Q1 of 2023
Methodology
User Interviews
SME Interviews
Data synthesis & Feature Prioritization
What We Learned
Our research plan around measures included a wider scope than just the Edit issue, however, we were able to validate both the concern and got support for our hypothesis in how we might solution for this issue.
Additional, we found that in general, users seemed to have a positive sentiment towards measures but were experiencing some consistent pain points across the board. We hypothesized that improving the usability of and enhancing some features within measures would greatly improve these pain points and enable our users to mature their continuous improvement processes.
The Solution
The first iteration of the solution to this need included a way to revise measures, in addition to the ability to edit. Additionally, the warning dialog implemented was kept and page level description copy was added to inform users of the impact from Editing a measure to help prevent data loss.
Revise Measure Parameters:
When a measure is revised all results/past findings, actions, etc. for that measure will be brought in with the new version. A revised measure should still be considered the same object with every revision, just with marked and recorded points of change.
Editing and Revising measures are actions available only for current and future plans
Editing can be done in a past plan; users will receive a warning dialog about the impact of editing a measure.
Any plan marked "Complete" should be closed down from any edits
If a future plan's measure is revised, that revision will be the current version once the plan year begins since the effective date of a revision = the reporting year of the plan. Any revisions made in the current plan will appear in the future plan's measure revision history.
A plan is considered a "past plan" once the current date is past the last date in that plan's reporting year
A plan is considered a "current plan" if today's date is within the reporting year dates of that plan
A plan is considered a "future plan" if the reporting year dates are after that of the current reporting year
Edit Measure Parameters:
The Edit measure flow will still have the dialog implemented earlier this year
Edit measure immersive will include a page level description
Deleting a measure will delete **all** versions of the measure from **all in-progress** plans
Plans marked "Complete" will be essentially locked down - editing, revising, and deleting in open plans should have NO impact on a completed plan; if a plan is re-opened after an edit/revision/deletion occurs, the measures will exist as they were when the plan was completed, nothing should take affect retroactively
First Release & Next Steps
Measure revisions was successfully released in January of 2023 and was received well. Ongoing efforts to improve assessment plans and measures provided initial feedback that some of the restrictions we added to the revising functionality needed to be loosened in order to fit the individual needs of our clients. Initially we had restricted revisions to plans designated as current and future plans, but soon learned that most institutions are doing the bulk of their review on assessment plans from the previous year well into what we considered the current year. Updating that restriction to allow revisions in years past was released in May 2023 and we have received positive feedback from alleviating that pain point.