Context & Problem
About Planning & Self-Study and Lightcast
Watermark’s Program Review solution offers a collaborative workspace for institutions to complete rigorous program review processes with all the essential inputs in one place to evaluate program effectiveness and drive program progress. Through an exciting, strategic partnership with Lightcast, a global leader in labor market data, analytics, and expert guidance, our goal was to make valuable labor market data provided by Lightcast Analyst directly available to the end users charged with making critical decisions about program viability and competitive offerings.
The Challenge
So what’s the problem? Enrollment has been on the decline since the pandemic, and so it is top of mind for institutions across the country. Though there are many creative ways institutions are responding to this, one vital tool to ensuring that a university’s programs are worth students’ valuable time and money is to use accurate labor market data to ensure programs align with current trends and skills, and are increasing employability of their students post-graduation. But how do we get this data to the people who need it?
Currently, up-to-date, accurate data is challenging for our users to acquire due to limited sources and high costs. Watermark’s partnership with Lightcast presented us with an opportunity to put some of that key, invaluable data at our users' fingertips. Our goal with this integration was to determine a selection of the most common and useful insights related to a program’s external landscape scan, and make them accessible to our users to easily reference while completing their program reviews.
My Role
For this project, I led research efforts, UI and UX, and worked collaboratively with my product manager and development team to ensure quality and consistency in implementation. This project kicked off in June of 2022.
Main Responsibilities
Project Management
Research
UI & UX Design
User Testing
Communication & collaboration between internal and partner teams
Design Toolkit
Sketch + Abstract | Overflow | Mural
Research
What We Wanted To Learn
Gain a better understanding of the external scan data points schools are looking for and using in their program reviews.
Determine which key insights would be ideal to surface.
Methodology
Discovery User Interviews
SME Interviews
Competitive & Industry Best Practices Analysis
Feature Prioritization
User Testing
What We Learned
Through our discovery user and SME interviews, we determined which insights would be most valuable to users and would have the greatest impact. We tested our initial workflows and concepts, and validated both the need for this valuable feature, as well as our assumptions about how to provide the data where it is needed most.
Through collaboration with both my internal development teams and the solutions engineers at Lightcast, we were able to determine the criteria needed from users to pull the appropriate data for each program and institution.
Phasing & First Release
Phase I (Released early 2023)
Due to the scope and API constraints for this initiative, production had to happen in phases. For phase one, our minimal marketable product, we prioritized 6 key insights based on user feedback, as well as the set up flow for users within each program.
Phase II (Released June 2023)
After a successful launch of Phase 1, we conducted some initial user feedback sessions. We learned that what institutions considered peer competitors varied from program to program, even within a single institution, which we initially had based only on school size within that institution’s 100 mile radius. Our users needed a way to specify who they consider their top competitors for each program in order for them to complete an accurate competitive analysis. Additionally, we learned that the designated 100 mile radius, which was used as a criteria input for many of the insights provided, was not always the best radius for every institution. In order to better provide for both our smaller community colleges, larger universities, and online learning institutions, we designed a way for users to decide on the radius that works best for their needs in 50 mile increments from 50 to 250 miles.
Phase III (Expected February 2024)
Although our initial feature provided key insights right where program review contributors needed them, we still needed to give our users a way to place those data points directly into the narratives they were writing to support their arguments. Phase 3, which is set to launch in February of 2024, does just this. With this iteration of the Lightcast + Watermark partnership feature, users will be able to add each insight to any part of their program review narratives.
Next Steps
After our next release, we plan to continue testing and feedback sessions to improve the offering. Additionally, by tracking business goal metrics, we hope to expand the offering and bring even more insights both to the Program Review module and throughout the rest of Watermark’s Planning & Self-Study product.