Panorama Education / 2024
Improving Student Outcomes with AI ✨
How my team at Panorama leveraged new AI capabilities to provide educators with more efficient workflows that significantly save time.
Panorama Education is an education technology company helping schools and districts transform their approach to education. As Head of Product Design, I played an instrumental role in rallying and leading the Design team to successfully ship three AI-powered Beta features in just 8 weeks.
✨ Outcomes
Designed and shipped three AI-powered features in under 8 weeks
300+ school districts have opted in
Saved educators countless hours by supercharging our platform capabilities
Problem overview
Many companies across the ed-tech space have been experiencing challenges with growth and client retention due to district budget cuts and other factors. Our leadership team believed investing in innovative AI capabilities was critical to enable us to provide more value to our clients and to differentiate ourselves from competitors.
To achieve this, we were presented with a challenge of building and shipping three AI features by the end of 2023.
Our approach & challenges
Motivating the team
Tech leadership realized that this could easily feel like a top-down mandate for team members. My approach included reiterating why this work was so critical for the company at this moment, and generating enthusiasm for the designers to actually have an opportunity to explore AI. Ensuring the team understood why this was important for us and for our clients was the first hurdle. Some were pumped from the get-go, while others were skeptical. We focused a lot on framing and the narrative, where we believed AI could bring real, meaningful value to educators.
Rapid iteration and development
In lieu of user research (there just wasn’t any time), PMs and Designers looked at previous user feedback we’d collected to find some opportunities and pain points to consider. Designers and engineers worked very closely together to learn what was possible and how we might express things in the UI.
Rather than trying to create a standalone experience, the team strongly felt that we should experiment rapidly to enhance current capabilities in the platform. We wanted to ensure the features fit seamlessly into our existing products and amplified existing capabilities. It was also decided to release these features to users in Beta in order to set clear expectations for users that the features would be improving over time.
Setting the team up for success
Under my guidance, the team explored high-value problems to solve and rapidly iterated user-centric solutions that were small enough in scope to achieve in our timeline. To set the designers and their squads up for success, I helped ensure:
Support mechanisms: We empowered squads with flexible working arrangements, schedule autonomy, focus time, and direct access to leadership to help unblock issues quickly.
Close collaboration: Through rapid brainstorming sessions, each squad identified high-value user problems to address and ideated feasible solutions within our timeframe.
Reviews for alignment: To ensure coherence across the features, we conducted frequent design reviews within technology and cross-functionally across the company.
The solutions
With Power Search, educators can type a simple prompt to quickly filter students and identify patterns and trends.
The Students Overview dashboard displays rich information about each student’s academic, behavior, attendance, and social-emotional learning progress. There are powerful filtering options, but it can be cumbersome to navigate through the massive list of filters to narrow the list of students. The team asked themselves, how might we enable educators to discover trends and identify students who need support as quickly and effectively as possible?
The team leveraged AI to supercharge how users identify students who meet certain criteria, essentially removing the need to manually use the filter panel. Educators can use prompts such as “show me students with declining attendance”, or “show me students with math interventions and improving math grades.” It’s now easier than ever—and more discoverable—to find students who need support.
FEATURE ONE
The search feature was designed to be prominently displayed at the top of the page, and provided example prompts to help users get started.
After interacting with the search prompt, users would see the filtered list of students who met the criteria based on their search.
Student Insights concisely summarizes a student’s challenges and strengths, providing educators with at-a-glance trends to help boost decision-making.
Educators often use Panorama during student support meetings to understand more about each student they are discussing. The problem is there's a LOT of data, and it's not clear what's most important to focus on. Students too often fall through the cracks, with their needs not identified. How might we make the decision making process faster and more effective for teams, so they can discuss more students, and plan better supports?
With one click, Student Insights uses AI to generate a concise summary of a student’s key challenges and strengths, providing educators with at-a-glance trends and suggestions for potential next steps to help boost decision-making. This empowers educators to have more productive and efficient case management discussions.
FEATURE TWO
The insights summary was also positioned prominently on the student profile page. Users could choose to interact by clicking Uncover Insights in order to initiate the summary generation.
The team landed on surfacing three insights for a student, as to not overwhelm. Once generated, users can show or hide the summary, and copy to easily share externally.
Signal sorts hundreds of thousands of survey responses to identify sensitive content, enabling educators to take immediate action for students in need.
When districts run student surveys about school climate, belonging, relationships with adults and peers, etc., there are often free response questions where students can voice anything on their minds. The challenge is that it's not easy for staff to sift through those responses to see if students are having really negative experiences, expressing thoughts of self-harm, or even worse. This manual process is time intensive and has a high potential for human error. Simply put, it takes too long to discover if students are seeking help.
The team asked themselves, how might we enable staff to rapidly analyze free response survey data to identify students who need urgent support and ensure their needs are met as quickly as possible?
This feature leverages AI to rapidly sort hundreds of thousands of survey responses to identify sensitive content. Free responses that have been flagged with a key concern (ie. Self Harm, Abuse, Bullying, etc) are displayed in a simple table for review. The data can be easily exported for further manipulation if desired, and allows educators to take immediate action for students in need.
FEATURE THREE
“The beta features I was able to enable this evening are fantastic! I have much more playing to do, but I wanted to compliment Panorama’s commitment to continuously improving data dashboards that support the work!
Thank you so much.”
— Asst. Superintendent from Ohio
Outcomes
Ultimately, the team successfully shipped all three features in Beta on time.
Our clients were given the option to opt-in to have the Beta features turned on for their district. Within the first 3 days, we had 58 clients opt in. As of June 2024, we’ve had over 300 clients who have opted in and actively using these features.
58 clients
Opted in by day 3, Dec 2023
334 clients
Opted in as of July 2024
We continue to see interest in the betas across districts of all sizes, and the clients to date represent over $18MM in ARR. While we of course cannot attribute this ARR to the betas themselves, it does reflect opportunities to point to examples of product innovation across our product lines in upcoming renewal conversations, which we hope will support our retention goals.
Feedback & iteration
We’ve been able to gather feedback from our clients through a variety of means, but one of the fastest methods is hearing directly from our account managers, customer experience team, and the sales team. We’ve learned a ton through these Beta features to date.
We've gathered feedback and usage data, and have acted on user feedback. Two specific examples are an enhancement to Power Search where we found users were simply typing in a student's name to find them, and it wasn't returning results because we didn't initially design for that. A simple fix enabled that. Engineering also spent time refining Signal's language model to drastically improve the accuracy of what was being flagged and what was missed.
What’s next?
We are continuing on our AI journey with several prototypes in early stages and hope to get those features to beta in the coming months. We're also figuring out how to get design involved in experimentation much earlier in the process so we can be co-collaborators with engineering.
Another AI feature currently in the works