MCIEA SQM Dashboard
A reimagined performance dashboard designed to help school leaders look beyond standardized test scores
The Massachusetts Consortium for Innovative Education Assessment (MCIEA) is a partnership of eight MA public school districts and their local teacher unions, joined together to create a fair and effective accountability system that offers a more dynamic picture of student learning and school quality than a single standardized test. MCIEA seeks to increase achievement for all students, close prevailing opportunity gaps among subgroups, and prepare a diversity of students for college, career, and life.
MCIEA approached VMware Tanzu Act in 2021 for help reimagining the existing School Quality Measures (SQM) dashboard that leverages an evidence-based School Quality Measures (SQM) framework intended to create a comprehensive picture of school performance. With limited resources and technical experience, the MCIEA team wanted to move quickly while learning as much as they could about modern software development practices.
The team was comprised of 2 Product Managers (1 client & 1 Tanzu); 2 Product Designers (one who helped kick off the project and handed it off to me post-Discovery); 2-3 Developers (1 client & 2 Pivotal); and 1 SQM subject matter expert.
Prior to me joining the team, they had aligned on the key goals, the theory of change (i.e. an encapsulation of what impact the organization would like to have with the product over time and how they would measure that impact), and the key problem area opportunities they hoped the new and improved SQM Dashboard would address.
The problem area opportunities in particular were identified from a round of research that included speaking to six District Leaders (DLs), nine Instructional Leaders (ILs), and several subject matter experts (SMEs) at MCIEA. The questions revolved around experiences with the existing dashboard and possible areas of improvement.
The team prioritized three DL-oriented opportunities since DLs were determined to be most likely to leverage the data as well as help the consortium expand beyond its initial set of schools in Massachusetts:
Need for data to support positive action and improvement in order to impact school quality
Need for data comparison capabilities (e.g. student demographics) in order to identify inequalities
Need to see data over time to determine whether schools are improving or if their plans are working
With prioritized problems in hand, the team quickly moved to determine an MVP solution. To do this, they:
Wrote a scenario mapping out how Carrie, the DL persona, would benefit from the improved dashboard
Reviewed other public-facing educational dashboards (e.g. ProPublica, Urban Institute, EdBuild, Oakland Unified School District, and the Rennie Center) to get inspiration
Held a design studio to get a sense of how the team was thinking about how the dashboard might look and work
This is approximately when I joined the team. My goal was to get smart quickly on the product direction and move quickly to address any unanswered questions while iterating through the design toward a successful MVP.
After a few handoff conversations with the previous designer, I spent some time understanding the initial design direction and visualizing the information architecture in order to clarify how a DL would move through the dashboard content.
Given the limited time we had together as a large team, we moved quickly to validate the main dashboard and content browse views via team design critiques, SME reviews, and DL / IL research. Here's an example of findings from a round of DL / IL research with additional SME feedback layered onto it, too:
The SME reviews in particular proved to be very helpful in distilling the MCIEA ethos into the designs. Naturally DL & IL feedback drove most of the direction, but SME feedback ensured we hewed closely to the theory of change. Two examples:
We moved away from the legacy chart color scheme (a spectrum from red/pink to green) as it actively reinforced the concept of "good" and "bad" results in each category—something MCIEA's framework does not support. I opted instead for a visual binary of gold and purple to indicate category achievement because they did not have the same connotations and were complementary to the MCIEA brand colors.
In a similar vein, we moved away from letter-based SQM framework zone indicators since they weren't clear to either user and also had the connotation of "letter grades". I iterated us toward Harvey balls instead which, in their interim state, did not visually celebrate the accomplishment of reaching "Approval"—not the highest zone, but itself a very good achievement that MCIEA wanted schools to feel. Ultimately, this led me to using iconography to indicate that "Approval" was an achievement and "Ideal" was exemplary.
Eventually we felt confident in the direction of the landing page, main dashboard, and browsable content so I was able to lead the team in determining how we might incorporate data analysis into the product. This consisted of a scenario writing, design studio (for initial team sketching), design exploration, SME feedback loops, and DL / IL research sessions.
At this point the dashboard had really taken shape and we had gotten far enough ahead of our engineers to keep a good buffer of validated stories to build. Cognizant of the fact that with my eventual roll off the MCIEA team would be left without a designer, I worked to make it easy for a future designer to pick up where I'd left off. This took two forms.
First, a design system. I had already been working on a basic set of components in parallel with day-to-day product work, so I had a general idea of the types of elements to include. I eventually created a fairly robust system in Figma that included both high-level design principles and usable components.
Second, some design contractor hiring tips. The MCIEA team had not hired designers before, and weren't well-versed in how to go about determining a good fit for their work. In concert with some folks on the VMware Tanzu Labs delivery leadership team, I helped draft a "cheat sheet" of sorts that eventually became this public-facing guide: Hiring a Design Contractor or Vendor. Our main MCIEA contact was very appreciative.
Ultimately, we were able to design and build a robust MVP that allowed MCIEA to provide a significant amount of value to their consortium school districts. Here's a taste of what the SQM Dashboard is envisioned to become.
The VMware team rolled off in late 2021 after approximately 2.5 months of work but left the team set up for success with a working app deployed to both beta and production environments and a path toward a public launch. After some additional development work (including reshaping historical data to fit the new visualizations) by the MCIEA team, the first version of the reimagined MCIEA SQM Dashboard launched publicly in spring 2022. As of this writing it is still under active development.
• • •
If you're interested in learning even more about the work, please check out these in-depth resources:
Using Education and Software Practices to Change the Conversation Around School Quality (VMware Tanzu Blog)
Measuring school quality beyond standardized test results (VMware Tanzu Act case study)
And if MCIEA's work has inspired you, I'm sure they'd love for you to follow them on Twitter.