2-teacherCover.png

Dashboard Design

// Design Research, UI //

60 second snapshot

The goal

Adaptive tutors, which move students at their own pace, are often difficult to understand. I worked on the development of two different dashboards, Real-Time and Luna, which helped teachers utilize data from adaptive tutors.

The users

Middle school teachers who used adaptive tutors to supplement their teaching a couple times a week were the primary users.

The team

I joined an ongoing research project at CMU’s Human Computer Interaction Institute led by Kenneth Holstein and Françeska Xhakaj, while under the direction of Vincent Aleven and Bruce McLaren.

Real-Time: research methods


3-cardSort.jpg

Card sorting: superpower question

"If you could have any superpowers to help you teach, what would they be and how would you use them?"

With this question, I wanted to communicate that this interview would be exploratory, collaborative, and fun. Since many of the teachers had recently participated in interviews evaluating the Luna interface, I felt that setting the tone right way was important.


4-Sc Wall.png

Speed-dating: vagueness

How do you feel about this type of dashboard?”

Both the speed-dating scenario and prompt are intentionally vague in order to engage teachers in discussion. If they asked for more details, I would often prompt them to fill in the details themselves with their best case scenario. Similarly, I encouraged teachers to respond to scenarios with how they felt, versus focusing on exact details.


Luna: interface work

Showing student progress in the tutor in relation to a calendar helps teachers ground online time with real time. This filter groups students by similar problem set, so teachers can easily choose groups of students to give mini lessons to, a teacher behavior I uncovered with research.


 

Process

Real-Time: process

My objective was to understand what needs a dashboard must fulfill in order for a teacher to circulate effectively throughout a computer lab.

Initial research

My first step after joining the project was listening to previously conducted teacher interviews and trying to understand how the use cases of the two dashboards related to each other. I mapped how each dashboard might be used in a school week, to understand their use in real time.

Testing hypotheses

Based on previously conducted research, I wanted to test potential ways to reduce non-ideal teacher behaviors. Addressing these without being overly paternalistic was difficult. For example, this dashboard tells the teacher to ignore John Smith’s raised hand due to the dashboard’s “triage algorithm”.

Final concept

I designed almost all of the interview materials myself, as well as conducted three solo interviews with teachers.

Card sorting

"If you could have any superpowers to help you teach, what would they be and how would you use them?"

After teachers wrote down their superpowers, I had them organize them while using the “think aloud” protocol. Then, I showed them what past teachers wrote and asked them to reorganize all the cards. Their commentary was just as interesting as the cards themselves.

Directed storytelling

"While annotating the computer lab seating chart, how would you usually spend your time?"

After this project, I was dubious about the effectiveness of directed storytelling, as many of the behaviors teachers listed were inconsistent with those observed by my colleagues through contextual inquiry. However, I saw teachers’ idealized version of their own behavior as a future that the dashboard could help them realize.

Speed-dating

"What are your gut reactions to the idea behind these hypothetical 'teacher dashboards'?"

I tested 10 different scenarios with teachers.


17-Sc Watch.png

With this scenario I hoped to test two questions. First, how receptive were teachers to new technology (they were). And secondly, how trusting were teachers of a simplified dashboard which gave no rationale for its decisions (they weren’t).


18-Sc Drone.png

Including scenarios which were intentionally absurd had two main purposes: determining what teachers deemed unequivocally inappropriate and building rapport through humor.



19-Sc Vibration-01.png

The degree to which this bothered teachers was honestly a surprise, but an important boundary I learned not to cross. It seemed obvious later, but a dashboard directly or indirectly doubting the intuition of teachers was perceived as an attack.


 

Luna: process

My objective was to progress the visual and feature design of the Luna dashboard to help students assess their class’s progress in the tutor.

Notification threshold feature

Based on research, whether or not the tutor was effectively used was highly based on trust. To build a teacher’s trust in the tutor, I proposed a variable threshold for notifications. For example, the tutor's notifications for which students needed help could match the teacher’s mental model.

20-threshold1.png
21-threshold2.png

Feature iterations

The features I designed were directly connected to a specific teacher action outlined in either design or academic research. Some speed-dating scenarios were unintentionally helpful in highlighting which features, in this case sparklines, made little sense to teachers.

In this iteration, “Errors” appear to be more accurate than “Skills” due their higher degree of fidelity. However, the statistical model behind “Skills” is actually much stronger; the visuals of the dashboard needed to actually reflect this difference.

24-lofi.png

Final concept

I split the final dashboard into a “class level” and “individual level” in order to accommodate the varying degrees of time teachers were willing spend using the dashboard. Interestingly enough, I intentionally avoided notifications (ex. students who need help), as this would require choosing a threshold for such a notification, an area determined outside of the project’s focus.

Class level: skills

A teacher is afforded the most common actions I saw in research, quickly assessing the class-wide understanding of skills for lecture planning and figuring out which students are still having trouble with a skill most of the class has mastered. Note the updated visual weighting between the “Skills” and “Errors” page to reflect the varying statistical strength.

Class level: errors

Errors are less actionable for lecture planning and thus are displayed at a more granular level. Including pop-ups with actual student problems (2x+3=10) was important so the teacher could become familiar with the language used by the tutor (combine unlike terms to make variable). In addition to increasing a teacher’s adeptness, this familiarity in language would build trust.

Feature: student progression

The filters shown (time spent, problem set grouping, hint use, alphabetical) either reflect a commonly requested feature by teachers or a current teacher practice.

 

References

Vincent Aleven designed these two graphics.

http://marmelab.com/EventDrops/