MENU

Microsoft Whiteboard in-app teaching

Microsoft Whiteboard is a real-time collaborative, free-form canvas that I helped create. This is a vignette of one of Whiteboard’s major components. For more information on Whiteboard as a whole, as well as other major aspects that I designed, see my Whiteboard overview page.

When we released the early versions of Microsoft Whiteboard, we knew we’d built an easy-to-use app, but we were also introducing a huge audience to a relatively new category of software: an always connected and collaborative canvas where keyboard/mouse were not the primary input methods and the artifact was not a file attachment.

Problem

Several of the features in Whiteboard weren’t as self-explanatory as we would have liked.  A few too many new concepts were introduced at once and some users were missing key aspects of the value prop.

User research

Over a period of a few months, we brought lots of users in for weekly user testing to observe their behaviors and how they were successful (or not) in using Microsoft Whiteboard.  We found a few particularly interesting things:

  • When presented with a digital stylus (like a Surface Pen), the tool was so arcane that many didn’t know how to use it: draw directly on the screen, flip the pen over to erase, hold the button for Lasso select, and rest your palm on the screen
  • Our goal was to have every Whiteboard be a collaborative Whiteboard, but even with an FRE, marketing messaging around this concept, and a highly prioritized collaboration experience, many users didn’t have the expectation that their Whiteboards were shareable
  • When presented with a touch screen, a mouse, and a pen, users weren’t sure which device to use for which task

 

Initial explorations

Video: Triggered callouts appear when the user has or hasn’t taken some action over a period of time.

 

So we implemented a set of basic triggered callouts.  These guide the user over a period much longer than the typical First Run Experience. It was an early attempt to address the bullets above.

When triggered callouts like these first started appearing in applications, they did quite well in grabbing users’ attention and teaching them what they needed to know.  But over time, their prevalence made them more commonplace, less noticeable, and less effective.

Since the implementation of the triggered callouts, I’ve done several explorations around guiding the user through the app like in the beginning of a video game.  A few primary tasks with an AI recognizing their accomplishment.

 

Further explorations

Video: “BoardBot” was an exploration around a digital companion teaching the user about Whiteboard.

 

BoardBot was one attempt at guiding the user through some simple steps.  But it also quietly expressed that others could pop in and out of Whiteboards.  With BoardBot we’re able to show the collaboration aspects of the app from the user’s first time interacting one the canvas.

 

Status

Triggered callouts have been released in Whiteboard for some time with some success.  Explorations such as BoardBot are still ongoing.