We came across some really interesting insights. Users performed micro-interactions like making side notes, "underlining” the important points and marking content for later review using different symbols, markers etc. They supported notetaking with other tools like calculators, converters, graphs, scale etc. They didn't want to think too much while taking notes. They used non-preferred hand for interacting with paper: using one hand to move (reposition) the paper and other to write on paper.
Use of other tools
Use of Non-preferred hand
Based on insights, we created scenarios to generate ideas. Among numerous scenarios, two scenarios that stood out were using a second surface for keeping notes and second using gestures to perform tasks such as adding reminders, highlighting etc.
A tool centered interface
Second Surface: an actionable space for your static notes
Augment capability of note-taking app by providing advanced features using pen (stylus) gesture based interaction.
We proposed a tool-centered interface which re-used existing and familiar components of the note taking application. This interface was extended to a Second Surface which would function as an actionable space for your static notes. The Second surface leveraged the proposed interaction paradigm which uses pen-based gestures to replicate experience of taking notes on paper onto digital canvas.
Gap in current market: Most application use multi-touch or marking menus. None of the popular applications surveyed used any form of gestural language. Additionally, they didn’t go beyond simple functions.
Novelty: The application differed in the way user interacts with the system to perform common tasks. The gestures were adopted from the daily behavior of user while taking notes.
Second Surface: Avoiding Faulty Detection
When using other writing apps, the users encountered many problems with faulty detection. This led to a poor experience with the apps. Thus, we designed second surface as an actionable mode that would allow them to write with ease in one mode (Note Mode) and perform other smart functions in other mode (Second Surface).
Basic Interaction: Change Mode -> Gesture
Users can write anything in writing mode and long press takes user to the second surface. User can perform any gestures on this mode to perform action. Some functions work similar to the smart search on google search engine. For example, by writing "Convert 1.2 minutes to degree." would convert the selection to degrees.
Perform Action on certain text: Change Mode -> Select -> Gesture
Users can mark certain parts of text so that they can refer to it later. They can select that text and perform actions like highlighting, calculation, etc., in this example user wants to mark a formula so that he can refer to it later.
To refer to the marked item,
1. user can perform star gesture without selecting a text or
2. tap on star icon next to the text or
3. tap Marked button from the contextual menu.
Multiple access points makes the application intuitive and easy to navigate.
What if user is in a meeting and they decide next meeting time or user is in a class and he wants to refer to a graph? The user can perform advance function like plotting a graph, setting up a calendar event or sharing the whole note using the second surface.
FUTURE WORK AND LEARNINGS
Future Work: The efficiency of such an interaction model will only be confirmed upon testing it against application with marking menus or touch and tap only applications. If successful: It could open immense opportunities the way user can “naturally” interact with system using pen as an input modality.
Learning: My biggest takeaway was working with stakeholders. There were a lot of discussion around will this be useful for Microsoft and how does it match up with existing Microsoft products. The other major learning was “devil in the details” while designing the interactions and keeping them simple yet making sure the probability of false positives is less.