Surface Duo

An exploration project -- that leverages dual-screen interactions for data visualization.


My Role 

Research, Exploration, Low-fi and Hi-fi Mockups, Product Design, Concept Developemnt, 
Physical Hardware Prototyping (HW UX).


Laser Cutter, Abobe Illustrator, Figma, Adobe Photoshop


12 weeks


Mehul Shah, Olga Khvan


Human-Computer Interaction + Design,
University of Washington


Jan 2020 - Mar 2020


This is a design concept for making visualization more compelling by seeking advantage of the dual-screen displays. I propose an extended set of gestures helps to create a more intuitive and easy way of visualized data exploration.

While I led conceptualizing the idea, building physical prototypes, and following an iterative design process for final design solutions my teammate, Olga helped me with building the data set. 

Multitude of possibilities to interact with foldable devices

Target Audience

The target audience for this project scope are professionals in the industry, engineers, and executives that have access to dual-screen displays and who have a need for interacting with information and visualizations. But I see the potential of this concept to all consumer levels.

Data visualization has become an inherent part of our daily lives - whether it is viewing the latest information on a map, current company stock price, or the current site-analytics. As people shift to smartphones as their primary source of information, we are faced with exploring new visuals and interfaces that are optimized for small screens and constrained input modalities. 


I've taken cues from a series of patents filed by Microsoft on hinge mechanics, dual-screen interactions, and experience, as well as prospective developments in operating systems, such as Windows 10X, Android for Foldables, and Fluent Design system. This design concept is meant to be consistent with the current strategy from Microsoft while offering something radically new.


Communicate complex ideas with clarity, precision, and efficiency.

The size of a regular smartphone screen and the existing gestures are not enough to fully explore the presented data, especially, when a user wants to compare multiple graphs or communicate selection between multiple graphs. It is a lot easier to show 10,000 data points on a 30” desktop monitor than it is on a 4” smartphone.

Most smartphones and tablets leverage only one-sided display or screen for users to interact with. If humans were able to use double-sided foldable displays, this might enable new ways to interact with their personal devices more efficiently. 

Hardware Prototyping to understand multiple device states

Through research, I found that by the next coming years more than 50 percent of users will use a tablet or smartphone first for all online activities. In addition, users total activity on smartphones and tablets account for an outstanding 60 percent of digital media time in the US, already surpassing the desktop. It is clear that mobile devices are becoming the primary way that users interact with digital content. With the possibility of multiple states of the display screens, users can interact with visualized data in different ways.

Folded and Unfolded states of the device

Key Focal Point

Multiple Perspectives

Reveal the data at numerous levels of details, give a multiple view perspective.

Encourage Spanning 

Encourage exploration and comparison of different data pieces.

Interact with Clarity

Serve a clear and reasonable purpose. (description, exploration, decoration, etc.)

Version 1.0

Visualizing large data sets

I designed the end-to-end experience for this concept which focused on visualizing data on a relatively new medium. In version one of the concept app, users could use their mobile phones to view large data sets coherently.


Before jumping into the design, I stretched out my knowledge with the existing SDKs to understand how two symmetric screens work together in unique ways to provide productivity in a flexible form factor. Their main problem was that the existing smartphones have a single screen to visualize information. I decided to focus on visioning new interaction techniques for flexible screens.

Interaction Modes

Interaction for visualization is characterized as a goal-oriented activity that contains semantics, that is sequential, incremental, and iterative, and preserves the following properties:

 of interaction

  • Micro-level: screen touch,

  • Macro-level: hypothesis generation

  • In-between: sort, filter

Micro-level and Macro-level: Using dual screens by placing the app canvas on one screen and using the other to hold tools that manipulate the content of the canvas. 

 vs Discrete

  • Continuous interactions involve a sequence of intermediate visualization states between an initial state and a goal state.

  • Discrete interactions mean that action and reaction occur in a distinct manner.

Discrete Interactions - Interactions that can take place actively and passively.

Continuous Interactions - Interactions that can take place in the main focus; wherein a user chooses to span the application to both screens.

Direct vs

  • Direct interaction is interaction with data itself.

  • Indirect Interaction is interacting with data using widgets or queries
 as a medium

Taking advantage of the two distinct screens and snapping to the natural boundary, we could use one screen to show the "items" list and the other to show details of the selected item.

Design Hypothesis

Portrait Compact Mode

In this mode, the device is folded along the hinge mechanism and held vertically
 in hands. This position of foldable displays reminds users of conventional modern touch-screen smartphones.

In this compact portrait mode, users observe visualized data and perform graph-based explorations like they are used to with their existing smartphones. 

With the help of the gyroscope sensor, the device identifies the side
 which faces the user and activates the display screen,
 the opposite side turns off, and becomes inactive. 

This concept could potentially offer a new way to scale up graphs through the gyroscopic sensor, achieved by the action of tilt motion. 

Double finger touch and dragging up/down will help the user to view the scaled-up version of the graph also allowing the user to scroll through the graph.

Force-touch on one of the bars will evoke a tooltip with detailed information for users to further inspect the data set. Also, at the same time giving the choice to the user to further perform actions.

Portrait Notebook

There's more to come!!

I'm working late hours to collect all my artifacts and display them here! Until then Stay tuned and check out the Report below.

Version 2.0

More control while spanning

There's more to come!!

I'm working late hours to collect all my artifacts and display them here! Until then Stay tuned and check out the Report below.

Previous Project

Next Project