Smart Screenshots, designing screenshots as a contextual interface

Redesign of the iOS screenshot function; Self initiated one-week side project, early 2017

Tools: Principle, Sketch; Time: 1 week; Context: self-initiated project

Challenge: Different intentions, same outcome

People take screenshots to remember a song or an event, to gather inspiration, or just to send them to friends. Intentions to take screenshots vary, but they always end up in the same place - the photos folder (iOS until early 2019).

As an Apple advocate, I have often wondered why this functionality lags so far behind other iOS features. So I simply decided to redesign it.

How might we redesign the current iOS system (early 2017) to support the individual intentions when taking screenshots?

Outcome: Predicting user behavior

Based on the screenshot content and user's recent behavior, the algorithm proposes a course of action. The contextual system provides recommendations by anticipating the user's next action. The user can choose to accept, alter, or ignore the suggestion without interrupting their current activity.

Some examples of Smart Screenshots

Change suggestion
Add events
Save routes
Save to folder
Share with friends
Add song to playlist

The Process of redesigning screenshots

The design process was very human-centred and iterative with a crucial role in creating small prototypes and testing them. Accomplishing a project in a very short periode of time meant that I was able to fail fast in order to move on.


Seeing where the person's hand is when taking screenshots was very insightful for the further process.

Learning about peoples' screenshot behaviour

People sent me hundreds of screenshots, which I sorted by what people did with them. To get a clearer understanding of particular cases, I interviewed 2 people in more detail about their screenshot habits.

Sorting my screenshots
Learning about other people's screenshot behavior

Iteration and validation 🔁

Following the validation of those insights, I created various prototypes to determine which interaction is the most user-friendly. I tested multiple versions of the feature to find out which one feels the most intuitive.

Usability testing

Current screenshot function vs. screenshots as a contextual interface

Contextual user interfaces

Contextual user interfaces are about providing the right kind of data when the user needs it without forcing the user to have to ask for it. Thereby it doesn't force them to interrupt their current activity and let them continue it as intended.

The nature of contextual design is data-driven. In this case, the system can fetch data from the person's recent smartphone behavior (behavior), which app they use, and the data in the screenshot (image recognition) to make a prediction.


What I have learned

• Balancing constraints and freedom in order to avoid losing focus or limiting the creative process too much.
• Being confident in how to continue with the process.
• Getting comfortable with being solely responsible for a project.


Google maps In August 2021, I noticed a similar feature on google maps when I was taking a screenshot.

Spotify I noticed that in 2018 Spotify implemented or at least tested a screenshot feature to share songs with friends on social media.