iOS 10 Concept

MacStories April 2016

UI Design / Motion Graphics / Video Production


In March 2016, I was commissioned by MacStories to produce a wish list concept video for the upcoming release of iOS 10. Head over to MacStories to find out more about the concept.

The final iOS 10 concept video produced for MacStories.


In February 2016, I put together an iOS 10 concept video to demonstrate how a customisable Control Centre could work. At the time, I had planned to produce a series of videos throughout the spring in the run up to WWDC 2016. Having seen my concept, Federico Viticci, Founder and Editor-in-Chief of MacStories, reached out and proposed working together on an all-encompassing wish list for iOS 10 — complete with visual mockups and a concept video. This kind of collaboration made sense to me and sounded like a perfect fit. I immediately said yes and we got to work.

Control Centre

Control Centre was introduced with iOS 7 in 2013 and since then it has benefited from minor visual tweaks and the inclusion of a Night Shift toggle with iOS 9.3. In future updates, Control Centre could gain more hardware and system toggles, along with the ability for users to customise which toggles they require and where they are positioned. An enhanced Control Centre could also add support for 3D Touch, thus enabling additional options.

Swap Toggles

A user could long press on toggles to enter edit mode. The toggles would ‘jiggle’ in a familiar behaviour already seen when editing app icons on the home screen. Toggles can then be lifted and dragged into a new position.

3D Touch Integration

3D Touch was introduced in 2015 with the iPhone 6S and rolled out system-wide to enable shortcuts to many frequently used actions. Control Centre could benefit from 3D Touch by allowing users to quickly access additional options from the contextual menu for a specific toggle.

Dark Mode

There is a growing trend among third party apps, such as Twitter and Tweetbot, to offer users the choice between a light and dark mode. Apple has been thinking more about device display settings in recent months with the inclusion of Night Shift in iOS 9.3 and True Tone Display with iPad Pro. Other apps such as iBooks automatically shift to a night mode at a certain time of the day.

It would be great to give users the choice to change the display mode at a system level or provide scheduling features to switch between modes in the evening and morning for instance. Night Mode could be found in Settings > Display & Brightness but could also be accessed via a shortcut from Control Center — perhaps next to the new Night Shift option.


Messaging apps are everywhere; WhatsApp, Snapchat, Facebook Messenger and many more third party apps are building upon the standard features of the Messages app — offering users a compelling alternative with fun and engaging features. Viticci wanted to show how Messages could be taken to the next level, with richer previews and faster emoji input.

Richer Previews

Content shared to Messages from other applications could take on the styling and basic functionality of that particular app. For example, a playlist shared with Messages could inherit some of the UI and interactivity from Apple Music and enable the recipient to add and like content.

Faster emoji input

Emoji have become a powerful addition to conversation for millions of people worldwide. I considered adding favourites and search to the existing emoji view within the keyboard. Whilst this approach was perfectly fine, and still a credible option, I didn’t feel it would suffice in making the user experience as intuitive as possible. Viticci suggested we include an emoji shortcut in the QuickType bar — this was a much smarter solution and I created a couple of early drafts which incorporated a small popover attached to the keyboard. Creating this UI was fun — it felt like a really compact design with genuinely useful controls as and when the user required them. I’d definitely like to see an implementation of something like this in future versions of iOS.

Document Picker

Apple has provided basic file management with iCloud Drive and a document picker introduced in iOS 8. More advanced features could be borrowed from the Finder in OSX and thus allow users to view files via a multi-column layout, tag files and manage file versions. A more versatile document management tool would help transform iOS into a highly capable operating system and enable the iPad to become an even better productivity device.

Multi-Column Layout

Provides a more flexible layout to quickly access files.

Tag Files

Users can browse and assign files with custom defined tags.

Version History

Ability to view and restore previous versions of a file.

An example of a typical user flow using the new document picker interface.


With iOS 9, Apple introduced 'Slide Over' and 'Split View' — a step in the right direction in realising the true potential of a multi-tasking experience on iPad. I think Apple can build upon this foundation in iOS 10 with an all new interface for copying data between apps and improved interactions for the split screen view.

Drag & Drop

The system clipboard has served iOS well but I think it could be time to introduce the next layer of interactivity between apps. A user could select a piece of content in one app, for example, an image in safari, or a paragraph in notes, and then transport that content to another app. Viticci and I felt that this interaction should live within the copy/paste menu; when a user long presses on text, images or any other compatible content, a new 'drag handle' icon appears for drag and drop.

An example of content being dragged from one app and dropped into another.

App Picker

Finding and swapping apps within the split view interface is a little cumbersome — currently there is no facility to search for apps or pin frequently used and favourite apps. Viticci and I proposed a solution whereby the information density is higher to accommodate more apps on screen. We also included a search bar and segmented the app list between recent and favourites.


New proactive features were introduced with iOS 9 to provide useful shortcuts, news and information tailored to the user. Apple could enhance this experience without compromising on user privacy. Based on the time of day, the proactive search screen could suggest upcoming calendar events, traffic conditions or transit directions.

By recognising patterns about current location and context, iOS could present the user with relevant information. For example, surfacing a workout playlist from Apple Music when the user is typically visiting a Gym, or providing a meeting-related email when the user is often at work.


A Siri API and the ability to provide textual conversations could greatly enhance the current experience. By allowing third party developers to hook into Siri, iOS and by extension, Apple Watch, become more powerful tools for quickly and reliably retrieving information from any app. For situations whereby talking to Siri is not possible, providing a textual interface would help to boost the functionality of the assistant in all environments.