Paul Henry Smith
AI Product Designer

iOS App to Make Playing Music More Fun


Sonation, Inc.
MassChallenge Finalist
Harvard Innovation Lab
Apple Managed Developer


Product Lead
UX/UI design
Interaction/motion design
User testing
Content production


iOS, Mac OS



The Problem

Playing music is social. Just like basketball or ballet, we love doing it with other people. Unfortunately, learning to play an instrument involves hours of solitary practice, punctuated by weekly solitary lessons with a teacher. Imagine being expected to stay interested in basketball if you had to spend 80% of your time shooting hoops all by yourself.

Opportunities for collaborative music-making are rare, and that’s the main reason ninety percent of kids quit music by the time they're sixteen years old.

Could we create an experience that would make practicing music as engaging and fun as playing with sensitive, great musicians who listen to you and respond to your personal expression? If so, would that keep kids engaged longer so they could reap the proven benefits of making music?

From university research to a “Best New App”

In 2012 I came across a compelling AI research project with great promise for musicians. The researcher, who was also a musician, had created a method using predictive algorithms and machine learning for a computer to listen to a musician and accompany them with a real audio recording, in real time.

By 2015 our team at Sonation had created a consumer product that became a Best New App in over 20 countries, with over 8,000 monthly active users.

We won the support of Yo-Yo Ma and Harvard University. We were invited to Cupertino to show the technology and discuss strategy with senior managers and developers at Apple.

The Product Team

Our go-to-market strategy was to create an app that would make you feel like a god when you play your instrument. It would be geared toward musicians for whom temporal expressiveness is central to their success, and it would run on iOS. The product team I put together consisted of

  • Two iOS engineers with deep signal processing, AI, and music experience. The lead engineer was Bruce Cichowlas, co-founder (with Ray Kurzweil and Stevie Wonder) of Kurzweil Music Systems.

  • Content producers from Berklee and New England Conservatory

  • A visual designer from Harvard GSD

We created an “intelligent karaoke” iOS app, called Cadenza, that freed musicians from being subservient to a frozen recording so they could explore and develop their own expressive voices. That ability to be free proved powerful, especially to teenagers.

The potential to create products beyond that application was apparent at the beginning. We ultimately created user-controlled movies that modified themselves to fit your singing, and AR music-making where a 360-degree band surrounds you and follows your spontaneous musical lead.

Cadenza—Your AI Orchestra in an App

The laboratory application created by a Sonation co-founder.

Starting Point: an Inscrutable Desktop Application

The university lab application created by the original inventor was not easy for musicians to use. It was designed to be operated by someone else, while a music student simply played their instrument.

Our initial user tests showed that musicians did not understand what this application was, nor how to use it. When they tried to use it, they found it intimidating. Most said they wouldn't use it or want it. We needed to transform this into a product, simplifying the experience as much as possible.

Our strategy was to create an app for iPad and iPhone and make it easy for a musician to have a great musical experience.

Watching and Learning

New England Conservatory student shows how to train Cadenza just by playing with it.

We moved into a user research goldmine

Where could we have unlimited access to users we could talk to, listen to, and observe?

Our company started as a venture-in-residence at Harvard Innovation Lab. Like many co-working, tech-oriented spaces, there were many small startups in a large, open space, working hard on code, design, hardware, and business. Unfortunately, however, if your startup has to make sound, such an environment quickly becomes a hindrance to progress, and an annoyance to the other teams.

Fortunately, the president of New England Conservatory invited us to move our startup to office space in their main building, Jordan Hall, in Boston. We moved in among 300 music students, most of whom were exactly the target market segment for Cadenza.

Now we could test and learn about almost any issue simply by stepping out into the hallway and asking someone to help us out. Weekly office hours for students turned out to be a goldmine of information. Regular users of the app regularly approached us, speaking up and sharing their thoughts. Faculty and parents played a big role, too.

For our part, we could observe users in their own setting. We saw how people really used the app. That knowledge led us to create solutions and make improvements that proved immensely popular among the world’s classical music students.

the “Happy Path”


Testing with the prototype macOS application showed us that musicians were confused. They could not even figure out how to actually play music with the prototype, yet that is the core experience they wanted from the app. The large spectrographic display with red lines on it was meaningless and off-putting. Users did not understand what it was o why it was being shown to them after they played their instrument. The fact that it conveyed fascinating information did not matter to someone who was there just to play music.


Reorganize and re-design the experience to make it as clear and inviting as possible to play music, and to make users feel like they were in a musician-friendly space.


  1. Move all elements not part of the happy path flow to secondary screens, or better, completely subsumed into the app's operation.

  2. Get to “playing music” in no more than three taps (open app, tap musical selection, tap play).

  3. Design user on-boarding to lead to understanding and success on first-time-use.

Cadenza—Your AI Orchestra in an App

The happy path UX Flow

Launch the app. Select the music to play. Start playing your instrument.

Cadenza—Your AI Orchestra in an App

An unusual situation for an app.

We realized that this app would be most successful when users don’t look at the screen or touch a keyboard. This horn player completely ignored the Mac prototype, preferring to read from printed music on a music stand instead.

Cadenza—Your AI Orchestra in an App

Cadenza onboarding

First iOS Version

Starting out: Choose your instrument

Most people only play one instrument, so selecting your instrument only happens on first-time launch of the app.

Main steps

  • Select some music to play

  • Start playing

Secondary actions

  • Save or delete the automatic recording

  • Decide if the app should train itself based on what you just played

  • Play again, with more accurate, more sensitive accompaniment from the newly “educated” AI accompaniment

All the complexity of the AI predictive engine, the machine learning, the audio cancelation, and time-stretching, were taken out of the UI and hidden.

How well did Cadenza work?

Watch a wide range of musicians play with Cadenza. Each one of them is accompanied by exactly the same “track,” yet each performance is unique and personal—right down to the mistakes and restarts, which don't faze Cadenza at all.

The Surprising New Feature that Drove Adoption
Cadenza—Your AI Orchestra in an App

An 1889 patent drawing for a music page-turning machine.

Page-turning is a 400-year-old problem for musicians

There is a centuries-old pain point for musicians called “page turning.” It's so much a part of being a musician that it informs how publishers design printed music layouts. “Page turner” is an actual job at some concerts for a person to sit quietly by the pianist, only to stand up and turn the page whenever the time comes.

The reason it’s such a pain is that it requires the musician to take their hand off the instrument, turn a page, and get back to playing—literally without missing a beat.

Page turning has been a problem for so long that there are dozens of patents on file for mechanical solutions. Many of them are fascinating examples of over-wrought complexity. Yet none had supplanted good, old-fashioned speediness of hand.

What does this have to do with Cadenza?

We observed over and over that page turning was marring the experience of using Cadenza, just as it had been marring the experience of using printed music for centuries.

For musicians on our team, it was hardly noticeable. We were all so used to it, and no one expected the problem to be solved either. But during user testing, I couldn't help but notice that when people paused to “turn” the page, our app could mistakenly assume an expressive intent, and it would try to react to that mis-cue.

That made the page-turning problem stick out like a sore thumb. This age-old problem was now marring our app experience, even though there are no pages to physically turn.

It would be great if our users did not have to take their hand off their instrument to tap a screen. They could use the bluetooth pedals, but, again, another gadget to buy, carry, and hope works.

What if Cadenza simply turned the page for the user at the right time?

We solved the problem, and then…

Cadenza—Your AI Orchestra in an App

Cadenza shows the next page at the right time.
No tapping. No foot pedals.

Using the microphone and predictive algorithms, Cadenza simply does the right thing.

… app usage took off!
Cadenza—Your AI Orchestra in an App
Monthly Active Users
YOY user growth
Apple in-store events
Cadenza—Your AI Orchestra in an App

Paul Smith and his daughter, Frances, with Yo-Yo Ma (left) and co-founder Ann Shen (right) at Harvard Innovation Lab after winning seed funding for Sonation in a university-wide entrepreneurship challenge (2013).