Voice Commands Make Tapping on Your Phone’s Screen Seem so 2008
Voice-controlled, hands-free A.I. assistants were the brain children of Hollywood and sci-fi writers, not Silicon Valley. Some were good, like the Computer on board every iteration of the U.S.S. Enterprise. Others, like HAL-9000, were not so good (sorry Dave). But the technology sector has caught up with powerful voice-controlled A.I. like Amazon Alexa, Google Assistant, Apple Siri, Microsoft Cortana, and the newest entry, Samsung’s Bixby. In fact, voice control now surpasses graphics- and touched-based interfaces when it comes to discovering and consuming content. There may be no better example of this progress than using the combination of our intelligent OV Bluetooth headphones and Amazon Alexa to find and play music.
I remember the days when the tape cassette player dominated the portable music scene. I still love the look of the classic yellow Sony Walkman Sports model. Then along came portable CD players, and they were soon overshadowed by digital music players with their smaller form factors and scroll-and-click wheels.
The modern smartphone’s tactile and visual interfaces that first appeared with the 2008 unveil of the first iPhone are leaps and bounds ahead of all those devices when it comes to discovering, purchasing and listening to music. Today, less than a decade after Steve Jobs’ famous presentation, the maturation of voice controlled A.I. assistants represents another significant step in that evolution.
Consider the stark difference in the user experience between finding and playing a specific song – let’s pick “Layla” – using your phone’s tactile interface vs. the voice-controlled OV smart headphones.
Using your phone first requires you to pull it out of your pocket or bag, unlock it, open the music app, and either type “Layla” into the search bar, or scroll through your collection of artists, albums and playlists. When you find the song, you may have several to choose from, including the Derek and the Dominos original, the acoustic version from Clapton’s “Unplugged” album, or a live version from the concert album he released last year “Live in San Diego with Special Guest J.J. Cale”. Then you tap on the song, and you’re finally listening.
If you’re wearing the OV headphones, you never touch your phone. Just press the dedicated Alexa button, say “Play Layla on Eric Clapton’s ‘Live in San Diego’ album,” and that famous guitar riff starts.
Even better, you don’t need to have an encyclopedic knowledge of your music collection. Just say “play Eric Clapton” and your Amazon Music app will start playing his most popular songs. Or, say “play Eric Clapton radio” and you’ll start the stations on TuneIn or iHeartRadio. Or, just say “play music” and your apps will draw from your collection and find new artists and tracks that their algorithms select for you based on your likes and interests.
I could dedicate an entire post to all the music search and playback commands across multiple apps Alexa offers. Instead, I’ll just direct you to this link to the exhaustive list on Amazon’s web site: https://www.amazon.com/gp/help/customer/display.html?nodeId=201601830
Music is just one use case where voice controls have proven to be easier and more capable than touch. There are now 10,000 Alexa Skills, so you can do virtually anything you can think of.
Let’s say you’re on your commute home from work and are wearing the OV headphones. Just ask Alexa to turn on your smart lights, open the garage door, start playing music on speakers in the living room, and order a pizza to time its delivery shortly after you get home. All without ever touching, or even looking at, your phone.
Then, tap the earbud-located button to active Google Assistant or Siri and ask it to call your spouse or roommates’ mobile numbers so you can alert them that you’re on your way home, and so is the pizza.
To learn more about the OV intelligent headphones and purchase a pair, please head over to this page on our web site.