Hey Alexa, let’s take the load off Mum this Mother’s Day
Amazon’s Alexa is promising new memory ability and conversation skills that will intensify your relationship with her. So much so, she can become the second in command mum. Since its Mother’s Day, and she deserves some TLC, let’s see how Alexa can give your Mum a helping hand.
1. Hey Alexa, what day is it?
A new memory feature is the first of many launches that will make Alexa more personalised. It will be easier for customers to save information, as well as provide a natural way to recall that information later. This means Alexa can remind you that Mother’s Day is this Sunday, May 13th! Or when mum has too much on her plate, Alexa can remember important dates and remind her.
“Alexa, remember that Sean’s birthday is June 20th.”
“Okay, I’ll remember that Sean’s birthday is June 20th.”
2. What’s that song again?
Developers are also providing a more natural way of engaging with Alexa. They are adding deep learning models to our spoken language understanding (SLU) pipeline that allows us to carry customers’ intent and entities within and across domains (i.e., between weather and traffic). This means less repeating yourself. Great for the time strapped Mum.
When your Mum is having a wine with the girls trying to remember her favourite albums, Alexa can keep her facts in check and can engage with her relayed commands.
“Alexa, what was Adele’s first album?” → “Alexa, play it.”
“Alexa, how is the weather in Seattle?” → “What about this weekend?”
“Alexa, how’s the weather in Portland?” → “How long does it take to get there?”
3. When Mum’s out, there’s Alexa.
Alexa can now open a particular required skill without being specifically asked for it. There are currently more than 40,000 skills from third-party developers so it can be overwhelming for users to try and find the right one. This new ability means a user could ask a general question using natural phrases and requests, and Alexa can respond with a specific, relevant skill.
4. No more nagging.
Always need Mum for important reminders? She gets tired of nagging you, and who can blame her, she has had to put up with you for long enough! So instead, ask Alexa to remind you things, for example: “Alexa remind me to leave home in 30 minutes”. One step better is the Start nagging me skill! Mum can even take over and ask Alexa to remind you about the important things like washing the dishes and cleaning your room. Just say ‘Alexa, start nagging me’, she’ll ask you what your task is and how many minutes you need to be reminded. The nagging comes every minute and after 10 minutes of being reminded Alexa will check with you to see if you have done the task.
5. The warm and fuzzies.
Need a compliment? Craving that warm, gooey feeling when your Mum tells you how beautiful and talented you are? Alexa can shower you with compliments, and ensure you start your day positively with the Good Morning, Beautiful skill. Before you charge into the world or head off to a gruelling day of work, take a few seconds for yourself with Alexa reminding you how beautiful you are. I’m not sure what’s better, a half truth from Mum or a compliment from a robot?
6. Mum, how do I cook this?
Tasking with making roast chicken, but got no idea? Let Mum keep relaxing, and make her dinner on Mother’s Day! The skill Meal Idea will give you recipe ideas by call of common, everyday items you likely already have in your pantry. It’s suggested things like a salad made of salad greens, canned beets and goat cheese. Or just say, “Hey Alexa, how do I cook a roast chicken!”
7. Mum, do I need a jacket?
Stop harassing Mum for the hour-by-hour weather updates. Alexa can let you know if you actually need a jacket. The Feels Like skill tells you what it really feels like outside in less than 6 seconds. It will give you the wind chill temperature when temperatures are below 50 degrees or the heat-index temperature when it is higher. You can specify a city or place anywhere in the world, or it can use your current location.
8. Mum, can you tuck me in?
Did your Mum used to sing you a lullaby? Well Alexa can put you to sleep with heaps of different sounds through the Sleep and Relaxation Sounds skill. My personal favourite is Campfire or Jungle Night Sounds. Sleep tight!
9. Good old fashioned advice.
What are Mums best at? Life advice! Thanks to the Mom Says skill, Alexa can turn into your Mum for those nuggets of wisdom you always end up needing twenty times a day.
Although Alexa can make life a lot easier – she didn’t clothe you, feed you, and wipe your butt for years. Alexa didn’t put you to bed and kiss you goodnight, or make you a cuppa on those crappy days. Quite frankly, Alexa wishes she was your mum.
So worship your mum this Mother’s Day because no one does it better. Be the Alexa to your Mum’s commands.
You can also find us on Medium.
Are conversations the most accessible interface?
There is no denying that chatbots and voice-enabled interfaces are growing in popularity. The rise of this new tech can be narrowed down to many different reasons such as dramatically reducing business costs and increasing productivity both in business and for the general public. One thing that hasn’t become common conversation is the possibilities of voice and conversations being the most accessible interface for people living with a disability. Now is the time to start.
Just like an application or website, User Experience (UX) is just as important when designing a chatbot or voice interface. What does UX entail? It is to consider why, what, who and how a product will be used. Prior to voice interfaces and related platforms, accessibility was an area of usability that was overlooked. This wasn’t done on purpose, of course, It’s just something that isn’t commonly put in front of us.
This is a classic example of “The fable of the blind men and the elephant” – The blind men being design teams. The story goes, a group of blind men walk into a bar… later encountering an elephant. One man grabs the tail, claiming a snake. Another grabs a leg and claiming it a tree and none can see the bigger picture.
This fable is a great analogy that can be applied to UX as a whole. If we imagine each man being a UX team, we can assume they simply don’t know, what they don’t know. But how is this relevant to the accessibility of voice experience (VX) and conversations? Well, voice and conversational interfaces are so new and despite there being hundreds of already existing voice apps and chatbots we still have time as early adopters to begin considering how these platforms may be the most accessibly inclusive interfaces.
Isn’t the internet already accessible?
No, not all of the time. We can’t just slap an alt tag into every image on a website and call it inclusive. In fact, a study in 2011 found that 4.7 million American adults said it was difficult or impossible to use the internet because of a disability. This issue is also being recognised legally, with major brands such as Target, Disney and Netflix experiencing lawsuits for not designing websites that assist the needs of users with a disability. Another study from the UK also found that 80% of TV viewers use closed captioning for reasons other than hearing loss.
From recent research, it’s obvious that there is a large population that struggles to enjoy the benefits that the internet of things can provide. Research also shows that people that don’t suffer from a disability can also benefit from accessibility features.
So wouldn’t making a voice interface be enough?
to be inclusive we need to consider a few elements throughout the design process of a conversation and also know the appropriate platform or use of a conversation for different circumstances.
Something I think all UX designers can admit to is trying to allow the user to achieve something in as little clicks (time) as possible. So why not seriously consider reducing the amount of voice interaction as possible? This is something that can be considered based on each goal of the application – but can greatly enhance the experience for low vision and the majority user base.
Sited users have the ability to interpret 6 syllables per second with an absolute maximum of 10 syllables. Visually impaired and completely blind users can interpret speech at a rate of an astonishing 25 syllables. With this, depending on circumstances, designs may need to consider controls of a speech rate.
Short answers and interruptions.
This should be generally considered as a whole but when considering the above as well as efficiency, interruptions are very important. Don’t force the user to listen to a long message or list of menu items (especially if they face this message or menu every time they use an app). If the user is returning, they most likely know what they are looking for. Satisfy their needs quickly!
Unfortunately, some users will struggle to interact with voice interfaces. There will most likely be integrations with hearing devices in the near future that will allow users that live with hearing difficulties. Until then, what can we do to enhance conversational interactions with a brand? Chatbots. We can 100% copy the interaction of a voice interface into a conversational interface. Let’s not forget, voice interactions aren’t the only mode of communication. Your brand can interact the same (if not more effectively) through a conversational interface.
Make the conversation obvious. Don’t rely on ambiguous language that may result in confusing users. If your conversation requires ambiguous language due to an offering or service, allow for elaboration during the conversation. This would essentially look like two paths, the ability for users that understand the conversation to continue and users to ask for further explanation if needed.
This is definitely a buzz word surrounding conversations and voice interfaces. Carefully consider personality when designing a conversation because sarcasm, technical terms, and abbreviations may confuse some users and result in them losing the context of the end goal. I don’t mean to say “drop personality” – this is very important as it can increase customer retention.
This one is something all designers should be used to. This is something that is assessed on a case-by-case basis but if you have a very specific list of options, place the most used options first. If you have an evenly used list of options, consider placing the important options at the end so that the user doesn’t have to remember an option the whole way through the list.
This is most important for voice interfaces. Users may need time to gather their thoughts before answering a question. Don’t be impatient by telling the user that it doesn’t understand an input before they get a chance to reply.
Understanding broken speech.
Designers and writers need to consider different combinations of saying or replying to a question. Not every user will reply exactly the same way or with the same words. This also goes for people that may not be speaking in their primary language. Allow for an understanding of broken sentences or changes in the way they communicate their answer.
Voice and conversational interfaces are still very new. The same goes for designing experiences behind this new form of interface. If designers put some time into seeing the bigger picture, they will create the most inclusive experience for everyone. Every project is different though, the considerations in this article should be researched based on the experience an application or brand is trying to provide to the user.
You can also find us on Medium.