Artificial Intelligence (A.I.) Is Changing Our Lives Using Smartphones
Love it or hate it, Artificial Intelligence (A.I.) is an increasing part of our lives. Some might say that it is more omnipresent, thanks to its ability to watch over our lives and stay on hand to guide us. There are many people that are thankful for these adaptations. A.I. assistants have changed the way that we search for information, learn about the weather, and stay on top of calendars. This is an acceptable level of control for many. The A.I. bot responds to our commands and we have a good idea what it is doing.
But, 2018’s new smartphones and A.I. advancements take this technology to new levels. There is a greater desire for these artificial intelligence programs to work in the background and make decisions without human initiation. Some will provide alerts about things that it feels are of use. Some can make calls on a users behalf that sound realistic.
A.I. Technology Developments Mean Even Smarter Phones In The Future
Improvements in A.I. in mobile tech tend to centre around the assistant we come to call upon for our tasks and reminders. Many people with smartphones are quite used to this virtual assistant that sits in their back pocket and helps them out. There is also an argument for the notion that we rely too much on them. The intelligence of these women – because for some reason they all tend to have female voice defaults – is impressive. But, the only way that companies can continue to impress us with these phones is with continued advancements.
Siri and the New Apple A.I. iPhone
Naturally, the first company that we need to look to when considering the future of smartphone A.I. is Apple. Apple’s recent World Wide Developers Conference (WWDC 2018) event saw a big focus on Siri and how new updates could improve the experience for users.
In fact, the phone carries the name AI phone, just in case, there was any confusion over where Apple plans to go with this tech. The difference between this Siri and the old version is the ability to understand more about you. This means data on where you are and any information that might be important for that situation.
So What Can Siri Do Now?
An obvious feature is the ability to look at the calendar to inform you of important dates and appointments. This one is important and something that we see on many phones already. This helps people keep on top of schedules, and also allows for template texts for the right people if you are running late. Then there are the location-based suggestions that Siri can use to try and improve your experience.
Your phone knows where you are with its GPS sensors and other data links. Therefore, it knows if you are at the cinema, in the gym, the library or the supermarket. It can then use targetted, relevant reminders and suggestions. A.I. functionality includes options to start workout playlists, silence the phone in quiet areas or perhaps bring up shopping lists.
Notifications appear on the lock screen, so there is no chance of missing them. Some of these notifications are also time sensitive. Siri will learn your routine and set reminders when it thinks you should be doing something. This could be helpful for those with strict routines, but annoying for those that slept in.
Another update is the Siri shortcuts system. This voice-activated command allows the user to create a routine with a couple of words. The demo at the WWDC event showcased this brilliantly. The phone was able to communicate with the user’s home thermostat, text her room-mate, create directions in Apple maps and start a driving playlist. The A.I. phone did so all because she said that she was “heading home”. This has great potential for those with smart homes.
Concerns Using Virtual A.I. Assistants On Your iPhone
The biggest issue here is the idea of predicting needs. There is something about the possible intuitive nature of these A.I. Assistants that can be a little creepy. That desire to help and be ever present in our lives could feel a little clingy after a while. There is also the risk of invasions of privacy. This is a grey area because phone users have given their A.I. system permission to help them. They let them use phone data to improve their experience and time management. Still, one wonders how far this may go.
The difference will all depend on the use of suggestions vs actions. At the moment, Apple says that Siri will suggest that users take certain actions on their phone. Giving the phone user the option to concur or ignore the notification. An important feature giving the user the control to execute or not, even if the notifications become frequent. If phones assume that you want things to happen, and do it for you, there could be problems.
Much of the concern comes from that idea of learning routines and behaviours. This occurs because Siri is always there, listening in the background of apps. There are Siri integrations built into all kinds of Apple apps, which immediately relay information to your A.I. aid. This could be a problem for Apple in the long-run. If users cannot turn Siri off with ease, if she gets too invasive and “helpful”, they may abandon Apple altogether.
Alternative Option From Google and Duplex
Apple is not the only one that wants to get ahead of the game when it comes to smart A.I. systems. Google is also keen to ensure that its smartphones are as smart and helpful as possible. The company recently launched a demonstration of the new tech at its annual developer conference. This was the perfect stage to showcase Duplex to the world and highlight the independent processes of this A.I. system.
The team did so by allowing the phone to make calls and schedule appointments. For example, the phone had to make an appointment at a salon, where Duplex had complete control over the conversation. The system dialled the number, asked for an appointment and gave all the right responses to the questions asked.
The impression was that the salon had no idea that they were dealing with a bot, not a human. This was largely down to the speech patterns and natural sounding responses and fillers. Duplex sounded as though it was thinking and waiting for a response.
Source: Ben Thompson
This was all great in the demonstration and gave Google plenty of space in the headlines. Yet, there are some questions over the situation. The first is a natural wariness over the full implications of this system. How much control would Duplex have on scheduling and responses? If the user had said “make an appointment for tomorrow morning”, would it stick to that directive. Or, would it negotiate with a salon and book an afternoon appointment to get a result? What else could it do or say without the phone owner being aware of the fact? Then there is the other question over the legitimacy of the demonstration. Some felt that they had staged the calls, as there was no background noise to indicate that this was a salon at all. Was this actually a faked version of a possible conversation, rather than a real interaction?
Where should these tech giants draw the line?
The other question to ask here is just how far these tech giants can really go with these advancements. The desire for new and better tech means that a never-ending circle of updates and new products. There has to be a limit to what these phones can do – both on a technical and ethical standpoint. There is already a sense of discomfort and distrust with some smartphone users. Many fear an invasion of privacy or a world where technology can think for itself a little too much.
The reminders and aids of these current updates cover so many elements of a user’s life. In the desire to be more effective and helpful, those notifications could become a little intrusive and personal. There is also the risk of over-reliance on these suggestions and reminders.
This is where we reach an odd point in current developments – the use of A.I. to fool technology.
There is a strange paradox where we don’t trust computers and intelligent software for fear of data mining but will turn to A.I. to fix the problem. Rather than turn against smartphones and intelligent tech, we want to find ways that it can improve the situation. A great example of this is a new privacy filter for photos online. Researchers at the University of Toronto created an algorithm that distorts the image on photos. This then disrupts the processes of facial recognition software. The basic idea is that human users can go online and see your holiday photos. But, underlying bots and software cant use those images to learn anything more about you. This feels like a good compromise for those worried about internet security and privacy. They don’t have to stay off social media altogether, but they do have a barrier in place.
These sorts of barriers are helpful for those that fear privacy issues, but still, want A.I. and home help. This includes Alexa users. This is another important A.I. system that is currently more at home in the home, rather than on a phone. We already know that Alexa is hackable, with some odd responses, so security has to remain a priority. That said, there are plans to make Alexa a little more available on smartphones. At the moment, some phone users have the option to continue their connection with Alexa apps. This provides a sense of choice and control that consumers appreciate. It can also help with customization. Amazon also announced a plan to create physical Alexa buttons on new Prime Exclusive phones. This should make it even easier to call on her as needed.
So Where Do Microsoft and its Cortana A.I. System Fit Into This Current Climate?
There are clearly some companies taking bolder steps than others in the development of A.I. for smartphones. Apple probably leads the way in terms of the capabilities and appeal of the system. Google, on the other hand, could win the most prizes for realism. But, some developers have different views. Microsoft is a great example. Anyone that has a Windows-powered laptop is aware of the potential of the Cortana assistant. Assistant really is the operative word here, because it is on hand to help users as needed, not to poke them with reminders. In fact, it is easy to go through a working day without Cortana bothering you. Many people like this “call you when I need you” approach. It isn’t as intrusive and there isn’t the sense of a bot watching over us.
The recent demo of Google Duplex led to the suggestion that Microsoft was falling behind the pack. Their Cortana system cannot compete in regard to the capabilities and approach. Yet, those in charge of this A.I. aren’t bothered by that analysis. Instead, they seem to disapprove of Duplex’s methods. Javier Soltero, vice president of Cortana, liked the realism but sees Cortana as a different beast. He wants to focus on ensuring that Cortana is more of a “symbiotic” tool. This means one that helps people in their tasks, rather than operates on their behalf.
Which Type of Assistive Technology Is The Future For A.I. Phones: Symbiotic or Predictive?
The supportive role approach from Microsoft is encouraging. It shows that consumers still have a choice when it comes to the products and services they choose. To some extent, the same is true of Alexa. Users have the choice to add Alexa apps to their phone for extra help, instead of an A.I. that sinks its claws into the iOS. There is a divide between assistive, symbiosis technology and the super-intelligent, predictive system. The former is manageable, familiar and appears to appreciate boundaries. Time will tell if users become won over by the alerts and processes of Duplex and the new Apple A.I. phone. It might be a leap in the right direction or a step too far.
Source: Figure Eight, published on Jul 7, 2018
Presented by: Dan Ruderman, PhD, Professor Of Research Medicine, Ellison Institute For Transformative Medicine Of USC
Very good written story. It will be useful to anyone who uses it, as well as myself. Keep up the good work – I will definitely read more posts.