Siri Needs an Attitude Adjustment

You would think that a successful company like Apple would want to learn what ticks off its customers about its products and services, and then fix them. You would think that it would learn that its negative-thinking artificial intelligence assistant, known as “Siri,” is ticking off users.

Few people want to hear the opinion of a computer, especially when it contradicts their own opinion in a negative way. However, Apple keeps going its merry way without paying any attention to its products’ users.

Opinions are funny things. They are subjective, not objective. When you ask Siri a question, you want an accurate and objective answer. When you ask Siri about the weather, what you are looking for is the temperature and the forecast. Period. You will make your own subjective determination whether the forecast and temperature are good or bad, based on the way you think.

Why Does Siri Deliver Opinions Instead of Facts?

It appears that Siri is designed to keep you from thinking. It is designed to tell you the objective facts and then to add its subjective take on those facts. That’s bad enough, but what you get is not even the computers opinion… if a computer could have an opinion. What you hear is the result of Siri’s programmed design.

That’s right, it’s the programmer’s opinion. You wouldn’t take anyone else’s opinion at face value. Why should you take it from a computer? That’s what Apple expects its users to do.

When you ask the temperature, Siri should give you the temperature. Period — not the software developer’s negative opinion of the temperature.

When you ask Siri the temperature, it will say something like, “Brrr, it’s cold. Temperature, 45 degrees.” The 45 degrees is the objective number you asked for. That was good. But the subjective commentary that suggests 45 degrees is cold is over the top and uncalled for. Especially when, from your perspective, it’s wrong. It’s just somebody’s opinion.

Yesterday, it was 35 degrees, so at 45 degrees, today feels great — but not to Siri. Apple’s AI doesn’t seem to want you feel good. It always finds the worst part of the forecast to focus on.

Here’s another example: When asked for the forecast, Siri says it’s cold and wet. However, when you actually look at the forecast on the screen, what you see is clear, sunny and warm weather for the next week — except for one day in the middle of next week.

You probably would look at that forecast and say it was great. Why then does Siri deliver the negative “cold and wet” assessment? It throws cold water on the warm heart of every user.

4 Ways Apple Can Fix Siri’s Negativity Problem

First, Siri should not share the software programmer’s opinion, especially when it leans toward negative thinking.

Second, Siri should give users the ability to read the forecast on their own rather than waiting for it to stop reading aloud. Why is Siri so quick to tell you the negative side of the weather forecast, but can’t seem to utter a word when you are looking up something else? Very frustrating indeed.

Third, if Apple thinks Siri should give its opinion — which is very wrong, but just for the purpose of discussion — then at the very least, it should give users the ability to select a positive or negative outlook on life, instead of forcing every user to deal with negativity.

Fourth, Apple should give its users the choice to turn off the subjective opinion feature and opt for the objective information only.

What Gives Apple the Right to Dictate Our Attitudes?

What gives Siri the right to dictate our moods and attitudes? It’s a machine. It’s not human. And it’s misleading more often than not. This really ticks off countless users. Is that really what Apple wants to do?

Apple can fix this problem. It needs to stop Siri from delivering such negative, subjective and often misleading opinions. This feature sours the user experience for this product.

Apple Should Let Users Choose Positive or Negative Siri

Why doesApple continue to bathe us with negative subjectivity? Why can’t it look at life as a glass that is half full instead of half empty? At the very least, it should allow the user to give Siri an attitude adjustment. When you set up Siri, you can choose whether it’s a female or male voice. You can select an American or English accent, among others.

Apple should let users select a positive or negative Siri attitude. That would at least make Siri more acceptable to the human mind. That would be better than having Siri rub us the wrong way every stinking day.

Since this is AI, which is just a computer, there should be all sorts of options to let users select the experience they want. If Apple is such a successful company, why can’t it understand that it’s ticking off its users? Something is very wrong in Cupertino. Just saying.

Jeff Kagan

Jeff Kagan has been an ECT News Network columnist since 2010. His focus is on the wireless and telecom industries. He is an independent analyst, consultant and speaker. Email Jeff.


  • it kind of makes me wonder on the programmer’s individual attitude. The Siri attitude for sure makes me dislike apple products even more. There is no reason for us to have to deal with such bad behavior. Do people feel it is funny when kids respond with an attitude that now they add that to the phone system? is the programmer the parent of a shooter? I wonder perhaps he sees this behavior as a reflection of his own…

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories

E-Commerce Times Channels