> "It was slow, when it worked at all," Williamson said. "The software was riddled with serious bugs. Those problems lie entirely with the original Siri team, certainly not me."
Wow. It those were his original words, I don't doubt a bit that this guy should be responsible for Siri being left behind.
Conversely you get handed a pile of steaming hot shit they call legacy code(and it lacks documentation and the variables lack names that hint at any meaning) and they're telling you it's completely your fault that you couldn't get something to work properly. Getting handed a huge amount of technical debt is a very real problem.
The industry does not lack programmers who can clean up technical debt or outright rewrite a legacy system. At least, I’m pretty sure that Apple doesn’t lack such programmers.
A VP’s job is to find those people and give them the correct incentives. The VP has the rights to hire new people and restructure teams. The VP should use those leverages.
At the VP level, how a particular codebase or team is doing simply doesn’t matter, as the VP has all the leverages for organizational shifts. Blaming a codebase or team only makes sense if the whole company cannot produce quality code, and it’s impossible to hire new employees that can. I am pretty sure that this is not the case for Apple.
A hopefull adequate comparison would be this: you are a technical lead of at a company, you get really really high pay and you have a lot of fellow engineers under your command. You get handed a legacy system, and there are all all kinds of problems with it:
1. The system uses a database that is not duplicated.
2. The system has no monitoring.
3. The web frontend is written in PHP.
What do you do?
To fix 1, duplicate the freaking database.
To fix 2, add some freaking monitoring.
3 is presumably the hardest to fix. But remember that you get really high pay and you have a lot of engineers. What did the engineers at Facebook do? They fixed PHP.
Do you blame the previous team that made this horrible system? No. No. No. You own the system now. There are valid engineering approaches to fix. So fix it. It can take a while, but eventually you can fix it. In the worst case, you will have to rewrite. But that's still a feasible thing to do.
I agree with this, the one thing I'll add is that you may also have to contend with the competing priorities of every other VP. Specifically in that Business Development VP may be saying, "We don't have time to do any of those things! We need X done now otherwise we won't get our customer!" And the technical VP response might be, "If we don't fix this now then we won't have _any_ customers." But in the end the financial state of the company will make the decision for you. If you're on the edge and you immediately need more customers to survive then you're going to forgo doing 1 & 2 and instead implement X that BD was requesting. Hoping that you can get enough breathing room in the near future to actually do 1 & 2.
Does Apple have the resources for this project in particular? I really don't know. However in the general context, I think you lack understanding of two things:
1. You cannot simply put your standard programmer(or any number of them) with undergrad (even with superb knowledge) of combinatorial algoritms in the former position of someone with a graduate degree that focused on algorithms engineering who didn't give a damn about even the smallest aspects of software engineering like version control or documentation. You have even less ability to do this on an embedded device for numerically sensitive algorithms. I've been in this position (except with a far broader background in mathematics and algorithms engineering) at least a couple times with other engineers with graduate degrees in the requisite areas and (I'm not exaggerating) it is truly a fucking nightmare.
2. Just because you are a VP doesn't mean your managers are willing to approve the amount money that goes into a number of algorithms engineers fixing or rewriting a product that just barely functions well enough. I am incredibly close to a couple people that are VPs at companies with ~10,000 people and you would be surprised with how little power they can have on hiring sometimes. Companies are not bottomless pits of money and the people at the top usually treat it as such.
As an engineer who reported to Williamson, I’ve got to say I was far more intimidated by him than I ever was of Forstall. He rarely smiled in a meeting and his old school heavy handed management style always took the fun out of the work.
Honestly, I don’t know the full details, but I still hold that Forstall took the fall for the poor execution of Williamson in the Maps debacle.
>Honestly, I don’t know the full details, but I still hold that Forstall took the fall for the poor execution of Williamson in the Maps debacle.
Well that's not true because Eddy Cue fired Williamson [1]. Maps might have been the straw that broke the camel's back, but Forstall was fired also because he couldn't work well with Bob Mansfield (who came back out of retirement after he was gone [2]) and Jony Ive.
Then please set up a new account and tell us about it. I really like Forstall direction, and I dont think Maps was that much of a problem given it is still pretty poor after he left.
"@Jessicalessin This statement, wholly false, was made by the architect and head of the biggest launch disaster in Apple history, Apple Maps. In reality Siri worked great at launch but, like any new platform under unexpectedly massive load, required scaling adjustments and 24 hour workdays."
So Forstall and his direct report were responsible for two botched product launches, Siri and Maps. This is a big blow to the camp suggesting firing Forstall was a mistake.
Siri may have stalled now but it definitely wasn’t a botched launch. And Apple maps may not work great in all cases but it did succeed in forcing Google to provide decent maps on iOS. Before Apple Maps the vector maps were an Android exclusive.
Apple Maps is like Bing: almost as good as the #1 service, but almost is not enough to make anyone care.
Apple Maps is remarkably good today, doing nearly all of the important things well. Here in Australia, it's on par with Google Maps for driving (slightly worse for novel route planning, slightly better for guidance) and when travelling to major world cities I've found Apple Maps is frequently better than Google Maps for navigating on foot and public transport.
In reality Google Maps is farther ahead of its competitors than most people realize.
Google Maps now provides wholly personalized points of interest and attention affordances in its UI. It's also well on its way to 3D scanning and annotating POIs in every structure in the world, including very large pedestrian-only interiors. It also has realtime transit and traffic feeds, and more pedestrian trails and venue detail than anyone else. All of these are areas where its competition is not just behind, but falling further behind. But the personalization combined with superior POI search really illustrates how Google's competitors don't even have the platform to properly conceptualize the type of product Google is building, much less build it.
It still feels years behind any dedicated maps though(like TomTom), at least in those countries that I've tried it(UK, Spain, Poland, Germany). It's just consistently wrong about which roads are connected with each other, it doesn't know that certain roads are one-way or that you can't make a left/right turn out of a certain street. I've reported dozens of mistakes and they just don't get fixed. Every single morning on my way to work it tries to route me left on a no-left-turn intersection - I've literally filed a map correction report for this 3 or 4 times, still nothing.
TomTom just don't have this issue, I've been using it for 5 years now and it's never done anything like that. Also TomTom Traffic is vastly better than Google Traffic, consistently beating it for time and route prediction(I've been driving with Android Auto and TomTom 5000 set up side by side for a while, and if TomTom suggests a detour it's usually a good one - take the one google suggests and you're getting stuck on some small street that google thinks you can drive through but you actually can't).
I don’t know if I’d say most of these things matter enough at the end of the day, just to cherry pick, Apple has 3D, interiors, traffic, transit and pedestrians. Not everywhere, but it’s a problem which time and money can fix. Both of which apple has in abundance.
Also anecdotally I’m not sure if I can agree that Google’s POIs are vastly better, mostly just different but of a similar low quality (which is to say neither pleases me).
You might be right, but most of those reasons don’t actually matter. To be frank, the only things I miss when not using Google maps is their highlighting of high activity areas.
Both maps have similarly good cartography where it matters. In most countries, both maps are similarly accurate.
when in major cities in China, my experience is that Google Maps is almost unusable (not due to the censorship, as I use a roaming SIM which avoids those problems, but rather due to wildly incorrect data, missing metro lines, etc), and Apple Maps is at least on par with the quality that Google Maps achieves in other countries, and often much better.
I find it interesting that Google doesn't care enough to add the most recent several metro lines in (for example) Beijing or Shanghai, considering that many people try to use Google Maps in China when visiting and usually seem to end up switching to Apple Maps as a result of the complete trash the Google Maps experience is there. I wonder if they switch back when they go home.
When you set hard deadlines for software release dates because you want to have tentpole features for IOS releases, and you mostly refuse to do widespread beta tests because then it won't be surprising, you're creating the maps debacle. Now I'm sure maps was harder than it looked, but I'd also bet lots of people on the inside knew that. They should have let the date slip.
This is all moot with respect to Jobs. He resigned due to health reasons 20 months before Apple Maps was launched and died almost a year before it was launched.
They couldn't let the date slip, because the deal with Google was set to expire and not having a built-in maps application at all was not an option.
The question you need to ask is, did a pushy management style make Apple Maps better than it would have been otherwise or worse. I think it's hard to argue that pushing the team hard didn't produce a better product than they would have had otherwise.
I honestly think Jobs thought that no problem is hard enough that you can't cajole or threaten staff into just working harder. You see it all through his career.
We are hitting territory where there are now problems so hard that it will now take time to create the software. He never got that.
I know, I couldn't believe the chief of a major software product at Apple would talk smack about the previous team. It's what a newbie might do when he has screwed something up. Classic dodging.
Something always make me scratch my head. Is there any proof that actual people use and value Google and Amazon assistant more than Siri? Or is the leadership only designed by autoproclaimed experts and funky ad-hoc metrics?
I totally confess that I use Siri for only 3 things:
-Desactivating all wake up alarm during my morning poo
-Setting timers while baking
-Sending a « I’m coming » text to my loved one before riding my bike home
Yes, this is a very dumb and limited usage. But I don’t see Android users around me doing incredible things that I could be jealous of with their voice assistant. I am missing something? Or maybe it’s because I’m not in the US and we still have to catch up with the trend of using voice assistant more?
Google's voice recognition is excellent. I trust Assistant to help me throughout my day with a number of things now:
- Cooking & laundry timers
- Reminders (remind me to do X when I get home)
- Directions (Take me to the nearest X, take me home, etc)
- Checking hours of business (what time does X close today)
- Initiating phone calls (Call X)
- Checking weather (is it going to snow today? How cold is it right now?)
- Conversions (what's X ounces to ml)
- Quick stock checks (what's X stock price)
- Playing music around the house (play X on the kitchen speakers)
Additionally, I've found myself using Android's voice recording capabilities to dictate text all the time. It's just so much faster than typing or swiping.
I haven't an iPhone for a few years, so I can't compare with what Siri is capable of today.
Checking hours of the business works well. The integration with Yelp has that done pretty well.
I've actually been super impressed with Siri. You can have it call you an uber, get the leading scorer from your favorite hockey team, contextually ask followup questions about a person you were searching, etc. I haven't used Siri in a long time (besides timers, weather, etc) and it has come so far since I did.
I use it for these things, they're ok. I occasionally try to get it to control an app, it usually fails, but that's my dream state: I'd like to be able to say, "Hey Google, open app X and do Y task."
But I don't know if there's enough incentive for apps to add an assistant interface. How easy is it to add commands on your app that it will accept via assistant? Are there common APIs or structures for this to shift the incentives, allow it to proliferate app by app, rather than waiting on the assistant developers to manually do each one?
After that we need to go back to Aza Raskin's project, uh, Ubiquity? Anyone could build little scripts, verbs, that helped build a natural language input from the ground up. That's how we get to stuff like, "Hey Google, take a screenshot of news.ycombinator.com and email it to Joe."
I can open app and ask for actions on Google assistant on my Pixel phone. I use things like
- "Play <song> on Spotify"
- "Send message <message> to <someone> on WhatsApp"
- "Take a selfie or screenshot"
Maybe app actions are only available on Pixel devices.
No, they are available on all devices (at least on my Nexus 6P too). It's only a matter of which apps are implementing the "Actions on Google Assistant" and for which operations.
I want every app to start thinking about voice ux, but that's probably down to the individual apps, and you're right that some do. I can play some things on Stitcher, say, though I can't remember if I have to use podcast names specifically or if I can use my playlist names.
Some Google apps don't seem to support it though.
Ok Google, log my weight on Google fit as X? Search results page.
Ok Google, play (my playlist name) on Google Play: plays unrelated radio station inspired by my playlist name, with sometimes comical results.
> Are there common APIs or structures for this to shift the incentives, allow it to proliferate app by app, rather than waiting on the assistant developers to manually do each one?
I think once somebody finds a good API for that, it will be a watershed moment.
Amazon Alexa has apps called "skills", but they expose their own voice interface which is always inconsistent.
I always thought that Android's intents could be a good API for voice commands. It actually surprised me that we never really got much deep app integration with voice on Android phones given that API.
Sigh. That’s the main reason I won’t move to the s9. I already have an iPhone and Siri just sucks. Why would I want another sucky assistant? It’s like hiring the wrong person... twice.
I really wish the manufacturers would only have a few features, like weather or timer or location (home, work, school) that were truly bulletproof in execution. It feels like they are trying to do too much most of the time when I just want them to not suck consistently so I can actually use them reliably.
Interesting. I guess I thought that discussing how to model a software problem would be an acceptable and interesting conversation on HN. Do any of the downvoters care to contribute in a productive way? I'd be curious to hear what people found disagreeable in my question.
Nest doesn't have to be accurate to the minute to be effective. A reminder based on before I leave work would be.
For instance, if I don't know what time aim getting off of work and I wanted a reminder to call my wife when I got to the car, it could do that completely accurately based on phone connecting to the car. It could only guess about doing something before I left work.
which will, of course, trigger the reminder as you walk down the street away from home, forcing you to go back and get the change of clothes that you forgot.
Android's voice typing has been weird for me. It used to work pretty well offline, then a few months ago all the punctuation stopped working. Now it's working again and the overall recognition is better than ever.
This is strictly anecdotal but a friend of mine is an exec at Facebook and we were recently discussing phones. He had an iPhone for a few years and has now had an Android for a couple years. He said that by far the most drastic difference between the two platforms is Siri vs. Assistant (or whatever Google's assistant is called in the phone). He said it is no contest how much more useful and functional Assistant is, compared to Siri. We did not dwell on this discussion and I otherwise find him an objective, un-biased user of the two platforms, so there you have it.
Cant speak for the other person, but as someone who's used iPhones exclusively since the original, Siri makes mistakes with sufficient frequency that it's subjectively just bad. I can't quantify what percentage of requests fail to be understood correctly, but it's sufficiently high that the frequency is annoying. It's even worse when using the Apple Watch (Series 3).
Unexpectedly, this can sometimes be a positive too. "Hey siri, turn my lights <pink/orange/purple/blue/red/white>" has on multiple occasions been interpreted as "Hey siri, turn my lights", which then results in the lights being turned off. This is hugely entertaining when you're demonstrating your new Philips Hue bulbs over FaceTime, and the person on the other end now just sees your camera go black instead.
I often want to set a timer for exactly 50 minutes so I know when my laundry is done.
But siri always interprets "Fifty" as "Fifteen". Pretty much every single time, it always makes the same mistake, no matter how clearly or slowly I try to pronounce it. I have to say "Hey Siri, set a timer for five zero minutes" to get it to pick it up accurately.
I'm a native English speaker, though whether I have an accent depends on your perspective! I can do either American or British somewhat convincingly, and I usually don't have any trouble being understood by Siri.
I suspect the problem with "50" is that whatever ML underpins it is heavily tuned/biased towards "15" because a lot of people probably set 15 minute timers all the time. 50 minute timers are much less common, I suspect?
Americans think I sound British.
British think I sound American.
Siri fails less often when set to British instead of American for me, but still fails too often regardless.
Walt Mossberg has a funny "series" on his Twitter where he screencaps all the blatent failures he has with Siri. Many times, he or a follower will attach a screenshot of Assistant completing the action. It's not scientific by any means, but it's fun to go through and provides some concrete examples. His biggest issue seems to be that Siri is unreliable - there are queries that work, but only sometimes, and that makes the experience of using the product terrible.
If Apple engineers who work on Siri were required to do all of their corporate communication with Siri exclusively, Siri would rule inside a year.
That is assuming they were trusted to work on it instead of report to status meetings, or meetings to their dotted line manager, etc. Apple was built around Steve. It’s going to take a while.
But it's not personified to the degree Siri is. I noted this elsewhere in the thread.
Doing that bit of theatre, with some humor, pop culture relevant things, even just good responses, appears to matter more to people than "it can do it", and "it actually did it" hit rates.
OK Google needs a name.
Google is this big, smart thing. People get that.
But, "Ask Siri" has less friction and appears easier to identify with than giving an order to a corporation is, "OK Google."
Maybe just, "Hey Google" would make a subtle difference.
And, in typical Apple fashion, "Ask Siri" is all you need to know to attempt it.
Further, discovery has error. If that can be best guess, combined with some fun, people will play.
"OK Google" lacks these human friendly connotations.
A name would be a follow, hard for Google, but... If it were me, I would flat out own that, get the name, and run ADS. "We hear you... meet [name not common, familiar sounding, friendly, easy to say.]"
There has to be either an old world name with good connotations, or a variant that is catchy, fun, gender fluid.
Lol, use goog. Mostly kidding, but a tiny bit serious.
Better, much better is out there.
Or, let the user name it. Why not?
"OK Google, can I call you inigo?"
"Sure, I will answer to inigo. And OK Google when I'm not properly introduced, OK?"
I'm not a huge fan of Siri's 'jokey' personality, but I could live with it if it actually worked. As it is, it just feels extra frustrating to have a 'personal assistant' that is extremely dumb and cracks jokes.
Supposedly these devices have low-power chips that can only recognize certain pre-configured names (like "OK, Google"), and then wake up the main CPU to do the rest.
How about continuity of communication between Google Home & Phone.
In my kitchen, I can ask Google Home "how far is X" .. and it responds with a distance/time and pops up a notification on my phone to initiate Google Maps navigation to the target. Useful if I'm in getting ready to go there.
Does Apple's HomePod have any such continuity with an iPhone?
This is what I experienced years ago when I last had an iPhone. Google's voice recognition is exceptional. I can mumble words and it still seems to contextually figure out what I was probably trying to say.
Another one is when Siri tries to update an existing reminder when I say “create reminder”
I have Apple Music and I’ll often use my little Echo Dot when I want to play a song just because I can simply talk to Alexa from across the room and it will hear me.
I’ve started using Siri to control music when I drive. “Hey Siri” followed by “pause music”, “next track”, “go back”, “play music” all work as expected.
I've yet to meet anyone who got siri to do anything, let alone useful, on the 5th try. I can mumble microcontroller part numbers to google, or other odd non-standard words/phrases and 90% of the time its correct on the first try; half the time i'm astonished that it even was able to figure out what i said
It's how reliable it is when it does it. I use both iOS and Android, and Google Assistant is so much more reliable that I find I use it. With Siri I find I end up having to intervene so often it isn't worth it.
I do have an Australian accent, so maybe that is a factor.
All of those Siri can do maybe half the time. In my experience, it's a completely unreliable service. Sort of embarrassing when your voice assistant can't perform the most basic possible task on an iPhone: locking the screen.
I ask Siri for the closing time of different stores all the time. "Hey Siri, when does X close?" works for me. On Siri most things don't work for me but that does.
So far unmentioned in this discussion is the only reason I use Android voice assistant: Android Auto. When I'm driving my car I obviously can't type on my phone, so I use my voice for everything: changing the navigation destination, calling for music to be played, dialing the phone and sending text messages, amusing children by asking it stupid questions, etc.
I carry an iPhone on my person but I use Android in my car. I don't ever find myself using Siri, but I use Google Assistant a lot. I tried using CarPlay a few times but the voice recognition just wasn't there. I distinctly remember trying to get it to navigate to a restaurant called Da Nang Quan and it just was stumped. Google can recognize these kinds of things. I've also noticed it has no problem recognizing the spoken commands of my young children which generally speaking Siri can't figure out at all.
There's an Ike's Sandwiches like literally 30 feet from my office in San Jose, and I always like to ask Siri when it closes. To this day, I've probably tried on a dozen occasions and it has never once gotten it right. I'm a native English speaker with no accent.
I tried again just now, this is what I got:
Me: "What time does Ike's sandwich close?"
Siri heard: "What time does Eichs sandwich close"
Siri responded: "Hmmm... I don't see any hours for Rick's Sandwich Shop on Elkins St, Boston"
Things like this is why I've all but stop using Siri.
OT but what do you mean by "English speaker with no accent"? Clearly every person has an accent, and every system that understands language must be able to understand the various accents that people have.
In the US, people say "no accent" to mean what the rest of the world calls an American Accent (even though American Accents change with region, just like any country). It's a very US-centric way of looking at things, but means no harm.
I've lived here in the US for 25 years and I notice the phrase every single time. :-)
Right, you caught me :) what I really meant to say is American English speaker with no strong regional markers (i.e. Southern, New York). Maybe slightly Midwest, but I haven't been back in decades so it's pretty weak.
More importantly, I was distinguishing because while Siri should be able to to understand various accents, it's struggled with that. When it launched, there were only two English variants (US and UK), so if you were a native Indian or Australian English speaker (for example) your experience would suffer. Now, Siri has support for 9 different English variants so they've clearly come a long way, but I remember it wasn't always the case.
There is no such store as "Ike's Sandwiches". There's "Ike's Place", and there's "Ike's Love and Sandwiches". You might have better luck if you use it's actual name.
That was part of my point though. Siri should be smart enough to understand my meaning (and Apple heavily marketed that aspect of it). If I put "Ike's Sandwiches" in Google (or even Apple Maps) they both know what I mean. Siri somehow thinks I'd be interested in a restaurant over a thousand miles away.
We bought an amazon device to use as a kitchen timer. It sits right next to where everyone charges their iPhones. Nobody uses Siri. It’s sad. I want to love her, but I honestly can’t remember the last time she was useful, she only works with apples’ ecosystem (and I refuse to use maps it’s tried to kill me on multiple occasions), and I can’t honestly remember the last time I was impressed with her progress forward as a tech.
If the sum of 20 years of voice technology is a glorified stopwatch, you got to wonder...
So I just tried to replicate the query for the restaurant Da Nang Quan (which is in Oakland California) and this time it actually makes a sensible interpretation of my question ("Danang Quan") once out of five times, and got close another time ("Danang Kwan") and got complete nonsense the other three times. Unfortunately all three of the results it tried to give me for the nearly-right questions were in Vietnam which again is not anything like the answer I needed.
I recently switched from being an iphone user to a pixel 2, and I can say that the difference between siri and the google assistant is signifiant. Enough so that I rarely used siri and I use the google assistant every day.
It isn't that the google assistant is more featured packed, more so that it just works much better. Especially with slightly more complex queries.
I use Google Assistant all day, and know people who do the same. It's actually amazing, and although Apple was the first one to add any decent voice assistant, Google was the first one that understood my accent at all ("Indian English"). When we had that, it was a shame to even ask any question to Siri. Now when Siri has an Indian English version as well, I still compare it all the time with my friend's latest iPhone, and it still sucks. It understands me every time.
> "Is there any proof that actual people use and value Google and Amazon assistant more than Siri?"
This is a common confusion among iPhone users. I suspect the reason is that if all you're used to is Siri (and it's myriad disappointments), it's hard to conceive of the the quantum leaps that is Google Assistant in product polish and usefulness.
First and foremost is the near flawless voice recognition. I, with my India accent even, can just say things to my Android device (and now, my new google home) it just ... gets it. It's spooky.
Then, after it recognizes the question/request correctly, it is very likely to take the correct action with a high enough probability that it's now almost a surprise to hear "Sorry I don't now how to do that yet" when it does happen.
Outside of the active queries, there are things it does automatically that are at the level of what a particularly attentive human assistant might do for you.
It'll check your calendar and remind you that you should start driving for your 4pm appointment because the traffic is unusually heavy on 101 today.
Or that your flight tomorrow morning has been cancelled. It knows about your flight because of the confirmation email in your gmail account.
Remember, you have to take no action for this sort of reminders. Just have to opt in when assistant is setting itself up on your new phone.
There are myriad other such examples. It has saved my bacon numerous times. We're truly living in the future.
You nailed it. People think this technology is not that useful because they have only used Siri and Alexa. Use the Google Assistant and you will see it is very helpful.
I have used all three and can say from personal experience the Google Assistant is head and shoulders in front of the other two in UX.
Siri is something that's occasionally useful, if you think to use it.
Alexa and now Google Home can integrate themselves into your whole life. Alexa is the primary interface for the home automation at my home and my girlfriend's. Her 9 y/o kids can use it. Our 67 y/o mothers can use it. We've got a baker's dozen of Alexa-enabled speakers between us, covering all of our living spaces and half the bathrooms.
They get used constantly.
Many of our use cases are "trivial" but having a voice assistant everywhere instead of tied to phones / tablets greatly changes their utility.
I've had similar confusion based on the anecdata: none of the android users i know (including myself) ever use the Google Assistant. Or if they do, they don't do it in public. most iPhone users i know use siri at least a little bit. So is Siri actually better? Or is there something else going on that makes it more usable even if it's technically not as "good"
And the quantity of articles complaining about siri would seem to back up the idea that siri is getting used a lot more - Google Assistant is certainly not perfect, it's got some glaring flaws [1]. If people were actually using it, they'd be complaining. The lack of press coverage about Google Assistant's weak points is not a good thing.
I have a related anecdote. After being on Android since version 1.5 I’ve recently switched over to an iPhone (8) due to privacy and security concerns. Thats when I begun using the voice assistance for the first time. For me, it was a trust issue - I simply had an easier time trusting Apple’s Siri with my voice. On Android I would disable every non-essential Google feature that I could. I don’t feel like I need to do that on IOS.
I use google voice a ton. For dictating messages while in the car, and for setting reminders and calendar notifications while walking. Probbably adds up to around an hour of speaking a week.
Would say your friend is unusual. People that have GA I know use it all the time including myself. I also carry an iPhone and never use Siri like never.
The things you listed are my primary uses of Google Assistant, but here's some others. What time is it in [location]? What's the temperature outside? Change the temperature inside (nest). Turn off all the lights.
Really the only thing you can't do with Siri that I'm aware of is home automation. But I could be wrong about that.
Siri can do home automation with HomeKit and it actually works really well. The disadvantage here is that HomeKit has less supported accessories because, as I understand it, it's less open/requires certification of the accessory.
That said, for engineers in the crowd I know that I and others get around it by making a HomeKit RPi adapter for things that don't support it...
I thought Apple required HomeKit devices to go through a certification process that includes an audit of device security controls (things like secret management, crypto, default passwords, etc) — and very few of these low-cost IoT devices worry about security.
Apple won’t certify them, but there’s not yet a mass market for secure IoT devices (which would necessarily cost more due to the extra engineering required). It’s a bit of a chicken-and-egg problem.
I recently got a certified smart lock, and it's been superb. Any communications outside my network are handled through Apple/HomeKit and I trust them a lot more than some random Chinese IoT supplier.
> That said, for engineers in the crowd I know that I and others get around it by making a HomeKit RPi adapter for things that don't support it...
Yep, they're using HomeBridge. [0]
I'm planning on doing this for my garage door (with a little WiFi dev board) and for a video doorbell (with a RPi with camera). Others have used HomeBridge with a cheap $20 Chinese IP camera to do the same thing, although you have to install an alternate firmware if you don't want the camera to phone home to China... :O
I have a Kwikset Kevo that I enjoy. If I have guests they just install the Kevo app and I can send them temporary keys that expire. I don't use the integration, but apparently it's integrated with Alexa.
Sorry, slow to reply. It's a Schlage Sense. Major selling point for me was that it has a keypad, so I can give non-tech savvy folks an easy in (plus backup in case of flat phone). If you don't want that, there are nicer looking one.
> Is there any proof that actual people use and value Google and Amazon assistant more than Siri?
I found at least one measurement report: As of July 2017 Verto Analytics showed Siri on top[1] with 41.4M unique users per month (but falling), and Alexa with only 2.6M (but rising fast).
Well from my experience my wife never uses Siri on her iphone and always tells me to ask Google on mine even when she has her iphone in hand. I always tell her just ask Siri and her response is always, "you know Siri sucks". I honestly know this but only tell her to ask Siri to annoy her for interrupting me to ask Google something on my phone.
Honestly my own experience makes me wonder how many of those "users per month" either accidentally invoked it by holding down the button on their phones or had Siri just pop up and start yakking because its hotword detection went wrong.
My company had to switch from Microsoft to IPhones. I hear Siri every day. Usually followed by swearing.
When I got mine, I was trapped in some Siri -> Lock Screen -> Siri combination. "Siri go away" and "Siri show me the home screen" didn't help either. I just wanted to get on the home screen but since there is no Back-Button...well it's disabled now.
One of my favorite things that Alexa can do, which Siri still can't, is integrate with my Logitech Harmony and SmartThings hubs. This lets me control all my AV devices and Lutron Z-Wave lights using voice commands.
I hear now there's an unofficial bridge (runs as a NodeJS service) called Homebridge which might get Harmony working with Siri, and Lutron has newer lights not using Z-Wave that are Homekit compatible, but when I was shopping for all this stuff several years ago Siri /HomeKit support was abysmal.
It also helps that Echo's microphones and speech recognition is much better at far distances, with background noise, than "Hey Siri" (and assuming I don't have my phone in my pocket). At this point, I pretty much only use Siri when I'm in the car, and that's the environment where it seems to perform the worst (long delays to start listening, frequent misunderstandings, request failures).
Smartthings is the platform to beat for Apple. It’s simple, easy and incredibly powerful. I can’t understand why Apple doesn’t apply their magic to the model and just knock it out of the park. They have all of the pieces and the ux experience to make it exceptional.
Personally, I get the most mileage out of Google's voice assistant when I'm driving.
It's a situation when I can't (well, shouldn't) use the phone's UI, so it's great to ask it to:
-find a place and check its hours of operation
-navigate to said place
-find a gas station along the way
-etc.
On an occasion, I've asked it to add things to a calendar.
As a lazy person, I prefer to do things with the minimal number of steps. Scheduling things in a calendar requires an inordinate amount of clicks. On the other hand, it's something that a CLI could be well-suited for. For such things, voice control is natural.
As far as I can tell, the only advantage Siri has over Google Assistant is its integration with Wolfram Alpha. If you ask Siri "What is the angular diameter of Mars?" it will tell you the current angular diameter of Mars. Google Assistant can't do that. Apart from the Wolfram Alpha connection, Siri is inferior in every way, IMHO.
Agree completely. Siri works near perfectly for my basic uses: Alarms, timers, reminders, weather, stock prices, music, calls, texts etc. I concede that Google and Amazon's assistants are more advanced, if the measure of advanced is the number of gimmicky joke responses and Easter eggs they can say during a social gathering.
I see us at the beginning of a path that grants magic powers, controlling everything with our voice, or even our intention.
That requires machine learning, which requires data, which requires data collection, and the closest one to Google there isn't Apple or Amazon, it's Facebook.
My wife clicks the shutter on her iPhone and without touching an additional button walks later into our family room and ask for fine detail in photos and they appear in 4k.
We also have a 4k Chromecast and a Google home.
Just buy, plug in and log in. She already used Google photos.
Most ordinary people I know favor Siri despite it's differences.
The "character", theatre, framing, if you will, with Siri is superior.
Few people in my circles use the Amazon device at present.
Somehow, people like Siri. Siri is an agent in their minds and that is how they treat Siri.
The "OK Google" is one too, but not as well personified.
If I were to hazard a gut guess, based on an anecdotal sample, personification, as well as relevance to pop culture are big draws and can make up for technical gaps.
Personally, it's no contest. I use Google voice constantly. Works well enough for me to continue to invest in the experience.
You can probably point to a lot of quality failings at Apple as starting with Steve Job's death. Apparently he failed to teach and instill a culture of high quality and instead just focused on micro-managing everything himself. Steve Jobs may go down as both the best and worst CEO Apple ever had.
Not true at all. Look up the story of Apple University. Jobs can't run Apple from his grave. He tried his best to leave Apple with the right culture and leadership in his view. And for the most part, Apple has done really well since he passed. No one is going to get 100% of things right 100% of the time, and Jobs is no different; things would have inevitably gone wrong under his leadership, too.
Exactly this—a particular anecdote from the biography describes his motivation. Apparently when the HP Touchpad failed, Steve Jobs viewed it as a tragedy more than a triumph:
> "Hewlett and Packard built a great company, and they thought they had left it in good hands, but now it's being dismembered and destroyed. It's tragic. I hope I've left a stronger legacy so that will never happen at Apple."
In creating Apple University he was really trying to instill a legacy that would outlast him. That said, when pulling out the "Steve Jobs never would have let this happen" card, it's also important to remember that MobileMe happened on Steve's watch, as did the iPod HiFi.
I still have a working HiFi in my home gym. It’s a solid piece of engineering, especially for its time. I don’t know why people love to hate it so much.
I feel like sentiment toward the iPod HiFi is less "hate" and more "why does this exist". It was a perfectly solid piece of hardware, but Apple tried to enter a market that was already saturated (iPod docks) with a far more expensive product that didn't offer any significant advantages.
Jobs screwed up plenty of times. Look at the whole history of NeXT Computer for example. Software so far ahead of everything else at the time, but totally screwed up getting it into the market.
How was NeXT a failure when it was acquired by Apple and is the basis of macOS, iOS, watchOS, and tvOS? It's currently running on over one billion devices.
One thing I feel nearly certain is that had Steve Jobs been alive, Apple would have never under any circumstances released a flagship phone and a flagship laptop simultaneously that, out of the box, had no way of connecting to each other. I still can't believe that happened.
I disagree completely. By the time they shipped with incompatible cables / ports, wireless backup with iCloud, wireless sync with iTunes (who even "syncs" at all anymore...), and even wireless deployment and debugging had all been implemented.
Steve wouldn't have wanted you to connect your phone to your computer in the first place. That's why iTunes got wireless sync back before everything was in the cloud. The only really remaining piece was developers needing to plug in for deployment and debugging, but who cares - they're developers, they'll manage.
Yet, so many people plug their phone into their computer for the simple reason of charging it. Now perhaps wireless charging is coming, but the Apple cable fiasco sure inconvenienced anyone with that habit.
I don't see why. The Power Mac G5 still came with ADC for a year after they switched all their displays to DVI. (And the ADC-DVI adapter was $149.) Admittedly they weren't released simultaneously.
I dunno, Apple’s has some pretty terrible CEOs. Jobs might have been a dreadful micromanager and possibly a bit of an arse, but he did disrupt at least four major industries and lay the foundation for the most valuable public company ever.
Whereas Sculley, Spindler and Amelio were a parade of failure from start to finish.
I doubt very much he would have lost his job today for any of his behavior.
Look at what the U.S. President has and continues to do, and he still has his job, and a much more important and public one too.
Politics aside, Jobs' turnaround of Apple into the most valuable company in the world (for awhile, recently) would have ensured he remained at the helm for as long as he wanted.
What behavior are you talking about that would cost him that position today? Jobs wasn't physically assaulting employees. It isn't illegal to be mean or rude to people. At worst it might cost Apple a few workplace lawsuits/settlements.
Comparatively I think we could safely say that Microsoft transition from a retiring Bill Gates to Steve Ballmer didn’t go as smoothly as what happened at Apple.
Wow so much Siri bashing in here. I’m a daily driver and I suppose an outlier working on a now aging 6S+ but Siri has been way reliable and generally fast outside of what I’d call “harsh” environments like outside on a busy street or with poor reception.
I’m able to consistently perform:
- Reminders with or without location tags “add eggs to grocery list” “remind me to check oil when I get home”
- Calendar events “create event about mr Smith tomorrow 2pm” “create meeting with bob March 12 noon” (“with” spawns an invite, I use “about”) “move my 2pm meeting to 4pm” “what’s going on tomorrow”
- weather “what’s the weather like this week” “what’s it going to be like in Manhattan beach on Saturday”
- Calls, by name, number or direct to speaker “call my brother” “call 213-555-1212” “call my dad on speakerphone”
- FaceTime “FaceTime my mom”
- Email “Email my brother” Subject? “take out” Message? “Grab me a burger” ready to send? “Yes”
- Messages, one liner or conversationally “tell my brother I’m running late” ready? “Go” or “message my sister” message? “Call me.” Ready to send it? “Affirmative”
- Math “what’s 128 times 10 million?” “What’s 10 days from next Tuesday?”
- Time “what time is it?” “What time is it in Florida?”
- Dictation “blah period new line I swear exclamation mark new paragraph furthermore comma open parentheses really close parentheses all caps cats are all caps really all caps really cool question mark” I frequently do complicated technical dictation using Siri, it’s not a problem in fact this particular sentence was composed entirely with Siri with zero post correction.
All of these work consistently and reliably, for me, using either PTT style hold and speak, long-press and wait for the beep, or “Hey Siri” invocation.
Seriously, I use all of this all day every day on an old iPhone, I’m not blind but if I was I’d consider it reliable enough for those purposes.
I don’t do a lot of location, navigation, or searches with Siri because generally, performance is not optimal and you’re not able to focus the intent domain clearly enough with consistent verbiage.
Neither, nor. The loading bar or what that is called that appears after the text is recognized but it’s waiting for a response just never finishes. And yes, I’m online.
Call me oldfashioned but I find it neither smart nor acceptable to require reception for should-be-simple things like time telling, reminders or math/calculations.
The last one always irks me in browsers too, when I try to calculate or convert something in the adressbar and it doesn't do anything because my wifi dropped.
I also use it daily and loathe Siri. I went from Android to iOS with the 7 and Siri is my only regret.
For every timer I set Siri gets about 1/6th wrong and for every text I try to send (short messages like 'on my way home') are almost a coin flip at this point. My biggest annoyance is Siri just doesn't handle natural language cadence like Alexa or Assistant. Everytime I mess up a Siri query, I end up having to repeat it slowly and loud for her to (maybe) get it right. Google Assistant required no changes in my speech to understand me.
In my family we call this the stalker app. It's very good for finding out where specifically where someone is, even in malls and other places you might not expect GPS location to work all that well. (Thanks surveil– I mean wifi!)
It is a system where you have to give permission for people to see your location, and you can revoke it at any time should you wish, but it's still real creepy. The good for us (knowing where the immediate family is at pretty much any time) outweighs the bad, but I wouldn't dream of sharing my location like this with a friend – even a very close one – let alone strangers.
I'm using google maps location sharing for that. It's built-in and easy to toggle.
It's pretty useful for knowing when your partner will arrive if they're driving a long way, or for things like "going on a long bike ride, if I break down / go missing, you know where to look".
The cofounder/CEO of Siri contests the depiction of the scalibility issues in the original Information article:
> This statement, wholly false, was made by the architect and head of the biggest launch disaster in Apple history, Apple Maps. In reality Siri worked great at launch but, like any new platform under unexpectedly massive load, required scaling adjustments and 24 hour workdays.
Siri is very dumb. I use for for reminders and alarms.
I speak fr-ca as opposed to fr-fr, and some conversations go like this:
Siri, call X on cell
Sorry, can’t call X
Siri, call X on mobile
Calling X on cell
In fr-ca, we talk about cell phones. In the contact app, the mobile phone is registered as cell phone, but in fr-fr, they use mobile.
I also sometimes have this conversation:
Siri, rappelle-moi de mettre des mouchoirs sur la liste de matériel du camp
(Siri, remind-me to put tissues on the unit’s list of material)
Do you want me to create a list?
I only have one list, on purpose. I have no way to tell Siri that I never want to create new lists.
It appears to me as if some words have more importance than others, such as the word « list ».
Those are my biggest annoyances with Siri. I will continue to speak and use my iPhone in fr-ca, to promote my language and improve the software by providing data points about which languages are used.
Eh, I'm not sure. The percentage of iPhone users worldwide who can speak some English is probably pretty high. As a non-native speaker myself, I'd certainly prefer a reliable assistant in English over an unreliable one in my native language.
FWIW the assistants probably would not recognize most of these people's accents. Not to speak of whether people would be inclined or not to make the effort to speak a foreign language to their assistant.
I'm French, have a good level in English, all my OSes are set in English. I used to use Siri in English but stopped cause it was weird. The ability to select a different language for Siri than the system language is a blessing.
This is my same experience. The only things I can get Siri to do reliably is to route me home, or to do simple math problems. Also very simple short term reminders.
The title seems poorly worded to me. I read it as "Why Siri underlies Google Assistant and Alexa," as if Siri were the backing technology. Should say something more like "Why Siri is lagging behind Google Assistant and Alexa."
Siri's amnesia is the biggest painpoint for me. Lack of context and compounding errors when I can't extend an initial sentence makes it entirely unusable.
Anything useful requires speaking a longer sentence. Siri will interpret most of it but get one (key) word wrong, but Siri won't let me clarify my statement or add a statement without starting all over again. So I can't use Siri on top of it not understanding my accent (I've tried Australian, America and English accents and using "G'day Siri").
I know there is an NLP engine in the background deriving entity relations - just let me add some more to the same context and remember it for 30 seconds and I'll use Siri a lot more.
E.g. "Hey Siri, search the web for minumum cut graph algorithms."
Siri: "Now searching for minumum cut Graaff I'll go rhythms."
Me: "Graph, not graaff."
Siri: "Hello, I'm a web server!"
Me: throws phone out of window
After switching from Android to iOS one of the things I missed most was Google Now in comparison to Siri. Before I was far more likely to say "Okay Google" and get surprisingly contextual results.
I will believe that someone is trying to improve Siri when the implementation is updated so that the time between when I push the button and when I can start speaking is reduced to the minimum possible with a only a short reliably deterministic delay length.
The inability of Siri to provide me a way to know when I can start talking absolutely (waiting for that horrible beep is just insanely terrible) completely breaks the experience. The fact that no one seems to have noticed this flaw enough to figure out a fix suggests to me that the people developing the project do not have sufficient understanding of the interaction model to ever improve it ...
There should be zero network requests that need to be complete before I can start talking. If Siri is unavailable for network reasons tell me after (or during) my query — but don’t slow down every single interaction and add ridiculous amounts of uncertainty to the responsiveness of the interface for the sake of the uncommon cases ...
You don't have to wait for Siri to respond and you haven't had to for a while. Just say "Hey Siri" followed by the command with no pauses and it will handle it just fine.
This I think was poorly communicated. I found it out by accident, prior to that it seemed like Siri had taken a nosedive. "Hey Siri...." waiting.... followed by the cancel tone or sometimes "Yes?" - then the cancelled tone.
I might try 'Hey Siri' again now that I'm on iphone X. I'm not particularly a fan of always on mic's and the experience in the past was never sufficient for me.
(side benefit -- as a result of this conversation I have discovered that its possible to turn off the 'activate siri via side button' mis-feature -- HURRAY! I can count on one hand the number of times where I invoked siri this way on purpose and on 100 hands the times its happened on accident. [Yes that does mean I almost never intentionally invoke siri])
This isn't true. On iPhone 6, running the latest iOS version, with a brand new battery, there is a significant delay before you can speak. Instead, you have to hold the home button down.
Team dysfunction seems to be a part of it. The 'head' guy for Siri, Dag Kittlaus left Apple to start Viv Labs in about a year since Siri was released. Their assistant Viv, looked pretty okay[1], and was since acquired by Samsung[2].
I feel if he had stayed on, Siri would have been better.
Samsung seems to be trying pretty hard to get people to use it[3]. It's relatively new so I'm interested to see where it goes, especially since it has the lowest market share.
Not to mention Williamson shit-talking the Siri team to this reporter:
"Williamson denies the accusations that he slowed Siri development down and instead cast blame on Siri's creators.
'It was slow, when it worked at all,' Williamson said. 'The software was riddled with serious bugs. Those problems lie entirely with the original Siri team, certainly not me.'"
I remember that Viv got positive press when launched. But how good did it actually become? It seems to me we'd be hearing more about Samsung's voice assistant if Viv did really live up to the hype.
Siri: I've notice a massive slowdown in the speed of Siri and an actual deterioration in voice recognition (but I've just assumed that this is just part of the general deterioration of iOS)
I would say that Siri has not only fallen behind but gone backwards.
I'm still running an iPhone 5s ;) because i dont like the direction of Apple have taken certain things (and Android terrifies me with the sheer volume of personal data collection)
Yes the data collection volume is pretty massive. I run a raspberry pi at home running pihole to block DNS requests and the amount of metrics and dialling home devices do is pretty staggering.
That being said, my Windows 10 and Apple devices all do a similar insane amount of dialling home all the time so I don't think it is unique to Android.
In fact, you should probably run O&O ShutUp10 on Windows to disable all the telemetry (does it really need to know how you're using your keyboard?????), GlassWire or NetLimiter (not tried them) and install Little Snitch for Mac (I use this and observe much dialling out).
>>"It was slow, when it worked at all," Williamson said. "The software was riddled with serious bugs. Those problems lie entirely with the original Siri team, certainly not me."
As Steve Jobs said, once you become a VP you can’t make excuses anymore. You have crossed the Rubicon and need to start taking responsibility for the failures of your team.
My impression of Siri is that it was effectively R&D early on by the team that made it. To propose that Apple would purchase a company, and then expect an R&D product to support their level of scale without serious infrastructure changes, is a bit absurd.
To say what cheald said in a more concrete way: "now we know why Siri was so dumb for so long" is definitely the title that I am currently seeing. The entire concept that there is a single title that somehow should match across sites is a weird concept made up by Hacker News that doesn't exist in the real world and probably causes more harm than it does good.
It's sort of sad to read a story about a story, and not be able to figure out which problems were created in the source vs. the retelling.
Mashable: "Right now, it looks like Siri won't be blown up and a [sp] rebuilt". Yeah, well, that'd most likely be a huge mistake[1] depending on your definition of "blown up" and "rebuilt", and also Siri's already been rebuilt twice.[2]
The Information: "One of Apple’s original plans was to launch its speaker without Siri included, according to a source." Of course — there are always contingency plans. To me, this is a much more damning comment about the Siri team than Apple's "vision".
Regarding [1], the move to Mesos was not a rewrite. Some core components had to be improved, but [2] was primarily an infrastructure upgrade that happened to make Siri considerably faster by ditching hypervisors. It was a massive feat that was accomplished by relatively few engineers.
There were always many components under review for reengineering; keep in mind that Siri was released before WebSockets was a thing, let alone RFC.
The snowball effect of Siri's failures originated from iOS Program Management pushing us to expand into as many other languages as soon as possible from the beginning, thus diverting resources from implementing or polishing original "Siri, Inc." features and various "nice to have" improvements to en-US usage.
Source: I worked on Siri from the moment it was acquired, and no longer work at Apple. Richard Williamson's recent comments are pathetic.
Yeah, claiming that moving to Mesos was rebuilding Siri is stretching it a bit. The beauty of the Mesos transition was that most services needed only minor modifications to be moved to the new scheduler, and still get the benefits of better resource utilization.
Regarding building a speaker without Siri, the origin story I read from Bloomberg was that the idea of a great-sounding speaker that would adapt to room acoustics was dreamed up by Mac audio engineers, before Echo was even announced, so a speaker with a smart assistant built-in probably wasn't even on their radar at that point. I didn't necessarily interpret it as a lack of faith in the Siri team to deliver.
How/for what do people actually use Siri, or Alexa for?
I mean I can't think of having seen anyone use them for a long time for anything but to either test them out for curiositys sake, or for demoing their new Alexa controlled drapes.
Hopefully this wasn't too off-topic. Maybe HN should have an ability to create a poll and attach them to a thread. ;-)
Having two newborns has made me come to appreciate Alexa in a whole new way: my hands are almost always full with one (or both) babies and being able to use my voice to control the lights, play music for them, set reminders or alarms for myself, has been a godsend.
While my wife was bedridden after her C-Section surgery, she loved being able to use Echo's ability to call other Echo's in the house (or outside the home) when she needed something. Kind of like an intercom system. The problem with the iPhone is that it was usually either on the charger, or battery was dead, or it was left on vibrate. The Echo was always 100% reliable for her, which is really critical when you can't even get out of bed.
I use an Echo Dot on my nightstand to control my LifX lightbulb, adjusting its brightness between 1% and 100%, as my wife and I prefer extremely dim lights right before bed. When my daughter was younger, I also used the same Dot to set timers when we were trying to give her a few minutes to get herself back to sleep when she'd wake up crying.
I use a full-size Echo in the kitchen to set kitchen timers, to populate shopping lists, and to check the weather occasionally as we're going out the door. I also sometimes play music on it.
Recently, I've started using the Google Assistant on my phone to start music in the car (so as to avoid physically touching my phone while driving), and I've started to decide that it's easier to say, "Hey Google, navigate to 123 main st" instead of using the touchscreen, even when I could use the touch screen.
One thing my wife and I use the Google home for constantly is grocery shopping. We had previously used an app called our groceries that synchronizes a grocery list between multiple people. That's a map where to plug in for Google home where you can just say things to it and it adds it to your synchronized grocery list. We use it all the time when we notice we run out of something.
In my opinion, the problem with these assistants is that interacting with them using voice give you the impression that they're intelligent, but it's just a trick.
When it works, it can be useful for certain tasks, but when it doesn't work, you lose trust in the system pretty quickly, especially if you're in a situation where you expect and need the assistant to work, like when driving.
Are we surprised? Apple is not a software company, and has never been good at it. Apple is a design company, they make pretty things. Their hardware and software looks pretty. They figured out how to make their hardware robust. Their software not so much so.
It's what happens at large orgs, esp ones with leadership problems - they can't move fast to improve products, even when something is obviously not working.
The only thing I use Siri for is on the desktop for starting apps - I've hooked it up to a keyb shortcut. I find its a bit faster than fumbling around spotlight or the dock. Alfred would possibly be even faster.
Google assistant has the best knowledge graph for generic questions, and much better machine learning models for voice recognition and language parsing. I primarily use it for route finding, finding out opening times, currency conversions, sometimes calling people.
Alexa for almost only music, and timers, since it's hooked up to my kitchen speakers. It's terrible for general knowledge questions.
What? Think you drank the Kool aide and then poured it all over yourself and took a bath in it.
I carry both a iPhone and a Pixel 2 XL and there is no comparison between Siri and the Google Assistant. Now a bit of a Apple fanboy but can be realistic.
> And when she works, like driving directions, people aren't even aware they're using Siri.
What does that actually mean? When people talk about Siri; they’re talking about the thing that you talk to. I don’t think anyone thinks they’re not using Siri when they’d actually are.
Clearly a fanboy, I see. Numbers don't mean anything, technically. Siri is imposed on every iPhone owner. That should account for the numbers.
Also, you trust Siri for directions over Waze-integrated Google Assistant? Good for you player!
As for voice comprehension? I pity the fool who cares compare Siri to Alexa. But what do I know? You probably have a MacBook, an iPhone an iWatch and could not imagine anything better in your life.
All the known techniques for NLP and NLU are mediocre. From Apples perspective, it makes a lot of sense to not invest so much money for such little gain. Google Assistant is supposedly better, but I don’t notice/care. Google/Amazon pour a lot of money into their personal assistant tech, but the end result is hardly noticeable or useful.
Say what you will about Siri, but it was the watershed product that unleashed a new generation of mass-market voice assistants, including later bloomers such as Alexa, Google Assistant, etc. Yes, it was (is?) relatively rudimentary technology, but the founding team was visionary. As for the mismanagement, Apple's bureaucracy is to blame.
I haven't use Siri, so I am guessing here: Perhaps a major advantage of Google Assistant is that Google has more aggressively worked towards recognising and speaking regional accents. Assistant in India quite reliably understands Indian accents and speaks in an Indian 'News' accent.
And Google was working it that space really early. Did anyone here use GOOG-411? A service you could call way back in 2007, before most people had smart phones, and ask natural questions and get search results back.
Obviously that was early Google Assistant prototyping, and also a huge wealth of voice data to mine.
Not being able to flag requests that Siri failed to parse tells me that the Siri team isn't serious about making Siri better.
Just adding a small "That isn't what I asked" button or letting me say something like "Hey Siri, you got my last request wrong" would give Apple a mountain of usable data.
There's actually a way to do this, for a long time. In older versions, you'd see a wavy underline on certain words it was less confident about, letting you choose alternatives. The current version has a "Tap to edit" link.
Problem is discoverability. And having a mountain of "this request was wrong", without knowing what the answer should have been, would require too much manual sifting.
If the Siri infrastructure is so bad I don’t understand why Apple doesn’t already have several skunkwork teams trying to create a fresh alternative for the past few years.
Apple fell behind in voice on Jobs' watch and prior to the Siri acquisition. In fact, it's probably more accurate to say that Apple was behind on iOS progress and facing a ho-hum 4s launch. Hence, the quick 8-figure Siri buy and hurried integration that put them behind the 8-ball on voice ever since.
The broader context is that Apple doesn't seem to get: "It's the 'AI,' stupid!"
As an aside (OT), it's interesting the way Mashable essentially republished paywall-protected content. If there was new reporting there, I missed it. But that's the nature of modern media.
So the takeaway here is that it was dumb because was poorly built?
Also, how different is that "Information" report behind the paywall? It would not surprise me to see Mashable rewriting an article but to do so with paid content seems inappropriate.
I’m going on the record as not actually liking the name Siri. Pithy though that sounds, it’s the Sharon or Ralph name in digital assistants and I don’t even like saying it. Try saying “hey Siri” in a quiet room sometime and feel the awkward.
Wow. It those were his original words, I don't doubt a bit that this guy should be responsible for Siri being left behind.