Given Tesla's abysmal track record on keeping their promises I feel like it is justified to dismiss them, at the risk of being surprised if they do make it.
* Elon has been making wildly exaggerated and over-optimistic claims for a decade and continues to do so
* Tesla has recently made huge strides in capability and has a clear path to full autonomy
And to be fair, many other car companies also promised self driving cars, e.g. Audi in 2014 promising driverless cars by 2016 [1]. It's just that Tesla is still executing on the promise whereas many other carmakers have fizzled out on their ambitions. As the Rodney Brooks article itself mentions,
> As a reminder of how strong the hype was and the certainty of promises that it was just around the corner here is a snapshot of a whole bunch of predictions by major executives from 2017.
I had that laptop and it is the worst computer I have ever owned. As soon as you booted the fans would start spinning. There were sometimes kernel panics when plugging or unplugging Thunderbolt devices.
I have an M1 Max MBP now and it has been absolutely perfect.
My favorite and most painful issue was a bug in USB charging. Sometimes it would fail to charge from my monitor (USB-C) yet it would believe it’s connected. The battery would eventually run to zero and the machine would shutoff without warning. No low battery warning would be shown because it believed it was charging however it was not. Resolved with my M3.
Also fun with that generation is that you can’t plug in a dead laptop and start using it right away. Takes about ten minutes of charging before you can power it on.
Also fun, it would not establish power delivery with my monitor in this state. I’d have to plug it in with a regular charger to bootstrap it. Also resolved with my M3.
Now that it’s aged, the super capacitor for the clock no longer holds charge and the time is usually wrong on cold boot. I wish that was serviceable.
The laptop it was replacing was a terrible Asus computer that literally started falling apart and delaminating after about six months. It felt like it would break a bit more every time I touched it. Asus themselves was wholly unwilling to do anything about it, and they acted like I had been juggling with the damn thing when all it ever did was live on my desk or next to my bed.
It was the third Asus computer I had owned that broke way earlier than it had any right to, and I swore a blood oath that I will not buy another Asus product.
Point is, considering how terrible that laptop was, “annoying thermal throttling” was still a considerable upgrade, so I loved it in spite of it.
The reason for minification is not hiding the source code (which is impossible), but to reduce the payload size served to clients. Web pages (even web apps) are documents fully available to clients where users can choose to view, inspect and even modify their source code.
It's simply amazing. I was looking for a ~$6000 USD 14in laptop with good specs. NOTHING compares to what Apple has right now. I looked at Framework, some gaming laptops, ThinkPads, Dells and most of them would require 16+ inches to get specs similar to a MBP 14 Ultra with 128GB unified ram and 8tb disk. ...
Apple has done an amazing job integrating all that hardware. And I say this as someone who was looking to buy a notebook to install Linux, as its my favorite OS.
So what im doing is put Ubuntu Server Arm + kde-desktop in VMware and use it as my main dev env.
Tiger and Snow Leopard in particular were very solid releases.
Heck, the aluminum Macbooks from that era are still the foundation of Apple's laptop design. And they didn't have the butterfly keyboard fiasco!
But this is a bit of a irrelevant distraction. Apple under Jobs wasn't loved for quality of hardware, it was loved for telling a better story of progress of personal computing. From the iMac "make it simpler by going back to basics, but future-looking basics" to "easier to manage, funner to use music players" through showing how smartphones and then tablets could be far more functional and usable than MS', Palm's, or Nokia's visions. The watch is the next best category-definer since then, and the iCloud cross-device stuff generally feels better-done than competitors still, but otherwise... refine, refine, refine, and slowly add more ads and upsells. Microsoft or anyone else could run that playbook, in a way that they never could match the Apple playbook from 1997 to 2011.
(One side question here is "are there new segments out there waiting to be invented?" which I don't know the answer to. But even so, "becoming just another upsell-pushing, ad-driven, software-subscription-service provider" wasn't a necessary path.)
Snow Leopard eventually became a solid release. At launch it had many bugs, including some that lost customer data.
It’s tempting to compare one’s memory of an old late-cycle OS, after all the UI changes have been accepted and the bugs squashed, to the day-1 release of a new OS today, when UI changes seem new and weird and there are tons of bugs they knowingly shipped to hit the launch date (just like with Snow Leopard). But it’s not really a fair comparison.
iOS and ipadOS have gotten massively better over the years. The gap between them and macos has been slowly closing. Still a lot to go, but so much has improved.
Apple's classic Mac GUIs were beautiful and discoverable, with clear, visible controls/affordances.
Running Apple's "Macintosh" screen saver reminds me that Apple used to care about every pixel. Now even basic user interface elements like the menu bar are clunky, with things like the Window menu not aligning properly (even on a wide display where there is more than enough space.) Menus getting lost behind the notch is another annoying problem.
It seems like Microsoft learned from Apple's original approach somewhat, at least for Windows 95 through Windows 7 (though I think for a while there was a dead zone below the start menu, a fairly obvious mistake), but Apple seems to have strayed from the path with an invisible, gestural interface.
From a UI standpoint, I agree. There’s nothing like the classic Mac interface and its associated Apple Human Interface Guidelines for GUI software. I love Jobs-era Mac OS X, but the classic Mac and its ecosystem of applications were something special.
However, when it comes to UX, stability is a major component, and this is where Mac OS X is vastly superior to cooperative multitasking, lack-of-memory-protection Mac OS 9 and below. I prefer the classic Mac UI, but Mac OS X had a better UX.
Love Atomic Shrimp. There is so much variety in his content, and always a focus on curiosity, trial and error, and learning things. One of the most positive channels on Youtube.
Does it really need to think about the song contents? It can just cluster you with other people that listen to similar music and then propose music they listen to that you haven't heard.
That's one method they use, but "just cluster" is doing a lot of heavy lifting in that sentence. It's why Erik Bernhardsson came up with the Approximate Nearest Neighbors Oh Yeah algorithm (or ANNOY for short)
> We use it at Spotify for music recommendations. After running matrix factorization algorithms, every user/item can be represented as a vector in f-dimensional space. This library helps us search for similar users/items. We have many millions of tracks in a high-dimensional space, so memory usage is a prime concern.
I guess where I was getting at is they do not technically even need to know genres to recommend songs. In practice though, they probably have to know them anyway for playlists, but I assume they can have the song owners provide that when the songs are uploaded, and artists specify it when they create their profile.
I was at a security conference recently and one of the presentations had some TLP:RED slides in it.
I couldn't help but find that pointless. The conference is open to the public, the only barrier to entry being a small amount of money to purchase a ticket. How would that prevent bad actors from signing up to access the sensitive information?
It absolutely makes sense when used within an organization where access/membership is properly vetted, but there, I feel like there was no point.
You're completely right: if that's not an invite only or vetted conference (that exist), this is just a marketing gimmick
to grab people attention. People who do that either don't understand what you feel intuitively, or do this attention grabbing thing intentionally. Just like "no media" presentations that just post their slides online later.
You're right that it doesn't make sense. It suggests a failure in data handling (who can I share this with?).
A lot of these are borrowed from the US .gov in which prosecution is a relatively effective way to get compliance with these policies, but, and I'll take some license here, are copied to appear sophisticated by unsophisticated players outside of that.
That original Glacier API was infamous for being extremely cheap to write to but prohibitively expensive to read from. Something like 10 cents per list objects request or something ridiculous like that. Can't remember the specifics but I do remember reading blog posts from people that wanted to restore a couple files and had to pay several thousand dollars for that.
I believe that they did alter the pricing at some point. Regardless, the move to just a storage class on S3 made everything much simpler.