> If I start calling "bananas" "apples" then I devalue the meaning of the word "apple". You can't differentiate which I'm referring to.
In French, potatoes are called what translates to English as "apple of the earth". Nobody confuses a pomme de terre with an apple, because nobody calls a potato an apple without the adjective attached.
That's what the additional adjective as part of the title is for; like how apples and potatoes are vaguely related in that they're both plant-based food but are otherwise entirely different; turning "software engineer" into a compound term that has the extra word is specifically to differentiate it from expectations of it not having the extra word.
Software engineering is legitimately engineering going by the etymological meaning of engineering; but it's not really one going by some of the other (mostly orthogonal) things we've layered onto the term in many contexts over the years. It's creation through ingenuity. It has as much claim to the word as part of its title as any other usage of the word does.
> And the FAA won't become involved unless you're pointing them skyward.
The point here is that 'skyward' is where the laser's beam goes when you're trying to aim it at a camera up on a pole. It's practically impossible to point a non-fixed position laser at something a non-trivial distance higher than you without spilling a large amount of laser beam into whatever happens to be behind your intended target; which is very often the sky.
The best solution for dealing with AI content slop flooding your eyeballs is to hang out in places small enough to be a community -- like a local area mesh network.
AI slop thrives in anonymity. In a community that's developed its own established norms and people who know each other, AI content trying to be passed off as genuine stands out like a sore thumb and is easily eradicated before it gets a chance to take root.
It doesn't have to be invite-only, per se, but it needs to have its own flavor that newcomers can adapt to, and AI slop doesn't.
You can still find the essence of community on the traditional internet in places like invite-only discords, smaller mastodon instances, traditional forums, and spaces similar to Lobsters and Tildes.
...and not on Hacker News. Too many pseudo-anonymous jerks, too many throwaways, too much faith placed in gamified moderation tools.
Potentially, but those areas are also more and more getting leveraged to further identify and profile people for targeting - see the latest Discord scandal for example.
Depends what you're actually storing. There are plenty of cases where the timezone is not metadata; it defines how the datetime should be interpreted.
For example: your local mom and pop corner store's daily opening and closing times. Storing those in UTC is not correct because mom and pop don't open and close their store based on UTC time. They open and close it based on the local time zone.
You conflate different concepts here. The actual moment of opening and closing can be stored in UTC, because it's proper time. Scheduling algorithm is an algorithm, not time. You can use DSL similar to time to code this algorithm, but being DSL it can go only so far to implement it.
You don't need to store the timezone anywhere, you just need to know the current local timezone when the stored UTC time is used. And that's why storing in UTC is better, because it only takes one conversion to represent it in some arbitrary local time.
If you stored it as a local time (ie: with TZ), then if it's ever later translated to a different local time (different TZ), you're now dealing with all the quirks of 2 different timezones. It's great way to be off by some multiple of 15 minutes, or even a day or two!
Heck, even if it's the same exact location, storing in local time can still require conversion if that location uses daylight savings! You're never safe from needing to adapt to timezones, so storing datetimes in the most universal format is pretty much always the best thing to do.
It doesn't to me. It should be obvious that there are plenty of valid uses of dates and times which implicitly refer to either an exact instant in time, or the expression of a time in a certain reckoning.
A birthday doesn't have a time zone because the concept of a birthday is more about the date on the calendar on the wall, not any universally understood singular instant in time; and so what matters most when it comes to your birthday is where you are. Your birthday doesn't move to the day before or after just because you travel to the other side of the globe.
A deadline has a time zone because when your boss says he wants the project done by 4PM, he means 4PM wherever you both currently are -- and the specific instant in time he referred to doesn't change if you get on a train and move a time zone over before that 4PM occurs.
And it may in fact be time zone and not just UTC with an offset; because if your boss tells you he wants a certain report on his desk by 4PM every day; when your local time zone goes into daylight saving time, it doesn't suddenly mean the report needs to be filed by 5PM instead.
In the first of these cases, the date itself has no time zone and is valid in whatever context its being read from. In the second, the instant in time might be expressed in UTC time with or without a specific offset. In the third, each of the successive instants in time may shift around with respect to UTC even while it continues to be referred to with one constant expression.
None of these are subjective interpretations. They're a consequence of the fact that as humans we've overloaded our representation of date/time with multiple meanings.
idk about you but I can get a happy birthday on the hour of my actual birth from people in the know but I never literally prepare things for the exact hour of the deadline. It's more like a day sort of thing
That's absurd. The system should be able to update itself without fear that it's going to break anything. The user should not be expected to have to set aside time to babysit an update.
Windows isn't perfect in this respect by any means, but it sure seems like it handles updates a lot better than the distros that have been mentioned in this thread; in that Windows at least takes steps to examine your hardware first and to not apply updates where it's known something has fallen out of support.
Windows also, apparently unlike these mentioned distros, maintains a Last Known Good configuration so if an update does start causing failures, it can automatically roll itself back (or, at worst, can manually be rolled back). There are some distros that do similar, particularly immutable distros; but honestly this sort of thing should be table stakes to the point that if a distro doesn't do it, it should be laughed out of the room.
There is absolutely no acceptable excuse for any operating system to break itself in an update.
The major Javascript engines already have the concept of a type system that applies at runtime. Their JITs will learn the 'shapes' of objects that commonly go through hot-path functions and will JIT against those with appropriate bailout paths to slower dynamic implementations in case a value with an unexpected 'shape' ends up being used instead.
There's a lot of lore you pick up with Javascript when you start getting into serious optimization with it; and one of the first things you learn in that area is to avoid changing the shapes of your objects because it invalidates JIT assumptions and results in your code running slower -- even though it's 100% valid Javascript.
Totally agree on js, but it doesn't have the same easy same-language comparison that you get from compiled Lua vs LuaJIT. Although I suppose you could pre-compile JavaScript to a binary with eg QuickJS but I don't think this is as apples-to-apples comparison as compiled Lua to LuaJIT.
There's not some conspiracy that's stopped it from happening. Nobody, anywhere, has ever said "DOM access from WASM isn't allowed". It's not a matter of 'allow', it's a matter of capability.
There's a lot of prerequisites for DOM access from WASM that need to be built first before there can be usable DOM access from within WASM, and those are steadily being built and added to the WASM specification. Things like value references, typed object support, and GC support.
reply