We went from "Review any services and their interaction without local system and network" to "Defending local and remote logic created on the fly to mangle the local file system, and why that's a good thing" ...
That's not a productivity boost. That's a rapid increase in cognitive tax you're offloading for later and as you get backlogged in reviewing it, you lose more control over what it does...
Yes, if this selected piece is the best that was available to be used as a showcase, it's immediately off putting in distortion and mangling of pronunciation.
I've been using Osmo oils. This top oil and also their butcher block. Besides what they say that it is food safe, would this be fine for utensils which may get exposure to cooking temperatures ? Whether mixing soup or stir fry ?
Osmo topoil is actually mostly what it says on the can - wax + oil. The wax part will melt/degrade very quickly at cooking temps. the oil portion will not.
If you are exposing it to cooking temps, and want something very natural, i'd just use an oil and not a "hardwax". The wax part is not going to buy anything.
"hardwax" is just a made up term that means nothing for real, some of them are harder waxes (carnauba), some of them are not.
In any case, none of them will survive heat, because the wax won't.
Hold up ,- when I used a C or similar language for accessing a database and wanted to clamp down on memory usage to deterministically control how much I want to allocated, I would explicitly limit the number of rows in the query.
There never was an unbound "select all rows from some table" without a "fetch first N rows only" or "limit N"
If you knew that this design is rigid, why not leverage the query to actually do it ?
Because nothing forced them to and they didn't think of it. Maybe the people writing the code that did the query knew that the tables they were working with never had more than 60 rows and figured "that's small" so they didn't bother with a limit. Maybe the people who wrote the file size limit thought "60 rows isn't that much data" and made a very small file size limit and didn't coordinate with the first people.
Anyway regardless of which language you use to construct a SQL query, you're not obligated to put in a max rows
I imagine there's numerous ways to protect against it and protection should've been added by whoever decided on this optimization. In data layer, create some kind of view which never returns more than 200 rows from base table(s). In code, use some kind of iterator. I'm not a Rust guy, just a C defensive practices type of dude, but maybe they just missed a biggie during a code review.
That's not a productivity boost. That's a rapid increase in cognitive tax you're offloading for later and as you get backlogged in reviewing it, you lose more control over what it does...