A lot of open source projects suffer from “pet project” syndrome. Some maintainers have very strong opinions about how/what things should be done regardless of actual user demand or feedback, and there’s no one with the power to rebuff them because the product owners deciding what to implement and the engineers doing implementation work are one and same. Commit right makes might.
In a (healthy) company, you have PMs and executives who will tell the overly opinionated engineers to STFU and actually implement things that move the needle and solve problems users are facing.
This is also why most open source projects have terrible UI/UX and any designer who attempts to help and improve things finds themselves ignored, with no means to actually carry out any decision, and walk away soon after.
In fact larger open source GUI projects also tend to suffer from the opposite: with multiple equally powerful maintainers, a good number of contributors, and no one to rally them behind a coherent design, you get a messy bag of poorly documented features spread all over the place. Design by programmer is usually bad enough; design by a loosely-connected web of programmers working almost independently is even worse. Also, a lot of the developers/users are ideologically motivated and heavily invested, when you point out how bad the software is for the average user, instead of reflecting on it they often get defensive.
> “and there’s no one with the power to rebuff them”
Except for those that discover the fork button. /s
In all seriousness I think that what you mean is that people rarely have the power to fork a project and drag all the users and development expertise over to their version. Especially if they are simply users of the system and not contributors. Users of commercial software can get companies to do stuff for them with their wallets.
> any designer who attempts to help and improve things finds themselves ignored, with no means to actually carry out any decision, and walk away soon after.
Ah yes, remember how long it took before image thumbnails appeared in the GTK file picker? Finally in 2022, but that was after countless forks and patches submitted that implemented them.
It stems from a bias, developers are power/expert users and their interface behavior is likely matching, which means they are heavily reliant on keyboard shortcuts (Think Blender 1.x and Blender 2.x who leaned heavily and even exclusively on shortcuts)
This gives them the idea that their software is very powerful (and probably is) but this creates a cliff to the beginner and intermediate level users.
A lot of examples can be found in “About Face: On interaction design” by Alan Cooper.
Well, Resolve suffers from the same attitude in places. While a good and usable product, it suffers from many design gaffes that go unaddressed year after year, despite denying users functions that are expected in any similar product.
The render-job UI is pretty shambolic (as is the treacherously inaccurate timeline on the same page), for example. And BMD refuses to add simple functions like "match timeline properties to clip." Instead they have a prompt that offers to match only frame rate, in contrast to just about every other media-editing app I can think of. Fixing that should be easy; and if it isn't, the codebase must be hopelessly inept.
The integrations of Fusion and Fairlight are buggy and exhibit nonsensical and misleading UI behavior. Resolve has a node view... oh wait, now it has two node views, which are not integrated with each other or with the editor. Fixing that should be priority one, which could create a truly excellent hybrid product that some of us have asking for for years.
And now they've ported the thing to the iPad. Really? I realize that from the outside we don't know how engineering people are allocated, but nonetheless it seems absurd to spend resources on that while the desktop product suffers from serious defects that have languished for a long time.
Honestly, the real mystery is why so many people are confused about “why, despite universe seeming so vast, there are no signs of other alien civilizations” in 2023.
We have all the elements for answers if we focus on what we know, and forget the hand wavy sci-fi speculation.
1) Complex life is rare.
2) Reaching a space faring stage is even rarer. (we’re the most minimal definition of “space faring” you could come up with, and even then we got really lucky with so many things)
3) The universe is huge. It’s like, the hugest thing there is, man. And except for some little bits of interesting dust here and there, it’s mostly empty. As empty as it is huge.
So, does life - in any form - exist elsewhere in the universe? Almost certainly.
Are/were there life forms elsewhere in the universe that escaped their home planet gravity to go explore their moon or other planets in their solar system? Seems quite probable.
Is there any shot we are sufficiently close in space/time to encounter such another advanced life form? Almost certainly not.
Is there even any reason to think interplanetary is easy? We have a lot of stories saying so, and we intend to do it but we don't even have a loose plan of what that would look like.
Creating a closed self-sustaining ecosystem capable of supporting large animal life & cut off from the earth's resources is not something we've been able to do even at a proof-of-concept level.
We're very confident in ourselves but idk. It's not preposterous that life is a planetary expression and that it's simply not possible to expand an instance of it beyond the planet that birthed it. We assume we aren't subject to this constraint but we haven't demonstrated it at all.
I meant for all intelligent species that could be living in our universe, I can see interplanetary life as something rather easier to achieve. You don't have to think far out of the box for this.
I can imagine some solar systems having multiple liveable planets for a single lifeform, which would make it something we could even do right now.
Our some transpermia on planets that now host different but communicating lifeforms that are cooperating.
We can easily imagine faster than light travel but it simply isn't possible regardless. There are constraints on us other than our capacity for imagination.
We have never seen another living planet so we don't know what to expect from one. I touched on this originally but I'll be explicit now: the fact that we can't create a closed ecosystem even given a working example and diverse raw materials is a powerful indicator of our ignorance about the contours and possibilities of life.
I'm not saying it is impossible, but I'm saying any confident assertions about its possibility, much less how "easy" it is are fantasy if not pure hubris. It's quite possible that no level of advancement will allow for even merely interplanetary life.
Out of the trillions of galaxies out there and the trillions more solar systems, it is very likely that at least some of them have multiple planets with habitats suitable to whatever intelligent life evolves there. In that case, an interplanetary civilization would merely require reaching the other habitable planet, not trying to change its environment.
> it is very likely that at least some of them have multiple planets with habitats suitable to whatever intelligent life evolves there.
This assumption is exactly what I am challenging: that there is such a thing as "general suitability" to life.
We should be open to the possibility that a living system is only compatible with the specific circumstances it emerges from. We've seen nothing to indicate either way, of course, not having seen any other life. But we should not so comfortably assume it's a transferrable process.
2 and 3 seem plausibly correct to me (although 3 is a mixed bag - the hugeness increases the number of dice rolls on 1 and 2 - not only the difficulty of connecting after the fact), but what is the basis for 1?
As far as I know, we have observed complex (and extremely robust) life on every temperate, wet planet that we know about. Batting a thousand.
Life appeared on Earth almost as soon as it was possible, but it took almost 2.5 BILLION years for it to make the jump to multicellular (a.k.a. complex life). It then took another billion-plus years for life that had the slightest bit of intelligence.
The universe is probably teeming with life, but sadly, most of it is probably just goo.
There _are_ signs consistently that something has an understanding of physics and inertia that violate our understanding of such things, which implies our assumptions about distance of things in the universe are wrong.
Those vehicles appearance but lack of most interaction, lack of destruction implies something worse, that we are probably still such a primative civilization we're treated like zoo animals.
These conversations about the last century of the search for alien life in the light of David Grusch's testimony to Congress and the various and numerous UAP videos are.. interesting to say the least.
> which implies our assumptions about distance of things in the universe are wrong.
There are two easy interpretations. One of these is imagining a new entity; aliens. The other is to imagine that humans are, essentially, human and falliable and are chasing lense flares and radar glitches.
One of these interpretations is much more likely than the other.
I don't doubt the existence of aliens. I just doubt that UFOs are aliens.
"The Dark Forest" by Liu Cixin is sci-fi, but I don't think it's too hand-wavy. It presents a pretty logical answer to the Fermi Paradox, based on a couple obvious axioms.
- Attacking a planet that may or may not be a threat eventually is a much more expensive, noisy and slow endeavor than a shot in the dark in a forest.
(you can also think about all the technical hurdles such a thing would take, also remember that the fast something goes the harder it is to maneuver and to stop at the destination), don't underestimate all the kinds of noise that such thing will create and can be traced back to its origin.
- No species on Earth do that kind of unprovoked attack, unless it is very cheap and quick (and riskless)
So no, I think there are many better explanations than simply the dark forest one
I think that's where you're wrong. For a sufficiently advanced civilization, destroying another that's around our level would only entail destroying our planet. Hell, destroy the whole solar system to be sure. A "simple" way to do that would be to make the sun go supernova, or destroy it somehow.
How would you do that? There's a tremendous amount of energy already in it, it's releasing it gradually for now but what if there were ways to make it release all at once? One potential approach would be to throw something at it at relativistic speeds (you'd imagine accelerating things to near-lightspeed would be a pretty obvious milestone in the tech tree).
For an advanced civilization, this is pretty easy; they'd probably be able to do it routinely from a mobile spaceship so they don't have to give away their home star's position.
It's a reasonably logical answer but it's not the simplest answer. The Dark Forest hypothesis assumes that life is so common in the universe that encountering and being existentially threatened by other life is a serious threat. But what reason is there to believe that life is that common? For Liu Cixin's books this assumption makes sense because it makes the story possible, but in real life there's simply no evidence to justify such an assumption.
Well we have no idea of the values for most variables in the Drake equation, so of course there will be some assumptions. Dark Forest is an answer to the Fermi Paradox in the case where intelligent life isn't rare.
If you read the books, you'll notice that they talk about "hiding gene" and "cleansing gene"; these are traits that civilizations acquire as they evolve.
I think that this has real-world implications in terms of how we conduct our SETI (search for extra-terrestrial intelligence); we should be careful of our radio emissions and things like sending a powerful signal / message to another star should probably be avoided.
Shit entertainment channel masquerading as educational content. If you think you learned something from that video then why haven't you explained it yourself?
The simple fact of the matter is that there is no empirical evidence for life outside of Earth. My guess is that it exists out there somewhere, but there's no good basis for believing that life is as common as the Dark Forest hypothesis requires.
We still listened in on signals from the entire visible universe, but signals from further away are also from further back in time, so any aliens there would have had to develop earlier. But this also has an advantage, we could still hear from aliens that went extinct and only their signals are still propagating through the universe.
So the better way of thinking about this is a 75 year thick slice through spacetime containing the edge of the visible universe shortly after the big bang at one end and the solar system over the last 75 years at the other end. The volume between the past light cones of Earth now and 75 years ago.
Exactly this. Even to this day, Nintendo is very smart about almost never discounting their flagship games - and when they do, it’s by a small percent, sometimes with strings attached, not the crazy “90% off” tactics many other developers/publishers follow.
Playstation games being in the $15 bin while desirable N64 games never seemed to go on sale was a big reason kid me had a ton of games compared to my friends with N64s.
I still see this with my kids and their Switch. PS2 and GameCube seemed to follow the same pattern.
Yes. And it teaches the consumers that there's no point in waiting. When you want their game, you just buy it at full price on launch date, because you know that waiting for discount would take years and even then it would be 20-30% off at best.
Although to continue the argument from GP: copious 90% off steam sales haven't killed PC gaming, enough people still preorder PC games at full price. It's literally a meme that people have more games than they can ever play but they still keep buying new ones.
In the '80s the games were mostly being purchased by adults for children, while today PC games are mostly purchased by adults for themselves. (I also remember being encouraged to spend my allowance on clearance-bin games when my parents wouldn't buy them new.)
commercial design needs to hit a set of 2 contradictory goals:
1) be as boring as possible so people can make sense of it quickly and efficiently in a world where there are countless other things competing for your time and attention
2) standout as much as possible to gain your attention in the aforementioned busy world
IME this explains a lot the nature of trends that design experiences.
It might well be intentional. Scientists like those weird double entendre titles; it's especially noticeable in paper titles in academia, which often follows the format ("project name: punny phrase vaguely explaining the project").
It's frustrating because as a grad student I was explicitly taught to avoid using local slang, informal sayings and expressions, humor, etc. in my academic writing to make it as understandable and unambiguous as possible.
> I think a simple piece of advice (stolen from Tuft) is to not use colors unless you really need to.
Funny, read your comment as my copy of Tufte's VDoQI is open to page 154:
"Color often generates graphical puzzles. Despite our experiences with the spectrum in science textbooks and rainbows, the mind's eye doe not readily give a visual ordering to colors, except possibly for red to reflect higher levels than other levels [...] Attempts to give colors an order result in those verbal decoders and the mumbling of little mental phrases [...] Because they do have a natural visual hierarchy, varying shades of gray show varying quantities better than color."
The classical optical camera does not capture anything. It is a light sealed box, with a pinhole for a lens. As an optical system, it interacts with electromagnetic waves that go through it, that's the only 'reality' you can really care about.
What captures an image is an imaging surface; traditionally a chemical emulsion on a piece of film, now a complex array of digital sensors.
This imaging surface is of human design, it therefore images what its designers designed it to image. But don't forget that it is a sampling of reality; by definition always partial, and biased (biased to the 400~700 nm range, for starters).
> But don't forget that it is a sampling of reality; by definition always partial, and biased (biased to the 400~700 nm range, for starters).
This does not matter in any way. What matters is that, what comes out on the other end of filtering and bias, is highly correlated with what came in, and carries information about the imaged phenomenon.
This is what both analog films and digital sensors were designed for. The captured information is then preserved through most forms of post-processing, also by design. Computational photography, in contrast, is destroying that information, for the sake of creating something that "looks better".
Not sure what the Brazilian bureaucracy landscape is like, but if it's anything like western Europe, I'd be wishing for tools that shape us to write less ordinance. That is: there frequently isn't a lack of ordinance, but a lack of capacity to do something meaningful with the high volume of documents. And the disconnect between paperwork and reality.
I'm a photographer, and for years I've been pixel peeping at photos taken on phones with "portrait mode"; many years after the first introduction of the feature, regardless of the implementation, results still look crummy to my eye.
Looking at fine elements like hairs (nevermind curly hair) is a disaster, especially when you're used to fine classic german/japanese optics that accurately reproduce every subtle detail of a subject while having extremely aesthetically pleasing sharpness falloff/bokeh.
I've had to swallow the pill though: No one (end users; pros are another story) cares about those details. People just want something that vaguely looks good in the immediate moment, and then it's on to the next thing.
I suspect it'll remain the same for AI generated visuals; a sharp eye will always be able to tell, but it won't really matter for consumption by the masses (where the money is).