I would look into Kaleidescape if you want blu-ray quality viewing in your home with the convenience of streaming (although you have to download ahead of time). Only downside is the price and need for their proprietary player + server.
$9k ($4k player + $5k server) for a basic system is not a very "only" downside, and it seems kind of pathetic if you can't even access it from a regular desktop.
You could already do this before the port change, you just need a USB-C to Lightning cable (which has been included in the box for a few years now). Being able to use the same cable is nice and all, but in practice I want two or more cables anyway to charge multiple devices at the same time. If one of those cables happens to have a different connector on one end it isn't the end of the world.
> I'd suggest that the people who make "full use" of vsc-over-ssh are satisfied with vscode, so it would be unwise to target the full featureset.
Remote SSH + Dev Containers and their seamless integration (even stacking one on the other) are the only features that keep me using VS Code. I would love to see the full implementation of these in an editor as fast and light weight as Zed.
How exactly are these features present in your workflow? I honestly struggle to think of when I'd ever use this.
Fwiw, I do a lot of infrastructure-as-code, full stack, and systems programming.
I usually have a split screen (editor | terminal) or two terminals on the side, and exec into a container, or use devenv.sh.
If I _really_ need to modify files in the container as I dev and a "make" doesn't cut it, I usually just run podman with a -v mount. Similarly for remote machines w/ sshfs, though I try not to.
Devcontainers are amazing for getting a consistent environment set up on multiple computers on multiple operating systems. I was in this situation recently with a new colleague who was using a very locked-down Windows computer, and it was really convenient for the "you must install" list to be Docker and VSCode only. It's definitely not ideal - it adds overhead on Windows and Mac, there's occasional networking issues, and I don't have all the creature comforts of my usual shell - but it's very convenient for what it is.
Similarly, editing code in-place over SSH rather than rsyncing back and forwards is very useful for certain types of embedded development. I worked for a while on sensor systems where a lot of the logic was handled by a Python webserver that couldn't really run without access to the sensor data it was using. Developing entirely locally was therefore difficult, but developing on the machine was also painful because it didn't have the right tools. So we'd work locally, and then copy the Python files over every so often and restart the server. At the time, I don't think VSCode's remote stuff was working as well, but I believe now it's a lot better and could have handled that situation well - edit everything in-place, run it immediately, but still have the power of you local development machine available to you.
Both the git log and git bisect commands accept the --first-parent flag, which eliminates the complexities of dealing with merge commits in the history.
Could I ask real quick why you dislike W3Schools? I def learned SQL syntax and basic HTML/CSS (I needed to use CSS selectors but had basically no knowledge) through them and actually thought to myself they did a great job at keeping things need-to-know for immediate use.
I have a feeling I know your answer - as a financial professional I initially used Investopedia in a pinch but over the years realized their accuracy has an inverse relation with the specificity of the topic. After seeing incorrect formulas and descriptions a handful of times, I eventually stopped using it completely.
There are individuals that pay ~$4/month for GitHub Teams, plus other services (Copilot, CI, LFS, Packages, etc). They may not have public repos, but that shouldn't matter.
I doubt GitHub is going to close the account of an actively paying user. Rather, the fact that they are paying for it constitutes proof that the account is actively owned.
I think part of this stability stems from the upstream repos being more stable themselves (cheaper and easier than ever to run CI, quality of tools, etc). It's not often that upstream packages break their stable releases, or heck even develop/master.
It also seems like the firehose of upgrades that is Sid/Experimental has slowed down a bit. Even Sid is still on nodejs 18 and not 20 or 21. The same version as both Testing and Stable.
I feel at home at home with Debian Testing, not as bleeding edge as Arch for sure, but I think that is more of a blessing than a curse. The Debian maintainers seem to know what they are doing and the distribution has a sense or maturity that some others lack. I just hope an update won't break GRUB on me :)
Only downsides to running Debian Testing is that it doesn't get security updates with the same speed as Sid or Stable, and third parties don't exactly put together and distribute pre-built packages for Testing specifically. I understand why, but it makes me wonder if things will break due some ABI compatibility issue down the line. Maybe once Debian 13 is released I'll just stick to stable.
You're absolutely right, but that's one side of the coin. Debian also has matured its processes a lot, and they have a much bigger infrastructure now, so they patch tons of packages invisibly, and reproducibly build them, and I think some maintainers do their own suite of tests.
Debian 12 release is relatively new. Also we're in fall now. Package upgrade speed will take up in the next 12 months, reaching its top speed in 8 months if they keep up the usual schedule. I already can run two upgrades in a day, and get ~400 package updates in a couple of weeks. So testing is rolling pretty solidly now.
It's a bummer that security service for testing is discontinued now, but it's only natural. The project got too big, and the attacks are evolving faster than ever, so security people have so much time and staff to handle this.
I didn't experience any ABI compatibility problems with closed source or external software with Debian testing. I'm running Pagico, InSync, StarUML, etc. which are Deb packages and all of them are built against Debian stable or Ubuntu. They work without any problems. Well, Pagico needs libjpeg, but that's another thing unrelated to ABI.
> But nothing about upscaling, nor frame generation, sounds like something that requires game engine support to me. The graphics card should have enough info to do that by having all the pixels.
Modern upscaling solutions (eg TAA) require not just the pixel outputs but various inter-frame/metadata such as depth, normals, motion vectors, last frame, etc. DLSS is more or less just the spiritual successor to TAA with machine learning sprinkled in. Integration with the game engine is needed because the non-color metadata needs to be provided to the DLSS and how each engine computes those things is a bit different.
That sounds more like Google's captcha system. I've never seen Cloudflare Turnstile do this on any of my systems/browsers.