Hacker Newsnew | past | comments | ask | show | jobs | submit | Aardwolf's commentslogin

Includes link to a 2020 one that's still operational by the same person in the addendum at the bottom!

> Gaming was supposed to be one of the best drivers for 8K adoption.

While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.

However the framerate drop would be very noticeable...

OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision


I usually still play at 1080p on my Steam box because my TV is like nine feet away and I cannot tell a difference between 1080p and 4k for gaming, and I would rather have the frames.

I doubt I’m unique.


AAA games have been having really bad performance issues for the last few years while not looking much better. If you wanna game in 8K you are gonna need something like a NASA supercomputer.

AAA games are struggling for a lot of reason, and consoles are struggling as well. PC gamers tend to use a more traditional monitor setup and won't buy a gigantic television. At least, not for gaming.

Even with a super computer it'd be difficulty to render the frames in time with low latency.

We can’t render modern games at decent frame rates at 4k without going down the path of faking it with AI upscaling and frame generation.

There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.

Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?

(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)


Yeah, not only the huge required jump in raw fill rate, but to get the most out of a 4K TV you need higher detail models and textures and that means you also need a huge jump in VRAM, which never materialised.

The frame buffers/render targets alone for 8K are massive.

Basically 400MB for 12 bytes/pixel (64bit HDR RGBA + 32bit depth/stencil)

vs the 64000 bytes that Doom had to fill...


I don't see a future in which we play at 4K at top settings either without AI upscaling/interpolation. Even if it were theoretically possible to do so, the performance budget the developers have going forward will be assuming that frame generation and upscaling is used.

So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.


In most cases you dont need anti-aliasing at 4k.

VR headsets won't use the same panels that a TV would use. Any growth in the XR headset space won't help the TV industry.

> While the step from 1080p 1440p to 4K is a visible difference

I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.

Of course YMMV if you have a bigger screen, or a smaller room.


If your Mandalorian test was via streaming, that's also a huge factor. 4K streaming has very poor quality compared to 4K Blu-ray, for instance.

Which is a point in itself: bitrate can matter more than resolution.

For reasonable bitrate/resolution pairs, both matter. Clean 1080P will beat bitrate starved 4K, especially with modern upscaling techniques, but even reasonable-compression 4K will beat good 1080P because there's just more detail there. Unfortunately, many platforms try to mess with this relationship, like YouTube forcing 4K uploads to get better bitrates, when for many devices a higher rate 1080P would be fine.

I'm curious, for the same mb per second, how is the viewing quality of 4k vs 1080p? I mean, 4k shouldn't be able to have more detail per se in the stream given the same amount of data over the wire, but maybe the way scaling and how the artifacts end up can alter the perception?

If everything is the same (codec, bitrate, etc), 1080P will look better in anything but a completely static scene because of less blocking/artifacts.

But that’s an unrealistic comparison, because 4K often gets a better bitrate, more advanced codec, etc. If the 4K and 1080P source are both “good”, 4K will look better.


Yeah, I have a hard time believing that someone with normal eyesight wouldn't be able to tell 1080p and 4k blu-rays apart. I just tested this on my tv, I have to get ridiculously far before the difference isn't immediately obvious. This is without the HDR/DV layer FWIW.

Try comparing a 4K vs 1080p that were created from the same master, like a modern Criterion restoration.

Without HDR the differences are negligible or imperceptible at a standard 10' viewing distance.

I'll take it one step further: a well-mastered 1080p Blu-Ray beats 4K streaming hands down every time.


10 feet is pretty far back for all but the biggest screens, and at closer distances, you certainly should be able to see a difference between 4K and 1080P.

The Magsafe cord on a Macbook charger is 6'. It's not as far as you think.

For the 30 to 40 degree FoV as recommended by SMPTE, 10ft is further back than is recommended for all but like a 98in screen, so yes, it’s too far back.

It very much depends on the particular release. For many 4K releases you don't actually get that much more detail because of grain and imperfect focus in the original film.

there are so many tricks you can do as well, resolution was never really the issue, sharpness and fidelity isn't the same as charming and aesthetically pleasing

The person was referring to gaming where most PC players are sitting closer than 3 metres from their screen.

> While the step from 1080p 1440p to 4K is a visible difference

It really isn't.

What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.

4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.

The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.

The "visible difference" is mostly better source material, and HDR.

Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".


I know the thread is about tvs, but since gaming has come up, worth noting that at computer viewing distances the differences between 1080p/1440p and 4k really are very visible (though in my case I have a 4k monitor for media and a 1440p monitor for gaming since there’s 0 chance I can run at 4k anyway)

I can confirm that on a pc monitor, 1080p and 4k is very easy to tell apart.

I missed the part this was about gaming. Most people don't sit 10' away from their monitor, but it's standard for TV viewing.

For tv maybe, but you're replying to gaming, and it's definitely on a monitor, laptop or handheld

A lot of gaming is done on a TV in the living room

The same person might be both kinds of users, depending on the topic or just the time of the day


It's almost as if categorization is often an oversimplification.


Every now and then I play quordle, octordle, and once a thousand-word variation (which breaks down gameplaywise to just getting every letter at every spot).

A bit of reuse of the same word in the one-word version can't hurt I think


I thought this was going to be about nomograms. TIL that monograms, nonograms and nomograms exist


Made this as an exercise a while back:

https://news.ycombinator.com/item?id=35528155


Do those updates matter?

Not for me at least usually (exception might be something like an rpg game expanding the world), apps nagging to get updated is annoying in fact.


> apps nagging to get updated is annoying in fact.

There is no nagging. Apps auto-update on iOS, and have for years. I had 15 apps update in the last week. There was no nagging or notifications. It just happens.

My only gripe is that they seem to want to update right after I take it off the charger in the morning, instead of at night. But I only actually notice this once or twice per year, if I go to use an app that’s in the process of installing within the first few minutes of waking up.


Apps also auto update on Android. Frequently though, the updates reduce functionality or make it more annoying (basics like messages, calculator, photos, calendar, etc have been 'done' for a decade+ and can only really be made worse), so personally I've turned that off for most apps (and I suppose the other poster has too). Of course Google being aggressive assholes, they then have some of their apps start showing popups every time you open it telling you to update when the entire point was to have it not change in functionality and not introduce that sort of thing.


Most online RPGs (Genshin for example) check for world updates everytime you log in, it's not tied to app updates.


I was thinking Andor's Trail :)


> It would go on to run Windows 3.0, Windows 95, early Linux

That feels like a stretch :) Maybe it indeed ran on it, but Pentium was available when Windows 95 was released and it was probably far more likely to be sold along with such new Pentium multimedia machines, than someone getting it for their old 386. But Windows 3.11 was its exact match!


As somebody that was around at the time, this is not at all a stretch.

First, Linux was created FOR the 386. Linus Torvalds had one and wanted to unlock its power.

As you say, Windows 3.0 is certainly no stretch.

That leaves only Windows 95. The minimum spec at launch was a 386 with 4 MB of RAM. Realistically, you needed 8 MB to do anything.

Here is an article from 1993 saying that manufactures are beginning to drop the 386 from their product lines. That is, this is when people stopped being able to buy 386 machines brand new.

https://books.google.com/books/about/InfoWorld.html?id=2zsEA...

The 486 was the dominant chip in 1993 but there were still a lot of 386 machines being sold to that point when.

When Windows 95 shipped, people would certainly have been trying to run it on those machines.

When Windows 95 was released, people famously lined up to buy it like they were getting tickets to a rock concert. It was not just sold with new hardware. Back then, it was normal for people to pay money to buy a new operating system to run on hardware they already owned.

Of course Windows 95 certainly helped sell Pentiums. Pentium would have dominated new sales but a typical PC in service in 1995 would have been a 486 and there were still plenty of 386 machines in use.


A lot of (older than me) enthusiasts I knew got an MSDN subscription even though they weren't really developing apps for Windows. This gave them a steady stream of OS releases on CD-ROMs, officially for testing their fictional apps. So, they were often upgrading one or more systems many times rather than buying new machines with a bundled new OS.

Personally, yeah, I had Windows 3.0/3.11 on a 386. I think I may have also put an early Windows NT (beta?) release on it, borrowing someone's MSDN discs. Not sure I had got value from it except seeing the "pipes" software OpenGL screensaver. Around '93-94, I started using Linux, and after that it was mostly upgrading hardware in my Linux Box(en) of Theseus.

I remember my college roommate blowing his budget upgrading to a 180 MHz Pentium Pro, and he put Windows 95 on it. I think that was the first time I heard the "Start me up!" sound from an actual computer instead of a TV ad.

After that, I only encountered later Windows versions if they were shipped on a laptop I got for work, before I wiped it to install Linux. Or eventually when I got an install image to put Windows in a VM. (First under VMware, later under Linux qemu+kvm.)


> Maybe it indeed ran on it, but Pentium was available when Windows 95 was released and it was probably far more likely to be sold along with such new Pentium multimedia machines, than someone getting it for their old 386.

I was definitely running Linux on a 486. I even had a big bulky laptop back then but I don't remember what CPU it had: I was running Linux on it too. And my 486 was sharing it's (dial-up, 28.8 or 33.6) Internet connection to the laptop using the PLIP protocol (IP over a parallel cable): I set that up and my brother and I were discovering the Web at the same time. Fun memories.

The jump was not 386 to Pentium. The 486 had its glory days.


Windows 95 was Microsoft's biggest commercial hit at that point. Selling 40m copies in its first year.

There's no doubt that it went in to upgrade plenty of 386s/486s until the owners upgraded their hardware.


You needed at least 12 MB RAM to run Windows 95 smoothly. There were plenty of 8 MB systems that really really struggled. Even booting up was a swap fest.

I remember immediately upgrading to 12 MB. 8 MB was painful.

Not all 386 class systems could be upgraded to 12 MB or more.


I did a ton of upgrades from 3.1 to Windows 95 in late 1996 and early 1997 on 386 and 486 machines with 4MB of RAM. I still have a "mark" from the tedium. Some of the machines didn't have a large enough hard drive to store a copy of the setup files (the "CAB files") so until the company issued me a ZIP drive I had to do "the floppy shuffle" with 20-ish disks.

It ran like crap with 4MB of RAM but it did run. Opening anything much resulted in paging.


Your mileage may vary. I remember 4 MB booting, but being absolutely useless.

8 MB was still swapping all the time, you couldn't really run much beyond some simple software.

12 MB was finally enough to do something productive, but definitely nothing luxurious.


I was just trying to give a bit of historical context, but apparrently need to be more precise next time! 386 is the beginning of 32 bit. But it's mainly the pentium and 486 that ran Windows 95.


If true, this would mean more websites with genuine content from the "old" internet won't show up (since many personal websites won't have this), while more SEO-optimized content farms that of course do put up a robots.txt will...


It also fits Google's plan to create a surrogate web.

- AI was the first step (or actually, among the first five steps or so). CHECK. - Google search has already been ruined. CHECK. - Now robots.txt is used to weed out "old" websites. CHECK.

They do too much evil. But it is also our fault, because we became WAY too dependent on these mega-corporations.


> I'm not a fan of the bias towards "Gears are old tech, and that makes them bad"

If the gears don't at least require an app with a subscription and regular updates to use, they must be old tech

/sarcasm


When there's some big ongoing thing in the news there'll be many articles on that same topic on news websites and sometimes you can't even find the original one that tells what actually happened. Wikipedia's article on it is usually a great summary


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: