Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It seems bizarre to me that they'd have accepted such a high cost

They’re not the ones bearing the cost. Customers are. And I’d wager very few check the hard disk requirements for a game before buying it. So the effect on their bottom line is negligible while the dev effort to fix it has a cost… so it remains unfixed until someone with pride in their work finally carves out the time to do it.

If they were on the hook for 150GB of cloud storage per player this would have been solved immediately.



The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

That's why they did the performance analysis and referred to their telemetry before pushing the fix. The impact is minimal because their game is already spending an equivalent time doing other loading work, and the 5x I/O slowdown only affects 11% of players (perhaps less now that the game fits on a cheap consumer SSD).

If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.


> The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

Not what happened. They removed an optimization that in *some other games* ,that are not their game, gave 5x speed boost.

And they are changing it now coz it turned out all of that was bogus, the speed boost wasn't as high for loading of data itself, and good part of the loading of the level wasn't even waiting for disk, but terrain generation.


5x space is going to be hard to beat, but one should always be careful about hiding behind a tall tent pole like this. IO isn’t free, it’s cheap. So if they could generate terrain with no data loading it would likely be a little faster. But someone might find a way to speed up generation and then think it’s pointless/not get the credit they deserve because then loading is the tall tent pole.

I’ve worked with far too many people who have done the equivalent in non game software and it leads to unhappy customers and salespeople. I’ve come to think of it as a kind of learned helplessness.


> If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.

And others who wish one single game didn't waste 130GB of their disk space, it's fine to ignore their opinions?

They used up a ton more disk space to apply an ill-advised optimization that didn't have much effect. I don't really understand why you'd consider that a positive thing.


By their own industry data (https://store.steampowered.com/news/app/553850/view/49158394...), deduplication causes a 5x performance increase loading data from HDD. There's a reason so many games are huge, and it's not because they're mining your HDD for HDDCoin.

The "problem" is a feature. The "so it remains unfixed until someone with pride in their work finally carves out the time to do it" mindset suggests that they were simply too lazy to ever run fdupes over their install directory, which is simply not the case. The duplication was intentional, and is still intentional in many other games that could but likely won't apply the same data minimization.

I'll gladly take this update because considerable effort was spent on measuring the impact, but not one of those "everyone around me is so lazy, I'll just be the noble hero to sacrifice my time to deduplicate the game files" updates.


> In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

That makes no goddamn sense. I’ve read it three times and to paraphrase Babbage, I cannot apprehend the confusion of thought that would lead to such a conclusion.

5x gets resources to investigate, not assumed to be correct and then doubled. Orders of magnitude change implementations, as we see here. And it sounds like they just manufactured one out of thin air.


Perhaps this is a place where developers can offer two builds.

HDD and SSD, where SSD is deduplicated.

Im.sure some gamers will develop funny opinions, but for the last 8 years I have not had a HDD in sight inside my gaming or work machines. I'd very much rather save space if the load time is about the same.on an SSD. A 150gb install profile is absolute insanity.


I mean when you optimize assets for a single read on mechanical drives size blow up pretty quickly, but the single IO read reduces latency greatly. That said it only makes sense on drives with high IO latency.


It's not a "feature" for the 89% of users whose SSD capacity was being wasted.


Seems to me that most of these situations have an 80/20 rule and it would be worth someone’s time to figure out what that is.

Get rid of 80% of that duplication for a 2x instead of a 5x slowdown would be something.


I expect better from HN, where most of us are engineers or engineer-adjacent. It's fair to question Arrowhead's priorities but...

    too lazy
Really? I think the PC install size probably should have been addressed sooner too, but... which do you think is more likely?

Arrowhead is a whole company full of "lazy" developers who just don't like to work very hard?

Or do you think they had their hands full with other optimizations, bug fixes, and a large amount of new content while running a complex multiplatform live service game for millions of players? (Also consider that management was probably deciding priorities there and not the developers)

I put hundreds of hours into HD2 and had a tremendous amount of fun. It's not the product of "lazy" people...


An 85% disk size reduction at minimal performance impact is negligent by the standard of professional excellence.

But that's also par for the course with AA+ games these days, where shoving content into an engine is paramount and everything else is 'as long as it works.' Thanks, Bethesda.

Evidenced by the litany of quality of life bug fixes and performance improvements modders hack into EOL games.


The only sane way to judge "professional excellence" would be holistically, not by asking "has this person or team ever shipped a bug?" If you disagree, then I hope you are judged the same way during your next review.

In the case of HD2 I'd say the team has done well enough. The game has maintained player base after nearly two years, including on PC. This is rare in the world of live service games, and we should ask ourselves what this tells us about the overall technical quality of the game - is the game so amazing that people keep playing despite abysmal technical quality?

The technical quality of the game itself has been somewhat inconsistent, but I put hundreds of hours into it (over 1K, I think) and most of the time it was trouble-free (and fun).

I would also note that the PC install size issue has only become egregious somewhat recently. The issue was always there, but initially the PC install size was small enough that it wasn't a major issue for most players. I actually never noticed the install size bug because I have a $75 1TB drive for games and even at its worst, HD2 consumed only a bit over 10% of that.

It certainly must have been challenging for the developers. There has been a constant stream of new content, and an entirely new platform (Xbox) added since release. Perhaps more frustratingly for the development team, there has also been a neverending parade of rebalancing work which has consumed a lot of cycles. Some of this rebalancing work was unavoidable (in a complex game, millions of players will find bugs and meta strategies that could never be uncovered by testing alone) and some was the result of perhaps-avoidable internal discord regarding the game's creative direction.

The game is also somewhat difficult to balance and test by design. There are 10 difficulty levels and 3 enemy factions. It's almost like 30 separate games. This is an excellent feature of the game, but it would be fair to Arrowhead for perhaps biting off more than any team can chew.


> They used up a ton more disk space to apply an ill-advised optimization that didn't have much effect.

The optimization was not ill-advised. It is in fact, an industry standard and is strongly advised. Their own internal testing revealed that they are one of the supposedly rare cases where this optimization did not have a noticeably positive effect worth the costs.


23 GiB can be cached entirely in RAM on higher end gaming rigs these days. 154 GiB probably does not fit into many player's RAM when you still want something left for the OS and game. Reducing how much needs to be loaded from slow storage is itself an I/O speedup and HDDs are not that bad at seeking that you need to go to extreme lengths to avoid it entirely. The only place where such duplication to ensure linear reads may be warranted is optical media.


They used "industry data" to make performance estimations: https://store.steampowered.com/news/app/553850/view/49158394...

> These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.


Instead of y'know, running their own game on a hdd.

It's literally "instead of profiling our own app we profiled competition's app and made decisions based on that".


They started off with the competitors data, and then moved on once they had their own data though? Not sure what y'all complaining about.

They made an effort to improve the product, but because everything in tech comes with side effects it turned out to be a bad decision which they rolled back. Sounds like highly professional behavior to me by people doing their best. Not everything will always work out, 100% of the time.

And this might finally reverse the trend of games being >100gb as other teams will be able to point to this decision why they shouldn't implement this particular optimization prematurely


They didn't actually fix this until a couple of months after they publicly revealed that this was the reason the game was so big and a lot of people pointed out how dumb it is. I saw quite a few comments saying that people put it on their storage HDD specifically because it was too big to fit on their SSD. Ironic. They could have got their own data quite a bit earlier during development, not nearly two years after release!


If I’m being charitable, I’m hoping that means the decision was made early in the development process when concrete numbers were not available. However the article linked above kinda says they assumed the problem would be twice as bad as the industry numbers and that’s… that’s not how these things work.

That’s the sort of mistake that leads to announcing a 4x reduction in install size.


But if I read it correctly (and I may be mistaken) in actual practice any improvement in load times was completely hidden by level generation that was happening in parallel, making this performance improvement not worth it, since it was hidden by the other process.


>In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.

Never trust a report that highlights the outliers before even discussing the mean. Never trust someone who thinks that is a sane way to use of statistics. At best they are not very sharp, and at worst they are manipulating you.

> We were being very conservative and doubled that projection again to account for unknown unknowns.

Ok, now that's absolutely ridiculous and treating the reader like a complete idiot. "We took the absolute best case scenario reported by something we read somewhere, and doubled it without giving a second thought, because WTF not?. Since this took us 5 seconds to do, we went with that until you started complaining".

Making up completely random numbers on the fly would have made exactly the same amount of sense.

Trying to spin this whole thing into "look at how smart we are that we reverted our own completely brain-dead decision" is the cherry on top.


Are you a working software engineer?

I'm sure that whatever project you're assigned to has a lot of optimization stuff in the backlog that you'd love to work on but haven't had a chance to visit because bugfixes, new features, etc. I'm sure the process at Arrowhead is not much different.

For sure, duplicating those assets on PC installs turned out to be the wrong call.

But install sizes were still pretty reasonable for the first 12+ months or so. I think it was ~40-60GB at launch. Not great but not a huge deal and they had mountains of other stuff to focus on.


I’m a working software developer, and if they prove they cannot do better, I get people who make statements like GP quoted demoted from the decision making process because they aren’t trustworthy and they’re embarrassing the entire team with their lack of critical thinking skills.

When the documented worst case is 5x you prepare for the potential bad news that you will hit 2.5x to 5x in your own code. Not assume it will be 10x and preemptively act, keeping your users from installing three other games.


Well, then I'd like to work where you work. Hard to find shops that take performance seriously. You hiring?

In my experience it's always been quite a battle to spend time on perf.

I'll happily take a demotion if I make a 10x performance goof like that. As long as I can get promoted eventually if I make enough 10x wins.


I would classify my work as “shouting into the tempest” about 70% of the time.

People are more likely to thank me after the fact than cheer me on. My point, if I have one, is that gaming has generally been better about this but I don’t really want to work on games. Not the way the industry is. But since we are in fact discussing a game, I’m doing a lot of head scratching on this one.


They claim they were following industry standard recommendation.

Or, you know, they just didn't really understand industry recommendations or what they were doing.

"Turns out our game actually spends most of its loading time generating terrain on the CPU" is not something you accidentally discover, and should have been known before they even thought about optimizing the game's loading time, since optimizing without knowing your own stats is not optimizing, and they wrote the code that loads the game!

Keep in mind this is the same team that accidentally caused instantly respawning patrols in an update about "Balancing how often enemy patrols spawn", the same group that couldn't make a rocket launcher lock on for months while blaming "Raycasts are hard", and released a mech that would shoot itself if you turned wrong, and spent the early days insisting that "The game is supposed to be hard" as players struggled with enemy armor calculations that would punish you for not shooting around enemy armor because it was calculating the position of that armor incorrectly, and tons of other outright broken functionality that have made it hard to play the game at times.

Not only do Arrowhead have kind of a long history of technical mediocrity (Magicka was pretty crashy on release, and has weird code even after all the bugfixes), but they also demonstrably do not test their stuff very well, and regularly release patches that have obvious broken things that you run into seconds into starting play, or even have outright regressions suggesting an inability to do version control.

"We didn't test whether our game was even slow to load on HDD in the first place before forcing the entire world to download and store 5x the data" is incompetence.

None of this gets into the utterly absurd gameplay decisions they have made, or the time they spent insulting players for wanting a game they spent $60 on to be fun and working.


Which describes both the PS2, PS3, PS4, Dreamcast, GameCube, Wii, and Xbox 360. The PS4 had a 2.5" SATA slot but the idiots didn't hook it up to the chipsets existing SATA port, but added a slow USB2.0<->SATA chip. So since the sunset of the N64 all stationary gaming consoles have been held back by slow (optical) storage with even worse seek times.

Some many game design crimes have a storage limitation at their core e.g. levels that are just a few rooms connected by tunnels or elevators.


And it IS loading noticeably faster now for many users thanks to caching. That said I have to imagine many gaming directly off an hdd however are not exactly flush with ram


According to the post, "the change in the file size will result in minimal changes to load times - seconds at most."

It didn't help their game load noticeably faster. They just hadn't checked if the optimization actually helped.


The actual source (https://store.steampowered.com/news/app/553850/view/49158394...) says:

> Only a few seconds difference?

> Further good news: the change in the file size will result in minimal changes to load times - seconds at most. “Wait a minute,” I hear you ask - “didn’t you just tell us all that you duplicate data because the loading times on HDDs could be 10 times worse?”. I am pleased to say that our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

> Now things are different. We have real measurements specific to our game instead of industry data. We now know that the true number of players actively playing HD2 on a mechanical HDD was around 11% during the last week (seems our estimates were not so bad after all). We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.

They measured first, accepted the minimal impact, and then changed their game.


> They measured first,

No, they measured it now, not first. The very text you pasted is very clear about that, so I'm not sure why you're contradicting it.

If they had measured it first, this post would not exist.


But this means that before they blindly trusted some stats without actually testing how their game performed with and without it?


Yes, but I think maybe people in this thread are painting it unfairly? Another way to frame it is that they used industry best practices and their intuition to develop the game, then revisited their decisions to see if they still made sense. When they didn't, they updated the game. It's normal for any product to be imperfect on initial release. It's part of actually getting to market.


To be clear, I don't think it's a huge sin. It's the kind of mistake all of us make from time to time. And it got corrected, so all's well that ends well.


FWIW, the PC install size was reasonable at launch. It just crept up slowly over time.

    But this means that before they blindly trusted 
    some stats without actually testing how their 
    game performed with and without it?
Maybe they didn't test it with their game because their game didn't exist yet, because this was a decision made fairly early in the development process. In hindsight, yeah... it was the wrong call.

I'm just a little baffled by people harping on this decision and deciding that the developers must be stupid or lazy.

I mean, seriously, I do not understand. Like what do you get out of that? That would make you happy or satisfied somehow?


Go figure: people are downvoting me but I never once said developers must be stupid or lazy. This is a very common kind of mistake developers often make: premature optimization without considering the actual bottlenecks, and without testing theoretical optimizations actually make any difference. I know I'm guilty of this!

I never called anyone lazy or stupid, I just wondered whether they blindly trusted some stats without actually testing them.

> FWIW, the PC install size was reasonable at launch. It just crept up slowly over time

Wouldn't this mean their optimization mattered even less back then?


    premature optimization
One of those absolutely true statements that can obscure a bigger reality.

It's certainly true that a lot of optimization can and should be done after a software project is largely complete. You can see where the hotspots are, optimize the most common SQL queries, whatever. This is especially true for CRUD apps where you're not even really making fundamental architecture decisions at all, because those have already been made by your framework of choice.

Other sorts of projects (like games or "big data" processing) can be a different beast. You do have to make some of those big, architecture-level performance decisions up front.

Remember, for a game... you are trying to process player inputs, do physics, and render a complex graphical scene in 16.7 milliseconds or less. You need to make some big decisions early on; performance can't entirely just be sprinkled on at the end. Some of those decisions don't pan out.

    > FWIW, the PC install size was reasonable at launch. It just crept up slowly over time

    Wouldn't this mean their optimization mattered even less back then?
I don't see a reason to think this. What are you thinking?


> One of those absolutely true statements that can obscure a bigger reality.

To be clear, I'm not misquoting Knuth if that's what you mean. I'm arguing that in this case, specifically, this optimization was premature, as evidenced by the fact it didn't really have an impact (they explain other processes that run in parallel dominated the load times) and it caused trouble down the line.

> Some of those decisions don't pan out.

Indeed, some premature optimizations will and some won't. I'm not arguing otherwise! In this case, it was a bad call. It happens to all of us.

> I don't see a reason to think this. What are you thinking?

You're right, I got this backwards. While the time savings would have been minimal, the data duplication wasn't that big so the cost (for something that didn't pan out) wasn't that bad either.


If this is a common issue in industry why don't game devs make a user visible slider to control dedup?

I have friends who play one or two games and want them to load fast. Others have dozens and want storage space.


Any developer could tell you that it's because that would be extra code, extra UI, extra localization, extra QA, etc. for something nonessential that could be ignored in favor of adding something that increases the chance of the game managing to break even.


Having a smaller install size makes me more likely to buy the game if I have a large library.

Game size is a problem in every new triple A release.


> The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

Maybe, kinda, sorta, on some games, on some spinning rust hard disks, if you held your hand just right and the Moon was real close to the cusp.

If you're still using spinning rust in a PC that you attempt to run modern software on, please drop me a message. I'll send you a tenner so you can buy yourself an SSD.


The minimum requirement for new games is a SSD true.

However a lot of people have TINY SSDs. Think 500 gigabyte.


So, big enough for a 25GB game but not a 150GB game? I will be amused if we get stats in the coming month that the percentage of users installing the game on a HDD has decreased from 11% to like 3% after they shrunk it.


Fun story: I've loaded modern games off spinning rust for almost all of the past decade, including such whoppers as Siege, FS2020, CoD, and tons of poorly made indie titles. My "fast data" SSD drive that I splurged on remains mostly empty.

I am not the one who loads last in the multiplayer lobbies.

The entire current insistence about "HDD is slow to load" is just cargo cult bullshit.

The Mass Effect remastered collection loads off of a microSD card faster than the animation takes to get into the elevator.

Loading is slow because games have to take all that data streaming in off the disk and do things with it. They have to parse data structures, build up objects in memory, make decisions, pass data off to the GPU etc etc. A staggering amount of games load no faster off a RAM disk.

For instance, Fallout 4 loading is hard locked to the frame rate. The only way to load faster is to turn off the frame limiter, but that breaks physics, so someone made a mod to turn it off only while loading. SSD vs HDD makes zero difference otherwise.

We live in a world where even shaders take a second worth of processing before you can use them, and they are like hundreds of bytes. Disk performance is not the bottleneck.

Some games will demonstrate some small amount of speedup if you move them to SSD. Plenty wont. More people should really experiment with this, it's a couple clicks in steam to move a game.

If bundling together assets to reduce how much file system and drive seek work you have to do multiplies your install size by 5x, your asset management is terrible. Even the original playstation, with a seek time of 300ish ms and a slow as hell drive and more CD space than anyone really wanted didn't duplicate data that much, and you could rarely afford any in game loading.

I wish they gave any details. How the hell are you bundling things to get that level of data duplication? Were they bundling literally everything else into single bundles for every map? Did every single map file also include all the assets for all weapons and skins and all animations of characters and all enemy types? That would explain how it grew so much over time, as each weapon you added would actually take sizeOfWeaponNumMaps space, but that's stupid as fuck. Seeking an extra file takes a max of one frame* longer than just loading the same amount of data as one file.

Every now and then Arrowhead says something that implies they are just utterly clueless. They have such a good handle of how games can be fun though. At least when they aren't maliciously bullying their players.


If 5x faster refers to a difference of "a few seconds" as the article says, then perhaps 5x (relative improvement) is the wrong optimization metric versus "a few seconds" (absolute improvement).


I think we should remember the context here.

They're using the outdated stingray engine and this engine is designed for the days of single or dual core computers with spinning disks. They developed their game with this target in mind.

Mind you, spinning disks are not only a lot more rare today but also much faster than when Stingray 1.0 was released. Something like 3-4x faster.

The game was never a loading hog and I imagine by the time they launched and realized how big this install would be, the technical debt was too much. The monetary cost of labor hours to undo this must have been significant, so they took the financial decision of "We'll keep getting away with it until we can't."

The community finally got fed up. The steamdb chart keeps inching lower and lower and I think they finally got worried about permanently losing players that they conceded and did this hoping to get those players back and to avoid a further exodus.

And lets say this game is now much worse on spinning disk. At the end of the day AH will choose profit. If they lose that 10% spinning disk people who wont tolerate the few seconds change, the game will please the other players, thus making sure its lives on.

Lastly, this is how its delivered on console, many of which use spinning media. So its hard to see this as problematic. I'm guessing for console MS and Sony said no to a 150gb install so AG was invested in keeping it small. They were forced to redo the game for console without this extra data. The time and money there was worth it for them. For PC, there's no one to say no, so they did the cheapest thing they could until they no longer could.

This is one of the downsides of open platforms. There's no 'parent' to yell at you, so you do what you want. Its the classic walled garden vs open bazaar type thing.


Eh? Hard drives for gaming and high-end workstations are thoroughly obsolete. SSDs are not optional when playing any triple-A game. It's kinda wild to see people still complaining about this.


It is a trade-off. The game was developed on a discontinued engine, the game has had numerous problems with balance, performance and generally there were IMO far more important bugs. Super Helldive difficulty wasn't available because of performance issues.

I've racked up 700 hours in the game and the storage requirements I didn't care about.


Oh wow, what is the story with the engine?


somehow they chose to build their very complicated live service game with the Autodesk Stingray engine which was discontinued in 2018! Helldivers 2 was released in 2024.

https://www.autodesk.com/products/stingray/overview


Development of HellDivers 2 started in 2016. So they would have been 2 years into development with that engine. So they would have had to effectively start again in another engine.


Wow, that is a very long development cycle. It really shows in the level of polish this game has.


That's really strange. Perhaps they had a set of engineers that knew it very well or had built an engine in it for another game. Still... strange.


> They’re not the ones bearing the cost.

I'm not sure that's necessarily true... Customers have limited space for games; it's a lot easier to justify keeping a 23GB game around for occasional play than it is for a 154GB game, so they likely lost some small fraction of their playerbase they could have retained.


That is a feature for franchise games like CoD.


> I’d wager very few check the hard disk requirements

I have to check. You're assumption is correct. I am one of very few.

I don't know the numbers and I'm gonna check in a sec but I'm wondering whether the suppliers (publishers or whoever is pinning the price) haven't screwed up big time by driving prices and requirements without thinking about the potential customers that they are going to scare away terminally. Theoretically, I have to assume that their sales teams account for these potentials but I've seen so much dumb shit in practice over the past 10 years that I have serious doubts that most of these suits are worth anything at all, given that grown up working class kids--with up to 400+ hours overtime per year, 1.3 kids on average and approx. -0.5 books and news read per any unit of time--can come up with the same big tech, big media, economic and political agendas as have been in practice in both parts of the world for the better part of our lives--if you play "game master" for half a weekend where you become best friends with all the kiosks in your proximity.

> the effect on their bottom line is negligible

Is it, though? My bold, exaggerated assumption is that they would have had 10% more sales AND players.

And the thing is, that at any point in time when I, and a few I know, had time and desire to play, we would have had to either clean up our drives or invest game price + sdd price for about 100 hours of fun over the course of months. We would have gladly licked blood but no industry promises can compensate for even more of our efforts than enough of us see and come up with at work. As a result, at least 5 buyers and players lost, and at work and elsewhere you hear, "yeah, I would, if I had some guys to play with" ...


I do not think the initial decision-making process was "hey, screw working-class people... let's have a 120GB install size on PC."

My best recollection is that the PC install size was a lot more reasonable at launch. It just crept up over time as they added more content over the last ~2 years.

Should they have addressed this problem sooner? Yes.


Gamers are quite vocal about such things, people end up hearing about it even if they don’t check directly.

And this being primarily a live-service game drawing revenues from micro-transactions, especially a while after launch, and the fact that base console drives are still quite small to encourage an upgrade (does this change apply to consoles too?), there’s probably quite an incentive to make it easy for users to keep the game installed.


Studios store a lot of builds for a lot of different reasons. And generally speaking, in AAA I see PlayStation being the biggest pig so I would wager their PS builds are at least the same size if not larger. People knew and probably raised alarm bells that fell to the wayside because it's easier/cheaper to throw money at storage solutions than it is engineering.


I only skimmed through this; I have no real information on the particular game, but I think the console versions could be much smaller as less duplication is necessary when the hardware is uniform.


I mean its not really a cost to anyone. Bandwidth is paid for by Valve, games can be deleted locally, etc.


Taking up 500% of the space than is necessary is a cost to me. I pay for my storage, why would I want it wasted by developer apathy?

I'm already disillusioned and basically done with these devs anyways. They've consistently gone the wrong direction over time. The game's golden age is far behind us, as far as I'm concerned.


Putting something on your hard drive temporarily does not use it up? You don’t lose anything. Nothing is wasted.


On my high performance SSD for games and other data that requires it. If I pay for 1 or 2 TB of high performance storage, I want to use that extra space for other things. Not to mention the fact that I don't want my storage to fill up too much because they affects general performance on the drive. Also, more data that is written and rewritten with game updates is more, unnecessary wear on the drive.


insufferable


yes, being a contrarian and acting like games using more storage space does not results in higher costs on the user is insufferable.

If a user only installs 10 games the required storage space would go from 1.5TB to only 250GB.

Currently a 250GB SSD costs 300$, 2TB is +1300$....


So uninstall some games, you’re not playing 10 at once. Plus, they fixed the issue and you’re still mad.


yeah just let me delete those +100GB game that most in the world will take days to redownload not to mention the times you are offline just because it makes you feel better about your mega corp bootlicking.


Which goes to show, that they don't care about the user, but only about the user's money.


No - because most users also don't check install size on games, and unlike renting overpriced storage from a cloud provider, users paid a fixed price for storage up front and aren't getting price gouged nearly as badly. So it's a trade that makes sense.

Both entrants in the market are telling you that "install size isn't that important".

If you asked the player base of this game whether they'd prefer a smaller size, or more content - the vast majority would vote content.

If anything, I'd wager this decision was still driven by internal goals for the company, because producing a 154gb artifact and storing it for things like CI/CD are still quite expensive if you have a decent number of builds/engineers. Both in time and money.


So guide me through this thought process:

You are saying, that most users don't check install size of their games. Which I am not convinced of, but might even be true. Lets assume this to be true for the moment. How does this contradict, what I stated? How does users being uninformed or unaware of technical details make it so that suddenly cramming the user's disk is "caring" instead of "not caring"? To me this does not compute. Users will simply have a problem later, when their TBs of disk space have been filled with multiple such disk space wasters. Wasting this much space is user-hostile.

Next you are talking about _content_, which most likely doesn't factor in that much at all. Most of that stuff is high resolution textures, not content. It's not like people are getting significantly more content for bigger games. It is graphics craze, that many people don't even need. I am still running around with 2 full-HD screens, and I don't give a damn about 4k resolution textures. I suspect that a big number of users doesn't have the hardware to run modern games fluently at 4k.


The general thrust of my argument is this:

"There is a limited amount of time, money, and effort that will be spent on any project. Successful enterprises focus those limited resources on the things that matter most to their customers. In this case, disk usage in the ~150gb range did not matter much in comparison to the other parts of the game, such as in-game content, missions, gameplay, etc."

We know this, because the game had a very successful release, despite taking 150gb to install.

I'm not saying they should have filled that 100 extra gb with mission content - I'm implying they made the right call in focusing their engineering manpower on creating content for the game (the ACTUAL gameplay) and not on optimizing storage usage for assets. That decision gave them a popular game which eventually had the resources to go optimize storage use.


It's not even about graphics it's about load time on HDD. Which turns out didn't benefit all that much. I can see customers being much more annoyed at a longer load time than a big install size as this has become pretty common.


154GB is A LOT still.

I mean.. A few years ago, 1TB SSDs were still the best buy and many people haven't ugpraded still, and wasthing 15% of your total storage on just one game is still a pain for many.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: