Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's all about the story that's sold to the higher ups. The higher you go up the corporate ladder, the vaguer the understanding of the technology. The big boss hears from a Microsoft salesman that AI = you can fire 20% of your workforce, but never questions exactly how that works. They probably never got sold static analysis in that way. That was just some kind of tool that somehow helps with that mumbo jumbo that developers spend all day typing. There's no story there that inspires a manager. AI = cut costs is music to the ears of the board. So then pressure gets applied to those lower down.

Something similar was going on with cloud a few years ago. The story was if you get cloud you can get rid of those expensive infrastructure people and it will all be so much more reliable. So the big boss gets a cloud strategy and foists it on those lower down. There's also pressure to be an on-trend boss. If all the other boss' are getting into it, then you need to as well.



I think it is worth really deeply understanding that the bosses hate us. Capital has only begrudgingly involved labor when forced to. It is no surprise to me that genai hype happened after the largest increase in general labor power in recent memory (post 2020 labor market) and a decade long increase in the labor power among software engineers.

The bosses have seen pay and benefits go up and up and up. They've seen people jump between companies, taking institutional knowledge with them. They need the job market to crater so they can re-exert control in the relationship. LLMs are fucking catnip to this belief system. "You mean I do need to deal with those people that I have to hire and train and pay? I hate those guys! Awesome!"


> bosses hate us

As a person who has worked at different layers of corporate structures I find this statement mildly offending and mostly inaccurate. In terms of emotions expressed by those reported to, I've witnessed many, such as various feelings of satisfaction and control, to an air of superiority and at times even contempt. But hate was quite unusual and would more often go the other way around, together with envy.


Given how many times the bosses have shot workers with machine guns, I dunno.


they don't "hate" us. its just that the people who own the business and the people that work at the business have conflicting self interests.


I don't get the confusion in these comments. Unless we're talking about micro companies, people who own the business don't run it, and even those who run it usually rarely have direct contact with engineers.


We'd be slaves if it were legal.


I’m seeing this second hand at the Fortune 500 my spouse works for.

They are an enterprise SaaS company in SV. Where they have machine learning software that they have been selling for more than a decade, it’s all been rebranded as AI. That’s fair enough from a sales perspective, I guess. What’s odd is that their C-suite and SVPs are pressuring everyone to use LLMs everywhere, for pretty much everything, and none of them seem to understand why it’s only that level of employee that’s seeing any benefit or expressing any interest. My spouse has reported that the running joke across the company is that the executives have jobs that can be done by LLMs, but no one else does. The ICs could not be less interested, and even if they were, legal promulgated a policy against actually putting anything confidential into any LLM other than Copilot in Azure, which the whole workforce reportedly only really uses for summarizing the increasing number of meetings that are perceived as a waste of IC time. A lot of those meetings are “let’s use AI”.

It’s absolutely insane.


The bit about legal is critical. Im specifically working on "how do we use LLMs in a data-safe way?". Most of the huge gains require exposing information of at least a semi-sensitive nature. For now, that's probably going to shake out as "Copilot or self-host models". I believe that the infrastructure available to self-host at an enterprise level will start growing rapidly due to this trust problem.

Copilot gets the legal nod because Microsoft already has a very, very long history of being trusted with essentially all of an organizations most sensitive information, they have a lot to work with there when it comes to convincing execs that they can safely handle sensitive data in the cloud. (Im am not convinced of that personally).


I agree. Especially with Microsoft, it seems as soon as Satya Nadella promised security was their first priority and he’d stake his compensation on it, no sooner did they get distracted once again.

If you keep leaving the barn door open, eventually your staying power won’t be enough. On the other hand, I’ve stopped thinking to myself “this must be the big one” when it comes to their more egregious recent failures.

> I believe that the infrastructure available to self-host at an enterprise level will start growing rapidly due to this trust problem.

I thought it would go in this direction more rapidly. With the way even the hyperscalers have had trouble securing enough GPUs for existing demand, I’m left wondering if regardless of technical merits, the bubble is going to burst because of basic real world logistical challenges. I’ve actually heard it said in hushed tones that (paraphrased) “the money isn’t the problem. There just aren’t enough GPUs.”

For awhile I’ve assumed that coprocessors or leveraging GPUs at the local level will be the only viable course long-term for widespread adoption. Most actual need for AI/ML technologies aren’t “consume the whole Internet” scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: