As someone already said, parents used to be concerned that kids wouldn't be able to solve maths problems without a calculator, and it's the same problem, but there's a difference between solving problems _with_ LLMs, and having LLMs solve it _for you_.
Well the extent is much broader from a calculator vs an LLM. Why should I hire you if an agent can do it ? LLM is every job is a calculator and can be replaced. Spotify CEO stated on X that before asking for more headcount they have to justify not being able to do the job with an agent. So all the students who let the LLM do their assignment and learn basically nothing, what’s their value for a company to be hired ? The company will and is just using the agent as well …
An agent can't do it. It can help you like a calculator can help you, but it can't do it alone. So that means you've become the programmer. If you want to be the programmer, you always could have been. If that is what you want to be, why would you consider hiring anyone else to do it in the first place?
> Spotify CEO stated on X that before asking for more headcount they have to justify not being able to do the job with an agent.
It was Shopifiy, but that's just a roundabout way to say that there is a hiring freeze due to low sales (no doubt because of tariff nonsense seizing up the market). An agent, like a calculator, can only increase the productivity of a programmer. As always, you still need more programmers to perform more work than a single programmer can handle. So all they are saying is that "we can't afford to do more".
> The company will and is just using the agent as well …
In which case wouldn't they want to hire those who are experts in using agents? If they, like Shopify, have become too poor to hire people – well, you're screwed either way, aren't you? So that is moot.
So like arguably when people were not using calculators they made calculations by hand and there was a room full of people that did calculations. That’s gone now thanks to calculators. But it the analogy goes to an order of magnitude higher, now fewer people can « do » the job of many so less hiring maybe but not just on « do calculations by hand » but almost all fields where the use of software is required.
Where will all those new students find a job if :
- they did not learn much because LLM did work for them
- there is no new jobs required because we are more productive ?
Never in the history of humans have we been content with stagnation. The people who used to do manual calculations soon joined the ranks of people using calculators and we lapped up everything they could create.
This time around is no exception. We still have an infinite number of goals we can envision a desire for. If you could afford an infinite number of people you would still hire them. But Shopify especially is not in the greatest place right now. They've just come off the COVID wind-down and now tariffs are beating down their market further. They have to be very careful with their resources for the time being.
> - they did not learn much because LLM did work for them
If companies are using LLMs as suggested earlier, they will find jobs operating LLMs. They're well poised for it, being the utmost experts in using them.
> - there is no new jobs required because we are more productive ?
More productivity means more jobs are required. But we are entering an age where productivity is bound to be on the decline. A recession was likely inevitable anyway and the political sphere is making it all but a certainty. That is going to make finding a job hard. But for what scant few jobs remain, won't they be using LLMs?
> Spotify CEO stated on X that before asking for more headcount they have to justify not being able to do the job with an agent.
Spotify CEO is channeling The Two Bobs from Office Space: "What are you actually doing here?" Just in a nastier way, with a kind of prisoner's dilemma on top. If you can get by with an agent, fine, you won't bother him. If you can't, why can't you? Should we replace you with someone who can, or thinks they can?
You as the employer are liable, a human has real reasoning abilities and real fears about messing up, the likely hood of them doing something absurd like telling a customer that a product is 70% off and them not losing their job is effectively nil. What are you going to do with the LLM, fire it?
Data scientist and people deeply familiar with LLMs to the point that they could fine tune a model to your use case cost significantly more than a low skilled employee and depending on liability just running the LLM may be cheaper.
As an accounting firm ( one example from above ) far as I know in most jurisdictions the accountant doing the work is personally liable, who would be liable in the case of the LLM?
There is absolutely a market for LLM augmented workforces, I don't see any viable future even with SOTA models right now for flat out replacing a workforce with them.
I fully agree with you about liability. I was advocating for the other point of view.
Some people argue that it doesn’t matter if there is mistakes (it depends which actually) and with time it will cost nothing.
I argue that if we give up learning and let LLM do the assignments then what is the extent of my knowledge and value to be hired in the first place ?
We hired a developper and he did everything with chatGPT, all the code and documentation he wrote. First it was all bad because from the infinity of answers chatGPT is not pinpointing the best in every case. But does he have enough knowledge to understand what he did was bad ? And then we need people with experience that confronted themselves with hard problems and found their way out. How can we confront and critic an LLM answer otherwise ?
I feel student’s value is diluted to be at the mercy of companies providing the LLM and we might loose some critical knowledge / critical thinking in the process from the students.
I agree entirely on your take regarding education. I feel like there is a place where LLMs are useful but doesn't impact learning but it's definitely not in the "discovery" phase of learning.
However I really don't need to implement some weird algorithms myself every time (ideally I am using a well tested Library) but the point is that you learn to be able to but also to be able to modify or compose the algorithm in ways the LLM couldn't easily do.
>As someone already said, parents used to be concerned that kids wouldn't be able to solve maths problems without a calculator
Were they wrong? People who rely too much on a calculator don't develop strong math muscles that can be used in more advanced math. Identifying patterns in numbers and seeing when certain tricks can be used to solve a problem (verses when they just make a problem worse) is a skill that ends up being beyond their ability to develop.
Yes, they were wrong. Many young kids who are bad at mental calculations are later competent at higher mathematics and able to use it. I don't understand what patterns and tricks you're referring to, but if they are important for problems outside of mental calculations, then you can also learn about them by solving these problems directly.
Almost none of the cheaters appear to be solving problems with LLMs. All my faculty friends are getting large portions of their class clearly turning in "just copied directly from ChatGPT" responses.
It's an issue in grad school as well. You'll have an online discussion where someone submits 4 paragraphs of not-quite-eloquent prose with that AI "stink" on it. You can't be sure but it definitely makes your spidey sense tingle a bit.
Then they're on a video call and their vocabulary is wildly different, or they're very clearly a recent immigrant and struggle with basic sentence structure such that there is absolutely zero change their discussion forum persona is actually who they are.
This has happened at least once in every class, and invariably the best classes in terms of discussion and learning from other students are the ones where the people using AI to generate their answers are failed or drop the course.
> there's a difference between solving problems _with_ LLMs, and having LLMs solve it _for you_.
If there is a difference, then fundamentally LLMs cannot solve problems for you. They can only apply transformations using already known operators. No different than a calculator, except with exponentially more built-in functions.
But I'm not sure that there is a difference. A problem is only a problem if you recognize it, and once you recognize a problem then anything else that is involved along the way towards finding a solution is merely helping you solve it. If a "problem" is solved for you, it was never a problem. So, for each statement to have any practical meaning, they must be interpreted with equivalency.
There is a difference between thinking about the context of a problem and "critical thinking" about the problem or its possible solutions.
There is a measurable decrease in critical thinking skills when people consistently offload the thinking about a problem to an LLM. This is where the primary difference is between solving problems with an LLM vs having it solved for you with an LLM. And, that is cause for concern.
Two studies on impact of LLMs and generative AI on critical thinking:
For me the idea of any kind of interface in vehicles should tend toward audio anyway. Anything that takes your attention away from the road, wether it's tactile or touchscreen is a potential distraction.
Voice control seems the obvious solution but there are probably better ideas, especially as someone who's accent confuses all but the best recognition, or well trained, software. I end up talking in an "American" accent to my car ... but then I do enjoy pretending I'm Michael Knight.
I did something very similar but for the price of wood from sellers here in the UK but instead of Platwright, which I'd never heard of at the time, I used NodeRED.
You just reminded me, it's probably still running today :-D
This headline is misleading. According to the article schools CAN still use Chromebooks but they cannot enable the tracking functionality on them.
"From 1 August 2024 onwards, Danish schools will no longer be allowed to enable Chromebooks and Google platforms to collect students' personal data for processing."
and how many times has AP avoided accidents the driver wouldn't? The stats arn't that simple any more are they.
There are plenty more news articles, especially in local news, about fatalities involving non autopilot cars, but they're not as interesting so don't get shared as often.
Such stats are never simple. But when important data is withheld they definitely become irrelevant. They become more of a PR tool.
Statistically speaking 100% of people are able to pilot a plane with their eyes closed and not crash it... as long as a qualified pilot is ready to take over at any time. Such a stat is only valuable if you want to show the standby pilot is useful.
What would you base your assumption that AP saved even 1 life on when that data is not provided? All we know is that it crashed a number of times exclusively when it wasn’t supervised as required. The pragmatic conclusion is that it works because it’s supervised. I imagine Tesla would release more data if it supported the performance of the AP.
Here is a cool video in which shows a model 3 dodging a car in front of it [0] after it got rear ended causing it to jut forward. Owner says it wasn't him but autopilot that did it.
I'm surprised FIFA never gets scrutinised in this was, it's an £80 game each year and to make any kind of impression you need to buy packs.
I watched my son once where he literally spent what I'd given him on packs, looked at what he'd "won" and sold them all instantly for in game currency. When I asked why he didn't use any of them he said they were no good. He then repeated the same process on the rest of the packs, keeping maybe 1 player.
It reminds me of the behaviour of people addicted to slot machines, literally pumping in coins, recycling their winnings until they have nothing left.
After the Battlefront 2 debacle, EA said they were abandoning loot boxes and wouldn't do pay to win any more. They appear to have avoided it with Battlefield V, but FIFA still stands as the Daddy when it comes to pay to win and it needs to be destroyed.
I downloaded FIFA15 on PS3 years ago, and I've only played it a couple of times for this reason. They have completely gimped the out-of-the-box features in terms of what you can do as a single player. Majority of gameplay revolves around Ultimate Team, which requires packs.
Makes me long for the days of FIFA 96-09, where you didn't have to buy a damn thing other than the game itself to play in tournaments with you favourite team.
The "recurring revenue" model has really soured the experience of playing video games.
> The "recurring revenue" model has really soured the experience of playing video games.
Yeah, and the worst part is that many video game companies would rather kill decent single-player franchises with low-quality games that nobody likes, in order to get multiplayer capability and loot-boxes baked into everything.
Ten years ago I spent quite a lot of money on video games. Now, I just can't be bothered anymore. Most of the games I bought ten years ago have more replay value than the ones on the market today. At the same time the market for gaming seems to have exploded, so I guess it's working out for them.
Most of my friends who are religious about FIFA stopped playing online a couple years ago for the exact reason you described. FIFA is terrible about how they nerf players, if someone has a low stat it's like they are having a stroke when given the ball and half the time they loose control, despite the fact that the player is indeed based on a professional soccer player who can dribble the ball down the field blindfolded. RNG really has no place in competitive video games.
It's not really the start of the protest, ORG have been doing some great stuff recently with Blocked.org.uk (which is where the data for this campaign comes from)
If you're coming to the UK, don't limit yourself to just London, there are equally vibrant tech scenes in other cities like Sheffield (where I live), Manchester, Leeds, Newcastle, Liverpool and many more.
Actually no, I'm painfully unaware of these things! I usually go to Dorkbot when it's on. One of my colleagues goes to the testing meetup in the Rutland. What are some other good ones?
I don't see the former as that much of a problem.