I believe we're currently seeing AI in the "mainframe" era, much like the early days of computing, where a single machine occupied an entire room and consumed massive amounts of power, yet offered less compute than what now fits in a smartphone.
I expect rapid progress in both model efficiency and hardware specialization. Local inference on edge devices, using chips designed specifically for AI workloads, will drastically reduce energy consumption for the majority of tasks. This shift will free up large-scale compute resources to focus on truly complex scientific problems, which seems like a worthwhile goal to me.
The CPU development curve is often thrown around but it very seldomly fits anything else in reality. It was a very rare and extraordinary set of coincidences that got it us here. Computation using silicon turned out to have massive growth potential for a variety of lucky reasons but say battery tech is not so lucky, nor is fusion nor is quantum computing.
The low hanging fruit has been plucked by said silicon development process and while remarkable improvement in AI efficiency is likely it is highly unlikely for that to follow a similar curve.
More likely is slow, incremental process taking decades. We cannot just wish away billions of parameters and the need for trillions of operations. It’s not like we have some open path of possible improvement like with silicon. We walked that path already.
I don’t understand the “chips designed for AI workloads” sentiment I hear all the time. Llms were designed using Gpus. The hardware already exists, so what will make it use less energy in a world where Gpus over the last decade have only become bigger, hotter, more power hungry hardware? If we could develop Llm on anything less we probably would have shifted back to Cpus already.
It sure seems like that to me. I was pretty impressed by how easily I could run small Gemma on 7 year old laptop and get a decent chat experience.
I can imagine that doing some clever offloading to a normal programs and using the LLM as a sort of "fuzzy glue" for the rest could improve the efficiency on many common tasks.
I mean.. cute conspiracy but it doesn't correspond with reality. Just look what's Google releasing, they are trying to make these things fit on consumer hardware.
Well, maybe when the next pandemic hits, there will be no more remote work. Since you do not believe it is an option, if it worked then, why wouldn’t it work now?
The same could be said about a person stabbing someone to death. You could call it a incident. But if there was a clear intent it is called murder. Definition do matter. In above example you could argue a judge first has to declare someone guilty and I agree. In this particular case I "beleive" it is a statistically unrealistic scenario to not rule out bad intent. Thus fitting the definition. It is not about a matter of opinion, it is purely stating the fact of the act fitting the description of terrorism which I would argue isnt subjected to semantics of the vague aspects of the definition. Murder is still murder no matter who dun it.
> So actually it is a stance of the BBC not to condem it.
I think you missed the whole point.
The article states in no uncertain terms that "Our business is to present our audiences with the facts, and let them make up their own minds."
This goes against the use of loaded terms like "terrorist" because "Terrorism is a loaded word, which people use about an outfit they disapprove of morally."
Not using a loaded term to describe something does not mean "they condone it".
There was a time when journalists were expected to present you with the facts and left you to think for yourself and make up your mind. It seems you have something against the latter.
People taking a stance based on presented facts of having a opinion based on them because of complete or incomplete information also fine. But it defeats the purpose of having the definition in the first place. I am referring to the using violence against non-combatants mostly civilians with the aim of spreading fear. Never have I said anything of being pro whomever, only condemming the act with the intention of spreading fear. Just like there is fear of US dropping bombs on afgan houses as it is to having planes flying into buildings. It is the behavior not the party who does it.
I'll not argue your point, but I think the IMO does illustrate why the BBC is absolutely right to avoid using the term. Various governments, organisations, people try to define terrorism, but always meet the response "well, didn't the USA do that in Vietnam" or "didn't the British do that in Ireland", and they did. So now it is generally left undefined, or defined by diktat (see "it is a fact of law" mentioned above). In other words it is a yah-boo word, it means "use violence that I disapprove of", it has no place in reporting.
> Never have I said anything of being pro whomever, only condemming the act with the intention of spreading fear.
You've tried to accuse a journalism organization of condoning terrorism just because they explained why in general, due to the organization's founding principles, they refuse to use loaded words.
You've repeatedly missed their point, and unwittingly you're making their point as well.
Ok perhaps I did, maybe I was wrong with the example of the bbc. Other news outlets consistently make use of superlatives and after browsing other articles I must admit I didnt find any on their site. And with my initial post I didnt say they condoned it. I said it is almost like they condone it. These carefull use of words where to have the possibility of saving face and it is the same for journalism. Fact is most countries have defined Hamas as a terrorist organisation as is the UK even before these events. These events fit the general definition of terrorism is it not? I understand they are also more carefull now with their words because of the fine balance. I only argue I see it differently and contest their frame.
The spokesman of the ACM just told om national tv that Apple tried to keep it silent by going to court, and luckily the judge denied the request. Yet another dick move of Apple vs open market competition.
I expect rapid progress in both model efficiency and hardware specialization. Local inference on edge devices, using chips designed specifically for AI workloads, will drastically reduce energy consumption for the majority of tasks. This shift will free up large-scale compute resources to focus on truly complex scientific problems, which seems like a worthwhile goal to me.