That's the only idea that was actually tested successfuly. Accidentally of course. There was a brief pause in global warming right before we banned sulfur from our fuels.
It's not that there was a brief pause prior, it's that after the ban temps started rising way more than expected. The fuels were "hiding" the extent if global warming.
Between the 40s and the 80s global temperatures pretty much oscillated around the same value when our sulfur emissions roughly balanced our carbon emissions. But due to local damage from acid rains we reduced sulfur emissions (or at least reduced their growth) and carbon emissions rose happily unrestricted.
Given the criticality to National security interests, perhaps the US government should have brought the stick with heavy taxes / penalties for not having the US Based plant instead of the carrot of more corporate subsidies.
This is still a US based company, with US leadership
The US Government could pass a new ruling requiring all chips in all military hardware (or even all hardware used by any government agency) down to the smallest IoT shit needs to be made in the US by a US firm.
I doubt Intel would need any money or other incentive if the orders start coming in.
This is also why nobody ever wants to cut the DoD budget, it's basically the backbone of American manufacturing for everything. All the stuff our company makes for the military needs to be US made, despite us being able to get the same stuff from China for 1/10 the cost.
It's also a kick-your-door-down-and-go-prison offense if you try and skirt this. My company knows that first hand (unintentional, but the DoD doesn't care)
The military may not be a large enough customer to make such demands. Or perhaps a handful of companies would do it at 100x the cost of similar civilian products. It would turn into another $10k toilet seat fiasco.
The supply chain for military hardware is already under heavy scrutiny. I don’t think the CHIPS Act is about security; it’s about onshoring production so the US can avoid caring about Taiwan.
Purchase orders like this happen years out. Before the deliveries dry up, Intel's cash flow will dip and investors will riot until Intel is ready to do business in the US.
TSMC is looking to bail completely on the second fab / even intel is now looking at delaying. The problem we are running into is a lot of DEI language was included in the chips act which is making it hard for TSMC (and Intel) to comply. This is not a good PR move to say this outright so they will just give more generic "labor" shortages etc as the official reason.
The Japanese plant is less ambitious, using an older process on a smaller scale. The fact the Japanese plant was finished first says little about manufacturing in the US vs Japan.
I think you meant “aren’t manufacturing” and you are correct. It’s not cheap labor, except for the lowest end manufacturing. It’s that you can’t build anything in America anymore.
The government needs to address that instead of subsidizing one off efforts.
> it's selling out Americans in favor of cheap labor.
Perhaps American workers. But an alternate take is that American consumers got cheaper chips and PCs - which resulted in more being sold, more companies and people using them, greater efficiency and new ideas and perhaps a more productive economy.
It will be interesting in 5-10 years if the US needs to tax/block possibly cheaper and better components built outside the US being sold
Intel, like any other succesful company, prices its goods based on what the market will pay, not what it costs to make. Lower costs does not mean lower prices for customers, it means more profits for Intel.
> Intel, like any other succesful company, prices its goods based on what the market will pay, not what it costs to make
That may be true for Intel's CPUs but when products aren't produced by monopolies, those that have the cheapest costs and are in competition with others, often lower their prices to compete where the competitor can't. For example: AMD, graphics and motherboard parts
Well, I take it that it's always true and is a good lesson for anyone trying to set the price for their own product (like a hacked together side project one of us might be working on :) ).
My understanding is that the price should never be set based on production cost but on the value it brings to the customer. Thereby, a simple program you or I could hack together in a month, might have a cost of 10k in labor only cost, but if it would save the customer millions each year, they would pay a million for it.
So, you set the cost based on how much they are willing to pay, based on how much they would profit/save.
If another competitor offers a similar solution, you just undercut them by 10%, or invest another 10k for superior features.
Furthermore, up until at least a few years ago (i don't know how it is now) intel chips were more expensive than amd's, especially on server side, even accounting on performance/watt and intel adopted practices such as locking mobo to just one chipset and things like that. the situation might be different now tho
I believe that $77B as a fairly good cause for investors/other companies to set up shop and make investment into the US semiconductors market. According to random market statistic site, it is also currently growing by 10% each year.
Is that market so poor that subsidies really is needed on the basis of profitability?
A $77B market that's growing by 10% each year doesn't mean much when the government turns against you demands you do a bunch of expensive investments for nationalist reasons. See also: all the investors fleeing china because they were spooked by the tech crackdown/covid lockdowns. "Punishing a company because it's selling out Americans in favor of cheap labor" is basically the same thing but with a different coat of paint.
Intel is a product of Shockley, which was a product of Bell Labs, which was a product of “punishing” AT&T for “being American”.
AT&T would have zealously guarded the transistor patent and never have licenced it out to Motorola, TI, etc. if they only cared about maximising shareholder value.
We're always bending over backwards for a handful of companies instead of promoting a healthy market. We should of created a race with multiple companies to build the best chip factories.
> We should of created a race with multiple companies to build the best chip factories.
That's insanely expensive. Today the only companies that make leading edge chips are TSMC, Samsung, and Intel. Global Foundries (headquartered in US) dropped out a few years ago because it was too expensive. How on earth are you going to fund "a race with multiple companies to build the best chip factories"? It's going to cost multiples of whatever the intel subsidy is.
Multiples of $8 billion? Oh no, where would we find the money? If this is truly a national security issue, perhaps some of the DoD’s ~$150 billion R&D budget could go towards it.
Accelerated consolidation was policy for decades. Clinton Admin's dept of defence pushed it really hard, esp for defense contractors. To reduce costs, make our nat'l champions more competitive internationally, make our hair more luxuriant, and cure rickets. Because reasons.
Now the pendullum might finally be swinging back towards pro-competition and healthy open markets.
Wouldn't be easier for the US government to buy Intel shares rather than subsidies it? It should actually be mandatory for the government to own part of a company that is so critical to national and security interest.
It doesn't matter who wrote it, it got picked up, had a good argument and affected market opinion. The execs now need to respond to it.
Humans also don't grasp that things can improve exponentially until they stop improving exponentially. This belief that AGI is just over the hill is sugar-water for extracting more hours from developers.
The nuclear bomb was also supposed to change everything. But in the end nothing changed, we just got more of the same.
> The nuclear bomb was also supposed to change everything. But in the end nothing changed, we just got more of the same.
It is hard for me to imagine a statement more out of touch with history than this. All geopolitical history from WWII forward is profoundly affected by the development of the bomb.
I don't even know where to begin to argue against this. Off the top of my head:
1. What would have happened between Japan and the US in WWII without Hiroshima and Nagasaki?
2. Would the USSR have fallen without the financial drain of the nuclear arms race?
3. Would Isreal still exist if it didn't have nuclear weapons?
4. If neither the US nor Russia had nuclear weapons, how many proxy wars would have been avoided in favor of direct conflict?
The whole trajectory of history would be different if we'd never split the atom.
The whole trajectory of history would have been different if a butterfly didn't flap it's wings.
The bomb had effects, but it didn't change anything. We still go to war, eat, sleep and get afraid about things we can't control.
For a moment, stop thinking about whether bombs, AI or the printing press do or do not affect history. Ask yourself what the motivations are for thinking that they do?
"nuclear weapons are no big deal actually" is just a wild place to get as a result of arguing against AI risk. Although I guess Eliezer Yudkowsky would agree! (On grounds that nukes won't kill literally everyone while AI will, but still.)
Nuclear weapons are uniquely good. Turns out you have to put guns to the collective temples of humanity for them to realize that pulling the trigger is a bad idea.
It's too early to say definitively but it's possible that the atomic bomb dramatically reduced the number of people killed in war by making great power conflicts too damaging to undertake:
I'd actually guess those casualties would be quite less than WW2. As tech advanced, more sophisticated targeting systems also advanced. No need to waste shells and missiles on civilian buildings, plus food and healthcare tech would continue to advance.
Meanwhile, a single nuclear bomb hitting a major city could cause more casualties' than all American deaths in ww2 (400k).
That's really only true for the Americans, the Russians still don't seem to care about limiting collateral damage and undoubtedly the Americans wouldn't either if their cities were getting carpet bombed by soviet aircraft.
Single software engineers writing influential papers is often enough how a exec or product leader draws conclusions, I expect. It worked that way in everywhere I've worked.
The NIH and the CDC have objectively failed, on pure data. Autism rates were 1 in 2500 in 1980, now 1 in 44. Peanut allergies have tripled. Autoimmune diseases are up 50-100%. Obesity rates have tripled in past 50 years.
At this point, it's either complete incompetence by health regulators, or malice. There are clearly environmental factors at play!
While climate changes have always existed, there are now at least 2 important differences, which will make this warming much more dangerous for the terrestrial plants and non-human animals.
One difference is that the rate of warming is very high now, probably much higher than ever before. At least where I live, in Europe, the climate has changed dramatically in less than 1 human lifetime and now it is extremely different from how it was when I was young.
The second difference is that now all larger wild animals and wild plants will no longer be able to react to climate like they did before, when they migrated towards the south or towards the north, depending on the climate evolution.
Now the terrestrial part of the Earth is mostly occupied by humans, crops and domestic animals, while the remaining wild plants and animals are mainly in "islands" scattered over the lands. This will make impossible a gradual retreat of the wildlife towards some more appropriate climate and they cannot make plans like humans, e.g. that they should travel 100 km through some inhospitable land, because at the end there would be a suitable biotope.
Of course, among the terrestrial wildlife, the best chances for surviving a climate change will be for those who can reach far distances through the air, over the man-made obstacles, e.g. plants with airborne seeds, birds, bats, insects, spiders and other very small living beings that can be carried by the wind, or by birds or insects.
In any case, it is pretty certain that this warming will be much more destructive for the wildlife than any other before.
whose conclusions are that, by mass, the humans alone are many times more than all the wild terrestrial vertebrates together, while obviously the domesticated animals exceed a few times the humans, both in number and mass.
Of course the number and mass of animals are only partially correlated with the area occupied by them, because the larger some wild animals are, the less their surface density is, and except for animals as small as rodents or small birds their surface density is much less than that of humans in cities.
There are a few countries with a relatively low density of population, e.g. most of North America, Russia, Australia, the northern European countries (Finland, Sweden, Norway).
I am not familiar with the land status in these countries, but these are the only places on Earth where there is a chance for some areas occupied by wildlife to form a connected mesh, allowing slow migration, but even that must be sectioned by many roads and fences that may discourage the migration of animals from the places that they have inhabited from birth.
Other low human density areas on Earth, i.e. the tropical forests and the deserts, do not count, as they will not be destinations for animals or plants seeking lower temperatures. Of the other areas with low human density, only in the high mountains there would be possible migrations, when the animals and plants adapted to high altitudes and lower temperatures could die and be replaced by animals and plants coming from lower altitudes. Except that in many countries most of the wildlife is already in the mountains, so there might be very few wild animals and plants left at lower altitude, ready to replace the former inhabitants from high altitudes, which will never have any way out.
On the other hand I am more familiar with the status in Western and Eastern Europe and in some parts of Asia, where I have traveled frequently through several countries.
Here, it is enough to fly a plane over several European countries in a sunny day, and you will see only agricultural land, villages and cities, with only isolated and scattered remains of forests, lakes or uncultured land, or isolated parts of mountains that remain wild.
There already is a great difference between how some places were when I was young and I traveled to them and how they are now. Several decades ago, there were relatively large connected wild areas in some mountains, but meanwhile a lot of roads have fragmented the mountains, owners have built fences around land parcels, some forests have been cut and so on, so where there was a larger wild area now there are many disconnected smaller wild areas.
That doesn't mean conditions will be to our liking under those regimes. We built cities along coasts assuming a certain mean sea level. We built farms assuming certain weather patterns.
If we cause large changes to these patterns in a relatively short period of time that's bad. The long-term trends you are talking about happened very slowly - slowly enough in theory a city could move back from the shoreline without anyone noticing it was happening. Slowly enough that species can migrate or adapt to them - slow enough natural selection can produce individuals more heat or cold adapted.
Climate change isn't going to cause humans to go extinct or anything like that. But it is going to have large and sometimes unpredictable effects, most of which are not useful or helpful to us. Many of which are actively detrimental.
Since the oil/coal/natgas will eventually run out anyway and the world so depends on them that gives leverage to people who really really want to hurt us: we might as well just deal with it and move away from burning things as much as possible.
> There were millions of years without ice caps even
In case anyone was curious, this is a significant understatement. It is currently believed that the Earth had no ice caps for its first 2 billion years, before the Huronian glaciation, and then again no ice caps for another ~1.5 billion years.
Why bother? There weren't humans millions of years ago to record climate data or grape picking dates. Writing is 5500 years old. Climate change could have ended the Roman Empire. In wide terms we already know what will happen because it happened before.
At the end of a glacial period the temperature changes rapidly e.g. it took 100 years for the land ice to retreat from the Eifel range in Germany (at 50°N) to above Oslo in Norway (at 62°N), a distance of 1200 km at the end of the younger Dryas some 12.000 years ago:
Gifted kids are all really different in their specific strengths. That said, for one with a well-expressed nerd gene, we started at age 6. I would start with an iPad app (slightly simpler container than a browser), maybe that gamifies the whole thing a bit with some easy on-ramps. For us that was Tynker and a bit of Hopscotch. It's pretty easy to tell whether you've got any sparks of interest on your hands, else no point in pushing it.
HCQ was an interesting thought in early March 2020 (my brother as an ICU doc was considering getting some for our parents). But the data was very quickly available that it was essentially worthless, and certainly nowhere near as useful as some other cheap, off patent medication such as steroids which was rapidly demonstrated to be highly effective at cutting mortality by the UK’s medical industrial complex.
People who acted in bad faith on data that was demonstrably bad were rightly defenestrated by the medical and scientific community. However many of those same people found themselves right at home in a new cheer squad of people who are more than happy to invest in beliefs over data. The mop up on the shitty evidence is drawing to a conclusion, with events such as the recent NEJM publishing of the ivermectin study. But stupid knows no bounds and the scientific method is going to be mopping up the damage from this one for a long time
People suggesting ivermectin ought to be interesting were not wrong. It could plausibly have had some effect, based on the biochemistry.
It just turned out not to work. I don't know of any evidence that trying it, when we had literally nothing else, did anybody any harm. All the harm came from people using it instead of, later, doing the things that did turn out to actually work. Probably a few people even cleared up a chronic worm infection they didn't know about.
What I did not see, and expected to see, was a study of relative infection rates in people already on ivermectin for an on-label use vs. people who were not. These came out pretty early vis a vis chloroquines, showing that people who had been taking that got COVID-19 just like anybody else. Maybe periodic ivermectin doses wasn't a thing...
Noone was going after researchers running proper studies with informed consent on e.g. ivermectin impact on Covid. However, doctors randomly prescribing stuff because they believe in it is quite different from a medical study - we have good historical reasons why we regulate medical experiments on humans, why we don't allow the medical and pharmaceutical industry to "just try it" unless certain conditions are met.
Which doctors specifically are you referring to? There really wasn't any scientific controversy about the effectiveness of these drugs, it was mostly social media controversy by non-scientists.