It depends on the product. But software has stronger the ecosystem effect and data lockin more so than any previous technology. Think facebook, youtube, Windows, office, apple ecosystem of products. They have two market characteristics: 1. People need to be on the platform that others are in 2. Once you are in you it's hard to leave. Both creates tendency for winner takes all economics. And I think many people knows this. Part of the reason why software companies since the 90s have higher valuation than other industries.
Environmental regulations, like air quality, water treatment is a lot tougher these days. Air and water quality has been improving for the past 7 years. These days a lot of people are giving approvals for increase environmental quality, though much work has to be done. I feel people outside don't understand how china's political system works at all. We support democracy, it's just that we think there could be different governance structures to achieve democracy and more effective governance. A lot of people, like environmental scientists, are in position of power to make changes to regulations and policy. At the same time, we are doing things to mitigate negatives of climate regulations such as increasing hassle, more costs to businesses and so on that climate regulations deniers use as excuses. It's very comprehensive.
Where I'm from used to be communist... I have a good idea how it works... There's a way the country presents itself to its people and outside and the reality is most often very different.
Yeah, this was pretty much the norm. But these past 4 years, this kind of "using public money for private events" are much less common. There is much much tight curb on "dinners" "using government owned cars" "red pockets gifts". And there is a lot of auditing, approvals, and documentation for doing anything. Source: my cousin works in a governmental department. His co-worker was fired for hosting a expensive dinner for potential collaboration partner. Before, people wanted to get a government position for the "nice Perks". Nowadays, government position is no longer that "you can do whatever you want job". There are actually tons of rules for government jobs. Ex: My brother is not allowed to purchase a drink more than 8% alcohol content if the meal is sponsored by government money and there is amount cap per meal. He is mid ranked. And people can report evidence of corruption rules breaking so he is very careful with the rules.
Yep the introduction written by Jordan Schneider was really confusing and seemed to want to portray the crackdown by Xi solely as something to weed out dissidents. That might have been some part of it especially in high-profile cases, but in general the drive to crack down corruption and make the officialdom more service-minded is definitely genuine, and people have been taking positive note of this. A lot of previous officials have simply resigned and gone into the private sector in order to make more money as a result as well.
Nvidia's big market is in servers and big training projects that train 10s or 100s gb of data or more. Almost all of these environment the data are stored in a Hadoop store, or cloud storage or big sql servers. No one is going to hook up a mac to the network add it to part of the data processing pipeline. And cloud is the natural place where this happens. Until apple releases a dedicated gpu/neural network processor in pcie card, Nvidia has nothing to fear. Nvidia is more concerned about amd or other machine learning processor startup. Apple is going to keep apple silicon running only on apple personal devices because that is being used as a competitive advantage to apple devices.
You are describing the current state of affairs. Its funny how people always think nothing will change. I am sure you can read similar statements about how Apple can never outperform Intel in their Macs from about 5 years ago.
It is simple: machine learning is part of the future and here to stay. Apple will need to do that as well. What better way is there to build expertise in it than to build your own hardware for it? Especially given how far they have already come in that domain? It is laughable to think that machine learning domination is not one of Apple's goals.
The memory on m1 is nothing special. It's not part of the m1 die. M1 still have traditional memory controllers. It's just they built the logic die and the memory die in a single package for lower memory power. Increase package size is simple or if you really need big memory, do the traditional memory connection on a pcb.
As for gaming, I don't it's a hardware issue. It's software support. Microsoft now also owns a lot of gaming studio for xbox exclusivesz those will never come to the Mac. Also big gaming companies are now working with likes of microsoft, amazon, Google on cloud gaming. That seems like the way forward for making AAA games run everywhere. I doubt these companies will spin up another team to support these games on the Mac platform. It sucks that m1 is actually really good gaming processor.
x86 is wasteful in comparison to these M1 chips, not necessarily in comparison to 1990s chips. It wasn't wasteful back then, but today there are faster and more power efficient chips on the market. The longer Intel keeps making 100W designs that are slower than 10W ARM chips, the more wasteful it becomes.
Armv8 instructions should be consistent between vendors. Gcc and llvm is very likely to produce standard armv8 code that is portable. At work we use intel chips to compile armv8 linux executables and librarys and run them on broadcom arm chips.
Apple's advantage is in the virtuous cycle of big demand and big market for their products. And now they have built up the talent, the process, to build different chips and chips that are very large. M1 is not a small chip, it's complexity is comparable if not more than intel or amd or Nvidia's chips. They also established a sustainable relationship with tsmc. I am sure tsmc have exclusive or volume contracts for apple to be the first adopter to their most dense node. Tsmc allocates a lot more of waffer capacity to apple than other designers. The marginal cost for apple to design and build a large version of m1 is pretty cheap.
Rosetta 2 probably also have a JIT mode to deal with software that JIT produces x86 instructions like browsers and java VM. And of course you can do AOT or trans compile static x86 executables. I feel a lot of success of rosetta comes from the sheer perf of the m1 so even the jit and transcompile process is fast, and the execution of not well optimized code is fast. The firestorm cores is at least 60% faster than snapdragons cores per annandtech
I found say intel not just failed in the process nodes. It also failed to see the rise of the dedicated fab (tsmc) and fabless business model as a more efficient model. Intel also failed at innovating on lower power designs and the gpu. Even at 14nm, there are so much that could be done to improve igpu performance. From color compression, to the more efficient shader array. The nintendo switch runs Nvidia tegra x1. It's a 20nm chip, and look at the performance of the gpu. Intel could have also put their markshare weight to put the alternative to cuda be it opencl or even their own API to really gain market traction. They could have their dedicated gpu. Yet they just ignored the whole massive simd compute market. I think for intel, lossing mobile to arm and see the improvements in mobile year over year should be a wake-up call. Yet they didn't do anything to increase their moat for the past 10 years. Or maybe they did try, but their strategies are just flawed. The strategists at Intel just don't have enough foresight and vision.
Feels like execution rather than strategy: they've tried at mobile, gpus, foundry etc - the right calls - but just haven't delivered. Apple / AMD / TSMC have executed much better.
I feel is both. Also a good strategist would have made sure whatever plan and vision you have is also going to be executed well. A strategy that didn't end up to be executed well is a failed strategy.