Hacker Newsnew | past | comments | ask | show | jobs | submit | glasshead969's commentslogin

Happy new year!


Its been a clear upgrade for me. I am on T-Mobile and in my apartment i see around 200-300mbps and across Seattle metro area i hit 200mbps consistently.

T-mobile has better coverage here compared to ATT and Verizon because of Mid-band spectrum they got with sprint merger. I think other 2 are in process deploying more mid band spectrum and may be that will improve it in future.

In Seattle I see 5G UC (Marketing term for their MidBand Spectrum) for TMobile pretty much everywhere.


Intel's Power consumption numbers are ideal case, they routinely go over that. https://www.anandtech.com/show/16680/tiger-lake-h-performanc...


It has to be 3x faster to match M1's efficiency given the TDP is 30 vs 105.


I could care less about efficiency, it's still 2x faster for 2/3x less expensive. Isn't the m1 max tdp between 60 and 90w?


Well yes but ones a laptop chip with a 17 hour life and the other I’ve got under my desk heating my entire condo lol


If you could conceivably run a chip at a much higher TDP before hitting thermal limits you could get significantly more performance. Not that you can (probably) OC these chips at all but it suggests there may be more rabbits in the bag.


You're comparing a single CPU with a complete machine


Its 60-90w including GPU.


If you're plugged in all the time, lower TDP is nice but not critical. And if you live in a cold country, you have to heat up the house six months per year, TDP is just heating with a computing side effect.


> And if you live in a cold country, you have to heat up the house six months per year, TDP is just heating with a computing side effect.

Except it is still direct electrical heating which is atrociously inefficient.


>Except it is still direct electrical heating which is atrociously inefficient.

Electric heating converts practically all energy into heat, making it ~100% efficient. You can make statements about cost-effectiveness compared to burning things, but not all houses can.

CHP configurations are more common in colder climates with district heating, so their "waste" heat during generation often isn't wasted at all.


> Electric heating converts practically all energy into heat

No, not even close. There are huge losses in electricity production and transmission.


Which is why I covered them in my comment about CHP, which recoups a large portion of those "losses". Either way other power sources also require logistic challenges and/or big equipment installs to use, so it isn't exactly 1:1 comparison.


But a heat pump is more than 100% efficient.


I feel this is a bit disingenuous, because using the same logic burning wood is thousands of % effective, or even ∞% if the system only uses convection, making heat pumps seem like a poor choice even when they're perfectly valid.


Until the temperature drops below 4 C


Which is why you bore a hole deep enough that it isn't a problem.


I'm in Quebec (Eastern Canada). Most of the electricity is produced from hydraulic power (dams) up north, while cities are in the southern part of the province. Most houses are electrically heated (especially those built after 1970). Production from water turbines is very efficient. Transmission losses are about 30%, because of the distance (> 1500km). Heating itself is 100% efficient, no moving parts, no maintenance. In this context, heating with a baseboard or CPU makes no difference.


It's 100 efficient, and the amount of houses with a heat pump is <1%


Here in Sweden, roughly 50% of all small houses (ie one household) have a heat pump. Direct electricity heating is just somewhere around 15%.

In bigger houses direct electricity just isn't a thing, most have some sort of central heating, and lots have either some combustion or heat pump solution. The latter is gaining.


This winter I'm going to experiment with having a Raspberry Pi act as a thermostat that starts/stops containers running on a server in our basement. That, combined with the laptop we just got for gaming that has a 3070 in it, should do nicely to supplement our heating system.


Watts go in, Watts come out. So basically, find a space heater with the TDP of your server, and that’s the upper bound on what you can expect heating-wise.

I know that’s a bummer for people who want to heat their house with their computer, but Thermodynamics is always a bummer, I don’t make the Laws.


> So basically, find a space heater with the TDP of your server, and that’s the upper bound on what you can expect heating-wise.

I wonder how many CPUs I need to build a cryptocurrency miner water heater... That would be so much better than just wasting energy heating up dumb elements.


That's why I heat my house with a dell C4140! The _lowest_ power throttle it has is 1.4KW.


Modern heat pumps provide a lot more than 100% efficiency. I'd stick to the heating system unless those containers are doing something profitable (mining?)


Literally nothing provides 100% efficiency. You're conflating coefficient of performance with efficiency. They're not even close to the same thing, modern heat pumps reach their CoP because they don't actually generate heat, they simply move it around, which provides more heat indoors than if you had converted an equivalent amount of electricity directly into 100% heat.

Thermodynamics would not take kindly to you having a >99.999...% efficient anything.


> because they don't actually generate heat Heatpumps still do use electricity(or other power), and all that electricity also ends up as heat. It's why heatpumps have higher CoP than the same system as a refrigeration cycle.

> Thermodynamics would not take kindly to you having a >99.999...% efficient anything

Well the cogen gas powerplants here can produce 50kWh of electricity from burning 100kWh of natural gas. I can use 50kWh of electricity to put 200kWh of heat into my house with a heatpump.

Seems like a good deal to me, and I think carnot would be fine with that.


Nobody said "thermodynamic efficiency". You're butting in for no good reason.

Coefficient of performance is a type of efficiency.

> modern heat pumps reach their CoP because they don't actually generate heat, they simply move it around

That's not even true! What a mess of a pedantic correction.


As someone who lives in south-east Queensland, Australia, lower TDPs and thermal output is always welcome, at least for me. I adore my Ryzen 5600X/3060 Ti mini-ITX desktop, but mining on it makes my room annoyingly warm, and it's not even summer yet.

Was semi-useful in winter though, all 6 weeks of it...


>If you're plugged in all the time, lower TDP is nice but not critical.

"If you're plugged in all the time" then try the Mac Pro version when it comes out.

This is about performance in laptop models, where it IS critical.


A hot laptop can have significant impacts on fertility in men. And in the summer you have to dissipate that heat. A heat pump is more efficient for heating, to boot.


It's a temporary infertility, at least.


M1 with all its CPUs pegged to 100% only hits 20W, so M1 still uses way less power than single core in Zen.

Edit: In fact I just ran Cinebench R23 single thread test and package power barely crossed 5W.


Yup, the MacMini M1 maxes out at 10.5W on single threaded workloads

https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...


> M1 with all its CPUs pegged to 100% only hits 20W, so M1 still uses way less power than single core in Zen.

So do Ryzen CPUs. The 16 core Ryzen only needs 5W per core. It's purely a matter of frequency management.

These comparisons are meaningless because you can say stupid things like "Ryzens are more power efficient than Ryzens" and still be right.


At 5W per core, you don't get anywhere near the single-threaded perf needed to beat an M1...

(20W per core is what it reaches on ST tests, with 17W more for the I/O die on desktop Ryzen, an overhead that'll be much lower on Cezanne hopefully)


Regarding the 5nm vs 7nm thing, I expect apple to be at least 1 node or intra-node improvement (eg: 5nm vs 5nm+ or 4nm) ahead for foreseeable future. They pay TSMC pretty penny to have access to cutting edge nodes. Not much different from how Intel had node advantage over others albeit here apple paying for it so at-least others aren't locked out of it for too long.


They'll presumably be the first to use each node, but that only gives them a number of months before competitors are on it. I would expect to see 5nm Ryzen before 3nm Apple Silicon.

There's also a major question about what happens with Intel. If they ever get their process advantage back then it's not clear what Apple's response is. But if they implode then AMD takes over the PC market and probably becomes TSMC's biggest customer, which could put them in a position to get on newer nodes at the same time as Apple.


I think it's interesting Apple is continuing what they started with Intel. Remember Apple's deal with Intel where Intel would hold back their newest CPUs from the rest of the market so Apple could have their big reveal and "Worlds fastest blahblahblah" blurb to push their Mac refreshes? But apple never really upgraded their proc offerings, and you couldn't upgrade your processor on the machine you bought, so after 6 months or whatever they would be lagging behind in performance vs what you could build or buy elsewhere.

Well I guess once AMD whooped Intel that was not longer going to be an option. Not only would Intel not be able to deliver their "World's fastest blah blah" marketing claim, but Intel couldn't afford to hold anything back from the general market. I guess their little deal with TSMC letting them clear the cobwebs off this strategy and continue it for a bit longer going forward..


So now each year we can't do proper benchmarks anymore because there will always be the (X)nm vs (X-1)nm argument.

What's wrong with using price-power-performance ratio benchmarks on what's available on the market?


Yea I think this argument about node size comes up because in this case you couldn't buy just the soc but the whole Mac computer. People didn't care that some of advantage Intel had was from their process superiority because you didn't have to make any choices other than the CPU itself.


AMD would rather spend money on getting more capacity for the current 7nm products, that have seen a couple of years of severe shortages at launch, and gain market share. It's a problem that Apple doesn't seem to have, though maybe with desktop/server chips they might if they also start to gain market share.


Full on nodes don't come that often and from what we've seen from Intel's 14nm++++++, revisons to a node don't make that much of a difference.


yea its hard to compare, this makes sense as what is the most powerful cpu one can get no matter the power usage but there isn't anything close to M1 when it comes to high performance and long battery life. Even ryzen 4000 series laptops throttle a lot when on battery power while M1 performance doesn't change. I don't expect this to change with Zen 3 laptops either.


I have device with me and on full load on both CPU and GPU it can go up to 25 but for most use cases i see the whole SOC hovering around 15W


iOS has many ad blockers which work as well as Ublock. 1Blocker for example. On top of it these blockers don’t need to read contents of the site to block ads.


uBlock Origin can filter HTML elements. That includes script tags. It can replace scripts when blocking them breaks pages. Safari content blockers can't work as well.


It's really not a big deal.


Thanks for this.

I am not person who wrote the message but I am in similar situation, may be it’s the quarentine and being alone at home for weeks now but I have been thinking about my choices and mistakes.

I hope it gets better eventually.


It will get better eventually, take care & feel free to email me


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: