Hacker Newsnew | past | comments | ask | show | jobs | submit | muyuu's commentslogin

latency and control, and reliability of bandwidth and associated costs - however this isn't just the pull for specialised hardware but for local computing in general, specialised hardware is just the most extreme form of it

there are tasks that inherently benefit from being centralised away, like say coordination of peers across a large area - and there are tasks that strongly benefit from being as close to the user as possible, like low latency tasks and privacy/control-centred tasks

simultaneously, there's an overlapping pull to either side caused by the monetary interests of corporations vs users - corporations want as much as possible under their control, esp. when it's monetisable information but most things are at volume, and users want to be the sole controller of products esp. when they pay for them

we had dumb terminals already being pushed in the 1960s, the "cloud", "edge computing" and all forms of consolidation vs segregation periods across the industry, it's not going to stop because there's money to be made from the inherent advantages of those models and even the industry leaders cannot prevent these advantages from getting exploited by specialist incumbents

once leaders consolidates, inevitably they seek to maximise profit and in doing so they lower the barrier for new alternatives

ultimately I think the market will never stop demanding just having your own *** computer under your control and hopefully own it, and only the removal of this option will stop this demand; while businesses will never stop trying to control your computing, and providing real advantages in exchange for that, only to enter cycles of pushing for growing profitability to the point average users keep going back and forth


There's the selling point of using it as a relatively untrustworthy agent that has access to all the resources on a particular computer and limited access to online tools to its name. Essentially like Claude Code or OpenCode but with its own computer, which means it doesn't constantly hit roadblocks when attempting to uselegacy interfaces meant for humans. Which is... most things to do with interfaces, of course.

absolutely, the problem of the cognitive friction of having to decide what to pay for compounds massively when there uncertainty about the purchase, and the negative experience when the user feels that he or she has been essentially scammed because the purchased product is not what was expected far outweighs positive experiences that are perceived at best as just the expected transaction

I guess it's an age thing, but I thought this was really basic CS knowledge. But I can see why this may be much less relevant nowadays.


thanks for that, it's brilliant and very true

I've been in IT for decades but never knew that ctrl was (as easy as) masking some bits.

You can go back maybe 2 decades without this being very relevant, but not 3 given the low level scope that was expected in CS and EE back then.

I learned about from 6502 machine language programming, from some example that did a simple bit manipulation to switch lower case to upper case. From that it became obvious that ASCII is divided into four banks of 32.

Been an ASCII-naut since the 80's, so .. its always amusing to see people type 'man ascii' for the first time, gaze upon its beauty, and wonder at its relevance, even still today ...

smalltalk missed the opportunity to incorporate more sophisticated versioning, including distributed versioning with current SotA ideas

of course modern smalltalks or st-inspired systems could still incorporate these ideas


Perhaps decades ago there was "more sophisticated versioning" for Smalltalk implementations:

2001 "Mastering ENVY/Developer".

https://www.google.com/books/edition/Mastering_ENVY_Develope...


ENVY suffered of a problem that many other Smalltalk technologies suffered: a conflict between a culture of proprietary zeal as a business model and powerful network effects of adoption. Visualage in general was plagued by this. I used to blame Microsoft and Apple successes for the pervasive push for lock-in and "integration" as a feature that defined the era so strongly.

You had on the one hand had a technology that desperately needed adoption to build a culture and best-practices documentation, and on the other hand you had short term profit motive seriously getting in the way, so what you had that was completely cutting edge for decades, eventually it wasn't anymore - or the world moved in another direction and your once revolutionary technology became an ill fit for it.

By the 2000s with monotone and darcs, but specially with the rise of git, other standards for versioning have superseded what could have been. Smalltalkers already by the 2010s should have been wise to try to incorporate what is clearly a standard now but instead a bunch of invented-here systems for versioning and repositories and hybrids have developed in its place. And by incorporate i don't mean "let's make X for ST" but making it core in their implementation so that the system itself is more easily understood and used, even if its to take pieces of it away and use them which is actually a strength and not a weakness! contrary to some brand of 90s-era beliefs.

Generally speaking, to this very day it's regarded as cool and as a feature in ST world that something is ST-only, conveniently "integrated" into the system as tightly as possible and, implicitly but insidiously and glaringly, near-impossible to use elsewhere except maybe as a concept and laundered of its origin.


I've been threatened by the governments of Pakistan and Germany for stuff I've said pseudonymously on the Internet. As much as they may think everybody needs to care about their laws, I happen not to.

It's just a reality that law is harder to enforce when you cannot target a given server and take out an entire service. Regardless of what you think of the law.

This is why to this day torrenting of copyrighted material is alive and well.


wonderful stuff

I have a bunch of the old ones from my late father, I have sunk thousands of hours in old computer magazines, there's something special to them that the new world cannot capture anymore.


It was the accessibility. You were learning computing concepts from scratch, that would then increase in complexity in real-time as your learning caught up if you were actively engaged.

also the importance and the degree of care that was put into things that were published, and what all the constraints meant also in computing itself

there were strong positives to that, and they just cannot be replicated in a society of hyper-abundance and slop


they just removed the 50% discount today

i can imagine they're a bit tight to keep those prices viable while they also have enough compute to train their new models


I don’t think it is about viability. The Christmas discount was supposed to expire jan 15, then it was further expanded to jan 31.

The deal was too good to miss out on.


Perhaps some power user of Claude Code can enlighten me here, but why not just using OpenCode? I admit I've only briefly tried Claude Code, so perhaps there are unique features there stopping the switch, or some other form of lock-in.

Anthropic is actively blocking calls from anything but claude code for it's claude plans. At this point you either need to be taking part in the cat and mouse game to make that plan work with opencode or you need to be paying the much more expensive API prices.

i see

i guess they were blocking OpenCode for a reason

this will put people to the test that use mainly Anthropic, to have a second look at the results from other models


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: