Hacker Newsnew | past | comments | ask | show | jobs | submit | hendersoon's commentslogin

So with support for OCI container images, does this mean I can run docker images as LXCs natively in proxmox? I guess it's an entirely manual process, no mature orchestration like portainer or even docker-compose, no easy upgrades, manually setting up bind mounts, etc. It would be a nice first step.


Also hoping that this work continues and tooling is made available. I suppose eventually someone could even make a wrapper around it that implements Docker's remote API


There is a vid showing the process on their youtube

https://youtu.be/4-u4x9L6k1s?t=21

>no mature orchestration

Seems to borrow the LXC tooling...which has a decent command line tool at least. You could in theory automate against that.

Presumably it'll mature


It's just another declarative adblocker, as that is all Safari (and now Chrome) allows. There's vanishingly little room for differentiation in this space.


That info is outdated. Safari also allows JS scripts running on sites, i.e. extensions working like script injectors. The difference with content blockers is those extensions must be explicitly allowed to access sites being browsed first, for privacy reasons.


Chrome can do that too on desktop, and on iOS Chrome can't run any extensions at all. Safari web extensions have been around since iOS15, so several years now.


MCP is convenient and the context pollution issue is easily solved by running them in subagents. The real miss here was not doing that from the start.

Well, stdio security issues when not sandboxed are another huge miss, although that's a bit of a derail.


Google approached this the right way. No, not with "ai mode", that sucks. With the Chrome dev tools MCP. You allow AI to control the browser if the user opts-in and sets it up.


Exactly right, the OCR isn't the interesting part. 10x context compression is potentially huge. (With caveats, at only ~97% accuracy, so not appropriate for everything.)


Gemini 2.5 pro is generally non-competitive with GPT-5-medium or Sonnet 4.5.

But never fear, Gemini 3.0 is rumored to be coming out Tuesday.


The random people tweets I've seen said Oct 9th which is Thursday. I suppose we will know when we know.


based on what? LLM benchmarks are all bullshit, so this is based on... your gut?

Gemini outputs what I want with a similar regularity as the other bots.

I'm so tired of the religious thinking around these models. show me a measurement.


> LLM benchmarks are all bullshit

> show me a measurement

Your comment encapsulates why we have religious thinking around models.


Please tell me this comment is a joke.


I wish claude code supported the new memory tool. The difference is CLAUDE.md is always in your active context while the new memory stuff is essentially local RAG.


I would love to replace google/apple news, but publishing once daily doesn't work for me.


Why not? Is it really that important for you to know of events a few hours earlier?


Nextcloud News works just fine, is free, is as biased as the feeds you configure and no more, does not (yet...) introduce/intrude LLM slop, is free software (beer/freedom) and has been around for a long long time. You can configure it any way you want, the default update interval is 5 minutes which should be enough for even the most FOMO-affected 'news' junkie. Of course the actual updates depend on the RSS sources but if you configure a number of active feeds you'll get updates every few minutes.

https://github.com/nextcloud/news


This is why I don't run stdio MCP servers. All MCPs run on docker containers on a separate VM host on an untrusted VLAN and I connect to them via SSE.

Still vulnerable to prompt injection of course, but I don't connect LMs to my main browser profile, email, or cloud accounts either. Nothing sensitive.


If you used this package, you would still have been victim of this despite your setup. All your password reset or anything sent by your app BCC to the bad guy.


So they're essentially charging $5/month for unlimited tab completions, when you get 2k for free. That seems reasonable, many could just not pay anything at all.

But in the paid plan they charge 10% over API prices for metered usage... and also support bring your own API. Why would anyone pay their +10%, just to be nice?

This is the same problem cursor and windsurf are facing. How the heck do you make money when you're competing with cline/roocode for users who are by definition technically sophisticated? What can you offer that's so wonderful that they can't?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: