Hacker Newsnew | past | comments | ask | show | jobs | submit | tarasglek's commentslogin

I love that age lets one reuse ssh identities and thus identity sharing systems. The single most useful thing I ever wrote was a tool to sync github identities with age. https://github.com/tarasglek/github-to-sops

This way you get git for change tracking on your secrets and who-has-access-to-secrets and key rotation and this can be trivially expanded to other forges.

Its easy to introduce age this way into any modern project whereas gpg would-ve been a non started on most teams I worked on.

disclaimer: this was mostly vibe-coded because I really did not want to work on this and wasnt sure if teammates would adopt it. Then it just worked, so stayed ugly inside


my $100 mobiscribe lets me read HN. I only read online on my eink devices. LCD is much less enjoyable


To make it open source in the fullest sense one needs to document what youve done. This esp repo could use some details on what protocols the hardware speaks, sequence diagrams, auth, etc. I doubt you running webrtc on esp


WebRTC is running on the esp32! Library is libpeer[0]

[0] https://github.com/sepfy/libpeer


i am still confused what their software stack is, they dont use ceph but bought netapp, so they use nfs?


The NetApps are just disk shelves, can plug it into a SAS controller and use whatever software stack you please.


but they have multiple head nodes, so its some distributed setup or just active/passive type thing?


We have a custom barebones solution that uses a hashring to route the files!


I'm guessing the client software (outside the dc) is responsible for enumerating all the nodes which all get their own IP.


I think each rack is one head node and several disk shelves (10?). No dual headed shelves.


It would be killer to be able to integrate this with 3d scans. Eg "make me a mount that hugs this shape" where you can draw both render the scan and mark it with a 2d paint tool


But you can probably tail -F quite well! Which is perfect for logs (eg gimme last day i can get grep through)


Indeed, that was my original working title


bruh that's googles original working title


These articles seem to justify spending money without considering alternatives It's like saying "cold meds let me get back to work, I am well justified paying $100/day for NyQuil".

I would appreciate a less "just take my money" and more "here are features various tools offer for particular price, I chose x over y cos z". Would sound more informed.

Would also like to see a reason on not using open source tools and locking yourself out of various further ai-integration opportunities because $200/mo service doesn't support em.


People put so much effort into streaming Json parsing whereas we have a format called Yaml which takes up less characters on the wire and happens to work incrementally out of the box meaning that you can reparse the stream as it's coming in without having to actually do any incremental parsing


> People put so much effort into streaming Json parsing whereas we have a format called (...)

There are many formats out there. If payload size is a concern, everyone is far better off enabling HTTP response compression instead of onboarding a flavor-of-the-month language.


Yaml makes json appear user friendly by comparison.

Last thing one want in a wire format is white space sensitivity and ambiguous syntax. Besides, if you are really transferring that much json data, there are ways to achieve it that solves the issues


YAML is more complex and harder to parse


If we're talking about niche data protocols, edn is hard to beat. Real dates and timestamps and comments, namespaced symbols, tagged elements, oh my!

https://github.com/edn-format/edn


When I was at Nokia around that time they were giving out the weird keypad phone for free to all employees it was really terrible, I guess they had an oversupply as the result of people not liking them


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: