Hacker Newsnew | past | comments | ask | show | jobs | submit | pacoWebConsult's commentslogin

You can only ctrl+o the most recent response, and its a lot worse than knowing the # of lines read or the pattern grepped, which are useful because it can tell you what the agent is thrashing on trying to find, or what context would be useful to give it upfront in the future.

WPF was full of footguns and rigid organization to alleviate said footguns. MVVM (Model, View, Viewmodel) architecture was so much boilerplate and toil to work with. It feels like the advent of Electron-based desktop apps caused MS to simply give up on the space.

I don't have too much experience with MAUI so I can't comment on that.

Blazor's initial bundle sizes made it quite difficult to consider as an option for web applications, despite the ability to share code between frontend and backend.

I still feel like the ASP.NET + Frontend SPA story has a long way to go compared to what is available in the fullstack typescript ecosystem right now. Shared typings between the frontend and backend via tools like tRPC/ oRPC, or full RSC/SSR frameworks like Next and TanStack start are just so much more ergonomic, but the backend TS story, especially in data access and ORMs is so much worse compared to Entity Framework. Prisma is abysmally slow, and Drizzle is getting there but IMO nothing right now compares to the power and DX of EF Core + Linq methods.


Models each have their own, often competing, quirks on how they utilize AGENTS.md and CLAUDE.md. It's very likely a CLAUDE.md written for use with Claude Code utilizes prompting techniques that results in worse output if taken directly and used with Codex. For example, Anthropic recommends putting info that an agent must adhere to in statements like "MUST run tests after writing code" and other all-caps directives, whereas people have found using the same language with GPT-5.2 results in less instruction following, more timid responses than if the AGENTS.md were written without them.


In my experience, even a version upgrade of the same model will tend to break many assumptions about its quirks, so most people don't have time to try to optimize for them anyway. This is the wrong technology if you're that concerned about reliability.


I recently watched the Jon Bois documentary "Fool Time" which relates the story of the men involved in the development of telegraph lines to the 90s sitcom Home Improvement. It's an excellent watch.

https://www.youtube.com/watch?v=zmyBSrQodnI


The schemas for Amazon and Walmart's product information are absolutely bonkers and constantly missing features that they demand be provided.

Here's the XML Schema Definition for "Product" on Amazon [1]

This is joined on each of the linked category schemas included at the type, of which each has unique properties that ultimately drive the metadata on a particular listing for the SKU. Its wrought with inconsistency, duplicated fields, and oftentimes not up-to-date with required information.

Ultimately, this product catalog information gets provided to Amazon, Walmart, Target, and any other large 3rd party marketplace site as a feed file from a vendor to drive what product they can then list pricing and inventory against (through similar feeds).

You are right that the control McMaster-Carr has on their catalog is the strategic and technological advantage.

[1]: https://images-na.ssl-images-amazon.com/images/G/01/rainier/...


Very interesting how nearly half the list is (assumedly) every single chemical listed under California Prop 65. Do they really need to specify exactly which chemical it is? I've seen thousands of prop 65 warnings in my life but I've literally never seen it tell me what chemical its warning me about. I just commented to a friends a couple weeks ago i wished they'd tell me what so i could look it up myself!


DO-178c is not a coding standard, it's a process standard. Projects following DO-178c processes would adopt a coding standard as a part of the process, reviewing software deliverables adhere to those standards.


The federal government has grown immensely since the early 20th century due to the interpretations of the commerce clause allowing more and more federal legislation and rules to broadly be applied to essentially override state legislation.

The 10th amendment exists for a reason. The system wasn't intended for congress to even control something like this in the first place.


We definitely are straining the rules. I think we actually want a federal government like this. The reality on the ground is that most people want things like FDA and FCC at the federal level.

Maybe we just need to change the constitution--which I know is technically possible but im practically it's frozen. It's like a legacy API no one wants to touch.


This particular aircraft was acquired by UPS in 2006 and converted for cargo missions. It was originally delivered as a passenger aircraft to Thai Airways International in 1991. [1] I actually saw this exact aircraft at RDU International in August of this year and took a photo, since tri-engine aircraft in general are not very common these days.

[1]: https://www.flightradar24.com/blog/flight-tracking-news/majo...


It's not a fraction of what it would cost to actually employ those humans, since there were humans who clearly chose to do that work when presented with the opportunity.

I think this is a very first-world oriented take. It efficiently distributed low-value workloads to people who were willing to do it for the pay provided. The market was efficient, and the wages were clearly on par with those who were doing the work found economical to do, considering they did (and still do) the work for the wages provided.


If you're going to use claude to help you respond to feedback the least you can do is restate this in your own words. Parent commenter deserves the respect of corresponding with a real human being.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: