Hacker Newsnew | past | comments | ask | show | jobs | submit | ab2525's commentslogin

Well said - language (text input) is actually the vehicle you have to transfer neural state to the engine. When you are working in a greenfield project or pure-vibe project, you can get away with most of that neural state being in the "default" probability mode. But in a legacy project, you need significantly more context to contrain the probability distributions a lot closer to the decisions which were made historically otherwise you quickly get into spaghetti-ville as the AI tries to drag the codebase towards its natural ruts.


I have had similar observations to you and tried to capture them here: https://www.linkedin.com/posts/alex-buie-35b488158_ai-neuros...

The gist being - language (text input) is actually the vehicle you have to transfer neural state to the engine. When you are working in a greenfield project or pure-vibe project, you can get away with most of that neural state being in the "default" probability mode. But in a legacy project, you need significantly more context to contrain the probability distributions a lot closer to the decisions which were made historically.


You can get up to a TB RAID1'd


LOL!


Well, they rebrand SheevaPlugs as TonidoPlugs and install their cloud suite of software on them. (We host both at Pluggr)


We (Pluggr) get ours from RainmanWeather in Florida. (Only SheevaPlugs though)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: