Hacker Newsnew | past | comments | ask | show | jobs | submit | rozzie's commentslogin

I've been annoyed lately by posts (elsewhere) by vibe-coding startup CEOs who think they can build companies without developers. These are not serious people.

Recently, my head spun around a bit when I saw an experiment done by my team at blues, and I realized it was time for me to dig a bit deeper. https://www.tjvantoll.com/2025/03/17/vibecoding/

And so, over the past few weeks I've done an interesting CEO vibe-coding experiment of my own, trying to test the limits and to see how close I could come to building something that could be regarded as 'ship quality'.

Did I succeed? You be the judge.

I didn't write a single line of code in this repo, but, as HN would expect, my instructions were voluminous and needed to be quite detailed. An interesting experience, nonetheless.


I began recruiting for what became Azure in Jan 2006. I was chief software architect / cto at the company. Amitabh Srivastava and the legendary Dave Cutler were the leads, with Dave focused on the hypervisor. (I'd met Dave in the 80's when he was at DEC and I was at DG.)

The project was in my team (CSA labs) but was cross-funded behind the scenes by Kevin Johnson, the president of Server & Tools. KJ & I did this because there was passive-aggressive resistance to a 'cloud first' design/architecture philosophy from within his org, where there was a deeply-rooted belief that enterprise servers and ops management tools would adequately scale-up.

KJ bought in and was all-in, as was the 'tools' part of his org (Soma & ScottGu). SteveB initially didn't quite know what to make of my desire and myriad efforts to fundamentally transform the company from packaged products toward services, and he had to cope with some of the wake I was leaving. It wasn't all smooth. But he believed in me and helped me to recruit internally, which was essential.

My explicit cross-funding agreement with KJ, my peer, was that when I decided it was the 'right time', I'd hand off my Azure org and it would be re-merged into S&T in more-or-less a 'reverse merger', with cloud leadership taking over server.

I launched Azure at PDC 2008 with what today we'd call lambda's (functions-as-a-service based on .net) & blobs & cloud database as the core services. Why no linux or windows VMs? They were absolutely part of day 1 plans, but a major political ploy from within KJ's team ('this will kill the server business') resulted in an active decision (mine) to defer until post-launch. It wasn't a technology issue, nor was it an OSS issue; the team believed in OSS & Linux. But shipping was top priority, and we shipped.

When I ultimately left the company in 2011, it was time to do the reverse merger that KJ and I had planned. A proven, super-talented manager from Bing that everyone loved, Satya, was chosen to lead the org as it was moved into S&T upon my departure. James Hamilton, the architect of Azure's relational DB, left for AWS. Ultimately, under Satya, ScottGU & co ended up re-plumbing much of the original code with a by-then-ready Windows hypervisor, VMs & Linux, and all that you see today. By then the org finally was aligned and 'believed', and SteveB was genuinely 'all in'.

Getting products from 0 to 1 is sometimes a challenging process involving incredible people and stamina from believers at every level. In this case I'd say it was worth the effort.


Tangentially-related, it remains surprising that many U.S.-based developers I've spoken with fail to realize that they need to obtain an Export Control Classification (ECCN) from the US Dept of Commerce's Bureau of Industry and Security (BIS) before publishing their apps in the app store or otherwise making them available on the net. And then devs need to submit annual updates for their products.

Most will get a mass market exemption under 5D992, but a surprising number of modern applications making 'interesting' uses of crypto will need export licenses.

https://www.bis.doc.gov/index.php/encryption-and-export-admi...


They probably think they live in a free country without stupid requirements.


They probably think they live in a free country without stupid requirements.

…says the "PHP developer from Russia," according to his biography.


People generally expect more from self-declared democracies with free speech in their constitution, as they should.


Weird way of making a case for US-only app stores but okay!


Thx for clearing the rights and for releasing, Scott. And of course thanks to Microsoft and IBM.

It would be fun at some point down the road to get some of the older code building and running again - particularly '84/'85-vintage Windows & Notes builds. Quite a lot of work, though, not just because of hardware but also likely because of toolchain gaps.


At Software Arts I wrote or worked on the IL interpreter for the TRS 80 Model III, the DEC Rainbow, the Vector Graphic, the beginnings of the Apple Lisa port, as well as the IBM PC port. To put you into the state of mind at the time,

- in the pre-PC era, the microcomputer ecosystem was extremely fragmented in terms of architectures, CPUs, and OS's. 6502, z80, 68K, z8000, 8088. DOS, CPM, CPM/86, etc. Our publisher (Personal Software) wanted as much breadth of coverage, as you might imagine

- one strong positive benefit of porting from 6502 assembly to IL and using an interpreter was that it enabled the core code to remain the same while leaving the complex work of paging and/or memory mapping to the interpreter, enabling access to 'extended memory' without touching or needing to re-test the core VisiCalc code. Same goes for display architectures, printer support, file system I/O, etc.

- another strong benefit was the fact that, as the author alludes to, the company was trying to transition to being more than a one hit wonder by creating a symbolic equation solver app - TK!Solver - that shared the interpreter.

Of course, the unavoidable result is that the interpreter - without modern affordances such as JIT compilation - was far less snappy than native code. We optimized the hell out of it and it wasn't unusable, but it did feel laggy.

Fast forward to when I left SoftArts and went across the street to work for my friend Jon Sachs who had just co-founded Lotus with Mitch Kapor. Mitch & Jon bet 100% that the PC would reset the ecosystem, and that the diversity of microcomputers would vanish.

Jon single-handedly wrote 1-2-3 in hand-tuned assembly language. Yes, 1-2-3 was all about creating a killer app out of 1.spreadsheet+2.graphics+3.database. That was all Mitch. But, equally, a killer aspect of 1-2-3 was SPEED. It was mind-blowing. And this was all Jon. Jon's philosophy was that there is no 'killer feature' that was more important than speed.

When things are moving fast and the industry is taking shape, you make the best decisions you can given hunches about the opportunities you spot, and the lay of the technical and market landscape at that moment. You need to make many key technical and business decisions in almost an instant, and in many ways that determines your fate.

Even in retrospect, I think the IL port was the right decision by Dan & Bob given the microcomputing ecosystem at the time. But obviously Mitch & Jon also made the right decision for their own time - just a matter of months later. All of them changed the world.


What versions of Visicalc can we find in the wild that would have used the IL interpreter?


Thank you!—that fills out the story very nicely.


https://safecast.org/devices/ https://safecast.org/history-of-safecast/ https://safecast.org/about/

It’s quite a wonderful set of globally-distributed volunteers brought together by varying passions - from hardware hacking to citizen science.

Whether from fixed or mobile sensors, whether from radiation or air quality sensors, all data is CC0-licensed at birth and is freely available for download.


If you scroll to Australia you can see the data coverage is largely occassional driving about on a few roads.

It's interesting they don't include, as a baseline reference, the near complete Australian coverage by 256 channel radiometric data (available for free download): https://www.ga.gov.au/scientific-topics/disciplines/geophysi...

Such datasets are relatively common across the globe and none that I'm aware of appear to be included.

Their devices appear to be all total count gieger types which don't seperate out energy bands or have much accuracy - ie. "good enough" for coarse results, not that great for identifying radon Vs ??? as a source.


Beyond being 60-bits, programming the 6400/6500/6600/6700 was interesting and memorable in other ways.

- Ones' complement (rather than two's complement) binary representation of integers, and thus the need to cope with "-0" in your code. Modern programmers are surprised that there was a day when "-1" had a different binary representation than today.

- The CPU/CPUs were not actually 'in charge' of the machine. There were ten 12-bit processors called PPU's (peripheral processing units) which did all I/O, and which had the unique capability of doing an "Exchange Jump" instruction to do a CPU task switch. In a sense, the CPUs were 'compute peripherals' to the PPUs.

- The architecture was fascinating in terms of memory hierarchy. The "centeral memory" used by the CPUs was augmented by a much larger "extended memory" (ECS - Extended Core Storage) with block transfer primitives. One could implement high-scale systems (such as the one I worked on - PLATO) that smoothly staged data between CM, ECS, and disk.

In those days, there was a necessarily-direct relationship between the machine language (the bit encoding of instructions for operations & registers) and the assembly language (COMPASS). As a developer it was incredibly enjoyable because, in Ellen Ullman's words, you felt very 'close to the machine'.


Hi! Author of this short post here. Thanks so much for this comment. It's been 'on my list' to do a much longer post on Control Data and Seymour Cray for quite a while. This has convinced me to bump it up the list!


I’m a subscriber and I’d be happy to read as much of that as you want to write. There’s not that much biographical or technical history coverage of Control Data and Cray other than what’s in the not-very-technical Supermen book and anecdotes form one or two individual engineer memoirs.


Thanks for subscribing! I've added CDC and Cray as one of my next 3 or 4 posts so watch this space. Feels like it needs to be a two-parter (at least) though!


And the intriguing load/store scheme.

There were no LOAD or STORE instructions. Instead there were "address" registers (18 bits wide), matched to each "operand" (60 bit data) register.

When you updated an address register, that memory address was automatically read into the correspond operand register. Except for the last couple address registers - updating them performed a write from the corresponding operand into memory.

By our current way of thinking, it seems arse about. But it worked well when you understood it, and apparently improved concurrency. Loads and stores became sort-of transparent. (Remembering that memory was as fast, sometimes faster than the CPU, so a few instructions saved was worth the occasional unnecessary load.)

See "Design of a Computer, The Control Data 6600" by J E Thornton.


An alternative way to model that in an ISA is the indirect addressing model coupled with the ability to use it when expressing operands in any instruction.

The models are isomorphic.

Write to an "address" register -> write to a register directly

Write to an "operand" register -> write to "(register)" (write to the memory at the address stored in the register)

Not sure which was the first architecture to model it that way. The PDP-11 had it.

You just need one bit in the instruction encoding to determine whether to use direct or indirect access. You need one bit in the instruction encoding if you have twice the registers.

You can save that bit if you make most instructions able to access the "operand" register only, and require that manipulation of the "address" register use special instructions.

In that case you have an "inverse load / store" architecture, where instead of using load store instructions to do indirect access you use them to do direct access.


"Now, no one bats an eye if you ship the most secure crypto you want."

The most surprising thing to me is that, in speaking in the past several years with younger entrepreneurs, they're not even aware of the obligation to file for an export license for any/all software containing crypto (such as that submitted to the App Store).

I've not yet seen a case in which a mass market exemption isn't quickly granted, but devs still need to file - and re-file annually.


Is that still a requirement for US developers?

As in, currently.


When you submit the documentation via Apple, also submitting it to the government is not necessary: https://developer.apple.com/documentation/security/complying...

Essentially Apple built a system so you have to agree to export restrictions with every single build you upload to Apple.


I use my HP-16C several times every week while debugging Notecard firmware. Yes, math is integrated into IAR EWARM and there's always Hex Calc on my iPhone. But there is something comforting about grabbing this same little artifact that has been on my desk since the days when I was debugging Lotus Notes using symdeb.


Not unrelated, this is quite an amazing device - even before you consider its $9.95 price at Sparkfun.

https://usefulsensors.com/person-sensor/ https://usefulsensors.com/about/


~$10 shipped at aliexpress: https://www.aliexpress.us/item/3256804457160611.html (maybe this one)


Looks like you've found a different device there.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: