Hacker Newsnew | past | comments | ask | show | jobs | submit | geophile's commentslogin

Z-order based indexes avoid the resolution problem. Basically:

- Generate z-values for spatial objects. Points -> a single z-value at the highest resolution of the space. Non-points -> multiple z-values. Each z-value is represented by a single integer, (I use 64 bit z-values, which provide for space resolution of 56 bits.) Each integer represents a 1-d range. E.g. 0x123 would represent 0x123000 through 0x123fff

- Spatial join is basically a merge of these z-values. If you are joining one spatial object with a collection of N spatial objects, the time is logN. If you are joining two collections, then it's more of a linear-time merge.

For more information: PROBE Spatial Data Modeling and Query Processing in an Image Database Application. IEEE Trans. Software Eng. 14(5): 611-629 (1988)

An open source java implementation: https://github.com/geophile/geophile. (The documentation includes a number of corrections to the published algorithm.)


The article gets at this briefly and moves on: "I can do all of this with the experience on my back of having laid the bricks, spread the mortar, cut and sewn for twenty years. If I don’t like something, I can go in, understand it and fix it as I please, instructing once and for all my setup to do what I want next time."

I think this dynamic applies to any use of AI, or indeed, any form of outsourcing. You can outsource a task effectively if you understand the complete task and its implementation very deeply. But if you don't, then you don't know if what you are getting back is correct, maintainable, scalable.


> instructing once and for all my setup to do what I want next time.

This works up to a point, but eventually your "setup" gets complicated, some of your demands conflict, or have different priorities, and you're relying on the AI to sort it out the way you expect.


But setups get equally complicated, even with human software engineers. The approach that the OP is talking about applies only to experienced, good architect-level SWEs, and I suspect that the code quality and its problems are going to be the same whether they are directing LLMs vs a set of junior SWEs to write the code.

There is an inherent level of complexity in projects that solve some real world problem, due to all the code handling edge cases that were added incrementally over time.


> any use of AI, or indeed, any form of outsourcing

Oh that's a good analogy/categorization, I hadn't thought about it in those terms yet. AI is just the next cheaper thing down from the current southeast asian sweatshop labor.

(And you generally get what you pay for.)


On the face of it, this or at least acting as a code reviewer from an experienced point of view seems like the solution, the problem is that we all naturally get lazy and complacent. I actually think AI was at its best for coding a year or so ago, when it could kind of do part of the work but theres no way you could ever ship it. Code that works today but breaks in 6 months is far more insidious.

It does beg, the question , whether any of this applies to less experienced people. I have a hunch that the open-ended nature of what can be achieved with AI will actually lead right back to needing frameworks, just as much as we do now, if not more, when it comes to less experienced people.

The analysis misses a point. Wordle uses two lists of five letter words: words that are in the dictionary, and can be used in a guess; and those that can be used as the daily secret word. The latter list is smaller, and sticks to more common words. Wordle has been around for 1550 days, so they have used 67% of the possible words. In another couple of years, they have to either start using uncommon words, or recycle. There's no rush, so it's unclear why this is happening now.


> Wordle has been around for 1550 days

I'm confused. Today's Wordle is #1,688.


I did an approximate calculation.


For a long time, I had a MBP (this is in Intel days), with a Linux VM. It was like a reverse mullet, party in front (multimedia), work in back (dev).

And then:

    - Butterfly keyboard
    - Touchbar
    - M-series CPUs, which, while technically awesome, did not allow for Linux VMs.
So I switched to System76/Linux (Pop OS) and that has been wonderful, not to mention, much cheaper.


- No esc


See I'm a ends justify the means guy:

The more people forced into the beautiful world of capslock is escape the better!


Your website has stained my screen. lol

background-image: radial-gradient(circle at 12% 24%, var(--text-primary) 1.5px, transparent 1.5px),

radial-gradient(circle at 73% 67%, var(--text-primary) 1px, transparent 1px),

radial-gradient(circle at 41% 92%, var(--text-primary) 1.2px, transparent 1.2px),

radial-gradient(circle at 89% 15%, var(--text-primary) 1px, transparent 1px);


FWIW, On Reddit, I am seeing more and more discussions on the Linux subreddits or people getting fed up with Windows and switching to Linux. Usually, it's the Windows 11 upgrade that finally did it.


There is a good parallel here with Myspace and Facebook. Myspace added an ad network & was hammered by spammers around the same time Facebook was opening up user registration to everyone. Facebook had no ads. Myspace was dead.

This time Linux has very good game support to the point where some games have a higher FPS on Linux. It will be so expensive for Microsoft to attempt to turn this ship around, and it will likely still fail.

This is happening at the same time AI agents have gotten really good, so users will just use local AI agents to configure and troubleshoot the rough stuff about Linux. And then they will customize it so much they will never be able to go back to Windows.

Ubuntu is just fine for 99% of non tech users. Windows has so many anti-patterns, tricks, and OneDrive rugpulls now that Ubuntu is actually much safer and simpler for non-techies to use (I can also make the case it beats iOS in that department too.)


This seems like a good time to remind everyone of a letter by David Packard, to his employees. There is more morality, common sense and insightful business advice here than in any 1000 business titles you would care to name.

https://aletteraday.substack.com/p/letter-107-david-packard-...

I think that OPs essay identifies that something bad happened at HP but completely misses what it was. Look at this quote:

    Around 1997, when I was working for the General Counsel, HP engaged
    a major global consulting firm in a multi-year project to help 
    them think about the question: “What happens to very large companies that
    have experienced significant growth for multiple successive years?”
OP says that the findings and recommendations included: "the decade long trend of double-digit growth was unlikely to continue", and "the company [should] begin to plan for much slower growth in the future."

OP then goes on to talk about fighting for resources for investments, a "healthy back and forth" on these tradeoffs, and then losing the "will to fight" following this report. "The focus became how not to lose".

Unlike OP, I did not work at HP. But I have seen up close startups, middle-sized companies, and huge companies, and the transitions among these states. So I feel justified in saying: OP has missed the point. And in particular, he makes no reference to that letter from David Packard.

Look at this quote from the letter:

    I want to discuss why a company exists in the first place. ...  why 
    are we here? I think many people assume, wrongly, that a company 
    exists simply to make money. While this is an important result of 
    a company's existence, we have to go deeper and find the real 
    reasons for our being. ... a group of people get together and exist
    as an institution that we call a company so they are able to accomplish 
    something collectively which they could not accomplish separately. 
    They are able to do something worthwhile—they make a contribution 
    to society .... You can look around and still see people who are 
    interested in money and nothing else, but the underlying drives 
    come largely from a desire to do something else—to make a product—to 
    give a service—generally to do something which is of value.
I think this is the essence of what it means to do useful and interesting work in any technical field. Unfortunately, there are many, many examples of companies that have lost their way, forgetting this key insight. HP was certainly one of them. I would argue that Google and Microsoft are examples too. Boeing, for sure.

And sadly, there are very, very few companies that actually embody Packard's ideas. I think that JetBrains is such a company, familiar to many HN readers. Another one that comes to mind, from a very different field, is Talking Points Memo -- an excellent website that does news reporting and analysis, mostly on US politics. It started as a "blogger in a bathrobe", and 25 years later, it is a small, independent news organization, supporting itself mostly through paid subscriptions by a very loyal readership.

To me, the saddest part of the essay is this:

    In the last few years more and more business people have begun to
    recognize this, have stated it and finally realized this is their
    true objective.
(This is right before the "You can look around ..." section quoted earlier.) It seems to me that very, very few business people recognize the way to run a business, as outlined by Packard.


No, "we" are not replacing OOP with something worse. "We" are replacing layers of stupid shit that got layered on top of, and associated with OOP, with different renderings of the same stupid shit.

I have been programming since 1967. Early in my college days, when I was programming in FORTRAN and ALGOL-W, I came across structured programming. The core idea was that a language should provide direct support for frequently used patterns. Implementing what we now call while loops using IFs and GOTOs? How about adding a while loop to the language itself? And while we're at it, GOTO is never a good idea, don't use it even if your language provides it.

Then there were Abstract Datatypes, which provided my first encounter with the idea that the interface to an ADT was what you should program with, and that the implementation behind that interface was a separate (and maybe even inaccessible) thing. The canonical example of the day was a stack. You have PUSH and POP at the interface, and the implementation could be a linked list, or an array, or a circular array, or something else.

And then the next step in that evolution, a few years later, was OOP. The idea was not that big a step from ADTs and structured programming. Here are some common patterns (modularization, encapsulation, inheritance), and some programming language ideas to provide them directly. (As originally conceived, OOP also had a way of objects interacting, through messages. That is certainly not present in all OO languages.)

And that's all folks.

All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.

The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.

These pro- and anti-OOP discussions, which can get pretty voluminous and heated, seem a lot like religious wars. Look, we can all agree that the Golden Rule is a pretty good idea, regardless of the layers of terrible ideas that get piled onto different religions incorporating that rule.


I'm the kind of person that sees a bowl as a large cup without a handle.

Likewise, I see these patterns as equivalent style choices, since the problem fundamentally dictates the required organization and data flow, because the same optimal solution will be visible to any skilled developer, with these weak style choices of implementation being the only freedom that they actually have.

For example, these two are exactly the same:

    state = concept_operation(state, ...args)
and

    class Concept:
        def operation(self, ...args)
            self.state = <whatever with self.state>
and an API call to https://url/concept/operation with a session ID where the state is held.

I suspect people who get emotional about these things haven't spent too much time in the others, to understand why they exist with such widespread use.

It's like food. If you go anywhere and see the common man eating something, there's a reason they're eating it, and that reason is that's it's probably pretty ok, if you just try it. There's a reason they're eating it, and it's not that they're idiots.


A voice of sanity. The submitted article has zero meaningful content.

This comment i posted in an earlier thread rehashing Inheritance vs. Composition for the gazillionth time is highly relevant here - https://news.ycombinator.com/item?id=45943135

OOD/OOP has not gone away, has not shifted etc. but is alive and well; just packaged under different looking gloss. The meta-principles behind it are fundamental to large scale systems development, namely; Separation-Of-Concerns, Modularization, Reuse and Information-Hiding.


I thought structured program was about language support for control flow, simplification and formalization of control flow and single return. Not ADTs.


I thought I expressed that: "I came across structured programming. The core idea was ... Then there were Abstract Datatypes, ..."


I can see what you mean, I interpreted everything to be elaboration to your introductory remark:

> "We" are replacing layers of stupid shit that got layered on top of, and associated with OOP, with different renderings of the same stupid shit.

Meaning that both structured programming and ADTs are different names for the same "stupid shit", meaning the same ideas as OOP. I agree with this for ADT, that is really just the same thing under another name, but I failed to see how structured programming has something to do with OOP.

I now see, that the paragraph wasn't to be read like that.

---

This:

> All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.

> The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.

is really the summary under every explanation or criticism of OOP. It is way more eligible for expression than most blog-posts, but it is so concise that it kind of doesn't even warrant to be that.


> And while we're at it, GOTO is never a good idea, don't use it even if your language provides it.

Good luck with that if you're a C programmer.


These are not the same thing. The GOTO people complained about and what the famous article "GOTO considered harmful" is about, is called longjmp in C. Nearly all C programmers will agree with you that you shouldn't use longjmp. The goto of C has less freedom for control flow than try-catch constructs in other languages.


Well sure, but don't use it to implement if/while/for.


This is a very minor but pleasant surprise. An action like this is beyond what I thought the US government (my government, sadly) was capable of. It is kind of puzzling to me that this issue, like every other one, didn't get politicized, with right wing talking heads bemoaning progress of any sort, appealing to the good old days, when America was great, the days that MAGAs want to return to.

It's a good start. Now let's do metric.


I came to source code control reluctantly. CVS, SourceSafe, others I’ve forgotten. One of them was very expensive, very complex, took months of customization, and then it scrambled our bits following a disk crash and the vendor had to piece things back together. An expensive nightmare.

I finally started using Subversion, and it finally clicked. Easy to understand and use. It did everything I needed, and it was intuitive. But git was gaining popularity and eventually that was the only “choice”. And I don’t get git at all. I can do a few things that I need. I often have to consult google or experts for help. While I get the concepts, the commands are incomprehensible to me. I hate it.


Subversion & Mercurial were decent. SourceSafe is utter trash. I've learned to use Git, but I've always used an IDE; I hate the CLI commands.


I got tinnitus in my late 20s. Forty years later, it's still there. Research into the causes, and treatments, has been disappointingly slow.

I would really like to experience total silence at some point, but that seems very unlikely.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: