Hacker Newsnew | past | comments | ask | show | jobs | submit | shaper_pmp's commentslogin

You might want to clarify you never retained any PII, unless you can 100% confirm that that favicon request didn't include any embedded user- or installation-id in cookies, headers or the like?


It was only a minor issue that got resolved quickly thanks to your question, but DDG's response to this whole issue of losing users' trust seems to be accurately characterised by the phrase "a succession of embarrassing own goals". :-/


Wow - they nicked my entire comment from reddit days after I posted it there, and didn't even give credit. :-(

http://www.reddit.com/r/Android/comments/138res/google_launc...


I'm not sure I buy the augmented reality angle - the hype-generating marketing adverts seem to imply it, but nobody who's playing it has mentioned anything about AR - just a GPS- and map-based game. Augmented reality is when you use things like the phone position and orientation to overlay graphics onto a camera-view of the real world, like this: https://www.youtube.com/watch?v=U2jSzmvm_WA&feature=play...

As such I doubt it has much to do with Google Glass specifically - more likely I suspect they're tracking people walking around as they play to build a massively improved pedestrian-route database for a new pedestrian (and in-building, for places like malls, museums, etc) version of Google Maps: http://www.reddit.com/r/Android/comments/138res/google_launc...


Shaper_pmp, but thanks for the mention. ;-)


This is true - doing is undoubtedly the best way to learn.

What I meant was that you shouldn't do something for money, or to be put into production before you know what you're doing. ;)


What you really mean here is that humans are complex and adaptable and have a theory of mind, whereas computers aren't and don't.

This, however, is not a fault. I work in web development, and I see every single day confusion and problems caused by a mismatch between what someone asked for and what someone else understood they wanted.

When you're writing computer code you should specify exactly what you want - architects don't design a house by drawing four walls and a roof and some rough, to-the-nearest-metre dimensions on the back of an envelope, because then houses would fall down and kill people.

We shouldn't design software this way for the same reasons.

The main problem is that people are lazy, and think in generalities. That's fine for socialising, but many things in life require precision (especially in our increasingly-complex society)... and lazy people used to relying on implication and inference find it a great effort to be as precise as is required.

The solution is not for programming languages to guess what people want based off their imperfect, imprecise and un-thought-through requests - it's for people to learn to request what they actually want, instead of asking computers to essentially "do some stuff with a thing".

It's not a question of success of communication - it's a question of imprecise requirements, lossy communication methods and inescapable rules of information theory.


> What you really mean here is that humans are complex and adaptable and have a theory of mind, whereas computers aren't and don't.

That exactly what I meant.

> This, however, is not a fault. I work in web development, and I see every single day confusion and problems caused by a mismatch between what someone asked for and what someone else understood they wanted.

Coming from the same environment I also see how this confusions are avoided. And that is by direct fast two way communication that does not end too soon, talking, drawing pictures, pointing fingers and prototyping. I imagine same solutions should be incorporated in programming environments to make them more accessible.

When you don't get what someone said, you just say to him what you understood and sometimes what he might meant by saying that and he agrees or clarifies until you both are sure enough that you share the same idea.

That kind of dialog might have place in programming environments. Code completion and parameter hinting are very crude and silly attempts on making such dialog. Despite their silliness they are immensely useful.

> When you're writing computer code you should specify exactly what you want - architects don't design a house by drawing four walls and a roof and some rough, to-the-nearest-metre dimensions on the back of an envelope, because then houses would fall down and kill people.

I just think that architect should not necessarily need to be all human. Some technical details can be worked out in dialog between human and machine.

> The main problem is that people are lazy, and think in generalities.

That is the problem but thats just how humans are. Even programmers. It's not gonna change so things should be designed in such way that allows people to be lazy, still got the things done and not to hurt themselves.

> That's fine for socialising, but many things in life require precision (especially in our increasingly-complex society)... and lazy people used to relying on implication and inference find it a great effort to be as precise as is required.

This effort comes more easily if you can clarify things step by step and not have to be perfectly clear from moment zero. I think current programming environments seriously impede designing solutions to problems because they require precision too early and do not infer enough in proper way.

> it's for people to learn to request what they actually want, instead of asking computers to essentially "do some stuff with a thing".

It's not gonna happen. It lies in biology. Organisms always try to achieve goal with least possible effort.

Progress is not done by making slaves not lazy but by inventing steam engine that enables lazy people to do more.

> It's not a question of success of communication - it's a question of imprecise requirements, lossy communication methods and inescapable rules of information theory.

In my opinion programming is about forming exactly same idea in two places, programmers brain and computer hardware. Designing a program is part of programming and computer should play more active role in this part.


Hi there - article author here. ;)

> I don't like idea that you should keep inconvenience in place to keep of idiots. Idiots won't succeed anyway but inconveniences harm professionals and novices.

I think you've misunderstood - the argument is not to keep inconvenience in, it's to avoid taking things out which aren't actually a problem in practice.

As you rightly said, "human languages are geared towards interactive communication not towards describing problems and solutions precisely". Therefore any attempt to turn a programming language into English is not going to address the real difficulties of programming - specifying the problem and solution in enough detail, modelling the desired system in your head, avoiding/considering/handling edge-cases, etc.

All it's going to do is to make the experience of programming fractionally easier for people with effectively zero programming skills, and hence give them unwarranted self-confidence.

It's like making a bike easier to ride by gluing the rider to the seat - sure it's technically "harder to fall off", but the problem isn't usually the rider falling off the seat - it's the whole bike falling over, taking the rider with it.

Likewise, the hard part of programming isn't remembering precise and finicky syntax - it's making sure your own thinking is precise and finicky enough that you can accurately express it in such a finicky, precise language.

I'm not arguing in favour of long function names (I'm not entirely sure how you got that from my post) - I'm arguing against people who think that by changing "print('Hi there');" to "print 'Hi there'" they've somehow solved the problem of "programming being hard to do".


> I think you've misunderstood - the argument is not to keep inconvenience in, it's to avoid taking things out which aren't actually a problem in practice.

I really don't know what can be kept because it's not the problem, because I tried to get into programming and I succeeded. All barriers were passed. But the same barriers that I've passed without noticing may effectively discourage someone else.

Programming language is just UI of the complex automation system. It should be as easy, clear and intuitive as it can be. In my opinion same general rules should apply to computer languages design as to UI design. Designing good language is not about how to enable programmer to use hashmap (or whatever) with fewer keystrokes but how to make a man that has never seen a hashmap before to know at first glance that it might be solution to his problem and how to use it.

> Therefore any attempt to turn a programming language into English is not going to address the real difficulties of programming

Turning words into English is not gonna help much. But finding metaphors for programming concepts (for example dictionary instead of map) that are less mathematical and more similar to concepts encountered in everyday life might improve accessibility of the programming world.

I also think that computer should try to guess what you mean by what you are writing not give you a finger over a missing semicolon or quote. If there is ambiguity in what you have written he should ask you additional questions to establish what you want.

> It's like making a bike easier to ride by gluing the rider to the seat - sure it's technically "harder to fall off", but the problem isn't usually the rider falling off the seat - it's the whole bike falling over, taking the rider with it.

I see simple programming languages and systems more as a training wheels that protect you from falling off with the bike and never trying again. They introduce you to concepts of pedaling and steering so you can get some fun out of this activity and eventually teaching you how to keep balance by not punishing you too severely for loosing it.

> Likewise, the hard part of programming isn't remembering precise and finicky syntax - it's making sure your own thinking is precise and finicky enough that you can accurately express it in such a finicky, precise language.

Right. Even if you know exactly what is the datatype that you want to define you may have a hard time expressing it properly in C even if you are pretty good novice programmer.

That is just anecdotal example of syntax that can be obstacle even for smart people. We might have no idea what syntax might be obstacle for less mathematically inclined.

> I'm not arguing in favour of long function names (I'm not entirely sure how you got that from my post) -

Sorry. I mentioned PHP and that particular function because PHP is often perceived as too easy language that enables people who don't get the programming concepts to program and make a mess. I strongly disagree with that notion. I think that PHP's popularity is due to some very good decisions simplifying design (like single datatype to express various lists, dictionaries, arrays, lack of need for conversions between strings and numbers, run and forget model without long lived server side entities) and it's security issues are due to some very bad design decisions (one of them being the one I mentioned).

> I'm arguing against people who think that by changing "print('Hi there');" to "print 'Hi there'" they've somehow solved the problem of "programming being hard to do".

Claiming that such operation solves the problem of "programming being hard to do" is just silly because making programming easier is much harder thing that involves some real research on how people learn how to program.

But I think that this improves things slightly. Every unnecessary operation that has nothing to with you goal and lies between you and your goal is bad. Same as in UI design. All this ( ) ; is strange crap for graphic designer who wants his Flash flick to go to given page after clicking on it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: