Hacker Newsnew | past | comments | ask | show | jobs | submit | more leethomas's commentslogin

> The main benefit of knowing multiple languages is eavesdropping on people in the street speaking their language, but that's about it.

Hah, that's the least of benefits IMO. I'm not sure if you have no interest in the following or just forgot them, but these are things I enjoy: literature in the native language, comparing words and idioms and understanding how different languages influence each other and also how different cultures led to the creation of certain idioms. Conversations with people in their native tongue when I travel and the stories, adventures, and knowledge that unlocks.

To anyone reading this who only speaks English, while I agree with this person and the study that I don't necessarily feel smarter, learning another language is absolutely worth it for the advantages I stated above. My life is more rich because of it.


What do you figure is next for you?


I’m starting a Residential trash/recycling hauling cooperative

No investors, democratic organization, our vision is to give economic power to the working class


Yup, Plex is the way to go for me too. I use plexamp as my client and it's been a wonderful experience. Really love the sonic analysis features, especially the guest DJs.


I disagree, I think a degree of homogenization is good for information heavy websites, like government ones [1][2][3] and sites geared towards documentation. Consistency here is good because it makes things familiar and therefore means people spend less time trying to figure things out/find what they're looking for. Creativity isn't necessarily the point with sites like these. Now if you're marketing a product or showcasing something on the creative side, that's a different situation entirely and in that case I agree with you. The Bootstrap wave 10 years ago was indeed excessive.

[1] https://18f.gsa.gov/

[2] https://www.gov.uk/government/consultations/project-gigabit-...

[3] https://www.healthcare.gov/


Something about "I've been around since 2009" is hilarious. Evokes "I'm 14 and I am very smart" vibes


I recently started using plexamp and it’s been fantastic. I only use Music.app on my machine to rip CDs to a directory on my NAS that plexamp is serving.


> Often these roles are far less stressful

What did you have in mind specifically? I can’t think of anything myself that doesn’t seem just as or more stressful and with less flexibility.


Product Owner, Product Manager can be loads of fun and very creative at the right company - though both can be high stress in the wrong team/company.

Project Management can be fun if you are organised, Programme Manager if you want it to be a more senior role

Business Analyst can be great - and can get pretty senior.

In some companies the step out of programming and into architecture can be a very different pace, but all companies will vary. In my current company it seems full on but at the last it seemed a great role.

Strategy is really important and loads of people and companies do it poorly - great opportunities to do it well! Can be a lot of board papers and socialisation of ideas, but the pace will be very different, and while there can be some stress, will not be consistent.

Obviously ymmv - find a good company that you like and things will be a lot easier. Important jobs don't have to be high stress.


I wouldn't recommend PO or PM roles if you're looking for low stress. That's not to say these roles aren't fulfilling or rewarding, but they come with stress. When things are going well, praise and recognition goes to the teams. When they aren't, you're the head on a spike.

I'd agree that the stress of Product roles varies with company. Dealing with clueless, egotistical executives and HiPPos is never enjoyable.


Organizer for tech conferences [0]. If you operate an event venue as a former engineer the attendees will thank you. You also save on costs because you can self-host most of the tech stack.

The stress turned out to be higher than a 9-5 job though, so think twice :)

[0] https://handmade-seattle.com


I know a person who was a technical customer service manager (for a very technical product) and a general-purpose writer.

She now writes the customer-visible bug-fixed notes for a very large, very technical software product (with minor releases on the order of every 4 weeks, and major releases annually). A good bug takes 5-10 minutes to write up in an appropriate, customer-friendly, legally and security-appropriate way.

A bad bug might take hours of tracking down engineers who did the work, claimed they did the work, mis-tagged the bug entry, improperly closed the bug, improperly left the bug open, improperly merged the bug...

But at the end of the day, she isn't responsible for fixing the bug, just documenting it properly. The workday is essentially 9-5. And there's always another bug.


Technical Writing is an example.


I agree that the article generalizes, basically to the point of condescension.

Were you “there” though? I wasn’t so genuinely curious if his recollection reflects reality.

EDIT: they responded to another parent comment I created and were definitely working at Microsoft during the period.


Started programming in 1981. His recollection is correct. We were agile. Today's "Agile" is a disaster. We are not being condescending, we are mourning what once was an uninterrupted delight.


Could someone of the appropriate age group with experience please confirm for me if everything this guy is saying was true generally speaking?

I’m honestly curious as I’m too young to know myself and I have no way of knowing whether the article was written with rose tinted glasses on.

But wow if it’s accurate the point about meetings and concentration sounds amazing, would love to go back to that.


There's truth in it, but he omits the context for why these changes happened.

In 1990, real software companies could get away with shipping something major less than annually. Companies paced development to events like the annual Comdex. No company was expected to ship production software weekly, let alone daily. Competitive threats arose more slowly because there was less capital. The US software industry could be reasonably sure a Chinese firm wasn't going to suddenly arrive and become a material strategic concern in a few quarters. Seriously, read books describing 1990s era software dev and see how slowly things are paced before the Internet really hit. Look at the time scales involved. Even the rise of mighty Microsoft, considered the poster child of rapid high-tech growth circa 1992, pales in comparison to the Internet giants.

The world sped up. I hate the meeting culture we have now, but it's important to understand that it is a response to the pace of change rapidly increasing over the last 24 hours.

The other thing that really happened is the consumerization of software. In 1990, a much smaller slice of people used any computer in a given day. Frequently, those people could be expected to have undergone specific training for your product. Today, the number approaches 100% of adults globally. This means a higher level of fit, finish, and robustness are required now versus in 1990. Expectations are far higher.

Third, and I can't believe this needs to be said, but Microsoft was known for especially buggy software during the glory days highlighted in the article. Their OS releases were frequently late. They had to abort/restart the Vista project. Aside from the aesthetic factors of crashy software, it is somewhat obviously bad to have a multi-billion dollar company be unable to predict when (if?) its flagship products will ship, and in what form. Obviously, something had to change.

There's a lot to learn from the old days, but if one doesn't understand why things changed, one is likely to take the wrong lessons from history.


> The other thing that really happened is the consumerization of software.

Yes. Our customers were, for the most part, nerds like us in the 90's. We knew what features they wanted because we wanted them.


My experience at Microsoft in the late 90s and early 00s jives with his descriptions, mostly. But the essay went out of its way to be cynical and reductive.

He's not making things up, but I don't think his takeaways are worth, well, taking any further.


Same period of employment for me too and I agree that it mostly matches my experience. Even though I find the article an exaggerated polemic, there are a few things that really resonate with me:

1. Open offices are stupid. For proof just see how many people love working from home where they have far fewer unwanted distractions. Yes there are other reasons to like remote work too, but I bet lack of distracting surroundings is up there.

2. Not having dedicated QA is mega stupid for something like Windows. You can clearly see that in Windows 10.

Other than that, yes I am a bit more accepting of some of the newer stuff, but I get where he's coming from. y'all need to get off my lawn.


If open offices were kept to the same atmosphere as libraries, I don't think there would be such an issue.

But it's truly bizarre that they're not, and companies think that is OK to put people in an environment that is the completely opposite of the environment it should be.


So culturally which style of working do you prefer yourself? Or if it’s not binary, are there parts of how software was written yesterday that you’d like to bring into organizations today, whether mentioned in the article or not?


There's room for both extreme concentration and extreme communication, and most teams need some of each.

I actually snorted at the essay's idea that communication and concentration can't mix. I'm not sure what he thinks email and detailed specs are if not communication. I think the key is that they're batch-mode communication.

I can see value in several hours a day of concentration unbroken by communication, but insisting on a full week straight of unbroken concentration is impractical and possibly counterproductive.


It's not that there was NO communication. He said that he'd go to QA and fix anything that they found. It's just that it was more informal, and you did it when it was time to do it, but not just meetings every day that are required, even when there's nothing really to say, or what couldn't be done in a 15 second or one minute conversation. And you could do it on your own time, so you didn't have to stop right in the middle of some super productive time when you're on a roll. Because that's how it works - some coding time is more productive than other time. You're just into it and everything is just flowing, as he said.


I much prefer the old days. At the risk of the cliche (and I realize all the negative connotations as well), we were cowboys, but I liked it that way.

Of course, that may be what drew me to coding then. Perhaps the new culture attracts a new type of coder that likes process much more.


It is accurate in my experience.

We had "team (tech) leads" in the old days that took the place of marketing more or less — setting the general course for the software as a whole, the architecture as a whole.

But each engineer was given their own sandbox, their own piece of the framework/app to own. "Go implement an image cache."

Image cache engineer got to style their code however they pleased. They could tear it up and rewrite it when there was excessive technical debt. They knew it inside and out since they wrote it.

Often if it was code they inherited they would end up rewriting it eventually regardless — maybe piecemeal.

I don't remember every having code reviews. Sanity check? Sure, when there was some tricky bit of hacking required.

Yes, QA. Unit tests seem to make management happy these days. "I want to see 85% coverage!" as though that magically means we have less bugs.


I'm a few years younger than the author, based on when he started working, but I can relate to some of the attitudes of "the old days".

I worked at Microsoft and do remember when everyone had an office and the goal of managers was to keep you from having too many meetings. Nowadays, I also have probably 1-2 hours of recurring daily meetings. Absolutely everything needs to be tracked (daily standup, sprints, 1:1 with manager, skip-level meetings, team meetings, all hands, peer feedback), so you end up with meetings just to make sure everyone is on track. It's ridiculous.

I think it's all related to the fact that the pace of software development was just so much slower back then. I remember you could really bullshit a lot more 10-20 years ago. You could hang out with a coworker for an hour or play games with other coworkers and it was acceptable use of time. Nowadays, with daily standup, I feel guilty for wasting too much time because I'd have to give an update on where I am with work. Before, you might sync once a week, at most, so you had some breathing room. The industry was just more "fun" to be in.

Product releases were on yearly, or multi-year, schedules as well. Now everything is updated every week. I remember even ten years ago, we were releasing our software every 2 or so years, then they worked towards once a year, and then 6 month updates, and then monthly, and so on.


Nope. It isn't. I predate the author by a decade+ (first job in 1996), and he's bloviating.

Ask me about the time we had an informal meeting to decide on the premeeting strategy where we'd define the strategy for the actual meeting which would then result in a presentation to management so a decision could be made :)

It's always been a question of the culture you're in, and what they're trying to do. It still is. More collaborators => more meetings. Less ability to tolerate bugs => more formal development processes.

There's only a limited set of things where you can bang away in your dev cave for a few months and emerge with something people actually want. (It's absolutely more fun than the more formal/meeting-heavy approaches, but you're limited in scope and quality)

(With the caveat that open offices are and always have been an extremely stupid idea, and those were indeed less common)


It’s valid for his interpretation of his experiences.

Companies varied tremendously. I don’t share many of his opinions and experiences, and I’ve been coding his entire career.

I LOVE pair programming, for example, on the rare event that I can convince someone to do it. And “waterfall” did happen, but was mostly Fortune 500 and government contracts, and those of us not in that situation laughed at it. And I see benefits of test driven development, even if I write most of my tests after the fact.

The number of meetings, in my experience, was more a function of the size of the company and the team, than the era. I suspect some of the change he blames on millennials is just Microsoft getting more bureaucratic and bloated over the years.

One thing that I haven’t seen anyone commenting on is the huge shift in how software is delivered. If you take a year or two to plan for a physical release of a product, there is a lot more pressure to get things perfect the first time. If you can patch something today and go live, it’s not the same as have to send physical updates to a million customers. QA was definitely important, and, I would say, more important than today. Developers as testers is great, but by no means enough. I think testing has been shifted to users somewhat.

I do think that there was more of an understanding that uninterrupted time was very important. IBM did a lot of studies on what produced productive programmers and uninterrupted time was a huge factor. But software development has changed hugely over the years. I’ve been building a project that is similar to something I built only 15 years ago, and it’s amazing to me how much easier it is. I’m spending way more time gluing components together, with the help of stack overflow, and less time on hard problems. This may have changed what the optimal strategy is. I certainly try hard to be a good programmer in today’s environment, and not waste time complaining that things were better when we carved software into stone tablets.

So, I’m not going to say he’s wrong, but a lot of that was not my experience.


I'm from the same age group and, while I did not work in Microsoft back then, but everything he says matches my experience very closely (maybe just exaggerated a bit).


The author isn't wrong, but there's enough straw men in there that I'd recommend that you refrain from smoking while reading it. What was written was most likely true for the group the author worked in at Microsoft at the time. It most certainly is not a report on the software industry in general at that time. Some of it's true, some of it might have been true at some companies. But do note it's generalizations that don't give much to argue about; it's mostly old man shaking fist at clouds. Or cranky old anti-social asshole rants about kids today, depending on the read.

But this gem stands out:

"I would fix it, he’d verify, and we’d keep no records."

Yeah, you don't get to work on my team if you're going to sweep bugs under the rug.


First paid programming was 1981. It is all true. And easy to understand: http://www.paulgraham.com/makersschedule.html


This is all also described in the book "Peopleware: Productive Projects and Teams". Especially the importance of individual flowtime. I tried to explain this once to an agile coach and suggested he read that book. We then continued to stuff our faces with candy while he started a "flow" game mostly focusing on imaginary situations demonstrating that collaborating as much as possible increased the flow of the team.


Confirmed.


Not the same thing, but I believe the parent commenter because I’ve seen multiple studies done where it was found that breathing exercises seemed to help with reflux: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3807765/


Nice, I'll give this a read. Thanks


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: