In some ways, software development practices have degraded since that era. It largely has to do with the need for speed which comes at the expense of careful consideration, quality, integrity and the formal standards that support it.
In fact, I believe it pretty much killed the profession of software architect. Many teams had it as a dedicated role, and this indeed would be a person documenting/designing systems using UML or otherwise. And they'd know the classics, like memorizing all design patterns. Finally, they'd use formalized architectural decision making methodologies to justify tech choices.
Nobody seems to do anything like that anymore. Everybody is half-assing design or skipping it entirely. Solutions are reinvented and tech choices made on a whim by the loudest person whom won't see the consequences of it anyway. Because we've told ourselves that shipping garbage in short cycles is the one and only way to do things.
Yeah, hard disagree on all that. As someone who lived through that era professionally, and who has had an "architect" title in the past, I was actually resistant to ever stepping into that role, because I had so many bad experience with the Formal Architecture Methodology crew, who would produce the most absurd and out-of-touch designs. There's a reason that "architecture astronaut" is a term from that era.
Lots of people make bad designs today, sure, but it's not for a lack of formal methodologies, because average design quality was way worse back in the design patterns 'n' UML days.
Seconded. I once went to an IBM seminar with Grady Booch and seldom have I heard such an inane string of platitudes. Architecture Astronauts are the worst, and if there is one thing Agile can legitimately claim for, it's getting us rid of that plague.
I agree with you both except that architects are alive and well. I worked with an ex IBM architect a couple of years ago. Lovely guy but for someone designing software systems, I found it remarkable that he didn’t know how to write code.
I'd say this is a company red flag if they are still having these people around. About four years ago I was at a company that still had someone like that. Useless and highfalutin, he obstructed many projects thanks to an archaic director who thought he was still necessary.
I got out after butting heads with him constantly, and I don't think said company ever shipped anything meaningful in his entire tenure here.
I think you get these people in body shops - companies that basically rent out their skilled staff to other companies. The Architect sounds super important and is the most expensive resource, so obviously they sell it super hard.
When I worked there I was a product guy, and it took me an embarrassingly long time to understand why I didn't fit in... but yeah, I didn't last long.
All of these experiences resonate with me. The pendulum has swung hard to the continuous deployment model and just like the other side of the long arc of the pendulum not everything about it is good.
I don't think it was software architecture, UML or design patterns that was bad. I think the 'Open-Closed Principle' is one of the worst ideas to ever gain popular acceptance.
For anyone who didn't live through that time, the Open-Closed Principle states that software should be open for extension, but closed for modification.
However, you could also rephrase that principle to be: 'you should always prematurely abstract your designs'.
I think if abstraction was viewed as a negative to be avoided unless necessary, software architecture would have been far better off.
To be fair, premature abstraction is a lot of fun for those that do it. It's just those that follow who aren't so keen.
That is an import perspective. I've always struggled to understand why (esp. in the Java enterprise world) things were so complex. It took me a while to see through it and now I create abstractions when I need them, not just in case. That's why I don't like things like Clean Architecture.
E.g. I don't create an abstraction in case I someday need to switch the database from SQL to NoSQL, but when I need the abstraction right now for an alternative implementation (e.g. mocks for testing).
This is very on point. Premature abstraction has always a struck me as way now evil than what premature optimization can create.
Every class has an interface abstraction, or worse inheritance hierarchy, that adds friction to change, even though it's the only implementation.
When something similar comes along it's pressed into that shape, because the abstraction is already there.
It's like this meme video where it's hard to watch all the round and triangular pieces being fit in the square hole.
And the problem is that while software is more flexible than traditional architecture, hence soft, the complexity limit is also soft or virtually non-existent. So in software it's worth more to be simple and flexible than to have a plan that's long and detailed.
I’d argue software architecture has never been more healthy.
DDD Europe by all accounts was a resounding success. We’ve replaced older architecture activities like data modeling first with event modeling and Domain-Driven Design.
We’ve learned to embrace monoliths when appropriate and reduce complexity with bounded contexts.
We have phenomenal testing capabilities that didn’t exist 20 years ago.
We have a myriad of data storage tools.
We understand front-end engineering from an information architecture perspective.
We can design detailed architectures in tools like LucidChart very quickly with detailed solution details.
I’ve been at this for 40 years and I’ve never felt better about being a software architect.
Could we bring back the UI people from that era though? Trying to standardize good software is insane but having a consistent UI with standard elements for all programs that's usable with both mouse and keyboard was kind of nice.
> And they'd know the classics, like memorizing all design patterns.
That sentence is bringing back ptsd-like flashbacks.
I remember distinctly how much everyone thought that approach you describe was broken when it was the trendy thing.
I'm not sure how universal it ever was, but it was at one point a trendy thing that everyone thought they should be doing... and that so many people subject to found disastrous. [The phrase "architecture astronaut" was a common epithet, and not a friendly one]
That above paragraph [without the brackets], ironically, could certainly be said of "agile" more recently too. I don't know how universal it ever was, but it was at one point a trendy thing that everyone thought they should be doing... and that so many people subject to found disastrous.
But yeah, I'm pretty convinced going back to appointing some almighty deity architecture astronaut who isn't responsible for or involved in any implementation (let alone operations!), who hands down plans from on high after "memorizing all design patterns" and drafting some diagrams, never sullied by "contact with the enemy"... no thank you, but thank you.
----
Instead of just complaining about that though, what I'll say in addition is -- I think the real problem is that engineers aren't given the time to carefully consider top-level designs. it's a basic business/resource issue -- until engineers have more breathing room to talk to each other and research and consider and come to decisions in an unhurried way, the top-level design stuff will remain chaotic. It's not an issue of appointing an ivory tower "architect", or something solved by it.
Although sure, there should be senior and even "staff" or "principal" people with more authority/responsibility for higher-level designs.
Everyone should be responsible for design at the level they are working. Everyone needs enough time to feel like they are doing it well, instead of running on a sweatshop code production treadmill.
To be fair to the design patterns people, a lot of them got baked into pieces of infrastructure like web servers and frameworks, while others are part of frequently included libraries.
If I went through and inspected any Node, Rails, Django, etc app I would find many Gang of Four design patterns, but very few of them would be in the project-specific code. They got implemented well, and now programmers can build new things that would have taken too long to do before.
And that was the intent of the Gang of Four book, not to teach you patterns you should copy mindlessly, but to give examples of how to identify and extract useful patterns from the software you've already written and describe them to others. Since that is a lot more difficult than memorization, very few took up that work.
This. As a hiring manager I noticed on interviews that many engineers when asked about patterns, do not even realize how much they rely on them in standard libraries and rarely understand that it’s not a fixed collection of templates, but rather a way of working with code. If you identify a common design and give it a name, you can save a lot of time explaining solutions based on it.
So, here's something I hate on this site. Arguments that are made in the fashion "People who disagree with me must not understand X". When in fact you're often talking to some of the smartest people in programming on here (though, maybe not in other things.)
It's possible to understand design patterns and work in a language that doesn't require them as much.
Here's a talk about it from 2009 with regards to python.
Some design patterns work around a bad language, but most are about complex problems. They are a way of describing abstraction, and that is what languages are about.
Even GoF says that class and interface are part of language for C++ or Java can be a design pattern for languages like C.
I think in some ways AspectJ can make the observer design pattern obsolete in Java. I have arguments against doing it in aspect way, but it was revelation nonetheless.
Back in the day (before open sourse became so prevalent), lots of software was designed from scratch with the minimal use of outside components and frameworks.
Today OSS frameworks (especially web frameworks) and libraries, PaaS, and Cloud provides you with an already baked-in design patterns. So there is less need in SW Architect and proper design.
Also, most of the Gang of Four patterns are just addressing the deficiences of OOP and older PLs, so if you're using a modern PL with closures/etc. there is no need in them.
EDIT:
Novadays a vast majority of design patterns are dealing with the complexities of microservices architecture, so if your product is a monolith then there is no need in them.
IMO the most useful design patterns are the ones dealing with error handling, reliability, and the essential complexity comming from the real world and human actors, not the self-inflicted accidental complexity comming from the bad design decisions. The good SW Architect should be able to help in avoiding that.
I tend to agree about the time thing, namely that people aren't given enough of it to sit down and take their time with design decisions. If you squint you can kind of see that "software architects" are a business's way of creating a "solution" to this problem, but with lackluster results.
That said, I've been at places where management did a pretty good job of making sure that there was enough time to do this kind of work with middling results. People have to enjoy the work or feel invested in a way that makes them care about "3 years from now", and sadly in a lot of places there's a lot of "whatever, it'll be fine for as long as I work here."
It seemed to me that in let's say 50% of cases, having the extra time didn't really matter. Assumptions about what was being built were proven incorrect not long after a design was laid down and work had begun, or particular engineering types would create increasingly complicated designs with their time rather than doing the hard work of distilling the problem, sometimes including $PET_TECHNOLOGY as part of their solution.
The "let's design something truly great" really has to be in a team's DNA in my experience or you wind up with a good design that isn't followed, one person who does all the work, a gold plated design, etc.
I'm not sure what to make of it as far as general narrative.
If people are creating stuff they don't really care about, of course it won't be very good? And there's a lot of money spent paying people to build things that honestly nobody would reasonably care much about?
One way or another, the pining for a software engineering world where we all collaborate on creating things that are maintainable products with high-quality user-facing experiences, and have the time to do that... well, that's not the one we've got.
That’s because software architecture is something any senior employee should be able to do, and it’s not as important as people thought.
Like many abandoned corporate practices, I think it was abandoned for good reason. It may have made sense under different circumstances, like when you had a large army of cheap offshore devs who could not be trusted to architect a maintainable application.
If I had some ivory tower “architect” trying to interfere with my work I’d be so pissed. Anybody I’ve seen with that title, that wasn’t doing 1P cloud consulting where the title means something different, usually had no clue what they were doing and had been given the title as a soft retirement.
As much as I do think that software architecture does matter (though it should probably be somebody with a staff or principal hat, not a specific job title) I recently took a job at a place that really does like architect titles. Mine's even "lead architect".
It is pretty funny when somebody runs into me and realizes for the first time that I have the job because I build stuff and write code, not because I'm good at LucidChart. I'm planning things out beyond immediate needs, but not because I'm looking for job security--it's because I've built the thing we're doing before and would like to not make the same mistakes I've made in the past. I'm over here demanding adequate standards of code and low- to mid-level design, and the "wait he's serious?" of it is sometimes honestly pretty fun to run up against.
Yeah, I guess I would in retrospect refine my opinion to “software architecture is iterable and not completely separable from implementation” or that “architecture (as imagined by distinct architects who are shielded from implementation concerns) is not important”.
Principal engineers and such who are still involved in operations, implementation, and more tactical approaches are who I also think are “supposed” to be doing architecture, but even then, more as guides and first among equals than as people who hand down decisions from on-high.
The fundamental issue I have with a separate architect position is that it disempowers teams and makes them beholden to decisions that they may not agree with (and which very well may not understand the problem to the extent they do). It sounds like you’re doing the better thing of running up and down the layers of abstraction so your contributions empower people rather than disempower them
> Yeah, I guess I would in retrospect refine my opinion to “software architecture is iterable and not completely separable from implementation” or that “architecture (as imagined by distinct architects who are shielded from implementation concerns) is not important”.
This, I'd agree with. You have to be at the coalface to know what the hell is going on. At the same time, you have to be cognizant of business needs and why things are the way they are, which is to me a fair approximation of "the job of a principal engineer."
(My other hat here is "head of API governance" and that's largely a business-flavored analysis of APIs being brought onto our company-spanning platform. I couldn't escape having both in my head if I tried.)
> It sounds like you’re doing the better thing of running up and down the layers of abstraction so your contributions empower people rather than disempower them
Ideally, yes. In reality, I work for The Phone Company, and The Phone Company hires a lot, and I mean a lot, of vendor devs. I am doing their thinking for them a lot of the time; the swerve is that I can and do write code (have released moderately popular open-source libraries on their framework of choice, for example) and so the usual development practices of "sure let's make a dozen packages for marginal functionality" don't fly.
I am disempowering them, because ultimately, we will eventually be cycling out our vendors and I will be the one who has to own their output. So that output has to be something I can live with. But this place is Processes Georg and should absolutely not be counted.
(I like the job. I will enjoy when I eventually go back to a shop where the developers have a reason to feel ownership over the work.)
German speaking. You can take German IT to get an idea what would have happened to SWE if you'd have kept those bureaucratic methods from the 2000s as the backbone of all SWE endeavors: a horrible, expensive, non-working mess with barely any progress.
I think what many people, esp. from outside the SWE world, don't get: Software engineering is a deeply social kind of work. There are dozens of solutions for the given problems, you have to agree on one that works for all peers. That's the job. Optimizing it for drawing funny diagrams is not an issue if not for communication.
I'm also interested in the german software engineering culture. And also, heavy plus to this being a social problem more than a "engineers just need more time" problem. I tried to articulate this in another comment of mine but mostly beat around the bush. This is more directly what I was attempting to say.
Well in my theory core reason is that Germany never developed a thriving startup IT economy[0] that was ever relevant for the GDP, especially not in comparison with the industrial sectors (cars, steel, chemistry) and so IT completely got ignored by politicians.
That resulted in no one who'd challenge the biggest gatekeepers Telekom and SAP, so they lobbied for and enforced whatever they wanted[1].
If you study CS in a German university you can easily get an MA without being able to write software at all (I personally happen to know several people). German universities teach what is easy to teach top down and test for: The textbook stuff that came out of the whole Java EE/OOP/SOAP/UML sector. You barely get practical coding lessons and can avoid them completely if you want. The academic sector never realized how crappy German software products are and never bothered looking at what Big Tech is doing. With given data protection and soon AI regulations, as a university you'd have a hard time collecting enough training data because your law department would step in referring to the current legal insecurity (I've heard stories from friends).
Then we have this little crazy island Berlin which up until maybe 10ys ago was mainly driven by the infamous Rocket Internet "startup incubator" which is led by a couple of MBA sociopath billionaires, trying to copycat everything from SV and then sell it back to the SV company whenever they wanted to start conquering Europe. Thing is they never really developed enough SWE excellence to get the copycat successful in Germany or anywhere in Europe (with some exceptions).
Third example? Here you are: Today I learned that the gov't already decided 20ys ago that they want to provide all usual governmental services online. 20ys later they (allegedly) poured 3.5Bn EUR into an unholy setup of consulting businesses, incompetent civil servants and a panel of software architect astronauts who could never really agree on things.
All their deliverables are click-dummies, gazillions of PDFs with SOAP/WSDL/OMG/UML thingies and prototype projects rolled out in "experimental" cities. So if you happen to live in Bremen you might be able to register your dog online but not in Berlin. Therefore in Berlin you might be able to get a license plate for your car online. Pretty much all governmental projects (Covid vaccination registration, special governmental aid for students because of high inflation, etc.) broke down because all their systems are incapable of handling more than maybe 10k visitors (in my theory it always breaks down whenever the biggest single Oracle DB host they could buy is going down).
Germany has some decent software engineers, especially if they're self-trained and not brainwashed by one of the universities or big corps. But the environment manages to regularly piss them off and make em emigrate to somewhere else.
Ouff, much text. Hope at least someone enjoys reading it.
---
[0] This is because if you start a company in Germany you're faced with horrible bureaucracy wrt taxes, laws, politics, governmental authorities, etc. For example, you're forced to pay for a membership in a funny non-IT institution called "Industrie- und Handelskammer (IHK)" which essentially consists of a crowd of old men who are officially supposed to lobby for you and create a networking environment but if you ask them something like "hey, can you tell me how many companies are having problem XYZ right now?" they will tell you that they don't have any numbers and have no means to collect them. In 2023 they still send out a meaningless paper printed magazine. So not helpful at all but take a significant share of your gross turnover mainly to pay for their pensions.
Additionally, with all the regulations the governments set up over years they're now facing a significant cut in civil servants because Germany is getting older and older. As a result they're not having enough people to enforce or check regulations in time and never managed to develop any IT-based systems. Big problem with the influx of refugees in recent years and affects many other concerns as well.
Finally, there's this cultural difference to, e.g., the US that average Germans are not business-savy at all. If you tell the average German mom that you want to start a business, she will tell you that you're a dreamer and should get a proper job. Germans generally tend to think that companies are something god-given.
[1] They are still the go-to businesses if the gov't quickly needs something, like the Covid tracing app which German tax payers AFAIR ended up paying 120M EUR for (lol).
I just read up on the IHK and it sounds like the equivalent for business that an employee pays to be a member of a union. The fee is 47 EUR + 0.14% of gross income, which is not exactly "significant" compared to the other fees, taxes etc.
As for the old people running the IHK, from what I read, the membership is one-company-one-vote, so what stops people running for election?
As for the influx of refugees, that is a distinct advantage to Germany of an increased availability of workers, including many educated Syrians and others.
So it sounds like there's a problem with entrepreneurship in Germany outside the engineering / petro-chemical businesses. There's also a problem with your political choices due to the usual issues prevalent in every Western country. An aging population, the effects of the "financial industry" (a misnomer if ever there was one), the effects of climate change, etc.
We stopped doing that stuff because it was useless. We eliminated those practices and that profession because it was actively harmful to making good software. All those decision making methodologies consistently lead to worse technology choices than one dude actually trying to write some code with the thing for half an hour.
UML is mostly useless, but thinking ahead, even a bit, has value. I've seen stuff shipped in a sprint that was not used by any actual end users, only to be "redone" in the next sprint.
My experience is that shipping stuff that doesn't get used by any actual end users is more often caused by thinking ahead too much than by thinking ahead too little.
I've actually seen it both ways: whole features (or even products) that were too early. But on the implementation level, I've seen someone pick the wrong thing (library, database, whatever) "because it was simple", only to have it be thrown out before getting any real usage.
I agree that outcome is worse. However, both extremes are bad. The optimal approach would be to do a little bit more work upfront, pick a "better" thing (where "better" is at least more than the next sprint or 3!), and use that.
The problem is that the expection was to document the software design in excruciating details (e.g. class diagrams with all fields and methods, etc.) before any coding. To the extreme you were meant to all the coding in your head and a Word document before doing it again in your IDE.
This wastes a huge amount of time and usually the documents become obsolete as soon as you try to actually run some code.
This is the very issue the Agile manifesto identifies and proposes a solution to when it says "working software over comprehensive documentation". 'Comprehensive' is a key word as they don't mean NO documentation but effectively just what's needed to plan and help people understand the code.
I don’t think this is any more true than it ever was. If anything, the rise of open source libraries, GitHub and friends has made it easier than ever to reuse software and avoid NIH. This has lead to a different set of problems, but relative to when I was writing C in the 90s I’d say software reuse and design is far advanced from those days.
I do however agree that design is sorely missing from the current software project management zeitgeist, which means it’s done in a more ad hoc way. People are taking frameworks like Scrum far too literally, and I agree that in some cases there is little vision or overall architecture because the framework doesn’t include these things. There should be design and review activities both before and after coding, but these are largely neglected from most project management frameworks. Scrum for example doesn’t even include backlog management, which in my work is a critical activity.
While that doesn’t mean design, architecture and scope management is not done, it’s certainly less visible than it might be.
All of that said, I’m a big believer that the ultimate expression of any design is code. The design artefacts that come before code is written are scaffolding, age quickly, and are soon useless. So there’s not much need to keep them around, and they are mostly of interest as preliminary directions.
> In fact, I believe it pretty much killed the profession of software architect.
And thank goodness. I am all in favor of more consideration, more quality, more integrity. I like good architecture. But I think it's a giant mistake to create a software architect role. Every company where I saw that in practice, the average architect was a bloviator who did no actual work but was very excited to tell everybody else how to work.
At the code level, this resulted in a lot of messes. The edicts and white papers sounded good in theory to the kind of people who decided the bonuses of the architects. But frequently they were unworkable in practice, causing a lot of code that conformed to the theory but was a pain to actually deal with.
I agree with you that a lot of things are half-assed and rushed, of course. But we are never going back to a world of 18-36 month release cycles where people could stroke their chins for a quarter or two before building anything. Instead, we need to move in the direction of continually investing in quality, so that the design of systems improves over time. Something that I think is actually easier, in that waterfall design practices locked in important decisions early on, when people knew the least.
We haven't told ourselves that shipping garbage in short cycles is the only way to do things. The market has pretty much determined that companies that over invest in formal software design activities lose out competitively in the long run, and the survivors are the ones with the right balance. You're welcome to prove the market wrong.
> The market has pretty much determined that companies that over invest in formal software design activities lose out competitively in the long run
The same market that gave us AT&T, Comcast, DRMed IoT Juice presses, FTX, Enron, and so on? Not sure it’s wise to conclude that the market produces optimal solutions, or even incrementally better solutions than yesterday. Especially in domains with extremely long feedback cycles, like organizational trends.
I DO think there is some hocus pocus market zeitgeist that does something resembling gradient descent, but it’s acting over very long cycles and there’s a ton of room for cargo culting, grifting, and opportunistic grabs along the way. Heck, marketing is an entire field dedicated to affecting “rational actors” in the marketplace.
AT&T, Comcast are great examples of companies that have tried to expand into online video distribution, and completely lost out to companies with better software management practices. Both of these companies had a huge leg up 10 years ago, in terms of customer base and equipment (set top boxes already present in many US households). Neither of them is more than a small blip now on Netflix's radar now, who have been able to leapfrog into installing their software on smart TVs. The margins they have and their growth rates are much, much smaller than software centric competitors.
So much this. Coding to marketing dictums on the cycle of days/weeks is completely NOTHING like engineering solutions where subtle design issues can make the difference between life and death (or IRL super good thing A and IRL super bad thing B).
The pendulum swung, and now we bias towards action rather than design. But to be honest, the "software architect" era wasn't great either, with people in the role sometimes not qualified, spending months on an architecture behind the curtain, releasing some over-verbose spec document that doesn't match the product needs, cargo-culting the latest fad from Martin Fowler.
The problem you're describing Exists in all engineering disciplines. You don't hire firm to feasibility studies, develop a design plan, do a materials bid, sub contract clearing and construction for your mailbox installation do you? Most people hire a dude hanging out at Home Depot and pay him a flat fee.
The risk and complexity dictate how much architecture and engineering are involved and with all things, businesses will try to get by with a bare minimum. The automotive, chemical, and other industrial sectors invest a lot in software architecture and enterprise architecture because the risk is so high. But even a medium size e-commerce platform has very low risk or complexity.
while i feel as if i largely understand what you're pointing out, i kinda want to offer my own speculation as a SWE who is very guilty in thinking in terms of sequence diagrams (and also more formal UML) -- sometimes UML is so fucking bogged down with (impl) details, i am just over here going "i don't give a shit about the (impl) details, i just want to know the abstract concept(s), and logical flows, and focus on necessary system interactions" -- bc i can (and should at this stage) worry about (impl) details later.
i prefer to reason about a system and component relationships using, say, a single word as representation, instead of glaring at one or many inheritance directionals, interface details, and other "field" information, which is usually conveyed in a UML node.
i do not think the lack of formalism is a degradation -- we work with abstractions after-all, so it makes sense to further leverage that fact and express things simply, at a high-level, and straight-forwardly -- you can pack a lot into a single word.
of course as the nature of any tool, there is a time and a place for its application. but i don't think it is fair to call it a degradation.
I've noticed the "over-detailed" problem a lot at my company that has dedicated "software architect" roles. IMO part of the issue is that most of them do not write any code anymore, but 'architecture' alone just doesn't have a full-time job's worth of stuff to do (or they're bad about finding it). The 'solution' they end up on is just treating UML like code and specifying exactly how they would have written the whole thing - which is a huge waste of time when they could have just been writing real code.
Software architects, much like real architects, often end up with absolutely beautiful elegant designs which are unfit for purpose and have leaky roofs.
And other architects make buildings/software that gets done on time because unlike babies adding people can make things go faster (it is always sub linear, but good architecture can get close to linear, while bad can go negative slope)
> Everybody is half-assing design or skipping it entirely. Solutions are reinvented and tech choices made on a whim by the loudest person whom won't see the consequences of it anyway. Because we've told ourselves that shipping garbage in short cycles is the one and only way to do things.
Now hardware and bandwidth is so cheap now you dont have to care about design so much. Make a service, JSON in JSON out. Some kind of loose versioning but no formal schema and fix it up as it goes.
Sequence diagrams and state machines should be probably taught much better in CS courses. You generally see state machines in a theoretical course and things like sequence diagrams in an object oriented course.
What we should do is probably in your advanced data structure programming course that everyone has to take is to create a model of an elevator and diagram the behavior using a sequence diagrams. This would be achieved by using an associative array AKA a map [1] that would represent where the elevator is and what it has to do next (current state and next state based on input).
If you program this correctly it gets around using unit testing even because you have diagramed and all parts of the system are known and can just be gone through and that would be a sufficient test. An example of a library that implements this is at https://pypi.org/project/python-statemachine/
Overdesign is one of the worst things to do. Trying to make something fit perfectly into some obscure pattern with horrible class hierarchies.
Look at the linux kernel sources, it does not look "beautiful" to some architects but the actual ideas and patterns are simple, anyone can jump into it if they know C.
No, that's the era of worst, shittiest legacy software to date. The computers were just fast enough to handle layers and layers of extraction and oh boy industry loved to pile them one and UML was just another way to pile up the rot
- There's very high-level engineering occurring ('computer science'). I always assumed this would be at places with 'web-scale' problems, but I've been seeing amazing work in local (not-web-scale) product companies here in Australia.
- On the other hand, I think a lot of other development work is not much above building flat-pack furniture. I guess this is where lo/no-code solutions will thrive
The important thing is to recognise which is being done, and which practices apply each case.
In my short experience this is mostly due to OOP falling out of style, being replaced by a more hybrid approach that combines a lot of functional programming, procedural programming, and OOP - that, together with newer languages providing alternatives for inheritance (traits/concepts), and such, you rarely find yourself needing much more than visitor- and factory pattern.
I disagree. Software back then used to just fail as often, there was just less of it. Now there's lots of working software, most software sucks and lots of it fails. But the people paying well, and enabling their teams usually get good software at a reasonable price reflecting the complexity.
The project management triangle never stopped existing.
Apparently, you probably haven't started a real business or created a real successful product . This is what people believe when they are just a cog in the machine and they believe they can spend countless time designing or planning for things that are out of touch with reality
I can’t remember where I read this (Martin Fowler maybe?), but I agree that “box and line” diagrams should be enough for most design cases.
I was eager to use UML when it came out but I agree with the article, the only thing I kept was sequence diagrams. Most of the rest of it was just a complicated way to represent stuff that is best represented in code or plain documentation.
I’m also not afraid to use stuff that’s out of fashion. I use simplified ER diagrams, and even flow charts on occasion.
But the tool I use most often in design is SQL DDL…
I have also used Mermaid (using a modified Docsy theme in Hugo) and it is also great. Using Typora to edit Mermaid is also excellent.
I like use case diagrams as well. Thing is these should be short lived and used to make specific illustration in specific time. Not some holy grail of documentation that people should take as some god given truth for whole life of the system.
Yeah, I posted elsewhere in this thread that I'm a big fan of Jack Reeves' essays on "code as design"; namely, that the code is what's important, and the design is just scaffolding.
"The following essays by Jack W. Reeves offer three perspectives on a single theme, namely that programming is fundamentally a design activity and that the only final and true representation of "the design" is the source code itself. This simple assertion gives rise to a rich discussion—one which Reeves explores fully in these three essays."
Yes, actors and actions. It's the crux of the use case. I'm still operating under the assumption that UML invented the modern use case. I will not be googling this next.
Aggressive IDE-supported semi-automated refactoring has replaced UML. Modern languages have been designed with toolability in mind. Modern IDEs allow ruthless redesign of the architecture of code on the fly, allowing architecture to be modified to reflect actual need rather being a fairytale written before the first line of code gets written.
Teams aren't shunning design. They're just doing it incrementally, because as it turns out, trying to design a piece of software before the first line of code is written doesn't actually work.
Even in peak UML (2000 or so) there were no really practical UML diagramming tools. (Rational Rose was impractically expensive, and buggy enough that it was a race to see if it wouldn't crash before you actually finished your diagram).
Corel Draw had passable UML diagraming support. But I don't imagine it still does.
Draw the boxes and lines and make sure that people understand what they mean. Describe the system from the perspective of the reader. Just like a real architect will have different drawings/plans/elevations for their customer, the planning authorities, and the builder.
The architecture diagrams are meant to communicate the design, not comply with some over-worked standard.
UML was a clusterfuck that evolved from the trifecta of late 90s OOP (inheritance not composition), design patterns that mostly provided fill-ins for what was missing from Java as a language, the ridiculous concepts of generating code from diagrams that could be regenerated from code, which never, ever, worked, in the same way that ORMs have an impedance mismatch between OO and relational logic.
It was yet another silver bullet, the late 70s/early 80s had "structured programming", the late 80s/early 90s had CASE, the 90s had all the stupid diagramming tools where people argued about the shape of the bubbles and what arrows to put on the lines.
There was also the Capability Maturity Model, where everyone was trying to get to "Level 5" which was only useful if you were doing exactly the same software over and over again, along with the "6 Sigma" and "Black Belt" nonsense.
The 2000s had the "iterative Rational Unified Process" (an excuse to sell expensive tools from IBM), along with CORBA et al.
The last decade has suffered from the Agile-with-a-capital-A and especially "Scaled Agile" which is just an excuse for project managers to again treat programmers as fungible, while losing all of the affordances of actual project management, like GANTT, critical path, S curves, earned value etc.
Sprints are nonsense, so is "t shirt" sizing and velocity and burn down charts, and retrospectives and scrum "masters".
There are a couple of useful things:
* Domain Driven Design, using Names That Make Sense To The Customer
* Entity Relationship Diagrams, where the Entities are the same Names as the DDD
* State Diagrams for those Entities, describing the events that will cause them to change state
* Event definitions that match the State Diagram transitions, where externally generated events end up being your external API.
Bah, I've been in this industry for almost 40 years and its the same shit over and over, just with different names, and different consultants selling expensive courses.
Don't take this as an endorsement of BDUF practices and the like, they're as fraught as anything, but it's important to also recognize what modern development practices trend hard towards themselves: the trapping of oneself in a local maximum.
I have a blog post in the works about this but it's not ready to share; in short, I can't help but notice that hyper-focused, optimize-for-time-to-market, minimum-viable-product projects have a real, and frequently killer, problem once you have built your initial, hopefully-better mousetrap. Almost every company I've ever worked for has gotten that initial mousetrap done, tried to expand horizontally to actually have enough stuff-that-works-together to actually sell, and fallen on their faces because that initial super-specific development effort created not just code, but product assumptions that are prohibitive to unwind.
Most of them hit the hillside because reality is no longer playing ball with their (frequently VC-driven) needs to cut scope and ship.
In fact, I believe it pretty much killed the profession of software architect. Many teams had it as a dedicated role, and this indeed would be a person documenting/designing systems using UML or otherwise. And they'd know the classics, like memorizing all design patterns. Finally, they'd use formalized architectural decision making methodologies to justify tech choices.
Nobody seems to do anything like that anymore. Everybody is half-assing design or skipping it entirely. Solutions are reinvented and tech choices made on a whim by the loudest person whom won't see the consequences of it anyway. Because we've told ourselves that shipping garbage in short cycles is the one and only way to do things.