This has not been my experience, and I give "systems design" interviews.
Generally the entire interview is, "Let's design an X", where X is some kind of system that has a substantial software component. The goal is to see how a candidate handles an open ended design problem of undefined size when we only have 45-60 minutes to discuss it.
For example, do they expect to be handed requirements? Do they ask what the requirements and use cases are? How much detail do they go into? Can they strike the right balance of depth and breadth of the requirements? (Or better yet, can they ask the right questions for guidance on what level of depth I'm looking for?) Once they've settled on a design, can they at least write pseudo code to demonstrate how the software would work? Can they come up with reasonable test cases? Do they even think about how they're going to test the system at all? Did they design for testability or was that an afterthought?
Some of the most effective systems for X include things that feel trivial or simple on the surface, but have some layer of hidden complexity that's only apparent once you really start thinking about it how it works, where different people may have different expectations or assumptions about the desired system behavior. For example, I frequently use a Garage Door Opener. It seems simple enough until you start enumerating the different states the system can be in, what should happen in each state when the button is pressed, what happens if multiple buttons are pressed at the same time, how to integrate the safety laser sensor which is supposed to stop the door if something is under it, etc.
More senior engineers tend to ask lots of clarifying questions and probe the bounds of the design and expectations of the customer before they start. They usually end up in good shape, because they discovered the potential blind alleys early in the discussion, their questions helped them understand the right scope for the system design, etc. It usually works out well with some very minor refactoring along the way.
More junior engineers tend to assume a LOT of the requirements and use cases if the system's operation sounds simple, and jump straight into coding without establishing these things explicitly. This is almost always a mistake, because you discover important design considerations too late and end up having to do major refactors for things that could have been identified at the very beginning, if only you'd taken more initiative and asked more questions.
For example, I recently gave this interview to a junior with 3 years of experience. He didn't ask many questions up front, and only after 45 minutes did we come to realize that he assumed the garage door remote would have 2 buttons, one to explicitly make the door go up and one to make the door go down. I (the customer) had assumed there would be one button that would toggle directions based on the door state, but this never came up or was asked about before he launched into a complex design involving 7 different classes and several circular dependencies.
Well I don't know how you conduct the interview but in my experience from interviewee side, usually interviewer doesn't want to disclose information if it's not explicitly asked, which is bad. In your garage door example, it's better if you disclose the one button information when asked for "is there any more requirement?" or for any constraint, etc. If you only answer when asked "is it using one or two button" then it's a bad practice.
This practice makes those with only* a decent knowledge about the problem domain at disadvantage, because the interviewer doesn't disclose the requirement, even if asked (unless if asked specifically). In the junior case, it is possible that they know some garage door with two buttons, thus assuming the same since you don't disclose any information about it (assuming the junior asked). And thinking about it, one or two buttons operation should not be the top priority to be asked.
I'll happily disclose anything if they ask about, but the goal I'm trying to get to is that you land at requirements through conversation. Starting with a blank slate and saying "Give me all the requirements" and expecting that list to be complete, accurate, and never change is, at best, a complete fantasy when dealing with actual customers. If you did that to an actual customer, half the time they would blankly stare back at you with no idea what to say.
Now, if that's the level they engage at, that might be ok for a junior role. I expect juniors to require projects to be spelled out in excruciating detail, and to not deviate from exactly what they were told to do. That's what, by definition makes them junior.
A senior engineer on the other hand, can generally be given vaguely defined tasks and has the initiative to figure out what actually needs to be built in the first place, before launching into building something.
I don't have a specific threshold of performance I'm expecting for passing or failing the interview. It's about gauging where a candidates skills are on a spectrum, and then seeing if that skill level lines up with their experience and the job level the are interviewing for.
If you have senior experience and are interviewing for a senior role, my expectations are higher. If you perform at a junior level, then either we'd offer you a junior role instead or not make an offer. If you have junior experience and do well, then it might be time to make the leap to a senior role, or we'd make you a very attractive junior offer because we see that you're likely to advance quickly.
It can be used for a software architect role, but in general I'm just trying to poke at systems design ability regardless of the job level. Expectations are different for different job roles, obviously. If a software architect didn't do well on this question, for example, it would be very likely be a "no hire", but for a junior engineer their performance just needs to be appropriate for their level.
Generally the entire interview is, "Let's design an X", where X is some kind of system that has a substantial software component. The goal is to see how a candidate handles an open ended design problem of undefined size when we only have 45-60 minutes to discuss it.
For example, do they expect to be handed requirements? Do they ask what the requirements and use cases are? How much detail do they go into? Can they strike the right balance of depth and breadth of the requirements? (Or better yet, can they ask the right questions for guidance on what level of depth I'm looking for?) Once they've settled on a design, can they at least write pseudo code to demonstrate how the software would work? Can they come up with reasonable test cases? Do they even think about how they're going to test the system at all? Did they design for testability or was that an afterthought?
Some of the most effective systems for X include things that feel trivial or simple on the surface, but have some layer of hidden complexity that's only apparent once you really start thinking about it how it works, where different people may have different expectations or assumptions about the desired system behavior. For example, I frequently use a Garage Door Opener. It seems simple enough until you start enumerating the different states the system can be in, what should happen in each state when the button is pressed, what happens if multiple buttons are pressed at the same time, how to integrate the safety laser sensor which is supposed to stop the door if something is under it, etc.
More senior engineers tend to ask lots of clarifying questions and probe the bounds of the design and expectations of the customer before they start. They usually end up in good shape, because they discovered the potential blind alleys early in the discussion, their questions helped them understand the right scope for the system design, etc. It usually works out well with some very minor refactoring along the way.
More junior engineers tend to assume a LOT of the requirements and use cases if the system's operation sounds simple, and jump straight into coding without establishing these things explicitly. This is almost always a mistake, because you discover important design considerations too late and end up having to do major refactors for things that could have been identified at the very beginning, if only you'd taken more initiative and asked more questions.
For example, I recently gave this interview to a junior with 3 years of experience. He didn't ask many questions up front, and only after 45 minutes did we come to realize that he assumed the garage door remote would have 2 buttons, one to explicitly make the door go up and one to make the door go down. I (the customer) had assumed there would be one button that would toggle directions based on the door state, but this never came up or was asked about before he launched into a complex design involving 7 different classes and several circular dependencies.