> If those features aren't supported by the widget's hard-coded schema, you're out of luck as a user.
Chat paired to the pre-built and on-demand widgets address this limitation.
For example, in the keynote demo, they showed how the chat interface lets you perform advanced filtering that pulls together information from multiple sources, like filtering only Zillow housers near a dog park.
Yes, because it seems that Zillow exposes those specific filters as a part of the input schema. As long as it's a part of the schema, then ChatGPT can generate a useful input to the widget. But my point is that is very brittle.
A fully generative UI with on-the-fly schema would be less brittle because you can guarantee that the schema and the intelligent widget can fully satisfy the user’s request. The bottleneck here is the intelligence of the model computing this, but we are already at the point where this is not much of a problem and it will disappear as the models continue to improve.
I think most software will follow this trend and become generated on-demand over the next decade.
> Chat paired to the pre-built and on-demand widgets address this limitation
The only place I can see this working is if the LLM is generating a rich UI on the fly. Otherwise, you're arguing that a text-based UX is going to beat flashy, colourful things.
Chat paired to the pre-built and on-demand widgets address this limitation.
For example, in the keynote demo, they showed how the chat interface lets you perform advanced filtering that pulls together information from multiple sources, like filtering only Zillow housers near a dog park.