I'm building a holistic financial guidance platform and getting mixed signals about how people actually want to interact with AI.
We decided not to use AI to build the custom investment portfolios. We use deterministic algorithms because we need control and predictability. A hallucination in a portfolio allocation is a non-starter for us.
The app shows users our underlying estimates and gives context for the math. We only use GenAI to generate the text explaining why the portfolio fits them.
Here is the weird part.
We're seeing a trust gap. Users find the transparent, logic-based advice interesting, but they hesitate to act. Yet these same users tell us they often ask ChatGPT for financial advice, even while admitting they know they should be skeptical of it.
It feels like people prefer a confident "black box" over a transparent "glass box" that shows its work.
We've tried bridging this by keeping it non-custodial (we don't touch the money) and sticking to blue-chip ETFs, but I'm trying to figure out the psychological block here.
Separately, we're building an agentic system to analyze user accounts and suggest specific financial planning actions like tax loss harvesting, but I'm hesitant to lean too hard into the AI label if it scares people.
My question for this crowd: Are you fine with AI running your money if the UX is good? Or do you genuinely prefer a traditional, transparent algorithmic approach? What would actually get you to pull the trigger on a trade?
(Platform is https://www.FulfilledWealth.co if you want to check the approach)
reply