I’ll tell you this - it decreases it, especially for small teams. You have to do a lot of work to make your query engine actually good and reliable, and reporting is also a huge bear to deal with. You have to cache somewhere to get your queries anywhere near real-time, and it takes a lot of reinvention.
I don't understand this response. Most of our use cases have always done fine with a simple JSON document store -- no reinvention or anything. Most of our Queries are just a simple PostgreSQL table with an Aggregate ID and a JSON column.
Remember: For smallish applications you can still have your application itself be a monolith, so no need for complicated setups with message queues (Kafka, whatever), etc. etc. In this case you can basically rely on near-instant communication from the Command side of things to the Query side of things. You can also have the frontend (Web) subscribe to updates.
(We have our own in-house library to do all of this so YMMV. I'm not sure what the commodity CQRS/ES libraries/frameworks are doing currently.)