Models aren't static. In order for them to remain relevant, they have to be constantly retrained with new data. Plus there's a model arms race going on and which will probably continue for the foreseeable future.
Fair point - though various distilling and retraining tricks do reduce the cost quite a bit. It’s not like everyone is doing all the work they had to do from scratch, every time.