IMO, if you're checking the URL for directory traversal it's already too late. Whenever I build a server that serves files, I maintain a whitelist set of served files, and the first thing I do in the file request handler is check if the URL is in the set. If not, immediately drop to 404. There's too much that can go wrong with trying to sanitize inputs; it's better to rule out the possibility of unsanitized data by design. There's more than one approach to this, and none of them admit directory traversal.
Exactly. The set of acceptable files can be modified at runtime. Now you've localized the issue of sanitizing paths to a small area of your code (file upload) rather than every request. A good way to do this is to save the file on disc with the hex encoding of its SHA256 hash as its name, and then maintain a mapping from file names to hashes. This way, the only feasible attack is to overwrite a preexisting file, which would require the ability to pull off a second-preimage attack on SHA256, which is not generally thought to be feasible.
Problematic how? It doesn't mean the whitelist has to be in the code, it could be 'generated' from a database (let whitelist be the result of select file_name from uploaded_files_table)
A hex-encoded file hash as a file name is a safe bet. You can resolve file names (unsanitized, stored safely in a database) to hashes and load the files from disk.
This general approach (whitelisting file URLs) lets us localize any path sanitation to the file upload code, rather than every single request.