I'm by no means a specialized data scientist, but I've done some (very surface-level) text crunching with both the TF/Numpy stack and Rust.
To me, the nice thing about switching to Rust for that kind of stuff was that it dramatically raised the bar of what I could do before reaching for those hyper-optimized descriptive libraries.
Want to calculate the levenshtein distances of the cross product of 100k strings? Sure, just load it into a Vec, find a levenshtein library on crates.io, and it'll probably be fast enough.
Could it be done in Python? Sure, but with Rust I didn't have to think about how to do it in a viable amount of time. Does that mean that Rust is going to take over the DS world? Probably not in the short term, Rust currently can't compete with Python's DS ecosystem.
But if I'm doing something that I don't know will fit into an existing Python mold (that I know about) then I'll strongly consider using it.
In fact, I'd basically argue against using most language-native implementations of algorithms where performance is at stake, because most implementations don't have all the algorithmic optimizations.
To me, the nice thing about switching to Rust for that kind of stuff was that it dramatically raised the bar of what I could do before reaching for those hyper-optimized descriptive libraries.
Want to calculate the levenshtein distances of the cross product of 100k strings? Sure, just load it into a Vec, find a levenshtein library on crates.io, and it'll probably be fast enough.
Could it be done in Python? Sure, but with Rust I didn't have to think about how to do it in a viable amount of time. Does that mean that Rust is going to take over the DS world? Probably not in the short term, Rust currently can't compete with Python's DS ecosystem.
But if I'm doing something that I don't know will fit into an existing Python mold (that I know about) then I'll strongly consider using it.