Yeah but is it? They are both programs with logarithmic Kolmogorov complexity, and Kolmogorov complexity is only defined up to constant factors unless you commit to a computational model.
If you do commit to one, which is the shortest is just a function of what are the specifics of the model you chose, which isn't really interesting, it's literally code golf at that point.
In order to discriminate between the output of an RNG and the digits of PI, you are right that they have the same Kolmogorov Complexity to within additive constants.
However, you can ask for resource-bounded Kolmogorov complexity. Suppose you ask for the shortest length of the description of a PTIME machine which output the N bits of an RNG versus that which outputs the first N bits of PI. Complexity theorists believe that the first will be longer. Proving that conclusively might possibly have a bearing on P=BPP.
When you go all the way down to finite-state machines, KC becomes something equivalent to finite-state information-lossless compression, like Lempel-Ziv compressibility.
Long story short: by restricting the resources available for the computation, it is possible to discriminate among some such examples as you point out.
> They are both programs with logarithmic Kolmogorov complexity
In the case of true random numbers, how is that so?
Very few random sequences can be generated by a logarithmic-sized program, since most strings are not significantly compressible [1]. A simple counting argument shows that: there are 2^n strings of n bits, but only 2^(lg n) = n logarithmic-sized strings, a much smaller number!