This is pretty neat! I was toying around with the problem and it appears you can use generating functions to derive the same sequence of operations. If you start with:
G(x) = 1 + x + x^2 + ... = 1/(1-x)
The coefficients of this polynomial is the sequence (0^0, 1^0, 2^0, ...)
If you take the derivative of G(x) and multiply by x you get:
x * G'(x) = x + 2*x^2 + 3*x^3 + ... = x * d/dx 1/(1-x) = x/(1-x)^2
The coefficients of this polynomial is the sequence (0^1, 1^1, 2^1, ...). If you repeat this step, you get a polynomial whose coefficients are (0^2, 1^2, 2^2, ...) and if you do this operation N times, you can get a closed form of a polynomial whose coefficients are (0^N, 1^N, 2^N, ...).
The infinite sum converges for -1 < x < 1. If you set x=1/c, you get the infinite sum
0^N/c^0 + 1^N/c^1 + 2^N/c^2 + ...
which is exactly the sum we are trying to solve for. This means you solve any infinite sum of the form given by taking the derivative of 1/(1-x) N times while multiplying by x each time. Then plug in x=1/c at the end.
True, but the generating functions make it easier to prove that this works, rather than relying on the properties of a particular function (properties that you can most easily prove by reverting to the generating functions).
If you take the derivative of G(x) and multiply by x you get:
The coefficients of this polynomial is the sequence (0^1, 1^1, 2^1, ...). If you repeat this step, you get a polynomial whose coefficients are (0^2, 1^2, 2^2, ...) and if you do this operation N times, you can get a closed form of a polynomial whose coefficients are (0^N, 1^N, 2^N, ...).The infinite sum converges for -1 < x < 1. If you set x=1/c, you get the infinite sum
which is exactly the sum we are trying to solve for. This means you solve any infinite sum of the form given by taking the derivative of 1/(1-x) N times while multiplying by x each time. Then plug in x=1/c at the end.