Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I got really excited about call-by-push-value about 15 years ago, when I first encountered it, but at this point I think it is overhyped, particularly the presentation of "values" and "computations" as duals. For example see http://www.itu.dk/people/mogel/papers/eec.pdf, it is a CBPV-style lambda calculus except computations are a subset of values and there is no duality. Similarly in Levy's book, he discusses "complex values" like 1+2 which allow pure computations to happen inside the "value" type, more evidence that they are not duals at all.

If you squint at CBPV, there is exactly one primitive that sequences computation, `M to x. N`. In Haskell this is just the monadic bind `M >>= \x -> N` or `do { x <- M; N}`. If I had to bet, I would place my bet on monads, not CBPV.



So what they have made would be a more complicated version of https://en.wikipedia.org/wiki/A-normal_form


I hadn't thought about it that way, but it makes sense. I would say the main difference is that ANF is type-preserving, whereas CBPV introduces new types. But they are similar in that they are IRs with accompanying translations from standard lambda calculus. You can see the translations of CBV/CBN on page 11 of https://www.cs.bham.ac.uk/~pbl/papers/tlca99.pdf


I was very relieved to discover from https://www.cs.bham.ac.uk/~pbl/papers/siglogcbpv2.pdf that PBL has learned how to explicitly number pages in the intervening decades.

(see Fig. 6 p.21)


I'm not so sure.

I believe that by applying ANF, you're baking in either CBN or CBV at the time of writing the compiler.

If I've read it right, the author is suggesting that CBPV evaluates (ordinary) values before passing them, but does not evaluate functions (even though they are also values!) before passing them.


Well, the transformation doesn't change semantics, so technically you aren't "baking in" any form of evaluation strategy. It is a bit obscure but ANF is achievable by a series of simple rewrite rules, Figure 1 in https://dl.acm.org/doi/pdf/10.1145/141471.141563 or Figure 7 in https://dl.acm.org/doi/pdf/10.1145/173262.155113. But the transformations are certainly biased towards CBV, the syntactic notion of "machine value" that ANF uses doesn't make much sense in call-by-need or call-by-name where any expression can be a thunk.

That does remind of another thing that CBPV glosses over though - it is only a fusion of call-by-name and call-by-value. In particular it doesn't model call-by-need particularly well, there is no notion of sharing thunks that are duplicated during evaluation. Actually call-by-need is sort of arbitrary, generally one is better off using "an implementation-specific evaluation strategy somewhere between CBN and CBV", which again is not modeled well by CBPV, as the choice of CBN or CBV evaluation is encoded in the IR so cannot be easily changed by the compiler.


If it doesn't evaluate functions, how do closures get their lexical environments?


How do they figure out what goes into their environment, or how do their environments get populated?

I believe the first happens at compile time, and the second happens after the function is passed.

  1  f x =
  2    let xx = x * x in
  3    (\y -> y + xx)
  4
  5  main = do
  6    let clo = f 2
  7    let applied = clo 3
  8    print applied
The closure is defined on line 3, and instantiated on line 6.

xx is known to be the captured variable at compile time. But in a lazy setting, it's fine for the closure to be passed around before the value of xx is known:

  f x =
    let xx = trace "Evaluating xx" (x * x) in
    trace "Returning closure" (\y -> y + xx)

  main = do
    let clo = f 2
    let applied = clo 3
    print applied
> Returning closure

> Evaluating xx

> 7


I thought monadic binds would typically be implemented in terms of this intermediate language, not within it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: