The coinductive interpretation of a big-step relation for a call-by-value functional language is insufficient for expressing all divergent computations. A commonly adopted alternative is to use a divergence predicate that suffers from a serious duplication problem. We consider divergence as state in coinductive big-step semantics, and show that this avoids the duplication problem. Big-step rules with divergence as state are slightly less expressive than using a divergence predicate or pretty-big-step rules, but are more concise than both.
Presented at NWPT 2014.