In information theory, entropy is the measurement of the amount of randomness in a probability distribution. It's defined as In my code, it's (defn entropy [X] (* -1 (Σ [i X] (* (p i) (log (p i)))))) Compare the similarity of my code above with the actual definition. Having a language that lets you declare constructs that look very similar to their mathematical definition is a huge win for readabi