A question that has bugged me for a while is whether or not it is possible to simulate a Turing Machine in one semicolon-free expression of Python. If you allow for liberal use of eval then the answer is (trivially) positive, but it's not directly obvious that it would be true without such workarounds. Python's lack of tail-call elimination makes its lambda calculus bounded in scope, as the user i

