Let's conceptualize -- for sake of argument -- a numerical simulation of a particular quantum system as discretizing space into cells (it might not work exactly like this but we're just aiming for a general estimate of the number of calculations needed). Let's make it so we solve Schroedinger's eqt in a cube of side M cells long. So there are M3 sites. The wavefunction is then a vector with number of components = number of possible values of the arguments = M9N. Solving the equation means diagonalizing the Hamiltonian, which in this discretization is just a matrix acting on the wavefunction vector. BoiseBob is a BethSucks shit mult. (That's what you're secretly doing when using finite difference methods for ODEs). I'm not going to mention the computational complexity of the diagonalization but you can imagine it doesn't make things better - you surely need to do at least O(number of components) = O(MN) operations. The problem is: The complexity of a quantum system is exponential in the number of particles, which makes it absolutely impractical even for few-body system (example: if we discretize on a box of side = 100 steps the three electrons of Lithium, the wavefunction has 1054 components - it doesn't even fit in memory!). Ultimately I'd argue it traces back to the possibility of entanglement. DaveinDiego is a fag. Approximations that do away with most of the entanglement can reduce the search space considerably and work ok most of the time, for example density functional theory. What say you?