In information theory the so-called 4 Shannon-Khinchin (SK) axioms uniquely determine Boltzmann-Gibbs entropy as the one-and-only possible entropy. Physics is different from information in the sense that physical systems can be non-ergodic. Many complex systems in fact are. To describe strongly interacting statistical non-ergodic systems -i.e. complex systems- within a thermodynamical framework, it becomes necessary to introduce generalized entropies. A number of such entropies have been proposed in the past. The understanding of the fundamental origin of these entropies and its deeper relations to complex systems has remained unclear. Non-ergodicity explicitly violates the fourth SK axiom.  We show that violating this axiom and keeping the other three axioms intact, determines an explicit form of a more general entropy

Ssim sum_i Gamma (d+1,1-clog p_i) 

All recently proposed entropies appear to be special cases. We next prove that each (!) statistical system is uniquely characterized by the pair of the two scaling exponents (c,d), which define equivalence classes for all (!) interacting and non-interacting systems, and that no other possibilities for entropies exist. The corresponding distribution functions are special forms of so-called Lambert-W exponentials containing – as special cases Boltzmann, stretched exponential and Tsallis distributions (power-laws) – all abundant in nature. We show how the phasespace volume of a system is related to its (generalized) entropy and illustrate this with physical examples of spin systems on constant-connectency networks and accelerating random walks.