This book presents the concepts needed to deal with self-organizing complex systems from a unifying point of view that uses macroscopic data. The various meanings of the concept 'information' are discussed and a general formulation of the maximum information (entropy) principle is used. With the aid of results from synergetics, adequate objective constraints for a large class of self-organizing systems are formulated and examples are given from physics, life and computer science. The relationship to chaos theory is examined and it is further shown that, based on possibly scarce and noisy data, unbiased guesses about processes of complex systems can be made and the underlying deterministic and random forces determined. This allows for probabilistic predictions of processes, with applications to numerous fields in science, technology, medicine and economics. The extensions of the third edition are essentially devoted to an introduction to the meaning of information in the quantum context. Indeed, quantum information science and technology is presently one of the most active fields of research at the interface of physics, technology and information sciences and has already established itself as one of the major future technologies for processing and communicating information on any scale. This book addresses graduate students and nonspecialist researchers wishing to get acquainted with the concept of information from a scientific perspective in more depth. It is suitable as a textbook for advanced courses or for self-study.