Complex systems find and maintain internal coherency through communication and coordination between various components within the system. This chapter considers information theory as the study of how senders and receivers can pass messages that reduce uncertainty in the presence of noise. After a short introduction to probability, Claude Shannon’s definition of information is discussed. These ideas are built upon in exploring computation, which is then applied to a range of complex systems that execute sophisticated algorithmic processes. The traveling salesman problem and halting problem are introduced to illustrate the difficulty in algorithmically solving some problems. Bayes’ law is covered as a means by which complex systems can update probabilistic predictions as more information becomes available. The possibility that some information might be unknowable is discussed through a simplified version of Godel’s incompleteness theorem. Examples are drawn from genetics, artificial intelligence, psychology, linguistics, and cell biology. More provocative topics addressed are how meaning might emerge from information, the possible nature of mind, consciousness, and self, a perspective on the meaning of privacy, and what quantum computing may (or may not) have to offer. The chapter concludes with questions for either reflection or group discussion as well as resources for further exploration.