Posted on Feb 02 2020
Computer science, information, general works
When people hear about “information theory” they may think it’s related to classifying information, maybe the way a librarian does. However, in formal terms, this phrase refers to a very modern science: the practice of trying to determine exactly how densely information can be packed and transmitted from one place to another, preferably without losing any important data. It usually refers to digital information and is related to codes and cryptography as well as general signal theory.
People usually use the term “computer science” to refer to computer programming or writing software. Technically, the concept includes much deeper ideas, like information theory and the study of algorithms, or the study of how programming languages themselves are built and how they can be improved. Anyone who gets a degree in computer science will spend some time studying these deeper topics, but primarily they are going to be interested in learning how to write programs so they can get a job building software for a tech company -- a field that dominates society today but never seems to stop growing.
This book is considered a difficult read, but it’s necessary for such a dense and rewarding topic. Students, especially those at an undergraduate level, will probably find it hard to get into and understand, but the book is designed to build confidence and familiarity with the topic as you go. It dives deep into the core theory of computation underlying modern programming practices and lays out the ideas you’ll need to be familiar with if you want to streamline a programming language or understand how to improve algorithmic efficiency. However, if you’re a hobbyist looking to get a little background into becoming a programmer yourself, this is probably not the best book to get started with -- it’s more theoretical than immediately practical.
0
0