The modern development of symbolic logic began with the English mathematician George Boole. In 1847, he published a pamphlet, "Mathematical Analysis of Logic," in which he argued that logic should be allied with mathematics rather than philosophy. Observing the connection between deductive reasoning and the symbols of algebra, he devised an algebraic language with three basic operations: AND, OR and NOT. These three functions formed the basis of his premise. They were, and still are, the only operations necessary to perform comparisons or basic mathematical functions.
Symbolic logic has contributed to the development of new axiomatic frameworks, formal systems used to derive logical theorems, in several branches of mathematics, including arithmetic, analysis and geometry. The study of symbolic logic in mathematics developed what was called "set theory," with its early 20th century pioneers including David Hilbert, Kurt Gödel and Gerhard Gentzen. The development of set theory proved that almost all ordinary mathematics can be formalized in terms of sets.
In language, symbolic logic can be deduced to propositions, which are statements that can't be broken down without a loss in meaning. Propositions are represented like this: A = B, B = C, then A = C, with A, B, and C symbolizing non-refutable statements. Within these propositions are operators -- "and," "either...or," "if...then," "only if," and "implies," among others -- that act like connecting blocks. In the proposition, "Joe will come to the party only if Jane is there," "only if" acts as an operator. If the proposition "Jane is not at the party" is true, then the proposition "Joe is also not at the party," is implied. Adding more operators results in more complex logical structures.
All symbolic logic is as complex as working with numbers made up of ones and zeros. As a result, Boole's developments in mathematics have contributed dramatically to the field of computer science. Today, all computers use the Boolean logic system through microchips that contain thousands of tiny electronic switches arranged into logical "gates" -- the three basic AND, OR, NOT operations. These produce predictable and reliable conclusions and allow the computer to execute its operations using binary language.