Author: Thomas Metcalf
Category: Logic and Reasoning
Word count: 1000
An argument is a set of statements (the premises) intended to provide evidence for, or prove, some conclusion. Formal logic is a tool we can use to present and evaluate arguments. Some arguments are better than others and formal logic can help us see exactly how some argument is supposed to work, and whether it is a good or bad argument (and why).
To use formal logic, we symbolize the arguments: we represent arguments in English (or some other natural language) in some other set of symbols. These symbolic representations can make it much easier to see whether a certain argument is valid (i.e. necessarily, if its premises are true, its conclusion is true), and to identify other important properties of arguments.
Here we’ll survey the simplest variety of formal logic: sentential logic.
1. Sentence-Letters and Constants
In sentential logic, it’s standard to symbolize particular declarative sentences, i.e. statements,with capital Roman letters, for example:
- A: ‘I apologize for tipping over your motorcycles.’
- B: ‘Beans are good for the heart.’
- C: ‘In the city there’s a thousand things I want to say to you.’
Notice that none of A-C contain the words ‘not,’ ‘and,’ ‘or,’ ‘if,’ or ‘then.’ That’s because those words (sometimes called “logical constants”) are very important in formal logic. We reserve special symbols for them, listed below. In contrast, we could use the letter ‘A’ to stand for ‘I apologize for tipping over your motorcycles’ in one argument, and that same letter ‘A’ for ‘The abacus is inferior to the digital calculator’ in another.
If a sentence contains one of those words ‘and,’ ‘not,’ and so on, we break it up into its components separated by those words. For example, if my sentence was:
- I apologize for tipping over your motorcycles and beans are not good for the heart
then I could write:
- A & ¬B.
If an individual sentence doesn’t contain any logical constants, we can call it an “atomic” sentence: it can’t be split up any more.
Here’s a set of symbols for the constants, along with the English words we use those constants to approximately symbolize, for any sentences φ, ψ, etc. Different languages of logic sometimes use different symbols, so I’ll include all the most common ones with their standard names and meanings.
|Name||Symbol||Alternative Symbols||English approx.|
|‘Conjunction’||φ & ψ||φ ∧ ψ, φ ● ψ||‘φ and ψ’|
|‘Disjunction’||φ ∨ ψ||‘φ or ψ’|
|‘Conditional’||φ → ψ||φ ⊃ ψ||‘if φ then ψ’ or ‘φ only if ψ’|
|‘Biconditional’ or ‘Logical Equivalence’||φ ↔ ψ||φ ≡ ψ||‘φ if and only if ψ’ (sometimes written ‘φ iff ψ’)|
|‘Conclusion’||∴ φ||‘therefore φ’|
|‘Contradiction’||⊥||[some false sentence]|
|‘Entailment’||φ, ψ ⊢ χ||‘φ and ψ jointly entail χ in the system of logic in use’|
As you see, another important set of symbols we can use are the lower-case Greek letters. We use them as variables, rather than abbreviations of determinate sentences: they stand in for some sentence or other. That sentence doesn’t have to be atomic; it could itself contain logical constants too.
The capital letters abbreviate particular sentences we’ve already defined, while the lower-case stand in for any sentence at all. Here’s a few examples:
|I’m not dreaming or if I’m an elephant, then fire is hot.||¬D ∨ (E → F)|
|If the money is in the envelope and isn’t in the envelope, then a contradiction is true.||(M & ¬M) → ⊥|
|The gate is open if and only if: the helicopter landed and either the ice has melted or you’re not joking.||G ↔ (H & (I ∨ ¬J))|
We can also use parentheses to help make our sentences easier to read and make our meanings clear. For example, in the second sentence just above, if I arranged the parentheses differently, the meaning would change:
| The gate is open if and only if the helicopter landed. |
And, either the ice has melted or you’re not joking.
| (G ↔ H) |
& (I ∨ ¬J)
Notice that we used English words to approximate our symbols. That’s no accident. In sentential logic, for example, if you know that A is true and you know that B is true, you can conclude that ‘A & B’ is true. And that’s how we use our English word ‘and’ as well. You can’t have both an Acura and a battleship unless it’s true that you have an Acura and it’s true that you have a battleship.
Notice, then, that if you know that you have an Acura and a battleship, you can conclude that you have an Acura. To symbolize this, we would say that if you know ‘A & B,’ you can conclude ‘A.’
Now you can start to see how we might write an entire ‘proof’: a symbolization of an entire argument. We can assume that if you have an Acura, then you have a car. We might symbolize that with ‘A → C.’ In turn, we can, assuming that you have an Acura and a battleship, prove that you have a car. It might look like this:
|1.||A & B||Assumption.|
|2.||A → C||Assumption.|
|3.||A||Follows from 1.|
|4.||∴ C||Follows from 2 and 3.|
As you can see, we go step-by-step, numbering our steps and indicating how we got our new sentence.
This is just a quick example, but you can probably imagine how we can use these symbols to construct more-interesting proofs. The topic of proofs in general is important and complex enough that it will have its own entry.
3. Beyond Sentential Logic
Once we know the rules for sentential logic, and the logical constants, it’s not difficult to symbolize more-complex sentences. For example, we might want to talk about which objects have which properties, or whether some sentence is necessarily true, or probably true. But those are topics for other entries.
 In this essay we don’t explicitly talk about deduction or induction, because sentential logic can be used for either. However, sentential logic is typically introduced by deductive arguments.
 Here we follow the standard way of symbolizing arguments. See for example Huber 2019 and Sider 2010.
 This is occasionally called “propositional” logic. But ‘sentential’ is more neutral. It only commits us to the existence of sentences, i.e. grammatical strings of words in some language. It doesn’t commit us to the existence of propositions, which, if they exist, are (roughly speaking) the meanings of sentences, or the contents of thought, or what “it” is that’s “true” when I say, ‘It’s true that electrons are smaller than protons.’ To say “sentential” is also, strictly speaking, more accurate: what we mainly do in formal logic is syntactic rather than semantic. That is, we’re looking at strings of symbols, rather than at the meanings behind the symbols, and propositions (if they exist) are the meanings of sentences.
 We call the special symbols “logical constants” because, unlike the sentence-letters, they always mean the same thing no matter where they appear.
 There are a few other English words or terms that match up to those symbols too. For example, the string ‘p only if q’ turns out to be equivalent to ‘if p then q.’ And in standard logic, ‘but’ and ‘and’ both follow the same rules; we don’t mention the connotation of contrast that’s inherent in ‘but.’
 Of course, in English we normally don’t talk about whether ‘not p’ is true or false; we talk about whether ‘not [some sentence]’ is true or false. That is, we don’t mention ‘p’ itself; we fill it in with whatever sentence we’re talking about. In order to mention a phrase containing a variable but indicate that you’re intending to mention the value of the variable, rather than the name of the variable itself, you can use corner-quotes (i.e. ‘⌜’ and ‘⌝’) for what is sometimes called “quasi-quotation” (Quine 1981 : 33-37). To avoid confusion, however, I’ve elected to keep the single quotation marks.
 Note that in English, our word ‘or’ can mean the inclusive or the exclusive ‘or.’ That is, sometimes we use “or” to imply that either or both can be true, while we sometimes use it to imply that only one can be true. For example, if I ask, ‘Have you ever ridden a scooter, or a motorcycle?’ I probably inflect my voice up at the end of my sentence, implying that the answer is ‘yes’ or ‘no,’ and you’d say ‘yes’ if you’d ridden either one, but also, if you’d ridden both at some point. But if I ask ‘Do you want soup or salad?’ at a restaurant, I might inflect my voice down at the end of the sentence, implying that you must choose, and ‘both’ isn’t an option. In standard logic, our disjunction symbol is for the inclusive ‘or,’ that is, it comes out true even if both p and q are true. If we want to indicate an exclusive or, we can use a negation symbol. For example: ‘(A ∨ B) & ¬(A & B).’
 This approximation can be particularly confusing. Strictly speaking, the “conditional” symbols (sometimes also called “material conditional” symbols) in standard logic say that it is false that: p is true and not-q is true. In real life, we can almost always read this as ‘if p then q,’ but there are somewhat confusing cases as well. We’ll talk more about those in the entries on truth tables. But for now, in standard sentential logic, a conditional statement will be counted true whenever the antecedent (the “if” part) is false, and whenever both the antecedent and consequent (the “then” part) are both true. As I said, that can be a little confusing.
 The symbol is normally used to indicate a contradiction, but for the purposes of proofs, that the sentence be false is usually enough. We’ll return to this topic when we look at actual proofs.
 That last line in the table isn’t about the truth of sentences, but about derivability in our system of logic. That may be confusing until we learn about actual proofs using these symbols, so don’t worry too much about it here, but I wanted to include it for completeness. Basically, it says that if you have p on a line and you have q on a line, you can get to r by using the rules of the logic.
Huber, Franz. 2019. A Logical Introduction to Probability and Induction. New York, NY and Oxford, UK: Oxford University Press.
Quine, W. V. 1981 . Mathematical Logic, Revised Edition. Cambridge, MA: Harvard University Press.
Sider, Ted. 2010. Logic for Philosophy. New York, NY and Oxford, UK: Oxford University Press.
The author would like to thank the editors of 1000-Word Philosophy, plus Michael Ferry, Andrew Lavin, and Daniel Massey for helpful comments on this essay.
Arguments: Why Do You Believe What You Believe? by Thomas Metcalf
Formal Logic: Symbolizing Arguments in Quantificational or Predicate Logic by Timothy Eshing
The Probability Calculus by Thomas Metcalf
Bayesianism by Thomas Metcalf
Quantum Mechanics and Philosophy III: Implications by Thomas Metcalf
About the Author
Tom Metcalf is an associate professor at Spring Hill College in Mobile, AL. He received his PhD in philosophy from the University of Colorado, Boulder. He specializes in ethics, metaethics, epistemology, and the philosophy of religion. Tom has two cats whose names are Hesperus and Phosphorus. shc.academia.edu/ThomasMetcalf
Follow 1000-Word Philosophy on Facebook and Twitter and subscribe to receive email notifications of new essays at 1000WordPhilosophy.com
9 thoughts on “Formal Logic: Symbolizing Arguments in Sentential Logic”