• Tyler Tidwell

How Do We Know?

Updated: Feb 2

Epistemology: A Beginner’s Guide by Robert Martin

Review by Tyler Tidwell

What is knowledge?

Each of us confronts an astounding amount of data each day, often as a jumble of contradictory beliefs and competing truth claims. We unreflectively filter much of this information into various mental bins based on perceived veracity.

For example, your child’s pronouncement that there is a fairy hiding under her bed is automatically filtered into the “mere belief” bin, while the article you read in The Wall Street Journal gets tagged as a “well-informed opinion.” However, when your doctor tells you that you have a tumor in your kidney as he examines an x-ray on his desk, you suddenly pass into the realm of what most of us would call “knowledge.” This is the information presented to us that compels us to believe it even (and especially) if we don’t want to.

But what is the curious nature of this thing we call knowledge? What specifically elevates it above the level of mere belief or opinion? Do I believe certain things count as knowledge when I shouldn’t? And who exactly is it that gets to draw the line between mere opinion and knowledge? Me? Television news? The ostensible experts on a particular subject?

The formalized study of such questions is called epistemology. It may be the most important word you have never heard of, as its relevance to our daily lives is vastly underappreciated. Martin’s primer walks us through the history, ideas, and prominent characters of modern epistemology (1):


Though the question “How do we know what we know?” is as old as philosophy itself, it was not until the seventeenth century that Frenchman Rene Descartes pried the lid completely off the epistemology can of worms.

He ignited a debate that, as we shall see, only seems to get less settled with time. In one of the great ironies of history, Descartes’ project to shore things up in the knowledge department set in motion a process that will culminate in the 21st century with professional philosophers having highly technical, highly esoteric, and highly serious (to them at least) debates about how they can be sure that the cup of coffee on the table in front of them is truly there.

Here is a simplified play-by-play of what got epistemology (and, by association, philosophy in general) into this crazy mess:

To put the concept of knowledge on a surer footing, Descartes writes a series of tracts that are summarized and immortalized in the famous saying Cogito, ergo sum: I think, therefore I am.

For Descartes, the bedrock of knowledge is a priori beliefs, which are supposedly valid without reference to external sense experience from objects or events. Some of the unique characteristics of such beliefs are (purportedly) their indubitability and incorrigibility – that is, they are incapable of being doubted or corrected, unlike our sense perceptions which seem to be prone to all sorts of errors. For example, we think we see a person in the woods but it is only a shadow, or we think we hear our wife tell us to go watch football but she is actually telling us to go wash the dishes.

Cogito, ergo sum is the most famous example of indubitability: there is no way, says Descartes, that he or anyone else could doubt that they were in the process of thinking. It is simply not possible. Likewise, there is no fathomable way to correct or modify the belief that a perfect triangle has precisely 180 degrees – it is incorrigible. According to Descartes, only a priori beliefs provide the level of certainty required to be standing surefootedly on the ground we call “knowledge.” Descartes becomes one of the fathers of the school of Rationalism, in which a priori beliefs reign supreme.

With the wit, style, and accent that only the Scottish can muster, Enlightenment philosopher David Hume delivers a skeptical counterpunch, putting an exclamation point on a line of thinking developed by Thomas Hobbes, John Locke, and George Berkeley: sense experiences – a posteriori beliefs – are the necessary, unavoidable building blocks of knowledge.

Descartes can think about thinking all he wants, but real knowledge comes through our experience and interaction with the external physical world. After all, the idea of a triangle couldn’t possibly be innate to the human mind; it must have originally come from our observation of some triangle-like object in nature. In other words, Descartes’ supposedly abstract a priori ideas are really just sense perceptions dressed up in highly elaborate garb. Hume becomes one of the early champions of the growing school of Empiricism, in which a posteriori ideas are king. For many years epistemology mostly consists of a series of debates between Rationalists and Empiricists.

During the late eighteenth century, Prussian thinker Immanuel Kant attempts to synthesize Rationalism and Empiricism into a single school of thought, creating what is perhaps the greatest (and most perplexing) philosophy project of all time.

A key pillar of this project is the idea of “analytic” and “synthetic” propositions. A satisfactory explanation of these two concepts is beyond the scope of this review, but, in short, their function was to elucidate the interconnectedness of a priori and a posteriori beliefs, suggesting that both were essential building blocks of knowledge. Kant said the Rationalists and Empiricists could finally stop arguing since they were both right; their real error was believing their competing theories to be mutually exclusive. How well Kant wed the Rationalist and Empiricist traditions is debated to this day, partly due to the opaque, highly complex nature of his writings, which has left them open to numerous interpretations. (2)

In the late nineteenth century, German logician Gottlob Frege moves Kant’s ideas of the analytic and synthetic into the realm of linguistics, where standardized sentence structures (which translate nicely into the world of formal logic in which Frege lived) finally seem to provide some clarity to all of Kant’s abstruse concepts. Again, explaining the details of Frege’s method would be tedious, but the takeaway is this: formal logic and the syntax of language enter the scene as gatekeepers and standard-bearers for what passes as knowledge. Many Analytic philosophers feel a true sense of optimism, as they seem to finally be on a path that might lead to real consensus in the world of epistemology. (3)


This optimism is short-lived, however. In the mid-twentieth century, American philosopher Willard Quine releases an article that, to many, definitively shows the analytic/synthetic concepts to be nothing more than a bunch of – and I’m using a technical term here – mumbo jumbo. Furthermore, Quine argues that epistemology should stop concerning itself with propagating endless theories about what the word “knowledge” means. He claims the only real knowledge is scientific knowledge, and once we move outside the scientific realm, mere beliefs are all we have. (4) For Quine and his acolytes, the implications are clear: epistemology needs to reinvent itself as a branch of natural science, using empirical evidence and observation to describe the belief-formation process as it actually happens in real life – vice trying to establish rules, standards, and laws for what constitutes a “justified” belief or “real” knowledge since, outside of science, there is no such thing. (5)

Before the academic community has fully digested Quine’s ideas, another American named Thomas Kuhn inadvertently turns everything upside down once more. Like Descartes’ story, there is some irony involved with Kuhn. Setting out to better understand how scientists approach their research, Kuhn’s work actually had far greater (and unintended) implications in the worlds of philosophy, psychology, and anthropology than of science proper. Kuhn’s extended essay, The Structure of Scientific Revolutions, demonstrated that the selection and application of supposedly objective standards among competing scientific theories was often a highly subjective affair, despite scientists’ efforts to act otherwise. This discovery was as surprising as it was embarrassing. Quine had just convinced many philosophers that objective knowledge could only be found in the realm of natural science, but Kuhn seemed to show that subjectivity is an incurable part of any human enterprise, science included.


So, where do things stand now? What defines knowledge? A priori beliefs? A posteriori? Language? Logic? Science? Some of these? All of these? None of these?

Martin’s assessment isn’t a happy one: “This represents the current philosophical state of play: all the varieties of theory have their partisans, who are working to try to get around the objections we’ve looked at – and more.” However, the influence of thinkers like Quine and Kuhn – combined with the apparent insolubility of the above questions – has slowly pushed Western thought from the “modern” to the “postmodern” age.

While the modern age was characterized by the search for universally applicable and accepted standards of truth and knowledge (the search described thus far), our current postmodern age is characterized by a rejection of this search. Instead, the concepts of truth and knowledge are viewed as relative to the situations and environments in which they are discussed (i.e., the real reason you think something is true or constitutes knowledge is because of your race, gender, socioeconomic class, etc.). This explains why certain subject matter which had previously been a slam-dunk in the “knowledge” bin (like mathematics and the natural sciences) has started to be questioned in certain milieu. Understandably, it is with the postmodern frame that Martin brings his story to a close, as the underlying assumptions of postmodernism dissolve the possibility of a systematic epistemology as traditionally conceived.

Of course, this doesn’t mean that epistemology has simply vanished! Though the transition to a postmodern framework has been evident in some areas (painting and literature come to mind), it has not been accepted everywhere. The modern epistemological tradition is alive and well in many academic circles. In fact, as we will discuss below, the premodern tradition is still highly operative in our world as well.


1. While Martin does a great job making epistemology accessible to the lay reader, one could easily read this book, understand all the concepts presented, and then simply ask, “So what? Why should I care? This sounds like a bunch of college professors talking to themselves!”

However, many of the practical aspects of our day-to-day lives are derivations of abstract, theoretical answers to the question “How should we live?” If you live in a communist society, for instance, the practical ramifications of the communist system to your daily life are myriad. However, despite these concrete realities, Communism is essentially a series of highly theoretical beliefs about human nature and how humanity should live. Like all theorists, Marx, Engels, and their later Communism acolytes have marshaled arguments and evidence supporting their theories. Some are a priori; some are a posteriori. Some are synthetic; some analytic. Some are passionate emotional appeals; some are economic data sets.

Is each argument or piece of evidence acceptable, though?

The answer largely depends on your epistemological outlook – even if you’ve never heard of epistemology! If the only argument/evidence supporting Communism you find compelling are the economic data sets, you might just be an empiricist without realizing it.

So, while the epistemological question “How do we know what we know?” may seem detached from the practicalities of our daily lives, we implicitly answer it every time we justify our ideas concerning “How should we live?”

2. Martin fails to mention that the age of Descartes was preceded by roughly a thousand years of Medieval philosophy in which most of the issues discussed above would have made little sense. This premodern tradition held that if truth and knowledge are to have universally accepted standards, then these standards could not possibly come from something so contingent and error-prone as a human mind. Rather, they necessarily had to be characteristics of a supernatural, transcendent realm which we can only grasp with our intellect and study through divine revelation. (6)

Under this paradigm, we can trust our reason and senses to the degree that they are in harmony with this transcendent realm where truth and knowledge reside. It is God Himself who grounds them, and humanity – being made in the divine image – was intended to share in them. For Descartes to look for the ultimate grounding of truth and knowledge inside the human mind would have seemed not just counterintuitive but downright crazy. Why wasn’t Descartes’ project dismissed then?

The answer is complicated, involving numerous factors:

  • The invention of the printing press

  • The Protestant Reformation

  • The Thirty Years War in Europe

  • The rise of modern science

  • The peculiarly mathematical mind of Descartes himself

Numerous historians and philosophers have told this story far better than I ever could. (7) Still, for present purposes, we simply need to recognize that the list of events mentioned above from the 15th – 17th century slowly undermined the premodern epistemological tradition and paved the way for thinkers like Descartes to try something new.

3. The premodern epistemological tradition did not cease to be influential with the rise of the modern tradition. To this day, countless doctors and scientists employ empiricist methods from the modern tradition while simultaneously adhering to the premodern belief that truth and knowledge are grounded in something beyond the human mind. In actuality, the premodern tradition can accommodate and incorporate most of the modern tradition.

This point is worth stressing. Some influential atheists have recently preached a (false) dichotomy between “religion” and “science” whereby we are told that the two are mutually exclusive and we, therefore, must choose the latter over the former. (8) Once we recast this argument in epistemological terms, we quickly see that it simply isn’t true. As demonstrated above, the premodern tradition (“religion”) can serve as a solid foundation for the modern tradition (“science”). To present the two as an either/or proposition is simply a logical fallacy. (9) The only aspects of the traditions which are mutually exclusive are where they find the ultimate grounding of knowledge. The premodern tradition finds it in the divine. The modern tradition tries to find it in the human mind or in techniques and methods developed by the human mind.

However, the lack of consensus within the modern tradition helped give rise to the postmodern tradition in which all concepts of truth and knowledge are called into doubt. (10) In a drastic departure from previous epistemology, the postmodern tradition says that the reason we can’t decisively agree on where to find the grounding of knowledge is because there isn’t any. (11) Instead, we are simply left with the contingent perspective of each group or person (12) – which is precisely why the premodern tradition didn’t look to the human mind for the grounding of knowledge in the first place!

Hopefully, we can all see the irony in this situation. It is akin to two men at night looking for their lost keys out on a dark city street. The first man (who is modern epistemology) says that the only place to look for the keys is under the streetlamps, for these are the parts of the street where the eye can best see. The second man (who is postmodern epistemology) surveys the ground under the streetlamps and, seeing no keys, declares that they do not exist! All the while, premodern epistemology is whispering in the shadows, telling us that there might be more things in heaven and earth than dreamt of in our narrow philosophy.


(1) The word “modern” is often used in everyday speech to mean something like “right now” or “current times” but in academic fields like philosophy and history it generally refers to the thoughts and events of the 16th to 19th/20th centuries.

(2) Hume’s insistence on the use of sense perceptions and observation raised a serious issue which he fully recognized: how can we really trust our senses when they are clearly so prone to error? It seems that the responsible, default position should be one of radical skepticism. According to Kant, it was Hume who awoke him from his “dogmatic slumber” in questions of epistemology, and an important part of Kant’s work is his attempt to dispel this Humean skepticism. The details of his attempt to do so are incredibly complicated, but one of his salient ideas is that our minds don’t actually conform to reality; rather, it is reality that conforms to our minds. This was a radical philosophical maneuver, and in it we can see an origin story for postmodern “subjectivity” becoming the new arbiter of truth, knowledge, and morality. (I am indebted to my good friend Greg Jurschak for this point).

(3) The history of modern Western philosophy is typically broken up into two broad, concurrent schools: Analytic philosophy and Continental philosophy. The former is characterized by rigorous analysis, math, logic, and linguistics. The latter deals more with emotion, aesthetics, and everyday life.

(4) Quine said our beliefs are best viewed as a giant self-supporting web. When we come up against something new and need to decide if we believe it, we see how well it fits with our already existing beliefs. If it seems to fit well enough, we accept it; if it doesn’t, we don’t (if you’re getting a mental image of interior decorating right now, you’re on the right track). To many philosophers, this smells like heresy – as if each individual’s “web of beliefs” gets to set the rules for what constitutes knowledge.

(5) Quine’s strong endorsement of science as the only true realm of knowledge did not make him an empiricist, however. Whereas empiricism, like other traditional schools of epistemology, seeks to be a normative (that is, setting universal standards which tell us how things ought to be), Quine argued that epistemology can only ever be descriptive (that is, simply describing how things actually happen). This is not mere academic hair-splitting. The shift from normative to descriptive endeavors within any subject has profound effects.

(6) For those in the premodern tradition, “divine revelation” did not simply mean inspired scripture, as the term often connotes today. Rather, it also consisted of the totality of nature and history. For example, see the Apostle Paul’s letter to the Romans, 1:20, or Emmanuel Swedenborg’s doctrine of correspondence.

(7) Perhaps the most thorough explanation of how the Cartesian project became plausible is A Secular Age by Charles Taylor.

(8) Richard Dawkins and Daniel Dennett are two examples of prominent atheists who argue that if science is true then supernatural religion must necessarily be false. This, however, is an argument that many atheist intellectuals have declined to hang their hats on. Some have even gone so far as to attack Dawkins for peddling the religion/science dichotomy as a viable foundation for atheism when, as discussed above, it clearly is not. To be fair, this doesn’t mean that atheists haven’t marshalled other, stronger arguments worth analyzing.

(9) Just because “religion” and “science” are not mutually exclusive does not mean that they are always in perfect harmony. Yet it is from nuanced and specific instances of disharmony that the case for absolute disharmony is erroneously derived.

(10) The rise of the postmodern tradition in Western philosophy is of course a much more complicated story, and it was far from being a purely academic affair. After the destruction and carnage of two world wars, many Europeans were disillusioned enough to cast aside the search for universal standards of truth and knowledge for reasons that were probably more emotional and psychological than intellectual.

(11) There is usually one concept that survives the relativism of postmodernism: power. In a world of no universal truth or knowledge, many postmodern theorists turn to power as the new organizing principle of analysis. Instead of asking if a proposition is true, we instead ask who gains power from the proposition. It is through this vein of thought that something like mathematics can be construed as a tool of cultural oppression vice simply a useful process for understanding nature.

(12) A primary critique against the postmodern tradition is that in order to make the claim that there is no ultimate grounding of truth and knowledge, we would have to assume that the speaker is standing on the very ground he or she says does not exist. For if all truth and knowledge is partial and doubtful, then so are the proclamations of postmodernism, and we need not take them seriously! So, the postmodern tradition as typically formulated is self-refuting. In order not to be, it must make the exact type of claim it supposedly loathes from the modern tradition: that there is a grounding for truth and knowledge, and I am the one standing on it, not you.

Tyler Tidwell is a retired Marine who lives in the Oklahoma City area with his wife and three children.

  • Facebook Social Icon
  • Twitter Social Icon