On the map we’re situated in western hegemonic philosophy. Geographically Germany, France, England, and so on. It’s useful to spend time with analytical philosophy because it is the lingua franca for philosophy; these terms provide the basis for philosophical discourse in English (and German, and French).

The analytic tradition emerged in the early twentieth century with a focus on “clarity”, “logic”, language analysis, and conceptual “rigour”. I’m going to stop quoting things now, because otherwise everything will get a quote. I’m using these terms with a subtle eye roll, but still language has meaning and it’s necessary to be aware that within this tradition are discursive norms that enable the field to be positioned as natural, essential, and dominating. It is not natural, nor is it essential.

Prior to analytical philosophy organised religion had served as the ruling common sense across Europe. Anglican theological common sense, in particular, in the early eighteenth century rested on the assumption that God had created a stable, orderly, material world whose existence was independent of human perception. This justified a wide variety of atrocities to be carried out “in the name of god”. This worldview treated matter as the anchor of divine design: physical objects existed “out there,” sustained by God but not reducible to ideas in the mind. It was a quiet metaphysical hegemony, rarely articulated, because nobody (“important”) asked, linking everyday realism with a broader religious narrative about creation, providence, and moral order. This political order reinforced political organisation at the time.

Aristotelian philosophy, emerging from Plato’s Academy in the 4th century BCE, fundamentally shaped Western thought through its systematic approach to knowledge, causation, and ethics. Aristotle’s doctrine of hylomorphism (matter and form), his four causes, and his teleological worldview provided a comprehensive framework that dominated medieval scholasticism and continues to influence contemporary virtue ethics and neo-Aristotelian metaphysics. Yet this system naturalised hierarchies through its concept of natural slavery, relegated women to inferior status through claims about their supposedly defective rationality, and its teleological essentialism has been weaponised to oppose social change by treating contingent social arrangements as natural end points.

Cartesian philosophy, initiated by Descartes in the 17th century, revolutionised epistemology through methodological scepticism and the cogito, while its substance dualism created the modern mind-body problem that occupies philosophy of mind to date. Descartes’ mechanistic view of nature enabled “modern science” by treating the physical world as mathematically describable extension, though this came at the cost of rendering consciousness inexplicable within his own system. Politically, Cartesian dualism served colonial projects by denying interiority to those deemed less rational, while its radical separation of mind from body reinforced gendered divisions between reason and emotion.

Both of these philosophies, despite their undeniable contributions to “systematic thought”, show that metaphysical frameworks inevitably encode and perpetuate the power relations of their time, whether through Aristotle’s natural hierarchies or Descartes’ disembodied rationality that conveniently resembled the European male subject.


The collapse of successive orthodoxies

In the 17th century, alongside Anglican common sense, Newtonian science was rising as a cultural force, offering a mathematically “precise” universe governed by forces and laws rather than scholastic metaphysics. While the Church largely adopted Newton (given his piety and the theological usefulness of his “natural law”), the shift was not frictionless. Newton’s world could, in principle, run without continual divine intervention, and this provoked unease: the more one relied on mechanical laws, the more precarious the Church’s role as epistemic authority became. In this tension-filled environment, Locke became a pivotal figure, grounding knowledge in experience while maintaining a realist ontology. Ideas, for him, were in the mind, yes, but still securely anchored to an external material world created by God.

Newton has an outsized impact on philosophy, mathematics, and physics shifting us from Aristotelian and Cartesian dominance to the rise of experimental philosophy. Newton’s co-discovery of calculus and his groundbreaking work in physics, detailed in Principia Mathematica and Opticks, revolutionised the study of nature. Newton challenged Cartesian vortex theory with universal gravitation, argued for absolute space and time against Leibniz’s relational view, established mathematical laws of motion that grounded Hume’s later causation debates, and pioneered the experimental method combining mathematics with empirical observation. Newton’s rejection of purely hypothetical reasoning and his emphasis on mathematical and experimental methods set a new standard for scientific inquiry. His concept of absolute space, his theory of universal gravitation, and his views on divine action in nature sparked significant debates, particularly with Leibniz, shaping philosophical discourse for centuries.

Newton’s experimental philosophy became the foundation of early 18th-century thought, merging Anglican theology with mathematical physics in what seemed an unshakeable synthesis. Matter obeyed mathematical laws, God sustained the cosmic order, and empirical observation revealed truth. This consensus dominated British intellectual life, with Locke providing its epistemological justification: we know the material world through ideas caused by external objects. But this apparent stability contained its own contradictions.

Berkeley enters here as an insider-critic, taking empiricism further than Locke or the Anglican-Newtonian mainstream was willing to tolerate. If all we ever perceive are ideas, he asks, why assume an invisible material substance behind them? His immaterialism directly challenged the twin pillars of the age:

→ Newtonian matter; and

→ Anglican common-sense realism

all while trying to preserve theological order by making God the guarantor of perceptual stability. The effect was destabilising: Berkeley exposed the philosophical fault lines that the hegemonic synthesis of theology, science, and empiricism had been papering over.

Into these fractures step Hume and Kant, each responding to the crisis Berkeley sharpened. Hume accepts the empiricist starting point but drops Berkeley’s theological safety net, revealing how thin the foundations of causation, selfhood, and external-world certainty are. Kant then attempts to rebuild stability by relocating the conditions of order from God or matter to the structures of the mind. In this arc, Berkeley becomes the hinge figure, one who forces the transition from early modern realism to the radical critiques and transcendental reconstructions that shape the next century of philosophy.

Kant’s transcendental idealism dominated philosophy for a century, spawning German and British Idealist movements that made mind or spirit the fundamental reality. By the late 19th century, British Idealists like Bradley and McTaggart had pushed this logic to its limit, arguing that ordinary distinctions between self and world, even space and time, were appearances of an underlying Absolute. This idealist consensus seemed as secure as the Newtonian one Berkeley had challenged. Yet it too contained fatal tensions. Where Berkeley had asked why assume matter behind ideas, Moore would ask why assume ideas swallow their objects.

In “The Refutation of Idealism” (1903) Moore argues that Idealists (that previous wave of British philosophy) fall into a self-contradictory error by failing to distinguish between an object and the experience of that object. He proposes that sensations or ideas consist of two distinct elements:

→ consciousness (the common element); and,

→ the object (the differing element).

Moore contends that identifying the object with the sensation itself is a fundamental mistake. He asserts that we are directly aware of material things in space just as we are of our own sensations, and that the existence of both has the same evidence. Therefore, if there’s a reason to doubt the existence of matter because it’s an inseparable aspect of our experience, the same reasoning would disprove the existence of our experience itself. Moore concludes that the assumption “esse est percipi” (to be is to be perceived) is unfounded, and that without valid reasons for believing in Idealism, doctrines like Idealism and Agnosticism are as baseless as superstitions.

Read the Refutation of Idealism for more.


From metaphysics to language

Later, Bertrand Russell’s essay “On Denoting” (1905) provided a foundational theory of definite descriptions. Russell shows how statements like “the present King of France” might be analysed logically so that we avoid ontological confusion. Linguistics here were of concern.

Russell introduced the theory of descriptions, transforming how analytical philosophers understand language, reference, and meaning. Russell showed that definite descriptions (“present King of France”) don’t directly refer to objects but can be analysed as quantificational expressions. As claims about existence and uniqueness.

→ Shifted philosophy toward linguistic analysis, laying foundations for analytic philosophy and logical atomism.
→ Influenced logic, semantics, and epistemology, including Frege, Wittgenstein, Strawson, and later philosophy of language.
→ Provided a model of philosophical method, showing how precise logical form can clarify long-standing conceptual problems.

His approach is formal, analytic, and linguistic, seeking to clarify philosophical problems through logic and the precise structure of propositions. He doesn’t care about politics, he has interests in the status quo.

Read Stevens - 2018 - Russell on Denoting and Language for more.


Analytical philosophy turns on itself

Quine in “Two Dogmas of Empiricism” (1951) attacks the distinction between analytic and synthetic truths and calls into question the reduction of meaning and knowledge to logical terms. This work marks the analytic tradition’s engagement with its own foundations and the philosophy of science. It’s meta, but it’s meta for white men (weak sauce™).

The first dogma is the analytic-synthetic dogma: Quine basically argues that this distinction doesn’t make any sense (from an empiricist view) and that the traditional ways of defending it fail. The most obvious method of defending it would be that we define “bachelor” as “an unmarried man” and therefore we can know, simply from these definitions, that bachelors are unmarried men. But, Quine objects (sort of—this is the most technical part of the paper) that the empiricist doesn’t get to base knowledge on definitions and claim they are separable from experience: words are not things that are separate from experience. I get my definitions from a dictionary, and I could be wrong about them; they are just like anything else I get from my senses. “Analytic” sentences, therefore, would be judged using the same senses that we use to tell if the cat is meowing right now. In other words, I could be wrong about the definition of bachelor and the only way I would be able to tell is through empirical means.

The second dogma (and, some argue, the one that is needed in support of the first conclusion) is that we can truly isolate the truths of certain sentences, so that when confronted with experience x I can deny (or confirm) sentence y. Quine claims that all the things I believe to be true are connected. This is the “web” he talks about: I can continue to hold to a proposition (including ones about abstract objects) anywhere on the web so long as I am willing to give up enough of the other truths. Not all truths are “equally” true: some are closer to the centre of the web, and so we would have to give up more beliefs.

The essential idea was that Quine was arguing against a group of theories that wanted to treat certain propositions, whether analytic or sensory, as foundational: who wanted to argue that if we get these right, we can figure everything else out from there. What (I interpret) Quine to be saying at his most basic is that we cannot separate those types of foundational sentences from the other sentences (or theories) we would want to build on top of them. Such a point probably doesn’t make much practical difference to science (i.e., people who just are empiricists), but it might make a larger difference to our theories about epistemology and language and our understanding of how science works.

Quine is arguing against a very specific form of empiricism. He’s basically done any usefulness to us beyond “he looked at the meta” (a bit, and then claimed he was the guy who invented that movement.)

Read Quine - 1951 - Main Trends in Recent Philosophy Two Dogmas of Empiricism for more.


Politics and the analytic tradition

While Quine’s Two Dogmas is typically read as a purely technical critique of the analytic/synthetic distinction, its impact coincided with a profound political reorganisation of American philosophy. The collapse of logical positivism did not occur in isolation—it intersected with McCarthyism, the rise of Cold War technocracy, and the professionalisation of academic departments in ways that reshaped what counted as legitimate philosophical work. The analytic tradition’s characteristic neutrality, formalism, and technical precision were not simply intellectual virtues but historically situated responses to institutional pressures and political constraints.

For a fuller account of how Cold War politics, RAND-style rationality, and the disciplining of academic dissent shaped the development of analytic philosophy after Quine, see Political themes in analytic philosophy after Quine.


Beyond analytic philosophy

While analytical philosophy provides conceptual tools and establishes discursive norms for anglophone philosophical work, its systematic depoliticisation and technical orientation leave crucial questions unaddressed. The tradition’s emphasis on logical clarity and scientific rigour proved congenial to Cold War technocracy but poorly equipped to analyse contemporary forms of power and domination.

Critical theory and cultural studies engage questions that analytical philosophy systematically avoids. How does power operate through apparently free choices? What mechanisms enable exploitation without visible coercion? How do digital technologies transform subjectivity and control? These questions require different methodological approaches than analytical philosophy typically provides.

The genealogical methods developed by Foucault, the critical analysis of capitalism advanced by the Frankfurt School, and contemporary work on psychopolitics and emotional capitalism offer complementary frameworks. These approaches examine how philosophical concepts and academic practices serve political functions, how knowledge production relates to power, and how contemporary capitalism operates through psychological rather than merely physical domination.

Understanding analytical philosophy’s political context and historical specificity enables productive engagement whilst avoiding both uncritical acceptance and wholesale rejection. The tradition’s contributions to logic, language and epistemology remain valuable. Its systematic avoidance of political economy, structural power and historical situatedness marks significant limitations requiring supplementation from critical traditions.