Example social construction of knowledge in physics: the speed of light

The graph below shows historical estimates of the speed of light, c, alongside uncertainty intervals (Klein & Roodman, 2005, Figure 1). The horizontal line shows the currently agreed value, now measured with high precision.

Note the area I’ve pointed to with the pink arrow, between 1930 and 1940. These estimates are around 17km/sec too slow relative to what we know now, but with relatively high precision (narrow uncertainty intervals). Some older estimates were closer! What went wrong? Klein and Roodman (2005, p.143) cite a post-mortem offering a potential explanation:

“the investigator searches for the source or sources of […] errors, and continues to search until he [sic] gets a result close to the accepted value.

“Then he [sic] stops!”

Fantastic case study illustrating the social construction of scientific knowledge, even in the “hard” sciences.

References

Klein, J. R., & Roodman, A. (2005). Blind analysis in nuclear and particle physics. Annual Review of Nuclear and Particle Science, 55, 141–163. doi: 10.1146/annurev.nucl.55.090704.151521 [preprint available]

Dedekind on natural numbers

The “standard model” of arithmetic is the idea you probably have when you think about natural numbers (0, 1, 2, 3, …) and what you can do with them. So, for instance, you can keep counting as far you like and will never run out of numbers. You won’t get a struck in a loop anywhere when counting: the numbers don’t suddenly go 89,90, 91, 80, 81, 82, … Also 2 + 2 = 4, x + y = y + x, etc.

One of the things mathematicians do is take structures like this standard model of arithmetic and devise lists of properties describing how it works and constraining what it could be. You could think of this as playing mathematical charades. Suppose I’m thinking of the natural numbers. How do I go about telling you what I’m thinking without just saying, “natural numbers” or counting 0, 1, 2, 3, … at you? What’s the most precise, unambiguous, and concise way I could do this, using principles that are more basic or general?

Of the people who gave this a go for the natural numbers, the most famous are Richard Dedekind (1888, What are numbers and what should they be?) and Giuseppe Peano (1889, The principles of arithmetic, presented by a new method). The result is called Peano Arithmetic or Dedekind-Peano Arithmetic. What I find interesting about this is where the ideas came from. Dedekind helpfully explained his thinking in an 1890 letter to Hans Keferstein. A chunk of it is quoted verbatim by Hao Wang, (1957, p. 150). Here’s part:

“How did my essay come to be written? Certainly not in one day, but rather it is the result of a synthesis which has been constructed after protracted labour. The synthesis is preceded by and based upon an analysis of the sequence of natural numbers, just as it presents itself, in practice so to speak, to the mind. Which are the mutually independent fundamental properties of this sequence [of natural numbers], i.e. those properties which are not deducible from one another and from which all others follow? How should we divest these properties of their specifically arithmetical character so that they are subsumed under more general concepts and such activities of the understanding, which are necessary for all thinking, but at the same time sufficient, to secure reliability and completeness of the proofs, and to permit the construction of consistent concepts and definitions?”

Dedekind spelt out his list of properties of what he called a “system” of N. Key properties are as follows (this is my paraphrase except where there is quoted text; also I’m pretending Dedekind started the numbers at zero when he actually started at one):

  1. N consists of “individuals or elements” called numbers.
  2. Each element of N is related to others by a relation (now called the successor), intuitively, “the number which succeeds or is next after” a number. But remember that we don’t have “next after” in this game. The successor of an element of N is another element of N. This captures part of the idea of counting along the numbers.
  3. If two numbers are distinct, then their successors are also distinct. So you can’t have say, the successor of 2 as 3 and also the successor as 4 as 3.
  4. Not all elements of N are a successor of any element.
  5. In particular, zero isn’t a successor of any element.

Dedekind notes that there are many systems that satisfy these properties and have N as a subset but also have arbitrary “alien intruders” which aren’t the natural numbers:

“What must we now add to the facts above in order to cleanse our system […] from such alien intruders […] which disturb every vestige of order, and to restrict ourselves to the system N? […] If one assumes knowledge of the sequence N of natural numbers to begin with and accordingly permits himself an arithmetic terminology, then he has of course an easy time of it. […]”

But we aren’t allowed to use arithmetic to define arithmetic. Dedekind explains again the intuitive idea of a number being in N if and only if you can get to it by starting at 0 and working along successors until you reach that number. This he formalises as follows:

  1. An element n belongs to N if and only if n is an element of every system K such that (i) the element zero belongs to K and (ii) the successor of any element of K also belongs to K.

So, we get the number 0 by 6(i), the number 1 by 6(ii) since it’s the successor of 0, the number 2 by applying successor to 1, and so on until an infinite set of natural numbers is formed. This approach is what we now call mathematical induction.

There are a few issues with Dedekind-Peano Arithmetic, though – for another time…

“The tendency of empiricism, unchecked, is always anti-realist…”

“The tendency of empiricism, unchecked, is always anti-realist; it has a strong tendency to degenerate into some form of verificationism: to treat the question of what there is (and even the question of what we can – intelligibly – talk about) as the same question as the question of what we can find out, or know for certain; to reduce questions of metaphysics and ontology to questions of epistemology.”
—Strawson, G. (1987, p. 267)

Strawson, G. (1987). Realism and causation. The Philosophical Quarterly, 37, 253–277.

Theories explain phenomena, not data (Bogen and Woodward, 1988)

“The positivist picture of the structure of scientific theories is now widely rejected. But the underlying idea that scientific theories are primarily designed to predict and explain claims about what we observe remains enormously influential, even among the sharpest critics of positivism.” (p. 304)

“Phenomena are detected through the use of data, but in most cases are not observable in any interesting sense of that term. Examples of data include bubble chamber photographs, patterns of discharge in electronic particle detectors and records of reaction times and error rates in various psychological experiments. Examples of phenomena, for which the above data might provide evidence, include weak neutral currents, the decay of the proton, and chunking and recency effects in human memory.” (p. 306)

“Our general thesis, then, is that we need to distinguish what theories explain (phenomena or facts about phenomena) from what is uncontroversially observable (data).” (p. 314)

Bogen, J., & Woodward, J. (1988). Saving the phenomena. The Philosophical Review, XCVII(3), 303–352.

The aim of critical realist philosophy

“The aim of critical realist philosophy is, when the practice is adequate, to provide a better or more adequate theory of the practice; and, when it is not, to transform the practice in the appropriate way. That is to say the aim of critical realist philosophy is enhanced reflexivity or transformed practice (or both). […]

“Since there is only one world, the theories and principles of critical realist philosophy should also apply to our everyday life. If they do not, then something is seriously wrong. This means that our theories and explanations should be tested in everyday life, as well as in specialist research contexts.”

—Bhaskar, Roy (2013) The consequences of the revindication of philosophical ontology for philosophy and social theory. In: Archer, Margaret and Maccarini, Andrea, (eds.) Engaging with the world. (pp. 11-21). Routledge: London.

The social model of disability as a case study of social ontology

Picture of a staircase
Photo by Alessia Cocconi on Unsplash

Social ontology is a branch of philosophy that tries to understand the building blocks of the social world. Debates in social ontology can be abstract and seem pointless. Even defining social ontology, and how it differs from, say, sociology, is a challenge (see Epstein, 2021). There has been a case to “rid social sciences of ontology altogether – of all philosophized metaphysics of how the social world is” (Kivinen and Piiroinen, 2007, p.99). This brief post tries to show why social ontology is important, using the social model of disability as an example.

In 1975, the Union of the Physically Impaired Against Segregation, a group of disability activists, published a series of fundamental principles which challenged the ontology of disability:

“In our view, it is society which disables physically impaired people. Disability is something imposed on top of our impairments, by the way we are unnecessarily isolated and excluded from full participation in society. Disabled people are therefore an oppressed group in society. […] For us as disabled people it is absolutely vital that we get this question of the cause of disability quite straight, because on the answer depends the crucial matter of where we direct our main energies in the struggle for change. We shall clearly get nowhere if our efforts are chiefly directed not at the cause of our oppression, but instead at one of the symptoms.”

Here a distinction is made between impairment and disability. From this perspective, it makes no sense to say that someone “has a disability”; individual people can have impairments, but it is society that determines whether someone is disabled. A vivid example of this is how common it still is for buildings not to be wheelchair accessible – or only partly so, e.g., wheelchair users can enter a building but not use its toilets. Note how the conceptualisation is used to unite people behind a social struggle. It has a practical purpose rather than only adding to our knowledge.

A related example is illustrated by the difference between deaf and Deaf identity:

“To be ‘deaf’ (small d) is to fit into the medical definition of deafness as something to be cured and eradicated. Being deaf means you have a hearing loss, but you choose or don’t feel able to function within the Deaf Community. […] Deaf – with a capital “D” (and occasionally with capital E, A and F too) – is used to refer to people who are culturally Deaf. These people actively use British Sign Language; they see themselves as being culturally Deaf and part of the Deaf community. […] I consider myself to be culturally Deaf; this is my Deaf Identity. […] I don’t see it as a disability – there is nothing I feel I cannot do – rather, I see it as an important aspect of my character that makes and shapes me.”

These conceptualisations of impairment and disability, social barriers, adjustments, aids, community, and Deaf identity, concern social ontology. Debates on these topics occur naturally in social struggles and discussions of social policy and identity, whether or not explicitly articulated as being about ontology. They also have clear implication for how social science is carried out and how research findings are used.

More posts on social ontology

References

Epstein, B. (2021). Social Ontology. The Stanford Encyclopedia of Philosophy (Winter 2021 Edition), Edward N. Zalta (ed.).

Kivinen, O., & Piiroinen, T. (2007). Sociologizing Metaphysics and Mind: A Pragmatist Point of View on the Methodology of the Social Science. Human Studies, 30, 97–114.

Measurements presuppose theories

(cited in Gillies’ “Philosophical theories of probability”): 

“Against this view [operationalism] it can be shown that measurements presuppose theories. There is no measurement without a theory and no operation which can be satisfactorily described in non-theoretical terms. The attempts to do so are always circular; for example, the description of the measurement of length needs a (rudimentary) theory of heat and temperature-measurement; but these in turn involve measurements of length.” (Popper, Conjectures and Refutations, 1963)

Distress

Image

“The address that Moore delivered to the British Academy, entitled ‘Proof of an External World,’ caused him a great deal of torment in its preparation. He worked hard at it, but the concluding portion displeased him, and he could not get it right as the time approached for his appearance before the Academy. On the day of the lecture he was still distressed about the ending of the paper. As he was about to leave the house to take the train to London, Mrs. Moore said, in order to comfort him, ‘Cheer up! I’m sure they will like it.’ To which Moore made this emphatic reply: ‘If they do, they’ll be wrong!’” [Source]

The role of measurement in science

The road from scientific law to scientific measurement can rarely be traveled in the reverse direction. To discover quantitative regularity one must normally know what regularity one is seeking and one’s instruments must be designed accordingly; even then nature may not yield consistent or generalizable results without a struggle. […] I venture the following paradox: The full and intimate quantification of any science is a consummation devoutly to be wished. Nevertheless, it is not a consummation that can effectively be sought by measuring. As in individual development, so in the scientific group, maturity comes most surely to those who know how to wait.” (Kuhn, 1961, pp. 189-190)

Kuhn, T. S. (1961). The function of measurement in modern physical science. Isis, 52(2), 161-193.