March 16, 2012

Measurement, Matter, & Meaning

Today's results that suggest neutrinos cannot travel faster than the speed of light should have surprised no one; there were too many variables, too much opportunity for human error, and too little evidence that such results can be explained without a profound revision of our understanding of quantum mechanics. The entire episode is something that may not have played out so spectacularly in the media if not for the open-source nature of scientific data-sharing; certainly, physicists were reluctant to assume anything prior to rigorous empirical verification. I suspect that, most of the time, errant results like the too-fast neutrino would remain out of the public eye until seriously considered viable by a large swath of the scientific community.

Alternatively, three months after the initial Higgs results were announced from CMS and ATLAS, further data analysis still points strongly toward a SM Higgs somewhere around 125 GeV/c2.  This is significant, again because of the nature of subatomic experimentation: both the theory behind quantum field theory, and the machinery designed to execute experiments that may support the theory, are unbelievably complex. No one wants to crow about the discovery of the Higgs without being absolutely sure--and even that is complicated. Most of the calculations behind the Standard Model have been verified within an extremely small percentage of error, but this kind of new physics simply may not yield such certainties. Brian Greene writes about this as it relates to multiverse theory: eventually, the culture of physics may have to adjust to the fact that our current language of investigation--be it mathematical, experimental, or both--lags behind, requiring (ahem) faith in the numbers.

It's a tricky epistemological state: traditional models of knowledge construction require experimental verification, period. Just ask Alan Sokal. But one of the things that fascinates me most about our relationship with information is that the very act of measurement affects the matter under analysis: Neils Bohr writes at length about this, arguing that there are no inherent boundary characteristics of bodies, and that apparatus necessarily affects measurement. The particle-wave duality of light is a classic example of the fact that objects can exhibit contrasting properties under different experimental circumstances (the scientific term for this is "complementarity"). Bohr, rather than concluding that there are different kinds of light, proposed a theory of the phenomenal whole: waves or particles are simply differing phenomenal expressions of light, as enacted by differing experimental cuts. Therefore, the experimental apparatus itself--and, logically, its human operator--always affects the experimental result.

Karen Barad takes Bohr's philosophy one step further, arguing that all matter and meaning is mutually constructed and constantly reconfigured. She calls the onto-epistemological differences in phenomenal expression "agential cuts," or articulations of agency based on local material definitions. Light as a wave is the result of one particular agential cut; light as a particle is another. This logic can be applied infinitely, I think: the Higgs boson is both an expression of calculus and a simulated visualization at CERN; gravity may be an expression of both cosmology and quantum physics.

I could write about this forever--but you can see how experiments with neutrinos and bosons, which require the most conceptually advanced and technically sensitive apparatus in the history of science, imply quite a bit about the co-constitutive nature of matter (neutrinos) and meaning (whether they can travel faster than the speed of light). Implicit in every scientific endeavor is, ultimately, knowledge: and as any fan of Foucault  will tell you, knowledge is power, and power is the ultimate material. Just like quantum jitters, if you look closely enough, it's always there: you just need the ability to measure it.

No comments:

Post a Comment