Bernhard Siegert is currently in residency at New York University’s Department of Media, Culture and Communication, where he’s giving a series of seminars and talks, including the LeBoff Public Lecture last Thursday, April 7, entitled “Codes and Coding.” The material for the talk, as Siegert explained in personal correspondence, was originally intended to be a chapter in Passage des Digitalen, but for reasons of time, it was left out then.
The talk focused in particular on the British engineer/mathematician, Oliver Heaviside, with a parallel interest in postwar discussions about the ontological status of the digital within the context of early computing. Siegert began aphoristically, splitting the difference between Kittler and Wittgenstein: “Coding determines everything that is the case” and according to Wittgenstein, that is the world (i.e., “The world is everything that is the case”). In a brief reflection on Kittler’s contribution, he noted that critics had oversimplified Kittler’s arguments into techno-determinism, and that the “cultural technical turn” began with Kittler. Every cultural begins with the introduction of distinctions: human/animal, male/female, etc., and a focus on cultural techniques allows us to consider the role of “the third” that precedes “the first and second” in these binaries; or in other words, the ground that allows these distinctions to be made in the first place. The primary function of media is articulation of difference, creating distinctions that may (but need not) be used for making meaning. The study of cultural techniques thus examines how differences are articulated, processed, and made operational. This makes hybrid and emergent states particularly interesting, where these articulations and distinctions are being made. A key question in this talk and more generally for Siegert is “the question of the digital”: “what should a cultural technical approach to the digital and digital media look like?”
“Coding is the most basic cultural technique of the digital age.” Two major focal points served to illustrate the emergence of this cultural technique: first, the debates of the Macy Conferences (1946-1953), “the founding documents of cybernetics,” and especially their debates about the ontology and meaning of the digital; and second, the work of Oliver Heaviside, whose work on telegraphy and circuits led him to develop operational calculus that transformed differential equations into a form of algebra, with particular emphasis on “step functions” with discrete, discontinuous values (e.g., a negative argument is 0, positive is 1). The first point, the Macy Conference debates (especially in 1950, when the conference was renamed with an explicit emphasis on cybernetics), focused on the particular question: is the digital real or symbolic? (Most engineers: symbolic; neurologists: real.) Norbert Wiener took a middle-ground position, “baptizing” the analog as “continuously coded” and the digital as “discretely coded.” His explanation described signals in terms of time. The two stable states are regarded as real, the continuity between them is effectively a non-reality. As one participant (Julian Bigelow?) described, it’s not enough to simply have two states, A and B, but there must also be some “forbidden ground” between them that is never assigned a value. The “sacred” is both consecrated to the gods and afflicted with perpetual blemish; it’s a black box or forbidden zone that humans can’t access. Coding thus becomes the encapsulation of fuzziness. Like Claude Shannon’s masters thesis, “A Symbolic Analysis of Relay and Switching Circuits,” the question is one of givenness: “at any given time” (i.e., proscribed by certain prohibitions). The formulation ‘at any given time’ is like a black box that, when opened, shows the workings of cultural techniques that otherwise remain invisible.
The second narrative, focusing on Heaviside, considers signals not as functions of time but rather of frequency. (Here Siegert gave a “history of coding in two minutes,” emphasizing coding as a signaling process dating back to the Greek development of the phonetic alphabet, which allowed permutation, combination, etc. He then leaped forward past fire telegraphy to the Chappe Brothers’ optical telegraph of the 1790s, up to Soemmering’s attempts at electric telegraphy and on to Morse, with his emphasis not on secrecy but on efficiency, and William Thompson’s formulations of signal delays in 1855, just after the Crimean telegraph lines were laid.) By the time of Michael Faraday’s work on electricity (c. 1820s), Newtonian physics was already struggling to adequately account for electricity. The discovery of induction and of alternating current required new understanding, which would be provided by the novel and unorthodox thinking of Heaviside. In forging his new idea of self-inductance to limit the effects of noise, Heaviside made many enemies in the scientific community (e.g., professors and members of the Royal Society). His famous step function (with possible values 0 and 1) “corresponds to nothing in analog reality” yet still remained cutting edge/controversial all the way to the 1950 Macy Conference gathering. Siegert’s concluding theses included: The nonsensical becomes the non-existent. And, the undefinable becomes the basis for the possibility of notation as such.