How did we get here? Notes on Orit Halpern’s Beautiful Data (part 1)

Our contemporary media environments were not technically determined; aesthetic practices were central to the production of the condition of possibility for the acceptance not only of computing but also of our contemporary ideas about psychology, attention, and space (Halpern, 2014, p.143).

In some respects, Orit Halpern’s (2014) Beautiful Data traces the rise of dataism – or the mindset or philosophy of big data that underpins many of the operations in contemporary society. In a similar way to Ian Hacking’s the Taming of Chance (find my blog post on this book here) and James Gleick’s (2011/ 2012) The Information, it helps to explain how data and information have become so integral to the operation of contemporary society. Critical scholars today are concerned over the nonsensical and sometimes punitive and dangerous uses of data in society, however, understanding the logics that led to this is essential if we are to counter and change them.

While Hacking’s book explains the rise of numerisation, Halpern’s Beautiful Data is a more comprehensive look not only at the increasing importance placed on information, but also how this has shaped vision and cognition. Through this lens, the challenge to resist dataism is more difficult as the logics are all encompassing. Halpern argues systems of knowledge and representation – such as the archive, design, art and aesthetics – are shaped by the idea that a successful future involves a surfeit of data. Halpern argues the 19th century was one of exploration, characterised by the naturalist ‘extracting value from natural resources’ (p.15). However, the second half of the 20th century marked a turning point, with increasing importance being placed on information and data. Like Hacking and Foucault, Halpern (2014) argues political reason became biopolitical, and ‘intimately tied to data, calculation and the economy’ (p.25). 

There is a lot in this book that is of interest, however, in this blog post I want to concentrate on a couple of key points that have relevance to this project and that are raised in previous blog posts, including: the emergence of ‘black-box’ entities in World War II; the role of visualization and art as a kind of ‘interface’ to data and technology; and the vision of the contemporary subject as conduit and channel for data. I can’t do justice to the detail of Halpern’s work here, however, I can explore how these concepts are relevant to the materializing data project and how they might advance methodologies for visualizing and interpreting digital data. 

The first substantive chapter of the book details Norbert Wiener’s fascination with data and the reasoning that underpinned cybernetics. World War II was thought to catalyse a new world composed of data and Wiener sought to make use of this. Under the context of ‘rapid defence’ and working with neurophysiologists, Wiener argued that ‘human behaviour could be mathematically modelled and predicted, particularly under stress; thereby articulating a new belief that both machines and humans could speak the language of mathematics’ (Halpern, 2014, p.43). The findings of his early simulations of pilots flying under different situations and patterns, showed that people actually do ‘act quite mechanically under duress’ (p.43). It was therefore unnecessary to represent the enemy as such, but rather train the ‘machine or person to communicate and anticipate further signals’ (p.44). This process was opaque and unseen, however, the outcomes and actions were ‘intelligible and predictable’ (p.44). This signalled the emergence of black-boxed entities ‘whose behaviour or signal was intelligible to each other, but whose internal function or structure was opaque, and not of interest’ (p.44). 

Also of significance is Halpern’s exploration of how visualisation and art became subservient to science and technology in the 20th century. Aesthetic practitioners, like cyberneticians, were interested in understanding how systems make sense of data, and not necessarily the data itself. Halpern traces the work of mid-20thcentury designer and artist Gyorgy Kepes, who was interested in ‘seeking the patterns that organised perception’ and formulating ‘a visual pedagogy based on this concept of expressive abstraction’ (p.87). The focus was on understanding the ’landscape of vision’ (p.94) or how the eye becomes an information processing system. There are fascinating sections in the book where Halpern describes Kepes inundating his students at MIT with information in lectures to see how much of it they could process. Issues around filtering information were not considered problematic as the focus was on facilitating data flows. As an interesting aside, Halpern explains the importance placed on data and information led to an increased focus on speed reading and analytical skills in the school education system. 

Kepes, Halpern writes, wanted to make technology more human. When it came to art, the focus was not on challenging or forming technology, like we might see today, but rather ‘art’s subservience to science and technology…using art to enhance science’ (Halpern, 2014, p.95). During this period art and visualisation were an interface through which the inhuman or that ‘which is beyond or outside sensory recognition’ could be rendered understandable to human beings. In this way, visualisation enabled ‘interaction between different scales and agents – human, network, global, non-human’ (Halpern, 2014, p.22). 

A final point of interest is the positioning of the subject in the burgeoning information era. In many ways, the mid-20th century and early 21st century has been less interested in the actual data and more in how the systems that process data can be replicated. From this mimetic perspective, the ‘ideal’ subject – what Kepes imagined as the ‘intimated observer’ – is imagined as part of the system like a conduit or channel for the flow of data, rather than an independent, agentic entity. The subject is integrated into the system, so their ‘consciousness, cognition, perception, and memory were now envisioned as part of one interactive process’ (Halpern, 2014, p.138).

Now it must be remembered these dramatic shifts were not taken up wholesale or smoothly across the world or even the US for the matter. Yet these logics clearly underpin the way data is used today. For example, even though black-box entities emerged over 50 years ago, it is only recently that a robust critique of this has been put forward by scholars such as Frank Pasquale (2015). Despite this, operationalism and post-representation are key features of a datafied world as Andrejevic (2020) points out, with consequences for human’s capacity to interpret and critique these systems. A key question for this project then is if this logic is baked into these computing systems, is it possible to open the black-boxes and enable critique? While there may be opportunities to resist operationalism immanently from within the system, perhaps more profound change requires a transcendent critique initiated external to these systems.

Second, it is interesting to consider the changing relationship between visualization and art and data. Visualisation is a useful way of interfacing with the inscrutable aspects of big data. However, clearly these representations are themselves partial and subjective and if they are ‘in-service’ to data then can also reify the issues that emerge. The role of art, in particular, has changed profoundly. As I have written in this blog, contemporary art by Hito SteyerlMimi Onuoha and Trevor Paglen have become a kind of public pedagogy that provide some of the most powerful critiques of data today. With Cam Bishop (Pangrazio & Bishop, 2017), I have previously written about critical art as a cipher for the broken tool; and at the same time as a method for breaking tools and dislocating our conventional relations to digital technologies. The creative act can be a strategy to upset our embrace of technology and reveal it at the same time. In doing so, Heidegger’s powerful metaphor of the broken hammer is invoked while also conjuring an alter-image—the body that moves at the behest of the tools it has created. 

This is a far cry from the notion that art should be subservient to technology. In many respects this project seeks to use art and visualisation to disrupt the notion that the subject is an integrated channel in datafied systems and instead imbues a sense of agency in the audience/user. If art and visualisation can be used as an interface to data, then it can also be used to critically examine and re-position the human within datafied systems.


Andrejevic, M. (2020). Automated Media. London and New York: Routledge.

Gleick, J. (2011/ 2012). The Information. London: Fourth Estate.

Halpern, O. (2014). Beautiful Data: A History of Vision and Reason Since 1945. Durham and London: Duke University Press.

Pangrazio, L., & Bishop, C. (2017). Art as digital counterpractice. CTheory, 21C 054. Retrieved from

Pasquale, F. (2015). The Black Box Society: The Secret Algorithms that Control Money and Information. Cambridge, Massachusetts: Harvard University Press.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s