Expansion/Reduction

Let us, just for a moment, put aside all issues of digital manipulation, staged and synthetic photographs; and assume that the photograph, analogue or digital, is a photograph of something in the world. If the photograph shows us, to borrow a phrase from Barthes, the necessarily real thing that was placed before the lens, [1] then it follows that there must have been some other necessarily real things just beyond the camera’s view. If the photograph shows us an embalmed moment in time, then it follows that time continues to pass after the camera’s shutter closed. In short the photograph shows us a small slice of space and time, both of which extend well beyond its frame.

For the most part, this extension remains in the imagination – unlike a stilled film the photograph is not going to suddenly move forward to reveal what lies just beyond the frame, or make readily visible the passage of time. The photograph then has a limited spatiotemporal field of view. But what if it did not?

The ‘theoretical object’, or presumed form of, [2] of photography continues, in a large part, to be the print produced by chemical means, which embodies this stasis of space and time – as too does the print of a digital photograph. [3] Our day-to-day experience of photography, however, is increasingly not of the printed image but of the image displayed on screen. While stasis continues to define the printed photograph, often photographs on screen are set into motion, or more precisely vacillate between fixity and motion. [4]

Often these photographic images – these “moving stills” [5] – attempt to overcome the aforementioned spatiotemporal limits of the print by incorporating multiple points of view (Google Street View for example) or are set in motion using visual effects from the realm of cinema (for instance the ‘Ken Burns effect’ [6]). I use the term “photographic images” here to describe not a synthetic photograph produced by software, but to distinguish what we normally consider a photograph – a small slice of space and time – from these newer forms that have far less certain relationship with time and space.

Both George Baker [7] and Ingrid Hölzl [8] have pointed to Nancy Davenport’s 2004 work Weekend Campus, as one that sits somewhere in the liminal space between photography and film – an expanded photograph or “augmented document”. [9] Weekend Campus is a panorama, or photographic frieze, constructed from hundreds of individual photograph taken at college campuses and junkyards across the United States. [10] A virtual camera pans along its length taking in the scene of an accident: cars are piled up on the road, police stand by cordoned off vehicles, students survey the damage or stunned into passivity stare out towards the viewer; with green grass and brutalist college buildings visible in the background. Although it mimics a cinematic tracking shot (a deliberate homage to Jean Luc-Godard’s 1967 film Weekend [11]), the scene is perfectly still – static – broken only by the flashing light of police cars.

According to Davenport, Weekend Campus was “never meant to deceive as film, nor sit quietly as photography”. [12] Writing about the work, she talks about a desire for “endlessness”, for the single car crash in Godard’s original film to be the jumping off point for another car crash and for the tracking shot to continue in an endless succession of accidents. While Davenport uses the vacillation between movement and stasis in her work to foreground what she sees as the condition of all digital stills, Hölzl reads her desire for endlessness as a mechanism to question, indeed overcome, the “photographic cut”. [13] In other words, by expanding the photographic field of view Davenport seeks in include those other necessarily real things that normally lie out with the view of the camera: the “photographic off”. [14] Furthermore, as an assemblage of photographs, Weekend Campus does not represent a single moment in time, but simultaneously many and none at all.

Hölzl suggests – and this concurs with Barthes’s notion of the punctum as that which punches out from the photograph creating a “blind-field” [15] – that the force of the photograph lies in the tension between its dependence of the world and its absent other. [16] In other words, it is precisely the photographic cut that gives the photograph its power. The cut allows the photograph to extend beyond the frame and, if we follow Barthes’s reasoning, awaken sensations within the body of the viewer. [17] By extending the photograph, by including the “photographic off”, one may actually run the risk of disempowering the photograph. [18]

Davenport’s work may stem from a desire for “endlessness”, [19] but it remains on a limited loop. Though different elements of the frieze move in and out of view, meaning the work has many possible viewpoints and temporalities, the movement of the virtual camera still directs the viewer’s attention. Contrast this with something like Google Street View and we can see a quite different expression of endlessness.

Google Street View is often likened to the map from Jorge Luis Borges’s short story On Exactitude in Science, in which an empire produces a map so large and so detailed that it comes to cover the whole of the territory it was meant to represent, at which point it becomes useless. Although it is not intended to be viewed as a work of photography – and even less as a work of art – Google Street View might be the where ‘the end of endlessness’ is most visible.

Like Weekend Campus, Google Street View is not a single photograph, but constructed from millions of individual photographs. The photographs are captured by a specially equipped car, which has nine cameras mounted to its roof in a spherical configuration that allows for the capture of multiple points of view simultaneously. As the car is driven down a street it captures (geo-tagged) photographs, which are later aligned and combined into a 360-degree panoramic image. While these panoramas are not entirely seamless, [20] the overall effect is of a still image that has no single viewpoint. It is a photographic image, but, unlike a printed photograph, it has no clear spatial or temporal boundaries.

Rather than being set into motion by the movement of a virtual camera, Street View is set into motion by the action of the viewer, who clicks and drags the image in order to move through the photographic recreation of the street. The lack of any end points, the very totality of the image, leaves little or no space to imagine anything outside of it – shift the ‘camera’ a little to the left and you can see what’s there. In some streets, we can even scroll back through several years of images, to see how it has changed over time. [21]

A number of artists have explored the use of Google Street View as a new site for street photography, creating their own images from a section of the panorama. [22] These works show how, out of the endlessness of Google Street View, can come images that take on a life of their own, perhaps precisely because they re-assert the photographic cut. In the work of Jon Rafman (who runs the photo-blog 9-Eyes [23]) we see a tiger wandering an empty street, a woman sitting seemingly abandoned in the middle of the road, a shop-owner clutching a gun behind his back, body bags being carried from the scene of an accident. Here, we cannot move the camera or explore the different temporalities of the street and so what lies beyond the photograph is once again limited to our imagination.

Google Street View may be an extreme example, but the ability to create similar panoramic images is becoming a standard feature of smartphones, [24] or one easily added with the addition of a third-party app. Microsoft Photosynth in particular has been highlighted as an example of an app that seeks to create a more detailed snapshot of the world than can be achieved with a single photograph. [25] Indeed, part of Photosynth’s own marketing rhetoric is the promise to place whomever a “synth” is shared with in the shoes of person who created it, [26] which suggests a desire to get closer to the ‘real world.’

It was precisely this desire to connect the photograph to the ‘real world’, and by extension to the photographer, that inspired my own 2013 work Composed under Electric Stars; specifically the iPhone application. [27] With this application, the user was able to retrace my steps around London through a series of geo-tagged photographs. On arriving at the site of a photograph, the corresponding image was displayed on screen, allowing the viewer to place it within the landscape. It was precisely my intention that they should do this and from that connection begin to understand something of my process and subjectivity as a photographer.

Here too there is a sense of expanding the photograph beyond its frame, by locating the photograph in the landscape the viewer is not only free to discover what lies outside the frame of the photograph, but also the effects of the passage of time invisible in the static photograph. This is clearly visible in a clip I made to document the work. One photograph, taken in early spring shows a kite stuck in a tree, the branches of which are bare; in the video clip, made in late summer, this is contrasted with the same tree, the kite still stuck in its branches, which are now thick with green leaves. [28]

The work may strengthen the viewer’s sense of a connection to the ‘real world’, indeed some viewers reported enjoying the process of connecting the photograph to the landscape in front of them, one likening it to a “treasure hunt”. In that respect its aims were fulfilled, but I now find myself wondering if I risked muting the photographs themselves. Speaking as a photographic practitioner, this is clearly not the most desirable outcome.

 

 

 

Notes

[1] Roland Barthes, Camera Lucida (London: Vintage, 2000).

[2] Helen Jackson, “Knowing Photographs Now: The knowledge economy of photography in the twenty-first century,” Photographies 2, 2 (2009): 169 – 183.

[3] Nancy Davenport, “Weekend Campus,” in Still Moving: Between Cinema and Photography, eds. Karen Beckman and Jean Ma (Durham, NC: Duke University Press, 2008), 190 – 195.

[4] Ibid.

[5] Ingrid Hölzl, “Moving Stills: Images that are no longer immobile,” Photographies 3, 1 (2010): 99 – 108.

[6] The Ken Burns effect is a method of adding the appearance of movement to still photographs by panning across the image, zooming in and out, or a combination of both. Its popularity owes much to its inclusion in Apple’s iPhoto and iMovie software, of which it became a standard feature in 2003. See: Ingrid Hölzl, “Moving Stills,” 104 – 105.

[7] George Baker, “Photography’s Expanded Field,” October 114, Autumn (2005): 120 – 140.

[8] Ingrid Hölzl, “Blast-off Photography: Nancy Davenport and Expanded Photography,” History of Photography 35, 1 (2011): 33 – 43.

[9] Ibid., 33.

[10] Nancy Davenport, “Weekend Campus.”

[11] Ibid.

[12] Ibid., 193.

[13] Ingrid Hölzl, “Blast-off Photography,” 34.

[14] Ibid.

[15] Roland Barthes, “Camera Lucida,” 57.

[16] Ingrid Hölzl, “Blast-off Photography.”

[17] Marie Shurkus, “Camera Lucida and Affect: Beyond representation,” Photographies 7, 1 (2014): 67 – 83.

[18] Ingrid Hölzl, “Blast-off Photography.”

[19] Nancy Davenport, “Weekend Campus,” 190.

[20] Several artists have developed works that highlight the glitches often found in Google Street View. See, for example: Emilio Vavarella, “Report a Problem” [Online], http://emiliovavarella.com/archive/google-trilogy/report-a-problem/ (accessed 15 January 2015).

[21] For example, looking at the Glasgow School of Art on Street View, we can see photographs taken at various intervals between June 2009 and July 2014. See: Google, “Street View: Renfrew Street, Glasgow” [Online], https://www.google.co.uk/maps/@55.8662158,-4.2637011,3a,90y,103.13h,86.09t/data=!3m4!1e1!3m2!1saA8q-L6KoQ4bIockWNKfsQ!2e0 (accessed 19 January 2015).

[22] See, for example: Mishka Henner, “No Man’s Land” [Online], http://mishkahenner.com/filter/works/No-Man-s-Land (accessed 15 January 2015) and Michael Wolf, “Street View: Paris” [Online], http://photomichaelwolf.com/#paris-street-view/1 (accessed 15 January 2015).

[23] Jon Rafman, “9-Eyes” [Online], http://9-eyes.com/ (accessed 15 January 2015).

[24] For iPhone models 4S and above, panoramic photographs have been a standard option of the operating system’s ‘Camera’ app since the release of iOS 6 in 2013. See: Craig Grannell, “iOS 6 Review” [Online] Tech Radar, 17 May 2013, http://www.techradar.com/reviews/pc-mac/software/operating-systems/ios-6-1096515/review#article-body (accessed 19 January 2015).

[25] Daniel Palmer, “Redundancy in Photography,” Philosophy of Photography 3, 1 (2012): 36 – 44.

[26] Ibid., 38.

[27] Catherine M. Weir, “Composed under Electric Stars” [Online], http://www.cmweir.com/portfolio/composed-under-electric-stars/ (accessed 15 January 2015).

[28] Catherine M. Weir, “Composed under Electric Stars: Video Documentation” [Online], http://youtu.be/mfCrQvE8p4w?t=2m47s (accessed 19 January 2015).

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.