Journal articles: 'Boot Camp (Computer file)' – Grafiati (2024)

  • Bibliography
  • Subscribe
  • News
  • Referencing guides Blog Automated transliteration Relevant bibliographies by topics

Log in

Українська Français Italiano Español Polski Português Deutsch

We are proudly a Ukrainian website. Our country was attacked by Russian Armed Forces on Feb. 24, 2022.
You can support the Ukrainian Army by following the link: https://u24.gov.ua/. Even the smallest donation is hugely appreciated!

June discounts! -30% on all plans during the first week of June.

Get my discount

Relevant bibliographies by topics / Boot Camp (Computer file) / Journal articles

To see the other types of publications on this topic, follow the link: Boot Camp (Computer file).

Author: Grafiati

Published: 4 June 2021

Last updated: 1 February 2022

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Consult the top 36 journal articles for your research on the topic 'Boot Camp (Computer file).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Lyon, Louise Ann, and Emily Green. "Coding Boot Camps." ACM Transactions on Computing Education 21, no.2 (June 2021): 1–30. http://dx.doi.org/10.1145/3440891.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

College-educated women in the workforce are discovering a latent interest in and aptitude for computing motivated by the prevalence of computing as an integral part of jobs in many fields as well as continued headlines about the number of unfilled, highly paid computing jobs. One of these women's choices for retraining are the so-called coding boot camps that teach programming skills through intensive multi-week courses. This article reports on a qualitative research study focused on the Silicon Valley area of California. We used social cognitive career theory (SCCT) to investigate the larger context surrounding women entering computing professions through boot camp learning sites, including: the environment of a booming technology workforce, boot camps as learning settings , the characteristics of women who attend boot camps, how retraining at a coding boot camp influence women's computing self-efficacy and outcome expectations , and the performance attainments of women at boot camps. Interview data was collected from 14 women who had attended boot camps—first before graduating from the boot camp and again after six months in the workforce. To contextualize the computing ecosystem, we conducted single interviews with 6 boot camp organizers/curriculum developers, 7 industry hiring managers, and 14 university computer science faculty. To provide a contrast with women at boot camps, we interviewed 5 women who majored in computer science at the university and with 17 men who had attended boot camps. Structural data coding and analysis was done focused on the SCCT mechanisms of environment, person inputs, learning experiences, self-efficacy, outcome expectations, and performance attainments. Findings here demonstrate that training at a boot camp can be the catalyst for college-educated women to attain computing jobs and careers, although these entry-level jobs may be a compromise to the goal of a software development job and are unlikely to lead to a job at large, well-known, established technology companies.

2

Ford, Richard, and DeborahA.Frincke. "Building a Better Boot Camp." IEEE Security & Privacy Magazine 8, no.1 (January 2010): 68–71. http://dx.doi.org/10.1109/msp.2010.31.

Full text

APA, Harvard, Vancouver, ISO, and other styles

3

LANGFORD,TATUMW., CHRISTOPHERB.REZNICH, and SARAH ERWIN. "A Computer “Boot Camp” for Academic Medicine Faculty." Academic Medicine 75, no.5 (May 2000): 555–56. http://dx.doi.org/10.1097/00001888-200005000-00088.

Full text

APA, Harvard, Vancouver, ISO, and other styles

4

Jordan,SherylG., ElizabethC.Deans, Meredith Johnson, SheilaS.Lee, and GaryL.BeckDallaghan. "Rules of Engagement: Implementing Student-Centered Learning in Breast Imaging." Journal of Breast Imaging 2, no.1 (January 2020): 67–71. http://dx.doi.org/10.1093/jbi/wbz084.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Abstract In response to changing learner needs, our institution launched a new translational curriculum wherein basic sciences and clinical skills were integrated, longitudinal patient care experiences provided, and earlier opportunities in specialty fields introduced to better inform residency program decisions. Medical students taking the breast imaging elective were enrolled in a breast imaging immersive experience designed to meet the School of Medicine’s core competencies. In focusing the elective on a narrow range of specialized topics and skills, we labeled this experience the Breast Imaging Boot Camp. Outcome data from March 6, 2017, to April 26, 2019, have been analyzed for this report. The modifications made to the elective include: e-mailing a detailed orientation document to students prior to the start of the rotation; assigning students to diagnostic, procedural, and screening roles; the attendance of students at introductory radiology symposia; students’ weekly attendance at institutional multidisciplinary and divisional breast radiologic–pathologic correlation conferences; student self-study assignments using faculty-vetted resources; student participation in breast biopsy simulation and small parts ultrasound laboratories; the development of a student-centric radiology educational website; and student-directed publishing of digital case files. Medical student feedback and course analytics indicated superior course evaluation scores reinforced by narrative feedback. In website domain utilization data, the breast file domain is the most viewed subspecialty domain. The Breast Imaging Boot Camp is a successful curriculum. Its novelty lies in both its integrated approach to medical student core competencies and its clinician educators’ use of today’s student-favored teaching tools.

5

John,NigelW., NicholasI.Phillips, Llyr ap Cenydd, David Coope, Nick Carleton-Bland, Ian Kamaly-Asl, and WilliamP.Gray. "A Tablet-Based Virtual Environment for Neurosurgery Training." Presence: Teleoperators and Virtual Environments 24, no.2 (May1, 2015): 155–62. http://dx.doi.org/10.1162/pres_a_00224.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

The requirement for training surgical procedures without exposing the patient to additional risk is well accepted and is part of a national drive in the UK and internationally. Computer-based simulations are important in this context, including neurosurgical resident training. The objective of this study is to evaluate the effectiveness of a custom-built virtual environment in assisting training of a ventriculostomy procedure. The training tool (called VCath) has been developed as an app for a tablet platform to provide easy access and availability to trainees. The study was conducted at the first boot camp organized for all year-one trainees in neurosurgery in the UK. The attendees were randomly distributed between the VCath training group and the control group. Efficacy of performing ventriculostomy for both groups was assessed at the beginning and end of the study using a simulated insertion task. Statistically significant changes in performance of selecting the burr hole entry point, the trajectory length and duration metrics for the VCath group, together with a good indicator of improved normalized jerk (representing the speed and smoothness of arm motion), all suggest that there has been a higher-level cognitive benefit to using VCath. The app is successful as it is focused on the cognitive task of ventriculostomy, encouraging the trainee to rehearse the entry point and use anatomical landmarks to create a trajectory to the target. In straight-line trajectory procedures such as ventriculostomy, cognitive task-based education is a useful adjunct to traditional methods and may reduce the learning curve and ultimately improve patient safety.

6

Lubas, Rebecca. "Boot Camp for the 21st Century Metadata Manager. A Report of the Program Presented by ALCTS and Online Audiovisual Catalogers (OLAC), American Library Association Annual Conference, Washington, DC, June 2010." Technical Services Quarterly 28, no.3 (May19, 2011): 335–39. http://dx.doi.org/10.1080/07317131.2011.574520.

Full text

APA, Harvard, Vancouver, ISO, and other styles

7

John,NigelW., NicholasI.Phillips, Llyr ap Cenydd, SerbanR.Pop, David Coope, Ian Kamaly-Asl, Christopher de Souza, and SimonJ.Watt. "The Use of Stereoscopy in a Neurosurgery Training Virtual Environment." Presence: Teleoperators and Virtual Environments 25, no.4 (December22, 2016): 289–98. http://dx.doi.org/10.1162/pres_a_00270.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

We have previously investigated the effectiveness of a custom-built virtual environment in assisting training of a ventriculostomy procedure, which is a commonly performed procedure by a neurosurgeon and a core task for trainee surgeons. The training tool (called VCath) was initially developed as a low-fidelity app for a tablet platform to provide easy access and availability to trainees. Subsequently, we have developed a high-fidelity version of VCath that uses a stereoscopic display to immerse the trainee in the virtual environment. This article reports on two studies that have been carried out to compare the low- and high-fidelity versions of VCath, particularly to assess the value of stereoscopy. Study 1 was conducted at the second annual boot camp organized for all year-one trainees in neurosurgery in the UK. Study 2 was performed on lay people, with no surgical experience. Our hypothesis was that using stereoscopy in the training task would be beneficial. Results from Study 1 demonstrated that performance improved for both the control group and the group trained with the tablet version of VCath. The group trained on the high-fidelity version of VCath with a stereoscopic display showed no performance improvement. The indication is that our hypothesis is false. In Study 2, six different conditions were investigated that covered the use of training with VCath on a tablet, a mono display at two different sizes, a stereo display at two different sizes, and a control group who received no training. Results from this study with lay people show that stereoscopy can make a significant improvement to the accuracy of needle placement. The possible reasons for these results and the apparent contradiction between the two studies are discussed.

8

Cleary,DanielR., DominicA.Siler, Nathaniel Whitney, and NathanR.Selden. "A microcontroller-based simulation of dural venous sinus injury for neurosurgical training." Journal of Neurosurgery 128, no.5 (May 2018): 1553–59. http://dx.doi.org/10.3171/2016.12.jns162165.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

OBJECTIVESurgical simulation has the potential to supplement and enhance traditional resident training. However, the high cost of equipment and limited number of available scenarios have inhibited wider integration of simulation in neurosurgical education. In this study the authors provide initial validation of a novel, low-cost simulation platform that recreates the stress of surgery using a combination of hands-on, model-based, and computer elements. Trainee skill was quantified using multiple time and performance measures. The simulation was initially validated using trainees at the start of their intern year.METHODSThe simulation recreates intraoperative superior sagittal sinus injury complicated by air embolism. The simulator model consists of 2 components: a reusable base and a disposable craniotomy pack. The simulator software is flexible and modular to allow adjustments in difficulty or the creation of entirely new clinical scenarios. The reusable simulator base incorporates a powerful microcomputer and multiple sensors and actuators to provide continuous feedback to the software controller, which in turn adjusts both the screen output and physical elements of the model. The disposable craniotomy pack incorporates 3D-printed sections of model skull and brain, as well as artificial dura that incorporates a model sagittal sinus.RESULTSTwelve participants at the 2015 Western Region Society of Neurological Surgeons postgraduate year 1 resident course (“boot camp”) provided informed consent and enrolled in a study testing the prototype device. Each trainee was required to successfully create a bilateral parasagittal craniotomy, repair a dural sinus tear, and recognize and correct an air embolus. Participant stress was measured using a heart rate wrist monitor. After participation, each resident completed a 13-question categorical survey.CONCLUSIONSAll trainee participants experienced tachycardia during the simulation, although the point in the simulation at which they experienced tachycardia varied. Survey results indicated that participants agreed the simulation was realistic, created stress, and was a useful tool in training neurosurgical residents. This simulator represents a novel, low-cost approach for hands-on training that effectively teaches and tests residents without risk of patient injury.

9

Sha, Yun Dong, Jia Han, and Feng Tong Zhao. "The Design of Embedded AeroEngine Measurement and Test System Based on ARM and FPGA." Advanced Engineering Forum 2-3 (December 2011): 458–62. http://dx.doi.org/10.4028/www.scientific.net/aef.2-3.458.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Present Measurement and test system for aeroengine rig testing is primarily based on DSP and PC104 industrial computer, which have the disadvantages of large volume, high power consumption, single function and less-generic. So it is a valuable job to promote the performance of the system employing the latest methods of collaborative design and embedded hardware and software technology. For designing of the new type improved aeroengine test system, a hardware and software co-design strategy is employed. After a thorough trade-off analysis and comparison among different hardware and software systems, an embedded aeroengine test system based on ARM and FPGA is designed, by using S3C2440A CPU with ARM920T core and FPGA from Xilinx as fundamental hardware data acquisition and process platform, using Linux as real time operating system. The data transmission between ARM and FPGA take the path of dual-port RAM, during which the read-write operation is coordinated by means of interruption. The system uses U-Boot to initialize hardware and load operating system. FPGA is used as foreside of data acquisition system, which sampling parameter setting is loaded by reading configuration file during start-up and read into inner register by SPI bus. Data is exported from flash-chip of test system by means of U-disk or network, and analyzed in PC equipped with specialized software. Strong post-proceeding capacity for multi parameters of aeroengine rig testing can be realized, especially for dynamic signal analysis in time domain, frequency domain. Because of the use of the new platform, a big promotion has been achieved. I) the capacity of data processing increases by 15%. II) The new system allows numbers of the sampling channels to double. III) Abundant functions of software can realize more professional, intelligent and generic tests. IV) Sampling frequencies increases by 20%, meeting the demand of monitoring dynamic random signals such as noise and vibration, on fluctuating pressure with wide frequency band. The embedded aeroengine test system based on ARM and FPGA is successful and the prototype is manufactured and tested. It is found that the performance of the new system is promoted significantly, which shows the effectiveness and reliability of the method employed.

10

Graham,DonnaM., Gemma Wickert, Leanna Goodwin, Joanna Clarke, Carla Timmins, Dilshad Chang, Alison Walker, et al. "A multidisciplinary-tailored digital solution to data capture in early phase clinical trials." Journal of Global Oncology 5, suppl (October7, 2019): 2. http://dx.doi.org/10.1200/jgo.2019.5.suppl.2.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

2 Background: Data capture in early phase cancer clinical trials (EPCCT) is usually via paper records with manual transcription to the sponsor’s case report form. Capturing real time trial data directly to computer (eSource) may reduce errors and increase completeness and timeliness of data entry. A simulated system pilot took place between Oct 2018 and Jan 2019 at an EPCCT facility to appraise Foundry Health’s eSource system “ClinSpark”. Aims were to assess consistency and effectiveness of creating electronic templates for source data capture and live data collection compliance. Methods: A multidisciplinary focus group (MFG) (2 research nurses, 1 doctor, 3 data managers) was created to collaborate with Foundry Health staff. Specialised features of the eSource system were adapted to handle the complex needs of EPCCT. The pilot incorporated a 5 day boot camp for familiarisation to the digital platform; a conference room test using simulated patient data; construction of a trial template including contingency planning; and a clinic floor test with live simulated patient data collection using digital tablets. The MFG agreed on a 52 item user acceptance test listing ideal features for a data collection tool, with items classified as high, medium or low priority. Results: During the 3 month pilot, templates for 2 EPCCT were planned and created by the MFG. Using eSource, 43 items (83%) of the acceptance test were passed compared with 27 items (52%) for the current (paper) system. For the 30 high-priority items, eSource passed 30 (100%) compared with 22 for the paper system (73%). The paper system was not superior to eSource for any items assessed. Time saving and potential error reduction were noted as additional benefits. Conclusions: This process demonstrates that a multidisciplinary approach can be used to successfully integrate a customised eSource system working with previously untrained staff. Improved performance across pre-specified domains and potential additional benefits were noted. As FDA encourages use of digital solutions in clinical trials, using eSource provides a potential solution for compliant and efficient data capture from protocol assessments at investigator sites and rapid data transfer to sponsors.

11

Graham,DonnaM., Joanna Clarke, Gemma Wickert, Leanna Goodwin, Carla Timmins, Dilshad Chang, Alison Walker, et al. "A multidisciplinary-guided digital solution to data capture in early-phase clinical trials." Journal of Clinical Oncology 37, no.15_suppl (May20, 2019): e18063-e18063. http://dx.doi.org/10.1200/jco.2019.37.15_suppl.e18063.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

e18063 Background: Data capture in early phase cancer clinical trials (EPCCT) is usually via paper records with manual transcription to the sponsor’s case report form. Capturing real time trial data directly to computer (eSource) may reduce errors and increase completeness and timeliness of data entry. A simulated system pilot took place between Oct 2018 and Jan 2019 at an EPCCT facility to appraise Foundry Health’s eSource system “ClinSpark”. Aims were to assess consistency and effectiveness of creating electronic templates for source data capture and live data collection compliance. Methods: A multidisciplinary focus group (2 research nurses, 1 doctor, 3 data managers) was created to collaborate with Foundry Health staff. The focus group agreed on a 52 item user acceptance test listing ideal features for a data collection tool, classifying items as high, medium or low priority. Specialised features of the eSource system were adapted to handle the complex needs of EPCCT. The pilot incorporated a 5 day boot camp for familiarisation to the digital platform; a conference room test using simulated patient data; construction of a trial template including contingency planning; and a clinic floor test with live simulated patient data collection using digital tablets. Results: During the 3 month pilot, templates for 2 EPCCT were planned and created. Using eSource, 43 items (83%) of the acceptance test were passed compared with 27 items (52%) for the current (paper-based) system. The paper system did not pass any of the 9 items for which eSource failed. For the 30 high priority items, eSource passed 30 (100%) compared with 22 for the paper system (73%). Time saving and potential error reduction were noted as additional benefits. Conclusions: This process demonstrates that a multidisciplinary approach can be used to successfully integrate a customised eSource system working with previously untrained staff. Improved performance across pre-specified domains and potential additional benefits were noted. As FDA encourages the use of digital solutions in clinical trials, using eSource provides a potential solution for compliant and efficient capture of data from protocol assessments at investigator sites and rapid data transfer to sponsors.

12

Wu, Zenan, Liqin Tian, Yi Zhang, and Zhigang Wang. "Web User Trust Evaluation: A Novel Approach Using Fuzzy Petri Net and Behavior Analysis." Symmetry 13, no.8 (August13, 2021): 1487. http://dx.doi.org/10.3390/sym13081487.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

With the development of society and information technology, people’s dependence on the Internet has gradually increased, including online shopping, downloading files, reading books, and online banking. However, how to ensure the safety and legitimacy of these network user behaviors has become the focus of attention. As we all know, cybersecurity and system resilience originate from symmetry. Due to the diversity and unpredictability of cyber-attacks, absolute cybersecurity is difficult to achieve; system resilience indicates that protecting system security should shift from resisting attacks to ensuring system continuity. The trust evaluation of network users is a research hotspot in improving network system security. Aiming at the defects of incomplete evaluation processes and inaccurate evaluation results in current online user behavior trust evaluation methods, this paper combines the basic principles of online user trust evaluation and proposes a trust evaluation model that combines fuzzy Petri nets with user behavior analysis. First, for “unfamiliar” users, we used fuzzy Petri nets to calculate the user’s recommended trust value as the system’s indirect trust value; next, we used the user’s behavior record as evidence to conduct direct trust evaluation on the user to obtain the system’s direct trust in the user’s value; finally, the two calculation results were combined to obtain the user’s comprehensive trust value. In terms of experimental verification, the experimental data came from a self-developed e-book management system. Through theoretical analysis and simulation results, it was shown that the model met the optimization conditions of subjective and objective relative balance, the evaluation process was more complete, and the trust evaluation values of network users could be obtained more accurately. This evaluation method provides solid theory and research ideas for user credibility judgment of key network basic application platforms such as online shopping malls, online transactions, and online banking.

13

Shamapant, Shilpa, Shelley Adair, and JamesC.Grotta. "Abstract W MP55: Intensive Speech Therapy Promotes Recovery in Chronic Post-Stroke Aphasia." Stroke 46, suppl_1 (February 2015). http://dx.doi.org/10.1161/str.46.suppl_1.wmp55.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Background: Research suggests that Acute Post-Stroke Aphasia (APSA) improves with Intensive Speech Therapy (IST). However there is scarce research comparing IST for APSA vs Chronic Post-stroke Aphasia (CPSA), and the relation to Time Post Stroke (TPS). We hypothesized that IST creates improvement for both APSA and CPSA patients irrespective of TPS. Methods: We examined 26 participants chosen from the existing database of Austin Speech Labs, age range 23-79 yrs (mean 53 yrs). Clients that attended all eight weeks of therapy consistently were chosen. All clients received group and individual therapy using different computer applications every time they attended a session. The hours spent in therapy varied due to the individualized nature of the therapy program, but averaged 442 hours (range 80 -2,279 hours). Clients maintained their hours per week for at least eight weeks, which was defined as the IST “Boot Camp(BC)” interval. We classified clients into two groups, CPSA (n=12) and APSA (n=14) depending on TPS. CPSA clients had come to ASL at least one year post-stroke onset while APSA clients had come less than one year post-stroke. Two tests , Western Aphasia Battery (WAB) or Examining for Aphasia- Fourth Edition (EFA-4) were administered at enrollment and at discharge or every six months to avoid memorization. ASL Language Evaluation to test automatic speech, auditory comprehension, naming, reading, writing, and expressive language was done after every BC session. Results: APSA and CPSA clients differed on several levels. TPS was 1 to19.5 yrs, age was 23 - 75 yrs in CPSA compared to 0.08 - 0.75 yrs TPS, 30 - 79 yrs age in APSA. CPSA had four females(F) and nine males(M) while APSA had two F and 12 M. Using an independent samples test we found no significant differences in final scores between between APSA and CPSA participants across aphasia types. Secondly, using a bivariate correlation test, we found no correlation between TPS and post test scores across all categories. Lastly, age had no effect on treatment response; elderly and young clients improved equally with IST. Conclusion: IST can help CPSA clients as much as those with APSA. Research is continuing to determine the minimal and optimal length and duration of IST to produce and maintain improved language function.

14

Kerasidou, Xaroula (Charalampia). "Regressive Augmentation: Investigating Ubicomp’s Romantic Promises." M/C Journal 16, no.6 (November7, 2013). http://dx.doi.org/10.5204/mcj.733.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Machines that fit the human environment instead of forcing humans to enter theirs will make using a computer as refreshing as taking a walk in the woods. Mark Weiser on ubiquitous computing (21st Century Computer 104) In 2007, a forum entitled HCI 2020: Human Values in a Digital Age sought to address the questions: What will our world be like in 2020? Digital technologies will continue to proliferate, enabling ever more ways of changing how we live. But will such developments improve the quality of life, empower us, and make us feel safer, happier and more connected? Or will living with technology make it more tiresome, frustrating, angst-ridden, and security-driven? What will it mean to be human when everything we do is supported or augmented by technology? (Harper et al. 10) The forum came as a response to, what many call, post-PC technological developments; developments that seek to engulf our lives in digital technologies which in their various forms are meant to support and augment our everyday lives. One of these developments has been the project of ubiquitous computing along with its kin project, tangible computing. Ubiquitous computing (ubicomp) made its appearance in the late 1980s in the labs of Xerox’s Palo Alto Research Center (PARC) as the “third wave” in computing, following those of the mainframe and personal computing (Weiser, Open House 2). Mark Weiser, who coined the term, along with his collaborators at Xerox PARC, envisioned a “new technological paradigm” which would leave behind the traditional one-to-one relationship between human and computer, and spread computation “ubiquitously, but invisibly, throughout the environment” (Weiser, Gold and Brown 693). Since then, the field has grown and now counts several peer-reviewed journals, conferences, and academic and industrial research centres around the world, which have set out to study the new “post-PC computing” under names such as Pervasive Computing, Ambient Intelligence, Tangible Computing, The Internet of Things, etc. Instead of providing a comprehensive account of all the different ubicomp incarnations, this paper seeks to focus on the early projects and writings of some of ubicomp’s most prominent figures and tease out, as a way of critique, the origins of some of its romantic promises. From the outset, ubiquitous computing was heavily informed by a human-centred approach that sought to shift the focus from the personal computer back to its users. On the grounds that the PC has dominated the technological landscape at the expense of its human counterparts, ubiquitous computing promised a different human-machine interaction, with “machines that fit the human environment instead of forcing humans to enter theirs” (104, my italics) placing the two in opposite and antagonistic terrains. The problem comes about in the form of interaction between people and machines … So when the two have to meet, which side should dominate? In the past, it has been the machine that dominates. In the future, it should be the human. (Norman 140) Within these early ubicomp discourses, the computer came to embody a technological menace, the machine that threatened the liberal humanist value of being free and in control. For example, in 1999 in a book that was characterized as “the bible of ‘post-PC’ thinking” by Business Week, Donald Norman exclaimed: we have let ourselves to be trapped. … I don’t want to be controlled by a technology. I just want to get on with my life, … So down with PC’s; down with computers. All they do is complicate our lives. (72) And we read on the website of MIT’s first ubicomp project Oxygen: For over forty years, computation has centered about machines, not people. We have catered to expensive computers, pampering them in air-conditioned rooms or carrying them around with us. Purporting to serve us, they have actually forced us to serve them. Ubiquitous computing then, in its early incarnations, was presented as the solution; the human-centred, somewhat natural approach, which would shift the emphasis away from the machine and bring control back to its legitimate owner, the liberal autonomous human subject, becoming the facilitator of our apparently threatened humanness. Its promise? An early promise of regressive augmentation, I would say, since it promised to augment our lives, not by changing them, but by returning us to a past, better world that the alienating PC has supposedly displaced, enabling us to “have more time to be more fully human” (Weiser and Brown). And it sought to achieve this through the key characteristic of invisibility, which was based on the paradox that while more and more computers will permeate our lives, they will effectively disappear. Ubicomp’s Early Romantic Promises The question of how we can make computers disappear has been addressed in computer research in various ways. One of the earliest and most prominent of these is the approach, which focuses on the physicality of the world seeking to build tangible interfaces. One of the main advocates of this approach is MIT’s Tangible Media Group, led by Professor Hiroshi Ishii. The group has been working on their vision, which they call “Tangible Bits,” for almost two decades now, and in 2009 they were awarded the “Lasting Impact Award” at the ACM Symposium on User Interface Software and Technology (UIST) for their metaDesk platform, presented in 1997 (fig.1), which explores the coupling of everyday physical objects with digital information (Ullmer and Ishii). Also, in 2004 in a special paper titled “Bottles: A Transparent Interface as a Tribute to Mark Weiser”, Ishii presented once again an early project he and his group developed in 1999, and for which they were personally commented by Weiser himself. According to Ishii, bottles (fig. 2)—a system which comprises three glass bottles “filled with music” each representing a different musical instrument, placed on a Plexiglas “stage” and controlled by their physical manipulation (moving, opening or closing them)—no less, “illustrates Mark Weiser’s vision of the transparent (or invisible) interface that weaves itself into the fabric of everyday life” (1299). Figure 1: metaDesk platform (MIT Tangible Media Group) Figure 2: musicBottles (MIT Tangible Media Group) Tangible computing was based on the premise that we inhabit two worlds: the physical world and cyberspace, or as Ishii and Ullmer put it, the world of atoms and the world of bits claiming that there is gap between these two worlds that left us “torn between these parallel but disjoint spaces” (1). This agreed with Weiser’s argument that cyberspace, and specifically the computer, has taken centre stage leaving the real world—the real people, the real interactions—in the background and neglected. Tangible computing then sought to address this problem by "bridging the gaps between both cyberspace and the physical environment" (1). As Ishii and Ullmer wrote in 1997: The aim of our research is to show concrete ways to move beyond the current dominant model of GUI [Graphic User Interface] bound to computers with a flat rectangular display, windows, a mouse, and a keyboard. To make computing truly ubiquitous and invisible, we seek to establish a new type of HCI that we call "Tangible User Interfaces" (TUIs). TUIs will augment the real physical world by coupling digital information to everyday physical objects and environments. (2) “Our intention is to take advantage of natural physical affordances to achieve a heightened legibility and seamlessness of interaction between people and information” (2). In his earlier work computer scientist Paul Dourish turned to phenomenology and the concept of embodiment in order to develop an understanding of interaction as embodied. This was prior to his recent work with cultural anthropologist Bell where they examined the motivating mythology of ubiquitous computing along with the messiness of its lived experience (Dourish and Bell). Dourish, in this earlier work observed that one of the common critical features early tangible and ubiquitous computing shared is that “they both attempt to exploit our natural familiarity with the everyday environment and our highly developed spatial and physical skills to specialize and control how computation can be used in concert with naturalistic activities” (Context-Aware Computing 232). They then sought to exploit this familiarity in order to build natural computational interfaces that fit seamlessly within our everyday, real world (Where the Action Is 17). This idea of an existing set of natural tactile skills appears to come hand-in-hand with a nostalgic, romantic view of an innocent, simple, and long gone world that the early projects of tangible and ubiquitous computing sought to revive; a world where the personal computer not only did not fit, an innocent world in fact displaced by the personal computer. In 1997, Ishii and Ullmer wrote about their decision to start their investigations about the “future of HCI” in the museum of the Collection of Historic Scientific Instruments at Harvard University in their efforts to get inspired by “the aesthetics and rich affordances of these historical scientific instruments” concerned that, “alas, much of this richness has been lost to the rapid flood of digital technologies” (1). Elsewhere Ishii explained that the origin of his idea to design a bottle interface began with the concept of a “weather forecast bottle;” an idea he intended to develop as a present for his mother. “Upon opening the weather bottle, she would be greeted by the sound of singing birds if the next day’s weather was forecasted to be clear” (1300). Here, we are introduced to a nice elderly lady who has opened thousands of bottles while cooking for her family in her kitchen. This senior lady; who is made to embody the symbolic alignment between woman, the domestic and nature (see Soper, Rose, Plumwood); “has never clicked a mouse, typed a URL, nor booted a computer in her life” (Ishii 1300). Instead, “my mother simply wanted to know the following day’s weather forecast. Why should this be so complicated?” (1300, my italics). Weiser also mobilised nostalgic sentiments in order to paint a picture of what it would be to live with ubiquitous computing. So, for example, when seeking a metaphor for ubiquitous computing, he proposed “childhood – playful, a building of foundations, constant learning, a bit mysterious and quickly forgotten by adults” (Not a Desktop 8). He viewed the ubicomp home as the ideal retreat to a state of childhood; playfully reaching out to the unknown, while being securely protected and safely “at home” (Open House). These early ideas of a direct experience of the world through our bodily senses along with the romantic view of a past, simple, and better world that the computer threatened and that future technological developments promised, could point towards what Leo Marx has described as America’s “pastoral ideal”, a force that, according to Marx, is ingrained in the American view of life. Balancing between primitivism and civilisation, nature and culture, the pastoral ideal “is an embodiment of what Lovejoy calls ‘semi-primitivism’; it is located in a middle ground somewhere ‘between’, yet in a transcendent relation to, the opposing forces of civilisation and nature” (Marx 23). It appears that the early advocates of tangible and ubiquitous computing sought to strike a similar balance to the American pastoral ideal; a precarious position that managed to reconcile the disfavour and fear of Europe’s “satanic mills” with an admiration for the technological power of the Industrial Revolution, the admiration for technological development with the bucolic ideal of an unspoiled and pure nature. But how was such a balance to be achieved? How could the ideal middle state be achieved balancing the opposing forces of technological development and the dream of the return to a serene pastoral existence? According to Leo Marx, for the European colonisers, the New World was to provide the answer to this exact question (101). The American landscape was to become the terrain where old and new, nature and technology harmonically meet to form a libertarian utopia. Technology was seen as “‘naturally arising’ from the landscape as another natural ‘means of happiness’ decreed by the Creator in his design of the continent. So, far from conceding that there might be anything alien or ‘artificial’ about mechanization, technology was seen as inherent in ‘nature’; both geographic and human” (160). Since then, according to Marx, the idea of the “return” to a new Golden Age has been engrained in the American culture and it appears that it informs ubiquitous computing’s own early visions. The idea of a “naturally arising” technology which would facilitate our return to the once lost garden of security and nostalgia appears to have become a common theme within ubiquitous computing discourses making appearances across time and borders. So, for example, while in 1991 Weiser envisioned that ubiquitous technologies will make “using a computer as refreshing as taking a walk in the woods” (21st Century Computer 11), twelve years later Marzano writing about Philip’s vision of Ambient Intelligence promised that “the living space of the future could look more like that of the past than that of today” (9). While the pastoral defined nature in terms of the geographical landscape, early ubiquitous computing appeared to define nature in terms of the objects, tools and technologies that surround us and our interactions with them. While pastoral America defined itself in contradistinction to the European industrial sites and the dirty, smoky and alienating cityscapes, within those early ubiquitous computing discourses the role of the alienating force was assigned to the personal computer. And whereas the personal computer with its “grey box” was early on rejected as the modern embodiment of the European satanic mills, computation was welcomed as a “naturally arising” technological solution which would infuse the objects which, “through the ages, … are most relevant to human life—chairs, tables and beds, for instance, … the objects we can’t do without” (Marzano 9). Or else, it would infuse the—newly constructed—natural landscape fulfilling the promise that when the “world of bits” and the “world of atoms” are finally bridged, the balance will be restored. But how did these two worlds come into existence? How did bits and atoms come to occupy different and separate ontological spheres? Far from being obvious or commonsensical, the idea of the separation between bits and atoms has a history that grounds it to specific times and places, and consequently makes those early ubiquitous and tangible computing discourses part of a bigger story that, as documented (Hayles) and argued (Agre), started some time ago. The view that we inhabit the two worlds of atoms and bits (Ishii and Ullmer) was endorsed by both early ubiquitous and tangible computing, it was based on the idea of the separation of computation from its material instantiation, presenting the former as a free floating entity able to infuse our world. As we saw earlier, tangible computing took the idea of this separation as an unquestionable fact, which then served as the basis for its research goals. As we read in the home page of the Tangible Media Group’s website: Where the sea of bits meets the land of atoms, we are now facing the challenge of reconciling our dual citizenship in the physical and digital worlds. "Tangible Bits" is our vision of Human Computer Interaction (HCI): we seek a seamless coupling of bits and atoms by giving physical form to digital information and computation (my italics). The idea that digital information does not have to have a physical form, but is given one in order to achieve a coupling of the two worlds, not only reinforces the view of digital information as an immaterial entity, but also places it in a privileged position against the material world. Under this light, those early ideas of augmentation or of “awakening” the physical world (Ishii and Ullmer 3) appear to be based on the idea of a passive material world that can be brought to life and become worthy and meaningful through computation, making ubiquitous computing part of a bigger and more familiar story. Restaging the dominant Cartesian dualism between the “ensouled” subject and the “soulless” material object, the latter is rendered passive, manipulable, and void of agency and, just like Ishii’s old bottles, it is performed as a mute, docile “empty vessel” ready to carry out any of its creator’s wishes; hold perfumes and beverages, play music, or tell the weather. At the same time, computation was presented as the force that could breathe life to a mundane and passive world; a free floating, somewhat natural, immaterial entity, like oxygen (hence the name of MIT’s first ubicomp project), like the air we breathe that could travel unobstructed through any medium, our everyday objects and our environment. But it is interesting to see that in those early ubicomp discourses computation’s power did not extend too far. While computation appeared to be foregrounded as a powerful, almost magic, entity able to give life and soul to a soulless material world, at the same time it was presented as controlled and muted. The computational power that would fill our lives, according to Weiser’s ubiquitous computing, would be invisible, it wouldn’t “intrude on our consciousness” (Weiser Not a Desktop 7), it would leave no traces and bring no radical changes. If anything, it would enable us to re-establish our humanness and return us to our past, natural state promising not to change us, or our lives, by introducing something new and unfamiliar, but to enable us to “remain serene and in control” (Weiser and Brown). In other words, ubiquitous computing, as this early story goes, would not be alienating, complex, obtrusive, or even noticeable, for that matter, and so, at the end of this paper, we come full circle to ubicomp’s early goals of invisibility with its underpinnings of the precarious pastoral ideal. This short paper focused on some of ubicomp’s early stories and projects and specifically on its promise to return us to a past and implicitly better world that the PC has arguably displaced. By reading these early promises of, what I call, regressive augmentation through Marx’s work on the “pastoral ideal,” this paper sought to tease out, in order to unsettle, the origins of some of ubicomp’s romantic promises. References Agre, P. E. Computation and Human Experience. New York: Cambridge University Press, 1997. Dourish, P. “Seeking a Foundation for Context-Aware Computing.” Human–Computer Interaction 16.2-4 (2001): 229-241. ———. Where the Action Is: The Foundations of Embodied Interaction. Cambridge: MIT Press, 2001. Dourish, P. and Genevieve Bell. Divining a Digital Future: Mess and Mythology in Ubiquitous Computing. Cambridge, Massachusetts: MIT Press, 2011.Grimes, A., and R. Harper. “Celebratory Technology: New Directions for Food Research in HCI.” In CHI’08, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: ACM, 2008. 467-476. Harper, R., T. Rodden, Y. Rogers, and A. Sellen (eds.). Being Human: Human-Computer Interaction in the Year 2020. Microsoft Research, 2008. 1 Dec. 2013 ‹http://research.microsoft.com/en-us/um/Cambridge/projects/hci2020/downloads/BeingHuman_A3.pdf›. Hayles, K. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press, 1999. Ishii, H. “Bottles: A Transparent Interface as a Tribute to Mark Weiser.” IEICE Transactions on Information and Systems 87.6 (2004): 1299-1311. Ishii, H., and B. Ullmer. “Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms.” In CHI ’97, Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. New York: ACM, 1997. 234-241. Marx, L. The Machine in the Garden: Technology and the Pastoral Ideal in America. 35th ed. New York: Oxford University Press, 2000. Marzano, S. “Cultural Issues in Ambient Intelligence”. In E. Aarts and S. Marzano (eds.), The New Everyday: Views on Ambient Intelligence. Rotterdam: 010 Publishers, 2003. Norman, D. The Invisible Computer: Why Good Oroducts Can Fail, the Personal Computer Is So Complex, and Information Appliances Are the Solution. Cambridge, Mass.: MIT Press, 1999. Plumwood, V. Feminism and the Mastery of Nature. London, New York: Routledge, 1993. Rose, G. Feminism and Geography. Cambridge: Polity, 1993. Soper, K. “Naturalised Woman and Feminized Nature.” In L. Coupe (ed.), The Green Studies Reader: From Romanticism to Ecocriticism. London: Routledge, 2000. Ullmer, B., and H. Ishii. “The metaDESK: Models and Prototypes for Tangible User Interfaces.” In UIST '97, Proceedings of the 10th Annual ACM Symposium on User Interface Software and Technology. New York: ACM, 1997. 223-232. Weiser, M. “The Computer for the 21st Century." Scientific American 265.3 (1991): 94-104. ———. “The Open House.” ITP Review 2.0, 1996. 1 Dec. 2013 ‹http://makingfurnitureinteractive.files.wordpress.com/2007/09/wholehouse.pdf›. ———. “The World Is Not a Desktop." Interactions 1.1 (1994): 7-8. Weiser, M., and J.S. Brown. “The Coming Age of Calm Technology.” 1996. 1 Dec. 2013 ‹http://www.johnseelybrown.com/calmtech.pdf›. Weiser, M., R. Gold, and J.S. Brown. “The Origins of Ubiquitous Computing at PARC in the Late 80s.” Pervasive Computing 38 (1999): 693-696.

15

Watson, Robert. "E-Press and Oppress." M/C Journal 8, no.2 (June1, 2005). http://dx.doi.org/10.5204/mcj.2345.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

From elephants to ABBA fans, silicon to hormone, the following discussion uses a new research method to look at printed text, motion pictures and a teenage rebel icon. If by ‘print’ we mean a mechanically reproduced impression of a cultural symbol in a medium, then printing has been with us since before microdot security prints were painted onto cars, before voice prints, laser prints, network servers, record pressings, motion picture prints, photo prints, colour woodblock prints, before books, textile prints, and footprints. If we accept that higher mammals such as elephants have a learnt culture, then it is possible to extend a definition of printing beyond hom*o sapiens. Poole reports that elephants mechanically trumpet reproductions of human car horns into the air surrounding their society. If nothing else, this cross-species, cross-cultural reproduction, this ‘ability to mimic’ is ‘another sign of their intelligence’. Observation of child development suggests that the first significant meaningful ‘impression’ made on the human mind is that of the face of the child’s nurturer – usually its mother. The baby’s mind forms an ‘impression’, a mental print, a reproducible memory data set, of the nurturer’s face, voice, smell, touch, etc. That face is itself a cultural construct: hair style, makeup, piercings, tattoos, ornaments, nutrition-influenced skin and smell, perfume, temperature and voice. A mentally reproducible pattern of a unique face is formed in the mind, and we use that pattern to distinguish ‘familiar and strange’ in our expanding social orbit. The social relations of patterned memory – of imprinting – determine the extent to which we explore our world (armed with research aids such as text print) or whether we turn to violence or self-harm (Bretherton). While our cultural artifacts (such as vellum maps or networked voice message servers) bravely extend our significant patterns into the social world and the traversed environment, it is useful to remember that such artifacts, including print, are themselves understood by our original pattern-reproduction and impression system – the human mind, developed in childhood. The ‘print’ is brought to mind differently in different discourses. For a reader, a ‘print’ is a book, a memo or a broadsheet, whether it is the Indian Buddhist Sanskrit texts ordered to be printed in 593 AD by the Chinese emperor Sui Wen-ti (Silk Road) or the US Defense Department memo authorizing lower ranks to torture the prisoners taken by the Bush administration (Sanchez, cited in ABC). Other fields see prints differently. For a musician, a ‘print’ may be the sheet music which spread classical and popular music around the world; it may be a ‘record’ (as in a ‘recording’ session), where sound is impressed to wax, vinyl, charged silicon particles, or the alloys (Smith, “Elpida”) of an mp3 file. For the fine artist, a ‘print’ may be any mechanically reproduced two-dimensional (or embossed) impression of a significant image in media from paper to metal, textile to ceramics. ‘Print’ embraces the Japanese Ukiyo-e colour prints of Utamaro, the company logos that wink from credit card holographs, the early photographs of Talbot, and the textured patterns printed into neolithic ceramics. Computer hardware engineers print computational circuits. Homicide detectives investigate both sweaty finger prints and the repeated, mechanical gaits of suspects, which are imprinted into the earthy medium of a crime scene. For film makers, the ‘print’ may refer to a photochemical polyester reproduction of a motion picture artifact (the reel of ‘celluloid’), or a DVD laser disc impression of the same film. Textualist discourse has borrowed the word ‘print’ to mean ‘text’, so ‘print’ may also refer to the text elements within the vision track of a motion picture: the film’s opening titles, or texts photographed inside the motion picture story such as the sword-cut ‘Z’ in Zorro (Niblo). Before the invention of writing, the main mechanically reproduced impression of a cultural symbol in a medium was the humble footprint in the sand. The footprints of tribes – and neighbouring animals – cut tracks in the vegetation and the soil. Printed tracks led towards food, water, shelter, enemies and friends. Having learnt to pattern certain faces into their mental world, children grew older and were educated in the footprints of family and clan, enemies and food. The continuous impression of significant foot traffic in the medium of the earth produced the lines between significant nodes of prewriting and pre-wheeled cultures. These tracks were married to audio tracks, such as the song lines of the Australian Aborigines, or the ballads of tramping culture everywhere. A typical tramping song has the line, ‘There’s a track winding back to an old-fashion shack along the road to Gundagai,’ (O’Hagan), although this colonial-style song was actually written for radio and became an international hit on the airwaves, rather than the tramping trails. The printed tracks impressed by these cultural flows are highly contested and diverse, and their foot prints are woven into our very language. The names for printed tracks have entered our shared memory from the intersection of many cultures: ‘Track’ is a Germanic word entering English usage comparatively late (1470) and now used mainly in audio visual cultural reproduction, as in ‘soundtrack’. ‘Trek’ is a Dutch word for ‘track’ now used mainly by ecotourists and science fiction fans. ‘Learn’ is a Proto-Indo-European word: the verb ‘learn’ originally meant ‘to find a track’ back in the days when ‘learn’ had a noun form which meant ‘the sole of the foot’. ‘Tract’ and ‘trace’ are Latin words entering English print usage before 1374 and now used mainly in religious, and electronic surveillance, cultural reproduction. ‘Trench’ in 1386 was a French path cut through a forest. ‘Sagacity’ in English print in 1548 was originally the ability to track or hunt, in Proto-Indo-European cultures. ‘Career’ (in English before 1534) was the print made by chariots in ancient Rome. ‘Sleuth’ (1200) was a Norse noun for a track. ‘Investigation’ (1436) was Latin for studying a footprint (Harper). The arrival of symbolic writing scratched on caves, hearth stones, and trees (the original meaning of ‘book’ is tree), brought extremely limited text education close to home. Then, with baked clay tablets, incised boards, slate, bamboo, tortoise shell, cast metal, bark cloth, textiles, vellum, and – later – paper, a portability came to text that allowed any culture to venture away from known ‘foot’ paths with a reduction in the risk of becoming lost and perishing. So began the world of maps, memos, bills of sale, philosophic treatises and epic mythologies. Some of this was printed, such as the mechanical reproduction of coins, but the fine handwriting required of long, extended, portable texts could not be printed until the invention of paper in China about 2000 years ago. Compared to lithic architecture and genes, portable text is a fragile medium, and little survives from the millennia of its innovators. The printing of large non-text designs onto bark-paper and textiles began in neolithic times, but Sui Wen-ti’s imperial memo of 593 AD gives us the earliest written date for printed books, although we can assume they had been published for many years previously. The printed book was a combination of Indian philosophic thought, wood carving, ink chemistry and Chinese paper. The earliest surviving fragment of paper-print technology is ‘Mantras of the Dharani Sutra’, a Buddhist scripture written in the Sanskrit language of the Indian subcontinent, unearthed at an early Tang Dynasty site in Xian, China – making the fragment a veteran piece of printing, in the sense that Sanskrit books had been in print for at least a century by the early Tang Dynasty (Chinese Graphic Arts Net). At first, paper books were printed with page-size carved wooden boards. Five hundred years later, Pi Sheng (c.1041) baked individual reusable ceramic characters in a fire and invented the durable moveable type of modern printing (Silk Road 2000). Abandoning carved wooden tablets, the ‘digitizing’ of Chinese moveable type sped up the production of printed texts. In turn, Pi Sheng’s flexible, rapid, sustainable printing process expanded the political-cultural impact of the literati in Asian society. Digitized block text on paper produced a bureaucratic, literate elite so powerful in Asia that Louis XVI of France copied China’s print-based Confucian system of political authority for his own empire, and so began the rise of the examined public university systems, and the civil service systems, of most European states (Watson, Visions). By reason of its durability, its rapid mechanical reproduction, its culturally agreed signs, literate readership, revered authorship, shared ideology, and distributed portability, a ‘print’ can be a powerful cultural network which builds and expands empires. But print also attacks and destroys empires. A case in point is the Spanish conquest of Aztec America: The Aztecs had immense libraries of American literature on bark-cloth scrolls, a technology which predated paper. These libraries were wiped out by the invading Spanish, who carried a different book before them (Ewins). In the industrial age, the printing press and the gun were seen as the weapons of rebellions everywhere. In 1776, American rebels staffed their ‘Homeland Security’ units with paper makers, knowing that defeating the English would be based on printed and written documents (Hahn). Mao Zedong was a book librarian; Mao said political power came out of the barrel of a gun, but Mao himself came out of a library. With the spread of wireless networked servers, political ferment comes out of the barrel of the cell phone and the internet chat room these days. Witness the cell phone displays of a plane hitting a tower that appear immediately after 9/11 in the Middle East, or witness the show trials of a few US and UK lower ranks who published prints of their torturing activities onto the internet: only lower ranks who published prints were arrested or tried. The control of secure servers and satellites is the new press. These days, we live in a global library of burning books – ‘burning’ in the sense that ‘print’ is now a charged silicon medium (Smith, “Intel”) which is usually made readable by connecting the chip to nuclear reactors and petrochemically-fired power stations. World resources burn as we read our screens. Men, women, children burn too, as we watch our infotainment news in comfort while ‘their’ flickering dead faces are printed in our broadcast hearths. The print we watch is not the living; it is the voodoo of the living in the blackout behind the camera, engaging the blood sacrifice of the tormented and the unfortunate. Internet texts are also ‘on fire’ in the third sense of their fragility and instability as a medium: data bases regularly ‘print’ fail-safe copies in an attempt to postpone the inevitable mechanical, chemical and electrical failure that awaits all electronic media in time. Print defines a moral position for everyone. In reporting conflict, in deciding to go to press or censor, any ‘print’ cannot avoid an ethical context, starting with the fact that there is a difference in power between print maker, armed perpetrators, the weak, the peaceful, the publisher, and the viewer. So many human factors attend a text, video or voice ‘print’: its very existence as an aesthetic object, even before publication and reception, speaks of unbalanced, and therefore dynamic, power relationships. For example, Graham Greene departed unscathed from all the highly dangerous battlefields he entered as a novelist: Riot-torn Germany, London Blitz, Belgian Congo, Voodoo Haiti, Vietnam, Panama, Reagan’s Washington, and mafia Europe. His texts are peopled with the injustices of the less fortunate of the twentieth century, while he himself was a member of the fortunate (if not happy) elite, as is anyone today who has the luxury of time to read Greene’s works for pleasure. Ethically a member of London and Paris’ colonizers, Greene’s best writing still electrifies, perhaps partly because he was in the same line of fire as the victims he shared bread with. In fact, Greene hoped daily that he would escape from the dreadful conflicts he fictionalized via a body bag or an urn of ashes (see Sherry). In reading an author’s biography we have one window on the ethical dimensions of authority and print. If a print’s aesthetics are sometimes enduring, its ethical relationships are always mutable. Take the stylized logo of a running athlete: four limbs bent in a rotation of action. This dynamic icon has symbolized ‘good health’ in Hindu and Buddhist culture, from Madras to Tokyo, for thousands of years. The cross of bent limbs was borrowed for the militarized health programs of 1930s Germany, and, because of what was only a brief, recent, isolated yet monstrously horrific segment of its history in print, the bent-limbed swastika is now a vilified symbol in the West. The sign remains ‘impressed’ differently on traditional Eastern culture, and without the taint of Nazism. Dramatic prints are emotionally charged because, in depicting hom*o sapiens in danger, or passionately in love, they elicit a hormonal reaction from the reader, the viewer, or the audience. The type of emotions triggered by a print vary across the whole gamut of human chemistry. A recent study of three genres of motion picture prints shows a marked differences in the hormonal responses of men compared to women when viewing a romance, an actioner, and a documentary (see Schultheiss, Wirth, and Stanton). Society is biochemically diverse in its engagement with printed culture, which raises questions about equality in the arts. Motion picture prints probably comprise around one third of internet traffic, in the form of stolen digitized movie files pirated across the globe via peer-to-peer file transfer networks (p2p), and burnt as DVD laser prints (BBC). There is also a US 40 billion dollar per annum legitimate commerce in DVD laser pressings (Grassl), which would suggest an US 80 billion per annum world total in legitimate laser disc print culture. The actively screen literate, or the ‘sliterati’ as I prefer to call them, research this world of motion picture prints via their peers, their internet information channels, their television programming, and their web forums. Most of this activity occurs outside the ambit of universities and schools. One large site of sliterate (screen literate) practice outside most schooling and official research is the net of online forums at imdb.com (International Movie Data Base). Imdb.com ‘prints’ about 25,000,000 top pages per month to client browsers. Hundreds of sliterati forums are located at imdb, including a forum for the Australian movie, Muriel’s Wedding (Hogan). Ten years after the release of Muriel’s Wedding, young people who are concerned with victimization and bullying still log on to http://us.imdb.com/title/tt0110598/board/> and put their thoughts into print: I still feel so bad for Muriel in the beginning of the movie, when the girls ‘dump’ her, and how much the poor girl cried and cried! Those girls were such biartches…I love how they got their comeuppance! bunniesormaybemidgets’s comment is typical of the current discussion. Muriel’s Wedding was a very popular film in its first cinema edition in Australia and elsewhere. About 30% of the entire over-14 Australian population went to see this photochemical polyester print in the cinemas on its first release. A decade on, the distributors printed a DVD laser disc edition. The story concerns Muriel (played by Toni Collette), the unemployed daughter of a corrupt, ‘police state’ politician. Muriel is bullied by her peers and she withdraws into a fantasy world, deluding herself that a white wedding will rescue her from the torments of her blighted life. Through theft and deceit (the modus operandi of her father) Muriel escapes to the entertainment industry and finds a ‘wicked’ girlfriend mentor. From a rebellious position of stubborn independence, Muriel plays out her fantasy. She gets her white wedding, before seeing both her father and her new married life as hollow shams which have goaded her abandoned mother to suicide. Redefining her life as a ‘game’ and assuming responsibility for her independence, Muriel turns her back on the mainstream, image-conscious, female gang of her oppressed youth. Muriel leaves the story, having rekindled her friendship with her rebel mentor. My methodological approach to viewing the laser disc print was to first make a more accessible, coded record of the entire movie. I was able to code and record the print in real time, using a new metalanguage (Watson, “Eyes”). The advantage of Coding is that ‘thinks’ the same way as film making, it does not sidetrack the analyst into prose. The Code splits the movie print into Vision Action [vision graphic elements, including text] (sound) The Coding splits the vision track into normal action and graphic elements, such as text, so this Coding is an ideal method for extracting all the text elements of a film in real time. After playing the film once, I had four and a half tightly packed pages of the coded story, including all its text elements in square brackets. Being a unique, indexed hard copy, the Coded copy allowed me immediate access to any point of the Muriel’s Wedding saga without having to search the DVD laser print. How are ‘print’ elements used in Muriel’s Wedding? Firstly, a rose-coloured monoprint of Muriel Heslop’s smiling face stares enigmatically from the plastic surface of the DVD picture disc. The print is a still photo captured from her smile as she walked down the aisle of her white wedding. In this print, Toni Collette is the Mona Lisa of Australian culture, except that fans of Muriel’s Wedding know the meaning of that smile is a magical combination of the actor’s art: the smile is both the flush of dreams come true and the frightening self deception that will kill her mother. Inserting and playing the disc, the text-dominant menu appears, and the film commences with the text-dominant opening titles. Text and titles confer a legitimacy on a work, whether it is a trade mark of the laser print owners, or the household names of stars. Text titles confer status relationships on both the presenters of the cultural artifact and the viewer who has entered into a legal license agreement with the owners of the movie. A title makes us comfortable, because the mind always seeks to name the unfamiliar, and a set of text titles does that job for us so that we can navigate the ‘tracks’ and settle into our engagement with the unfamiliar. The apparent ‘truth’ and ‘stability’ of printed text calms our fears and beguiles our uncertainties. Muriel attends the white wedding of a school bully bride, wearing a leopard print dress she has stolen. Muriel’s spotted wild animal print contrasts with the pure white handmade dress of the bride. In Muriel’s leopard textile print, we have the wild, rebellious, impoverished, inappropriate intrusion into the social ritual and fantasy of her high-status tormentor. An off-duty store detective recognizes the printed dress and calls the police. The police are themselves distinguished by their blue-and-white checked prints and other mechanically reproduced impressions of cultural symbols: in steel, brass, embroidery, leather and plastics. Muriel is driven in the police car past the stenciled town sign (‘Welcome To Porpoise Spit’ heads a paragraph of small print). She is delivered to her father, a politician who presides over the policing of his town. In a state where the judiciary, police and executive are hijacked by the same tyrant, Muriel’s father, Bill, pays off the police constables with a carton of legal drugs (beer) and Muriel must face her father’s wrath, which he proceeds to transfer to his detested wife. Like his daughter, the father also wears a spotted brown print costume, but his is a batik print from neighbouring Indonesia (incidentally, in a nation that takes the political status of its batik prints very seriously). Bill demands that Muriel find the receipt for the leopard print dress she claims she has purchased. The legitimate ownership of the object is enmeshed with a printed receipt, the printed evidence of trade. The law (and the paramilitary power behind the law) are legitimized, or contested, by the presence or absence of printed text. Muriel hides in her bedroom, surround by poster prints of the pop group ABBA. Torn-out prints of other people’s weddings adorn her mirror. Her face is embossed with the clown-like primary colours of the marionette as she lifts a bouquet to her chin and stares into the real time ‘print’ of her mirror image. Bill takes the opportunity of a business meeting with Japanese investors to feed his entire family at ‘Charlie Chan’’s restaurant. Muriel’s middle sister sloppily wears her father’s state election tee shirt, printed with the text: ‘Vote 1, Bill Heslop. You can’t stop progress.’ The text sets up two ironic gags that are paid off on the dialogue track: “He lost,’ we are told. ‘Progress’ turns out to be funding the concreting of a beach. Bill berates his daughter Muriel: she has no chance of becoming a printer’s apprentice and she has failed a typing course. Her dysfunction in printed text has been covered up by Bill: he has bribed the typing teacher to issue a printed diploma to his daughter. In the gambling saloon of the club, under the arrays of mechanically repeated cultural symbols lit above the poker machines (‘A’ for ace, ‘Q’ for queen, etc.), Bill’s secret girlfriend Diedre risks giving Muriel a cosmetics job. Another text icon in lights announces the surf nightclub ‘Breakers’. Tania, the newly married queen bitch who has made Muriel’s teenage years a living hell, breaks up with her husband, deciding to cash in his negotiable text documents – his Bali honeymoon tickets – and go on an island holiday with her girlfriends instead. Text documents are the enduring site of agreements between people and also the site of mutations to those agreements. Tania dumps Muriel, who sobs and sobs. Sobs are a mechanical, percussive reproduction impressed on the sound track. Returning home, we discover that Muriel’s older brother has failed a printed test and been rejected for police recruitment. There is a high incidence of print illiteracy in the Heslop family. Mrs Heslop (Jeannie Drynan), for instance, regularly has trouble at the post office. Muriel sees a chance to escape the oppression of her family by tricking her mother into giving her a blank cheque. Here is the confluence of the legitimacy of a bank’s printed negotiable document with the risk and freedom of a blank space for rebel Muriel’s handwriting. Unable to type, her handwriting has the power to steal every cent of her father’s savings. She leaves home and spends the family’s savings at an island resort. On the island, the text print-challenged Muriel dances to a recording (sound print) of ABBA, her hand gestures emphasizing her bewigged face, which is made up in an impression of her pop idol. Her imitation of her goddesses – the ABBA women, her only hope in a real world of people who hate or avoid her – is accompanied by her goddesses’ voices singing: ‘the mystery book on the shelf is always repeating itself.’ Before jpeg and gif image downloads, we had postcard prints and snail mail. Muriel sends a postcard to her family, lying about her ‘success’ in the cosmetics business. The printed missal is clutched by her father Bill (Bill Hunter), who proclaims about his daughter, ‘you can’t type but you really impress me’. Meanwhile, on Hibiscus Island, Muriel lies under a moonlit palm tree with her newly found mentor, ‘bad girl’ Ronda (Rachel Griffiths). In this critical scene, where foolish Muriel opens her heart’s yearnings to a confidante she can finally trust, the director and DP have chosen to shoot a flat, high contrast blue filtered image. The visual result is very much like the semiabstract Japanese Ukiyo-e woodblock prints by Utamaro. This Japanese printing style informed the rise of European modern painting (Monet, Van Gogh, Picasso, etc., were all important collectors and students of Ukiyo-e prints). The above print and text elements in Muriel’s Wedding take us 27 minutes into her story, as recorded on a single page of real-time handwritten Coding. Although not discussed here, the Coding recorded the complete film – a total of 106 minutes of text elements and main graphic elements – as four pages of Code. Referring to this Coding some weeks after it was made, I looked up the final code on page four: taxi [food of the sea] bq. Translation: a shop sign whizzes past in the film’s background, as Muriel and Ronda leave Porpoise Spit in a taxi. Over their heads the text ‘Food Of The Sea’ flashes. We are reminded that Muriel and Ronda are mermaids, fantastic creatures sprung from the brow of author PJ Hogan, and illuminated even today in the pantheon of women’s coming-of-age art works. That the movie is relevant ten years on is evidenced by the current usage of the Muriel’s Wedding online forum, an intersection of wider discussions by sliterate women on imdb.com who, like Muriel, are observers (and in some cases victims) of horrific pressure from ambitious female gangs and bullies. Text is always a minor element in a motion picture (unless it is a subtitled foreign film) and text usually whizzes by subliminally while viewing a film. By Coding the work for [text], all the text nuances made by the film makers come to light. While I have viewed Muriel’s Wedding on many occasions, it has only been in Coding it specifically for text that I have noticed that Muriel is a representative of that vast class of talented youth who are discriminated against by print (as in text) educators who cannot offer her a life-affirming identity in the English classroom. Severely depressed at school, and failing to type or get a printer’s apprenticeship, Muriel finds paid work (and hence, freedom, life, identity, independence) working in her audio visual printed medium of choice: a video store in a new city. Muriel found a sliterate admirer at the video store but she later dumped him for her fantasy man, before leaving him too. One of the points of conjecture on the imdb Muriel’s Wedding site is, did Muriel (in the unwritten future) get back together with admirer Brice Nobes? That we will never know. While a print forms a track that tells us where culture has been, a print cannot be the future, a print is never animate reality. At the end of any trail of prints, one must lift one’s head from the last impression, and negotiate satisfaction in the happening world. References Australian Broadcasting Corporation. “Memo Shows US General Approved Interrogations.” 30 Mar. 2005 http://www.abc.net.au>. British Broadcasting Commission. “Films ‘Fuel Online File-Sharing’.’’ 22 Feb. 2005 http://news.bbc.co.uk/1/hi/technology/3890527.stm>. Bretherton, I. “The Origins of Attachment Theory: John Bowlby and Mary Ainsworth.” 1994. 23 Jan. 2005 http://www.psy.med.br/livros/autores/bowlby/bowlby.pdf>. Bunniesormaybemidgets. Chat Room Comment. “What Did Those Girls Do to Rhonda?” 28 Mar. 2005 http://us.imdb.com/title/tt0110598/board/>. Chinese Graphic Arts Net. Mantras of the Dharani Sutra. 20 Feb. 2005 http://www.cgan.com/english/english/cpg/engcp10.htm>. Ewins, R. Barkcloth and the Origins of Paper. 1991. 20 Feb. 2005 http://www.justpacific.com/pacific/papers/barkcloth~paper.html>. Grassl K.R. The DVD Statistical Report. 14 Mar. 2005 http://www.corbell.com>. Hahn, C. M. The Topic Is Paper. 20 Feb. 2005 http://www.nystamp.org/Topic_is_paper.html>. Harper, D. Online Etymology Dictionary. 14 Mar. 2005 http://www.etymonline.com/>. Mask of Zorro, The. Screenplay by J McCulley. UA, 1920. Muriel’s Wedding. Dir. PJ Hogan. Perf. Toni Collette, Rachel Griffiths, Bill Hunter, and Jeannie Drynan. Village Roadshow, 1994. O’Hagan, Jack. On The Road to Gundagai. 1922. 2 Apr. 2005 http://ingeb.org/songs/roadtogu.html>. Poole, J.H., P.L. Tyack, A.S. Stoeger-Horwath, and S. Watwood. “Animal Behaviour: Elephants Are Capable of Vocal Learning.” Nature 24 Mar. 2005. Sanchez, R. “Interrogation and Counter-Resistance Policy.” 14 Sept. 2003. 30 Mar. 2005 http://www.abc.net.au>. Schultheiss, O.C., M.M. Wirth, and S.J. Stanton. “Effects of Affiliation and Power Motivation Arousal on Salivary Progesterone and Testosterone.” Hormones and Behavior 46 (2005). Sherry, N. The Life of Graham Greene. 3 vols. London: Jonathan Cape 2004, 1994, 1989. Silk Road. Printing. 2000. 20 Feb. 2005 http://www.silk-road.com/artl/printing.shtml>. Smith, T. “Elpida Licenses ‘DVD on a Chip’ Memory Tech.” The Register 20 Feb. 2005 http://www.theregister.co.uk/2005/02>. —. “Intel Boffins Build First Continuous Beam Silicon Laser.” The Register 20 Feb. 2005 http://www.theregister.co.uk/2005/02>. Watson, R. S. “Eyes And Ears: Dramatic Memory Slicing and Salable Media Content.” Innovation and Speculation, ed. Brad Haseman. Brisbane: QUT. [in press] Watson, R. S. Visions. Melbourne: Curriculum Corporation, 1994. Citation reference for this article MLA Style Watson, Robert. "E-Press and Oppress: Audio Visual Print Drama, Identity, Text and Motion Picture Rebellion." M/C Journal 8.2 (2005). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0506/08-watson.php>. APA Style Watson, R. (Jun. 2005) "E-Press and Oppress: Audio Visual Print Drama, Identity, Text and Motion Picture Rebellion," M/C Journal, 8(2). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0506/08-watson.php>.

16

Maxwell, Richard, and Toby Miller. "The Real Future of the Media." M/C Journal 15, no.3 (June27, 2012). http://dx.doi.org/10.5204/mcj.537.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

When George Orwell encountered ideas of a technological utopia sixty-five years ago, he acted the grumpy middle-aged man Reading recently a batch of rather shallowly optimistic “progressive” books, I was struck by the automatic way in which people go on repeating certain phrases which were fashionable before 1914. Two great favourites are “the abolition of distance” and “the disappearance of frontiers”. I do not know how often I have met with the statements that “the aeroplane and the radio have abolished distance” and “all parts of the world are now interdependent” (1944). It is worth revisiting the old boy’s grumpiness, because the rhetoric he so niftily skewers continues in our own time. Facebook features “Peace on Facebook” and even claims that it can “decrease world conflict” through inter-cultural communication. Twitter has announced itself as “a triumph of humanity” (“A Cyber-House” 61). Queue George. In between Orwell and latter-day hoody cybertarians, a whole host of excitable public intellectuals announced the impending end of materiality through emergent media forms. Marshall McLuhan, Neil Postman, Daniel Bell, Ithiel de Sola Pool, George Gilder, Alvin Toffler—the list of 1960s futurists goes on and on. And this wasn’t just a matter of punditry: the OECD decreed the coming of the “information society” in 1975 and the European Union (EU) followed suit in 1979, while IBM merrily declared an “information age” in 1977. Bell theorized this technological utopia as post-ideological, because class would cease to matter (Mattelart). Polluting industries seemingly no longer represented the dynamic core of industrial capitalism; instead, market dynamism radiated from a networked, intellectual core of creative and informational activities. The new information and knowledge-based economies would rescue First World hegemony from an “insurgent world” that lurked within as well as beyond itself (Schiller). Orwell’s others and the Cold-War futurists propagated one of the most destructive myths shaping both public debate and scholarly studies of the media, culture, and communication. They convinced generations of analysts, activists, and arrivistes that the promises and problems of the media could be understood via metaphors of the environment, and that the media were weightless and virtual. The famous medium they wished us to see as the message —a substance as vital to our wellbeing as air, water, and soil—turned out to be no such thing. Today’s cybertarians inherit their anti-Marxist, anti-materialist positions, as a casual glance at any new media journal, culture-industry magazine, or bourgeois press outlet discloses. The media are undoubtedly important instruments of social cohesion and fragmentation, political power and dissent, democracy and demagoguery, and other fraught extensions of human consciousness. But talk of media systems as equivalent to physical ecosystems—fashionable among marketers and media scholars alike—is predicated on the notion that they are environmentally benign technologies. This has never been true, from the beginnings of print to today’s cloud-covered computing. Our new book Greening the Media focuses on the environmental impact of the media—the myriad ways that media technology consumes, despoils, and wastes natural resources. We introduce ideas, stories, and facts that have been marginal or absent from popular, academic, and professional histories of media technology. Throughout, ecological issues have been at the core of our work and we immodestly think the same should apply to media communications, and cultural studies more generally. We recognize that those fields have contributed valuable research and teaching that address environmental questions. For instance, there is an abundant literature on representations of the environment in cinema, how to communicate environmental messages successfully, and press coverage of climate change. That’s not enough. You may already know that media technologies contain toxic substances. You may have signed an on-line petition protesting the hazardous and oppressive conditions under which workers assemble cell phones and computers. But you may be startled, as we were, by the scale and pervasiveness of these environmental risks. They are present in and around every site where electronic and electric devices are manufactured, used, and thrown away, poisoning humans, animals, vegetation, soil, air and water. We are using the term “media” as a portmanteau word to cover a multitude of cultural and communications machines and processes—print, film, radio, television, information and communications technologies (ICT), and consumer electronics (CE). This is not only for analytical convenience, but because there is increasing overlap between the sectors. CE connect to ICT and vice versa; televisions resemble computers; books are read on telephones; newspapers are written through clouds; and so on. Cultural forms and gadgets that were once separate are now linked. The currently fashionable notion of convergence doesn’t quite capture the vastness of this integration, which includes any object with a circuit board, scores of accessories that plug into it, and a global nexus of labor and environmental inputs and effects that produce and flow from it. In 2007, a combination of ICT/CE and media production accounted for between 2 and 3 percent of all greenhouse gases emitted around the world (“Gartner Estimates,”; International Telecommunication Union; Malmodin et al.). Between twenty and fifty million tonnes of electronic waste (e-waste) are generated annually, much of it via discarded cell phones and computers, which affluent populations throw out regularly in order to buy replacements. (Presumably this fits the narcissism of small differences that distinguishes them from their own past.) E-waste is historically produced in the Global North—Australasia, Western Europe, Japan, and the US—and dumped in the Global South—Latin America, Africa, Eastern Europe, Southern and Southeast Asia, and China. It takes the form of a thousand different, often deadly, materials for each electrical and electronic gadget. This trend is changing as India and China generate their own media detritus (Robinson; Herat). Enclosed hard drives, backlit screens, cathode ray tubes, wiring, capacitors, and heavy metals pose few risks while these materials remain encased. But once discarded and dismantled, ICT/CE have the potential to expose workers and ecosystems to a morass of toxic components. Theoretically, “outmoded” parts could be reused or swapped for newer parts to refurbish devices. But items that are defined as waste undergo further destruction in order to collect remaining parts and valuable metals, such as gold, silver, copper, and rare-earth elements. This process causes serious health risks to bones, brains, stomachs, lungs, and other vital organs, in addition to birth defects and disrupted biological development in children. Medical catastrophes can result from lead, cadmium, mercury, other heavy metals, poisonous fumes emitted in search of precious metals, and such carcinogenic compounds as polychlorinated biphenyls, dioxin, polyvinyl chloride, and flame retardants (Maxwell and Miller 13). The United States’ Environmental Protection Agency estimates that by 2007 US residents owned approximately three billion electronic devices, with an annual turnover rate of 400 million units, and well over half such purchases made by women. Overall CE ownership varied with age—adults under 45 typically boasted four gadgets; those over 65 made do with one. The Consumer Electronics Association (CEA) says US$145 billion was expended in the sector in 2006 in the US alone, up 13% on the previous year. The CEA refers joyously to a “consumer love affair with technology continuing at a healthy clip.” In the midst of a recession, 2009 saw $165 billion in sales, and households owned between fifteen and twenty-four gadgets on average. By 2010, US$233 billion was spent on electronic products, three-quarters of the population owned a computer, nearly half of all US adults owned an MP3 player, and 85% had a cell phone. By all measures, the amount of ICT/CE on the planet is staggering. As investigative science journalist, Elizabeth Grossman put it: “no industry pushes products into the global market on the scale that high-tech electronics does” (Maxwell and Miller 2). In 2007, “of the 2.25 million tons of TVs, cell phones and computer products ready for end-of-life management, 18% (414,000 tons) was collected for recycling and 82% (1.84 million tons) was disposed of, primarily in landfill” (Environmental Protection Agency 1). Twenty million computers fell obsolete across the US in 1998, and the rate was 130,000 a day by 2005. It has been estimated that the five hundred million personal computers discarded in the US between 1997 and 2007 contained 6.32 billion pounds of plastics, 1.58 billion pounds of lead, three million pounds of cadmium, 1.9 million pounds of chromium, and 632000 pounds of mercury (Environmental Protection Agency; Basel Action Network and Silicon Valley Toxics Coalition 6). The European Union is expected to generate upwards of twelve million tons annually by 2020 (Commission of the European Communities 17). While refrigerators and dangerous refrigerants account for the bulk of EU e-waste, about 44% of the most toxic e-waste measured in 2005 came from medium-to-small ICT/CE: computer monitors, TVs, printers, ink cartridges, telecommunications equipment, toys, tools, and anything with a circuit board (Commission of the European Communities 31-34). Understanding the enormity of the environmental problems caused by making, using, and disposing of media technologies should arrest our enthusiasm for them. But intellectual correctives to the “love affair” with technology, or technophilia, have come and gone without establishing much of a foothold against the breathtaking flood of gadgets and the propaganda that proclaims their awe-inspiring capabilities.[i] There is a peculiar enchantment with the seeming magic of wireless communication, touch-screen phones and tablets, flat-screen high-definition televisions, 3-D IMAX cinema, mobile computing, and so on—a totemic, quasi-sacred power that the historian of technology David Nye has named the technological sublime (Nye Technological Sublime 297).[ii] We demonstrate in our book why there is no place for the technological sublime in projects to green the media. But first we should explain why such symbolic power does not accrue to more mundane technologies; after all, for the time-strapped cook, a pressure cooker does truly magical things. Three important qualities endow ICT/CE with unique symbolic potency—virtuality, volume, and novelty. The technological sublime of media technology is reinforced by the “virtual nature of much of the industry’s content,” which “tends to obscure their responsibility for a vast proliferation of hardware, all with high levels of built-in obsolescence and decreasing levels of efficiency” (Boyce and Lewis 5). Planned obsolescence entered the lexicon as a new “ethics” for electrical engineering in the 1920s and ’30s, when marketers, eager to “habituate people to buying new products,” called for designs to become quickly obsolete “in efficiency, economy, style, or taste” (Grossman 7-8).[iii] This defines the short lifespan deliberately constructed for computer systems (drives, interfaces, operating systems, batteries, etc.) by making tiny improvements incompatible with existing hardware (Science and Technology Council of the American Academy of Motion Picture Arts and Sciences 33-50; Boyce and Lewis). With planned obsolescence leading to “dizzying new heights” of product replacement (Rogers 202), there is an overstated sense of the novelty and preeminence of “new” media—a “cult of the present” is particularly dazzled by the spread of electronic gadgets through globalization (Mattelart and Constantinou 22). References to the symbolic power of media technology can be found in hymnals across the internet and the halls of academe: technologies change us, the media will solve social problems or create new ones, ICTs transform work, monopoly ownership no longer matters, journalism is dead, social networking enables social revolution, and the media deliver a cleaner, post-industrial, capitalism. Here is a typical example from the twilight zone of the technological sublime (actually, the OECD): A major feature of the knowledge-based economy is the impact that ICTs have had on industrial structure, with a rapid growth of services and a relative decline of manufacturing. Services are typically less energy intensive and less polluting, so among those countries with a high and increasing share of services, we often see a declining energy intensity of production … with the emergence of the Knowledge Economy ending the old linear relationship between output and energy use (i.e. partially de-coupling growth and energy use) (Houghton 1) This statement mixes half-truths and nonsense. In reality, old-time, toxic manufacturing has moved to the Global South, where it is ascendant; pollution levels are rising worldwide; and energy consumption is accelerating in residential and institutional sectors, due almost entirely to ICT/CE usage, despite advances in energy conservation technology (a neat instance of the age-old Jevons Paradox). In our book we show how these are all outcomes of growth in ICT/CE, the foundation of the so-called knowledge-based economy. ICT/CE are misleadingly presented as having little or no material ecological impact. In the realm of everyday life, the sublime experience of electronic machinery conceals the physical work and material resources that go into them, while the technological sublime makes the idea that more-is-better palatable, axiomatic; even sexy. In this sense, the technological sublime relates to what Marx called “the Fetishism which attaches itself to the products of labour” once they are in the hands of the consumer, who lusts after them as if they were “independent beings” (77). There is a direct but unseen relationship between technology’s symbolic power and the scale of its environmental impact, which the economist Juliet Schor refers to as a “materiality paradox” —the greater the frenzy to buy goods for their transcendent or nonmaterial cultural meaning, the greater the use of material resources (40-41). We wrote Greening the Media knowing that a study of the media’s effect on the environment must work especially hard to break the enchantment that inflames popular and elite passions for media technologies. We understand that the mere mention of the political-economic arrangements that make shiny gadgets possible, or the environmental consequences of their appearance and disappearance, is bad medicine. It’s an unwelcome buzz kill—not a cool way to converse about cool stuff. But we didn’t write the book expecting to win many allies among high-tech enthusiasts and ICT/CE industry leaders. We do not dispute the importance of information and communication media in our lives and modern social systems. We are media people by profession and personal choice, and deeply immersed in the study and use of emerging media technologies. But we think it’s time for a balanced assessment with less hype and more practical understanding of the relationship of media technologies to the biosphere they inhabit. Media consumers, designers, producers, activists, researchers, and policy makers must find new and effective ways to move ICT/CE production and consumption toward ecologically sound practices. In the course of this project, we found in casual conversation, lecture halls, classroom discussions, and correspondence, consistent and increasing concern with the environmental impact of media technology, especially the deleterious effects of e-waste toxins on workers, air, water, and soil. We have learned that the grip of the technological sublime is not ironclad. Its instability provides a point of departure for investigating and criticizing the relationship between the media and the environment. The media are, and have been for a long time, intimate environmental participants. Media technologies are yesterday’s, today’s, and tomorrow’s news, but rarely in the way they should be. The prevailing myth is that the printing press, telegraph, phonograph, photograph, cinema, telephone, wireless radio, television, and internet changed the world without changing the Earth. In reality, each technology has emerged by despoiling ecosystems and exposing workers to harmful environments, a truth obscured by symbolic power and the power of moguls to set the terms by which such technologies are designed and deployed. Those who benefit from ideas of growth, progress, and convergence, who profit from high-tech innovation, monopoly, and state collusion—the military-industrial-entertainment-academic complex and multinational commandants of labor—have for too long ripped off the Earth and workers. As the current celebration of media technology inevitably winds down, perhaps it will become easier to comprehend that digital wonders come at the expense of employees and ecosystems. This will return us to Max Weber’s insistence that we understand technology in a mundane way as a “mode of processing material goods” (27). Further to understanding that ordinariness, we can turn to the pioneering conversation analyst Harvey Sacks, who noted three decades ago “the failures of technocratic dreams [:] that if only we introduced some fantastic new communication machine the world will be transformed.” Such fantasies derived from the very banality of these introductions—that every time they took place, one more “technical apparatus” was simply “being made at home with the rest of our world’ (548). Media studies can join in this repetitive banality. Or it can withdraw the welcome mat for media technologies that despoil the Earth and wreck the lives of those who make them. In our view, it’s time to green the media by greening media studies. References “A Cyber-House Divided.” Economist 4 Sep. 2010: 61-62. “Gartner Estimates ICT Industry Accounts for 2 Percent of Global CO2 Emissions.” Gartner press release. 6 April 2007. ‹http://www.gartner.com/it/page.jsp?id=503867›. Basel Action Network and Silicon Valley Toxics Coalition. Exporting Harm: The High-Tech Trashing of Asia. Seattle: Basel Action Network, 25 Feb. 2002. Benjamin, Walter. “Central Park.” Trans. Lloyd Spencer with Mark Harrington. New German Critique 34 (1985): 32-58. Biagioli, Mario. “Postdisciplinary Liaisons: Science Studies and the Humanities.” Critical Inquiry 35.4 (2009): 816-33. Boyce, Tammy and Justin Lewis, eds. Climate Change and the Media. New York: Peter Lang, 2009. Commission of the European Communities. “Impact Assessment.” Commission Staff Working Paper accompanying the Proposal for a Directive of the European Parliament and of the Council on Waste Electrical and Electronic Equipment (WEEE) (recast). COM (2008) 810 Final. Brussels: Commission of the European Communities, 3 Dec. 2008. Environmental Protection Agency. Management of Electronic Waste in the United States. Washington, DC: EPA, 2007 Environmental Protection Agency. Statistics on the Management of Used and End-of-Life Electronics. Washington, DC: EPA, 2008 Grossman, Elizabeth. Tackling High-Tech Trash: The E-Waste Explosion & What We Can Do about It. New York: Demos, 2008. ‹http://www.demos.org/pubs/e-waste_FINAL.pdf› Herat, Sunil. “Review: Sustainable Management of Electronic Waste (e-Waste).” Clean 35.4 (2007): 305-10. Houghton, J. “ICT and the Environment in Developing Countries: Opportunities and Developments.” Paper prepared for the Organization for Economic Cooperation and Development, 2009. International Telecommunication Union. ICTs for Environment: Guidelines for Developing Countries, with a Focus on Climate Change. Geneva: ICT Applications and Cybersecurity Division Policies and Strategies Department ITU Telecommunication Development Sector, 2008. Malmodin, Jens, Åsa Moberg, Dag Lundén, Göran Finnveden, and Nina Lövehagen. “Greenhouse Gas Emissions and Operational Electricity Use in the ICT and Entertainment & Media Sectors.” Journal of Industrial Ecology 14.5 (2010): 770-90. Marx, Karl. Capital: Vol. 1: A Critical Analysis of Capitalist Production, 3rd ed. Trans. Samuel Moore and Edward Aveling, Ed. Frederick Engels. New York: International Publishers, 1987. Mattelart, Armand and Costas M. Constantinou. “Communications/Excommunications: An Interview with Armand Mattelart.” Trans. Amandine Bled, Jacques Guot, and Costas Constantinou. Review of International Studies 34.1 (2008): 21-42. Mattelart, Armand. “Cómo nació el mito de Internet.” Trans. Yanina Guthman. El mito internet. Ed. Victor Hugo de la Fuente. Santiago: Editorial aún creemos en los sueños, 2002. 25-32. Maxwell, Richard and Toby Miller. Greening the Media. New York: Oxford University Press, 2012. Nye, David E. American Technological Sublime. Cambridge, Mass.: MIT Press, 1994. Nye, David E. Technology Matters: Questions to Live With. Cambridge, Mass.: MIT Press. 2007. Orwell, George. “As I Please.” Tribune. 12 May 1944. Richtel, Matt. “Consumers Hold on to Products Longer.” New York Times: B1, 26 Feb. 2011. Robinson, Brett H. “E-Waste: An Assessment of Global Production and Environmental Impacts.” Science of the Total Environment 408.2 (2009): 183-91. Rogers, Heather. Gone Tomorrow: The Hidden Life of Garbage. New York: New Press, 2005. Sacks, Harvey. Lectures on Conversation. Vols. I and II. Ed. Gail Jefferson. Malden: Blackwell, 1995. Schiller, Herbert I. Information and the Crisis Economy. Norwood: Ablex Publishing, 1984. Schor, Juliet B. Plenitude: The New Economics of True Wealth. New York: Penguin, 2010. Science and Technology Council of the American Academy of Motion Picture Arts and Sciences. The Digital Dilemma: Strategic Issues in Archiving and Accessing Digital Motion Picture Materials. Los Angeles: Academy Imprints, 2007. Weber, Max. “Remarks on Technology and Culture.” Trans. Beatrix Zumsteg and Thomas M. Kemple. Ed. Thomas M. Kemple. Theory, Culture [i] The global recession that began in 2007 has been the main reason for some declines in Global North energy consumption, slower turnover in gadget upgrades, and longer periods of consumer maintenance of electronic goods (Richtel). [ii] The emergence of the technological sublime has been attributed to the Western triumphs in the post-Second World War period, when technological power supposedly supplanted the power of nature to inspire fear and astonishment (Nye Technology Matters 28). Historian Mario Biagioli explains how the sublime permeates everyday life through technoscience: "If around 1950 the popular imaginary placed science close to the military and away from the home, today’s technoscience frames our everyday life at all levels, down to our notion of the self" (818). [iii] This compulsory repetition is seemingly undertaken each time as a novelty, governed by what German cultural critic Walter Benjamin called, in his awkward but occasionally illuminating prose, "the ever-always-the-same" of "mass-production" cloaked in "a hitherto unheard-of significance" (48).

17

Zuvela, Danni. "An Interview with the Makers of Value-Added Cinema." M/C Journal 6, no.3 (June1, 2003). http://dx.doi.org/10.5204/mcj.2183.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Things would never be the same again. As sales went through the roof, with some breathless estimates in the region of a 200% increase overnight, marketers practically wet their pants at the phenomenal success of the chocolate bar seen by millions in ET: the Extraterrestrial. That was back in 1982. Though not the first instance of product placement ‘at the movies’, the strategic placement of Reese’s Pieces in ET is often hailed as the triumphant marketing moment heralding the onset of the era of embedded advertising in popular media. Today, much media consumption is characterised by aggressive branding strategies. We’ve all seen ostentatious product wrangling – the unnatural handling of items (especially chocolate bars and bottled drinks) to best display their logo (regardless of considerations of verisimilitude, or even common sense), and ungainly product mentions in dialogue (who can forget the early Jude Law shocker Shopping?) that have passed into the realm of satire. In television and feature filmmaking, props bearing corporate trademarks not only supplement, but often sustain production budgets. Some programs appear to be entirely contrived around such sponsors. Australian commercial television makes no secret of the increasingly non-existent line between ‘entertainment’ and ‘advertising’, though it still purports to describe ‘lifestyle’ shows as ‘reality’ television. With the introduction of technologies like TiVO which enable consumers to skip over ads, the move is from ‘interruptive’ style advertising between programs or segments, to products insinuated in the décor – and increasingly scripts – of programs themselves, with correspondent online shopping opportunities for digital consumers. An entire industry of middle-people – sometimes euphemistically self-described as ‘prop houses’ – has sprung up to service the lucrative product placement industry, orchestrating the insertion of branded products into television and films. The industry has grown to such an extent that it holds an annual backpatting event, the Product Placement Awards, “to commemorate and celebrate product placement” in movies, television shows, music etc. But ‘advertising by stealth’ is not necessarily passively accepted by media consumers – nor media makers. The shoe-horning of brands and their logos into the products of popular culture not only defines the culture industry today, but also characterises much of the resistance to it. ‘Logo-backlash’ is seen as an inevitable response to the incursion of brands into public life, an explicit rejection of the practice of securing consumer mindshare, and subvertisem*nts and billboard liberation activities have been mainstays of culture jamming for decades now. However, criticism of product placement remains highly problematic: when the Center for the Study of Commercialism argued that movies have become “dangerously” saturated with products and suggested that full disclosure in the form of a list, in a film’s credits, of paid product appearances, many noted the counterproductivity of such an approach, arguing that it would only result in further registration – and hence promotion – of the brand. Not everyone subscribes to advertising’s ‘any news is good news’ thesis, however. Peter Conheim and Steve Seidler decided to respond to the behemoth of product placement with a ‘catalogue of sins’. Their new documentary Value Added Cinema meticulously chronicles the appearance of placed products in Hollywood cinema. Here they discuss the film, which is continuing to receive rave reviews in the US and Europe. Danni Zuvela: Can you tell me a little about yourselves? Peter: I’m a musician and filmmaker living in the San Francisco Bay Area who wears too many hats. I play in three performing and recording groups (Mono Pause, Wet Gate, Negativland) and somehow found the time to sit in front of a Mac for six weeks to edit and mix VALUE-ADDED CINEMA. Because Steve is a persuasive salesperson. Steve: I’ve been a curator for the past decade and a half, showing experimental works week after week, month after month, year after year, at the Pacific Film Archive. It was about time to make a tape of my own and Peter was crazy enough to indulge me. DZ: Why product placement? Why do you think it’s important? Where did this documentary come from? S: Steven Spielberg released Minority Report last year and it just raised my hackles. The film actually encourages the world it seems to critique by stressing the inter-relationship of his alleged art with consumerism in the present day and then extending that into a vision of the future within the film itself. In other words, he has already realized the by-product of an alarming dystopia of surveillance, monolithic policing, and capital. That by-product is his film. The rumor mill says that he was reimbursed to the tune of $25 million for the placements. So not only can he not see a constructive path out of dystopia, a path leading toward a more liberating future, he makes millions from his exhausted imagination. What could be more cynical? But Spielberg isn’t alone within the accelerating subsumption of mainstream cinema into the spectacle of pure consumption. He’s just more visible than most. But to consider product placements more directly for a moment: during the past few years, mainstream cinema has been little more than an empty exercise in consumerist viewership. The market-driven incentives that shape films, determining story-lines, exaggerating cultural norms, striving toward particular demographics, whatever, have nothing to do with art or social change and everything to do with profit, pandering, and promulgation. Movies are product placements, the product is a world view of limitless consumption. Value-Added Cinema is about the product-that-announces-itself, the one we recognize as a crystallization of the more encompassing worldview, the sole commodity, spot-lit, adored, assimilated. So why Value-Added Cinema? You’ve got to start somewhere. DZ: Can you tell me a bit about the production process – how did you go about getting the examples you use in the film? Were there any copyright hassles? P: Steve did nearly all of the legwork in that he spent weeks and weeks researching the subject, both on-line and in speaking to people about their recollections of product placement sequences in films they’d seen. He then suffered through close to a hundred films on VHS and DVD, using the fast-forward and cue controls as often as possible, to locate said sequences. We then sat down and started cutting, based at first on groupings Steve had made (a bunch of fast food references, etc.). Using these as a springboard, we quickly realized the narrative potential inherent in all these “narrative film” clips , and before long we were linking sequences and making them refer to one another, sort of allowing a “plot” to evolve. And copyright hassles? Not yet! I say... bring ‘em on! I would be more than happy to fight for the existence of this project, and one of the groups I am in, Negativland, has a rather colourful history of “fair use” battles in the music arena (the most nefarious case, where the band was sued by U2 and their big-label music lawyers over a parody we made happened before I came on board, but there’s been some skirmishes since). We have folks who would be happy to help defend this sort of work in a court of law should the occasion arise. DZ: Can you talk to me about the cultural shift that’s occurred, where the old ‘Acme’ propmaster has been replaced by ‘product peddler’? What is this symptomatic of, and what’s its significance now? S: In the past, privacy existed because there were areas of experience and information that were considered off limits to exploitation. A kind of tacit social contract assumed certain boundaries were in place to keep corporate (and State) meddling at bay and to allow an uncontaminated space for disengaging from culture. Nowadays the violation of boundaries is so egregious it’s hard to be sure that those boundaries in fact exist. Part of that violation has been the encroachment, at every conceivable level, of daily experience by all manner of corporate messages—urinal strainers with logos, coffee jackets with adverts, decals on supermarket floors, temporary tattoos on random pedestrians. Engagement with corporate predation is now foisted on us 24 hours a day. It’s the GPS generation. The corporations want to know where we “are” at all times. Again: in the past there was a certain level of decorum about the sales pitch. That decorum has vanished and in its place is the inter-penetration of all our waking moments by the foghorn of capital. If that foghorn gets loud enough, we’ll never get any sleep. DZ: How do you think product placement affects the integrity of the film? P: Well, that’s definitely a question of the moment, as far as audience reactions to our screenings have been thus far. It really depends on the work itself, doesn’t it? I think we would be highly judgmental, and perhaps quite out of line, if we dismissed out of hand the idea of using actual products in films as some sort of rule. The value of using an actual product to the narrative of a film can’t be discounted automatically because we all know that there are stories to be told in actual, marketed products. Characterizations can develop. If a flustered James Cagney had held up a bottle of Fred’s Cola instead of Pepsi in the climactic shot of One, Two, Three (Billy Wilder’s 1963 co*ke-executive comedy), it wouldn’t have resonated very well. And it’s an incredibly memorable moment (and, some might say, a little dig at both cola companies). But when you get into something like i am sam, where Sean Penn’s character not only works inside a Starbucks, and is shown on the job, in uniform and reading their various actual coffee product names aloud, over and over again, but also rides a bus with a huge Nike ad on the side (and the camera tracks along on the ad instead of the bus itself), plus the fact that he got onto that bus underneath an enormous Apple billboard (not shown in our work, actually), or that his lawyer has a can of Tab sitting on an entirely austere, empty table in front of a blank wall and the camera tracks downward for no other discernable purpose than to highlight the Tab can… you can see where I’m going with this. The battle lines are drawn in my mind. PROVE to me the value of any of those product plugs on Penn’s character, or Michelle Pfeiffer’s (his lawyer). DZ: What do you make of the arguments for product placement as necessary to, even enhancing, the verisimilitude of films? Is there a case to be made for brands appearing in a production design because they’re what a character would choose? S: It’s who makes the argument for product placements that’s troublesome. Art that I value is a sort of problem solving machine. It assumes that the culture we currently find ourselves strapped with is flawed and should be altered. Within that context, the “verisimilitude” you speak of would be erected only as a means for critique--not to endorse, venerate, or fortify the status quo. Most Hollywood features are little more than moving catalogs. P: And in the case of Jurassic Park that couldn’t be more explicit – the “fake” products shown in the amusem*nt park gift shop in the film are the actual tie-in products available in stores and in Burger King at that time! Another film I could mention for a totally different reason is The Dark Backward (1991). Apparently due to a particular obsession of the director, the film is riddled with placements, but of totally fake and hilarious products (i.e. Blump’s Squeezable Bacon). Everyone who has seen the film remembers the absurdist products… couldn’t Josie and the puss*cats have followed this format, instead of loading the film with “funny” references to literally every megacorporation imaginable, and have been memorable for it? DZ: What do you think of the retroactive insertion of products into syndicated reruns of programs and films (using digital editing techniques)? Is this a troubling precedent? P: Again, to me the line is totally crossed. There’s no longer any justification to be made because the time and space of the original television show is lost at that point, so any possibility of “commentary” on the times, or development of the character, goes right out the window. Of course I find it a troubling precedent. It’s perhaps somewhat less troubling, but still distressing, to know that billboards on the walls of sports stadiums are being digitally altered, live, during broadcast, so that the products can be subtly switched around. And perhaps most disturbingly, at least here in the states, certain networks and programs have begun cross-dissolving to advertisem*nts from program content, and vice-versa. In other words, since the advertisers are aware that the long-established “blackout” which precedes the start of advertising breaks on TV causes people to tune out, or turn the volume off, or have their newfangled sensing devices “zap” the commercial… so they’re literally integrating the start of the ad with the final frames of the program instead of going black, literally becoming part of the program. And we have heard about more reliance of products WITHIN the programs, but this just takes us right back to TV’s past, where game show contestants sat behind enormous “Pepsodent” adverts pasted right there on the set. History will eat itself… DZ: Could you imagine a way advertisers could work product placement into films where modern products just don’t fit, like set in the past or in alternate universes (Star Wars, LOTR etc)? P: Can’t you? In fact, it’s already happening. Someone told us about the use of products in a recent set-in-the-past epic… but the name of the film is escaping me. S: And if you can’t find a way to insert a product placement in a film than maybe the film won’t get made. The problem is completely solved with films like Star Wars and Lord of the Rings—most of the characters are available in the store as action figures making them de facto placements. In Small Soldiers just about every toy-sized character was, in fact, nicely packaged by Hasbro. DZ: What is the role of the logo in product placement? S: There are the stars, and there are the many supporting roles—the logo is just one of them. We’re hoping to see this category at the next Oscars. P: And categories like “Best Song” are essentially product placement categories already… DZ: I’ve heard about the future of product placement being branding in computer games, interactive shop-at-home television – what other visions of the (branded) future can you imagine? P: The future is now. If you can’t watch a documentary on so-called public television in this country without having text boxes pop up on screen to suggest “related” web sites which “might be of interest” to the viewer, you’re already well on the way to being part of a branded environment. Computer games already have ads built-in, and shop-at-home already seems plenty interactive (and isn’t internet shopping, also?). I think if the various mega-corporations can not only convince people to wear clothing emblazoned with their logo and product name, but so successfully convince us to pay for the privilege of advertising them, then we are already living in a totally branded future. Where else can it go? It may seem a trite statement but, to my mind, wearing an entire Nike outfit is the ultimate. At least the British ad company called Cunning Stunts actually PAYS their human billboards… but those folks have to agree to have the company logo temporarily tattooed onto their foreheads for three hours as they mingle in public. I’m not joking about this. DZ: Is there any response to product placement? How can audiences manage their interactions with these texts? S: Films have been boycotted for culturally heinous content, such as racist and hom*ophobic characters. Why not boycott films because of their commodity content? Or better yet boycott the product for colluding with the filmmakers to invade your peace of mind? What I hope Value-Added Cinema does is sensitize us to the insinuation of the products, so that we critically detect them, rather than passively allow them to pass before us. When that happens, when we’re just insensate recipients of those advertising ploys, we’re lost. DZ: Do you have anything to add to contemporary debates on culture jamming, especially the charge that culture jamming’s political power is limited by its use of logos and signs? Anne Moore has written that detourning ads ends up just re-iterating the logo - “because corporate lifeblood is profit, and profit comes from name recognition”, culture jammers are “trafficking in the same currency as the corporations” – what do you think of this? P: It’s an interesting assertion. But the best culture jams I’ve seen make total mincemeat of the product being parodied; just as you can’t simply discount the use of actual products in films in the context of a narrative, you can’t NOT try to reclaim the use of a brand-name. Maybe it’s a dangerous comparison because “reclaiming” use of the word co*ke is not like reclaiming the use of the word “queer”, but there’s something to it, I think. Also, I wear t-shirts with the names of bands I like sometimes (almost always my friends’ bands, but I suppose that’s beside the point). Am I buying into the advertising concept? Yes, to a certain extent, I am. I guess to me it’s about just what you choose to advertise. Or what you choose to parody. DZ: Do you have any other points you’d like to make about product placement, advertising by stealth, branding, mindshare or logos? P: I think what Steve said, that above all we hope with our video to help make people aware of how much they are advertised to, beyond accepting it as a mere annoyance, sums it up. So far, we’ve had some comments at screenings which indicate a willingness of people to want to combat this in their lives, to want to “do something” about the onslaught of product placement surrounding them, in films and elsewhere. Works Cited ET: The Extraterrestrial. Dir. Steven Spielberg. Prod. Kathleen Kennedy & Steven Spielberg, M. Universal Pictures 1982. Shopping. Dir. Paul Anderson. Prod. Jeremy Bolt , M. Concorde Pictures,1993. http://www.cspinet.org/ http://www.productplacementawards.com/ Links http://www.cspinet.org/ http://www.productplacementawards.com/ Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Zuvela, Danni. "An Interview with the Makers of Value-Added Cinema" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0306/03-valueadded.php>. APA Style Zuvela, D. (2003, Jun 19). An Interview with the Makers of Value-Added Cinema. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0306/03-valueadded.php>

18

Currie, Susan, and Donna Lee Brien. "Mythbusting Publishing: Questioning the ‘Runaway Popularity’ of Published Biography and Other Life Writing." M/C Journal 11, no.4 (July1, 2008). http://dx.doi.org/10.5204/mcj.43.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Introduction: Our current obsession with the lives of others “Biography—that is to say, our creative and non-fictional output devoted to recording and interpreting real lives—has enjoyed an extraordinary renaissance in recent years,” writes Nigel Hamilton in Biography: A Brief History (1). Ian Donaldson agrees that biography is back in fashion: “Once neglected within the academy and relegated to the dustier recesses of public bookstores, biography has made a notable return over recent years, emerging, somewhat surprisingly, as a new cultural phenomenon, and a new academic adventure” (23). For over a decade now, commentators having been making similar observations about our obsession with the intimacies of individual people’s lives. In a lecture in 1994, Justin Kaplan asserted the West was “a culture of biography” (qtd. in Salwak 1) and more recent research findings by John Feather and Hazel Woodbridge affirm that “the undiminished human curiosity about other peoples lives is clearly reflected in the popularity of autobiographies and biographies” (218). At least in relation to television, this assertion seems valid. In Australia, as in the USA and the UK, reality and other biographically based television shows have taken over from drama in both the numbers of shows produced and the viewers these shows attract, and these forms are also popular in Canada (see, for instance, Morreale on The Osbournes). In 2007, the program Biography celebrated its twentieth anniversary season to become one of the longest running documentary series on American television; so successful that in 1999 it was spun off into its own eponymous channel (Rak; Dempsey). Premiered in May 1996, Australian Story—which aims to utilise a “personal approach” to biographical storytelling—has won a significant viewership, critical acclaim and professional recognition (ABC). It can also be posited that the real home movies viewers submit to such programs as Australia’s Favourite Home Videos, and “chat” or “confessional” television are further reflections of a general mania for biographical detail (see Douglas), no matter how fragmented, sensationalized, or even inane and cruel. A recent example of the latter, the USA-produced The Moment of Truth, has contestants answering personal questions under polygraph examination and then again in front of an audience including close relatives and friends—the more “truthful” their answers (and often, the more humiliated and/or distressed contestants are willing to be), the more money they can win. Away from television, but offering further evidence of this interest are the growing readerships for personally oriented weblogs and networking sites such as MySpace and Facebook (Grossman), individual profiles and interviews in periodical publications, and the recently widely revived newspaper obituary column (Starck). Adult and community education organisations run short courses on researching and writing auto/biographical forms and, across Western countries, the family history/genealogy sections of many local, state, and national libraries have been upgraded to meet the increasing demand for these services. Academically, journals and e-mail discussion lists have been established on the topics of biography and autobiography, and North American, British, and Australian universities offer undergraduate and postgraduate courses in life writing. The commonly aired wisdom is that published life writing in its many text-based forms (biography, autobiography, memoir, diaries, and collections of personal letters) is enjoying unprecedented popularity. It is our purpose to examine this proposition. Methodological problems There are a number of problems involved in investigating genre popularity, growth, and decline in publishing. Firstly, it is not easy to gain access to detailed statistics, which are usually only available within the industry. Secondly, it is difficult to ascertain how publishing statistics are gathered and what they report (Eliot). There is the question of whether bestselling booklists reflect actual book sales or are manipulated marketing tools (Miller), although the move from surveys of booksellers to electronic reporting at point of sale in new publishing lists such as BookScan will hopefully obviate this problem. Thirdly, some publishing lists categorise by subject and form, some by subject only, and some do not categorise at all. This means that in any analysis of these statistics, a decision has to be made whether to use the publishing list’s system or impose a different mode. If the publishing list is taken at face value, the question arises of whether to use categorisation by form or by subject. Fourthly, there is the bedeviling issue of terminology. Traditionally, there reigned a simple dualism in the terminology applied to forms of telling the true story of an actual life: biography and autobiography. Publishing lists that categorise their books, such as BookScan, have retained it. But with postmodern recognition of the presence of the biographer in a biography and of the presence of other subjects in an autobiography, the dichotomy proves false. There is the further problem of how to categorise memoirs, diaries, and letters. In the academic arena, the term “life writing” has emerged to describe the field as a whole. Within the genre of life writing, there are, however, still recognised sub-genres. Academic definitions vary, but generally a biography is understood to be a scholarly study of a subject who is not the writer; an autobiography is the story of a entire life written by its subject; while a memoir is a segment or particular focus of that life told, again, by its own subject. These terms are, however, often used interchangeably even by significant institutions such the USA Library of Congress, which utilises the term “biography” for all. Different commentators also use differing definitions. Hamilton uses the term “biography” to include all forms of life writing. Donaldson discusses how the term has been co-opted to include biographies of place such as Peter Ackroyd’s London: The Biography (2000) and of things such as Lizzie Collingham’s Curry: A Biography (2005). This reflects, of course, a writing/publishing world in which non-fiction stories of places, creatures, and even foodstuffs are called biographies, presumably in the belief that this will make them more saleable. The situation is further complicated by the emergence of hybrid publishing forms such as, for instance, the “memoir-with-recipes” or “food memoir” (Brien, Rutherford and Williamson). Are such books to be classified as autobiography or put in the “cookery/food & drink” category? We mention in passing the further confusion caused by novels with a subtitle of The Biography such as Virginia Woolf’s Orlando. The fifth methodological problem that needs to be mentioned is the increasing globalisation of the publishing industry, which raises questions about the validity of the majority of studies available (including those cited herein) which are nationally based. Whether book sales reflect what is actually read (and by whom), raises of course another set of questions altogether. Methodology In our exploration, we were fundamentally concerned with two questions. Is life writing as popular as claimed? And, if it is, is this a new phenomenon? To answer these questions, we examined a range of available sources. We began with the non-fiction bestseller lists in Publishers Weekly (a respected American trade magazine aimed at publishers, librarians, booksellers, and literary agents that claims to be international in scope) from their inception in 1912 to the present time. We hoped that this data could provide a longitudinal perspective. The term bestseller was coined by Publishers Weekly when it began publishing its lists in 1912; although the first list of popular American books actually appeared in The Bookman (New York) in 1895, based itself on lists appearing in London’s The Bookman since 1891 (Bassett and Walter 206). The Publishers Weekly lists are the best source of longitudinal information as the currently widely cited New York Times listings did not appear till 1942, with the Wall Street Journal a late entry into the field in 1994. We then examined a number of sources of more recent statistics. We looked at the bestseller lists from the USA-based Amazon.com online bookseller; recent research on bestsellers in Britain; and lists from Nielsen BookScan Australia, which claims to tally some 85% or more of books sold in Australia, wherever they are published. In addition to the reservations expressed above, caveats must be aired in relation to these sources. While Publishers Weekly claims to be an international publication, it largely reflects the North American publishing scene and especially that of the USA. Although available internationally, Amazon.com also has its own national sites—such as Amazon.co.uk—not considered here. It also caters to a “specific computer-literate, credit-able clientele” (Gutjahr: 219) and has an unashamedly commercial focus, within which all the information generated must be considered. In our analysis of the material studied, we will use “life writing” as a genre term. When it comes to analysis of the lists, we have broken down the genre of life writing into biography and autobiography, incorporating memoir, letters, and diaries under autobiography. This is consistent with the use of the terminology in BookScan. Although we have broken down the genre in this way, it is the overall picture with regard to life writing that is our concern. It is beyond the scope of this paper to offer a detailed analysis of whether, within life writing, further distinctions should be drawn. Publishers Weekly: 1912 to 2006 1912 saw the first list of the 10 bestselling non-fiction titles in Publishers Weekly. It featured two life writing texts, being headed by an autobiography, The Promised Land by Russian Jewish immigrant Mary Antin, and concluding with Albert Bigelow Paine’s six-volume biography, Mark Twain. The Publishers Weekly lists do not categorise non-fiction titles by either form or subject, so the classifications below are our own with memoir classified as autobiography. In a decade-by-decade tally of these listings, there were 3 biographies and 20 autobiographies in the lists between 1912 and 1919; 24 biographies and 21 autobiographies in the 1920s; 13 biographies and 40 autobiographies in the 1930s; 8 biographies and 46 biographies in the 1940s; 4 biographies and 14 autobiographies in the 1950s; 11 biographies and 13 autobiographies in the 1960s; 6 biographies and 11 autobiographies in the 1970s; 3 biographies and 19 autobiographies in the 1980s; 5 biographies and 17 autobiographies in the 1990s; and 2 biographies and 7 autobiographies from 2000 up until the end of 2006. See Appendix 1 for the relevant titles and authors. Breaking down the most recent figures for 1990–2006, we find a not radically different range of figures and trends across years in the contemporary environment. The validity of looking only at the top ten books sold in any year is, of course, questionable, as are all the issues regarding sources discussed above. But one thing is certain in terms of our inquiry. There is no upwards curve obvious here. If anything, the decade break-down suggests that sales are trending downwards. This is in keeping with the findings of Michael Korda, in his history of twentieth-century bestsellers. He suggests a consistent longitudinal picture across all genres: In every decade, from 1900 to the end of the twentieth century, people have been reliably attracted to the same kind of books […] Certain kinds of popular fiction always do well, as do diet books […] self-help books, celebrity memoirs, sensationalist scientific or religious speculation, stories about pets, medical advice (particularly on the subjects of sex, longevity, and child rearing), folksy wisdom and/or humour, and the American Civil War (xvii). Amazon.com since 2000 The USA-based Amazon.com online bookselling site provides listings of its own top 50 bestsellers since 2000, although only the top 14 bestsellers are recorded for 2001. As fiction and non-fiction are not separated out on these lists and no genre categories are specified, we have again made our own decisions about what books fall into the category of life writing. Generally, we erred on the side of inclusion. (See Appendix 2.) However, when it came to books dealing with political events, we excluded books dealing with specific aspects of political practice/policy. This meant excluding books on, for instance, George Bush’s so-called ‘war on terror,’ of which there were a number of bestsellers listed. In summary, these listings reveal that of the top 364 books sold by Amazon from 2000 to 2007, 46 (or some 12.6%) were, according to our judgment, either biographical or autobiographical texts. This is not far from the 10% of the 1912 Publishers Weekly listing, although, as above, the proportion of bestsellers that can be classified as life writing varied dramatically from year to year, with no discernible pattern of peaks and troughs. This proportion tallied to 4% auto/biographies in 2000, 14% in 2001, 10% in 2002, 18% in 2003 and 2004, 4% in 2005, 14% in 2006 and 20% in 2007. This could suggest a rising trend, although it does not offer any consistent trend data to suggest sales figures may either continue to grow, or fall again, in 2008 or afterwards. Looking at the particular texts in these lists (see Appendix 2) also suggests that there is no general trend in the popularity of life writing in relation to other genres. For instance, in these listings in Amazon.com, life writing texts only rarely figure in the top 10 books sold in any year. So rarely indeed, that from 2001 there were only five in this category. In 2001, John Adams by David McCullough was the best selling book of the year; in 2003, Hillary Clinton’s autobiographical Living History was 7th; in 2004, My Life by Bill Clinton reached number 1; in 2006, Nora Ephron’s I Feel Bad About My Neck: and Other Thoughts on Being a Woman was 9th; and in 2007, Ishmael Beah’s discredited A Long Way Gone: Memoirs of a Boy Soldier came in at 8th. Apart from McCulloch’s biography of Adams, all the above are autobiographical texts, while the focus on leading political figures is notable. Britain: Feather and Woodbridge With regard to the British situation, we did not have actual lists and relied on recent analysis. John Feather and Hazel Woodbridge find considerably higher levels for life writing in Britain than above with, from 1998 to 2005, 28% of British published non-fiction comprising autobiography, while 8% of hardback and 5% of paperback non-fiction was biography (2007). Furthermore, although Feather and Woodbridge agree with commentators that life writing is currently popular, they do not agree that this is a growth state, finding the popularity of life writing “essentially unchanged” since their previous study, which covered 1979 to the early 1990s (Feather and Reid). Australia: Nielsen BookScan 2006 and 2007 In the Australian publishing industry, where producing books remains an ‘expensive, risky endeavour which is increasingly market driven’ (Galligan 36) and ‘an inherently complex activity’ (Carter and Galligan 4), the most recent Australian Bureau of Statistics figures reveal that the total numbers of books sold in Australia has remained relatively static over the past decade (130.6 million in the financial year 1995–96 and 128.8 million in 2003–04) (ABS). During this time, however, sales volumes of non-fiction publications have grown markedly, with a trend towards “non-fiction, mass market and predictable” books (Corporall 41) resulting in general non-fiction sales in 2003–2004 outselling general fiction by factors as high as ten depending on the format—hard- or paperback, and trade or mass market paperback (ABS 2005). However, while non-fiction has increased in popularity in Australia, the same does not seem to hold true for life writing. Here, in utilising data for the top 5,000 selling non-fiction books in both 2006 and 2007, we are relying on Nielsen BookScan’s categorisation of texts as either biography or autobiography. In 2006, no works of life writing made the top 10 books sold in Australia. In looking at the top 100 books sold for 2006, in some cases the subjects of these works vary markedly from those extracted from the Amazon.com listings. In Australia in 2006, life writing makes its first appearance at number 14 with convicted drug smuggler Schapelle Corby’s My Story. This is followed by another My Story at 25, this time by retired Australian army chief, Peter Cosgrove. Jonestown: The Power and Myth of Alan Jones comes in at 34 for the Australian broadcaster’s biographer Chris Masters; the biography, The Innocent Man by John Grisham at 38 and Li Cunxin’s autobiographical Mao’s Last Dancer at 45. Australian Susan Duncan’s memoir of coping with personal loss, Salvation Creek: An Unexpected Life makes 50; bestselling USA travel writer Bill Bryson’s autobiographical memoir of his childhood The Life and Times of the Thunderbolt Kid 69; Mandela: The Authorised Portrait by Rosalind Coward, 79; and Joanne Lees’s memoir of dealing with her kidnapping, the murder of her partner and the justice system in Australia’s Northern Territory, No Turning Back, 89. These books reveal a market preference for autobiographical writing, and an almost even split between Australian and overseas subjects in 2006. 2007 similarly saw no life writing in the top 10. The books in the top 100 sales reveal a downward trend, with fewer titles making this band overall. In 2007, Terri Irwin’s memoir of life with her famous husband, wildlife warrior Steve Irwin, My Steve, came in at number 26; musician Andrew Johns’s memoir of mental illness, The Two of Me, at 37; Ayaan Hirst Ali’s autobiography Infidel at 39; John Grogan’s biography/memoir, Marley and Me: Life and Love with the World’s Worst Dog, at 42; Sally Collings’s biography of the inspirational young survivor Sophie Delezio, Sophie’s Journey, at 51; and Elizabeth Gilbert’s hybrid food, self-help and travel memoir, Eat, Pray, Love: One Woman’s Search for Everything at 82. Mao’s Last Dancer, published the year before, remained in the top 100 in 2007 at 87. When moving to a consideration of the top 5,000 books sold in Australia in 2006, BookScan reveals only 62 books categorised as life writing in the top 1,000, and only 222 in the top 5,000 (with 34 titles between 1,000 and 1,999, 45 between 2,000 and 2,999, 48 between 3,000 and 3,999, and 33 between 4,000 and 5,000). 2007 shows a similar total of 235 life writing texts in the top 5,000 bestselling books (75 titles in the first 1,000, 27 between 1,000 and 1,999, 51 between 2,000 and 2,999, 39 between 3,000 and 3,999, and 43 between 4,000 and 5,000). In both years, 2006 and 2007, life writing thus not only constituted only some 4% of the bestselling 5,000 titles in Australia, it also showed only minimal change between these years and, therefore, no significant growth. Conclusions Our investigation using various instruments that claim to reflect levels of book sales reveals that Western readers’ willingness to purchase published life writing has not changed significantly over the past century. We find no evidence of either a short, or longer, term growth or boom in sales in such books. Instead, it appears that what has been widely heralded as a new golden age of life writing may well be more the result of an expanded understanding of what is included in the genre than an increased interest in it by either book readers or publishers. What recent years do appear to have seen, however, is a significantly increased interest by public commentators, critics, and academics in this genre of writing. We have also discovered that the issue of our current obsession with the lives of others tends to be discussed in academic as well as popular fora as if what applies to one sub-genre or production form applies to another: if biography is popular, then autobiography will also be, and vice versa. If reality television programming is attracting viewers, then readers will be flocking to life writing as well. Our investigation reveals that such propositions are questionable, and that there is significant research to be completed in mapping such audiences against each other. This work has also highlighted the difficulty of separating out the categories of written texts in publishing studies, firstly in terms of determining what falls within the category of life writing as distinct from other forms of non-fiction (the hybrid problem) and, secondly, in terms of separating out the categories within life writing. Although we have continued to use the terms biography and autobiography as sub-genres, we are aware that they are less useful as descriptors than they are often assumed to be. In order to obtain a more complete and accurate picture, publishing categories may need to be agreed upon, redefined and utilised across the publishing industry and within academia. This is of particular importance in the light of the suggestions (from total sales volumes) that the audiences for books are limited, and therefore the rise of one sub-genre may be directly responsible for the fall of another. Bair argues, for example, that in the 1980s and 1990s, the popularity of what she categorises as memoir had direct repercussions on the numbers of birth-to-death biographies that were commissioned, contracted, and published as “sales and marketing staffs conclude[d] that readers don’t want a full-scale life any more” (17). Finally, although we have highlighted the difficulty of using publishing statistics when there is no common understanding as to what such data is reporting, we hope this study shows that the utilisation of such material does add a depth to such enquiries, especially in interrogating the anecdotal evidence that is often quoted as data in publishing and other studies. Appendix 1 Publishers Weekly listings 1990–1999 1990 included two autobiographies, Bo Knows Bo by professional athlete Bo Jackson (with Dick Schaap) and Ronald Reagan’s An America Life: An Autobiography. In 1991, there were further examples of life writing with unimaginative titles, Me: Stories of My Life by Katherine Hepburn, Nancy Reagan: The Unauthorized Biography by Kitty Kelley, and Under Fire: An American Story by Oliver North with William Novak; as indeed there were again in 1992 with It Doesn’t Take a Hero: The Autobiography of Norman Schwarzkopf, Sam Walton: Made in America, the autobiography of the founder of Wal-Mart, Diana: Her True Story by Andrew Morton, Every Living Thing, yet another veterinary outpouring from James Herriot, and Truman by David McCullough. In 1993, radio shock-jock Howard Stern was successful with the autobiographical Private Parts, as was Betty Eadie with her detailed recounting of her alleged near-death experience, Embraced by the Light. Eadie’s book remained on the list in 1994 next to Don’t Stand too Close to a Naked Man, comedian Tim Allen’s autobiography. Flag-waving titles continue in 1995 with Colin Powell’s My American Journey, and Miss America, Howard Stern’s follow-up to Private Parts. 1996 saw two autobiographical works, basketball superstar Dennis Rodman’s Bad as I Wanna Be and figure-skater, Ekaterina Gordeeva’s (with EM Swift) My Sergei: A Love Story. In 1997, Diana: Her True Story returns to the top 10, joining Frank McCourt’s Angela’s Ashes and prolific biographer Kitty Kelly’s The Royals, while in 1998, there is only the part-autobiography, part travel-writing A Pirate Looks at Fifty, by musician Jimmy Buffet. There is no biography or autobiography included in either the 1999 or 2000 top 10 lists in Publishers Weekly, nor in that for 2005. In 2001, David McCullough’s biography John Adams and Jack Welch’s business memoir Jack: Straight from the Gut featured. In 2002, Let’s Roll! Lisa Beamer’s tribute to her husband, one of the heroes of 9/11, written with Ken Abraham, joined Rudolph Giuliani’s autobiography, Leadership. 2003 saw Hillary Clinton’s autobiography Living History and Paul Burrell’s memoir of his time as Princess Diana’s butler, A Royal Duty, on the list. In 2004, it was Bill Clinton’s turn with My Life. In 2006, we find John Grisham’s true crime (arguably a biography), The Innocent Man, at the top, Grogan’s Marley and Me at number three, and the autobiographical The Audacity of Hope by Barack Obama in fourth place. Appendix 2 Amazon.com listings since 2000 In 2000, there were only two auto/biographies in the top Amazon 50 bestsellers with Lance Armstrong’s It’s Not about the Bike: My Journey Back to Life about his battle with cancer at 20, and Dave Eggers’s self-consciously fictionalised memoir, A Heartbreaking Work of Staggering Genius at 32. In 2001, only the top 14 bestsellers were recorded. At number 1 is John Adams by David McCullough and, at 11, Jack: Straight from the Gut by USA golfer Jack Welch. In 2002, Leadership by Rudolph Giuliani was at 12; Master of the Senate: The Years of Lyndon Johnson by Robert Caro at 29; Portrait of a Killer: Jack the Ripper by Patricia Cornwell at 42; Blinded by the Right: The Conscience of an Ex-Conservative by David Brock at 48; and Louis Gerstner’s autobiographical Who Says Elephants Can’t Dance: Inside IBM’s Historic Turnaround at 50. In 2003, Living History by Hillary Clinton was 7th; Benjamin Franklin: An American Life by Walter Isaacson 14th; Dereliction of Duty: The Eyewitness Account of How President Bill Clinton Endangered America’s Long-Term National Security by Robert Patterson 20th; Under the Banner of Heaven: A Story of Violent Faith by Jon Krakauer 32nd; Leap of Faith: Memoirs of an Unexpected Life by Queen Noor of Jordan 33rd; Kate Remembered, Scott Berg’s biography of Katharine Hepburn, 37th; Who’s your Caddy?: Looping for the Great, Near Great and Reprobates of Golf by Rick Reilly 39th; The Teammates: A Portrait of a Friendship about a winning baseball team by David Halberstam 42nd; and Every Second Counts by Lance Armstrong 49th. In 2004, My Life by Bill Clinton was the best selling book of the year; American Soldier by General Tommy Franks was 16th; Kevin Phillips’s American Dynasty: Aristocracy, Fortune and the Politics of Deceit in the House of Bush 18th; Timothy Russert’s Big Russ and Me: Father and Son. Lessons of Life 20th; Tony Hendra’s Father Joe: The Man who Saved my Soul 23rd; Ron Chernow’s Alexander Hamilton 27th; co*kie Roberts’s Founding Mothers: The Women Who Raised our Nation 31st; Kitty Kelley’s The Family: The Real Story of the Bush Dynasty 42nd; and Chronicles, Volume 1 by Bob Dylan was 43rd. In 2005, auto/biographical texts were well down the list with only The Year of Magical Thinking by Joan Didion at 45 and The Glass Castle: A Memoir by Jeanette Walls at 49. In 2006, there was a resurgence of life writing with Nora Ephron’s I Feel Bad About My Neck: and Other Thoughts on Being a Woman at 9; Grisham’s The Innocent Man at 12; Bill Buford’s food memoir Heat: an Amateur’s Adventures as Kitchen Slave, Line Cook, Pasta-Maker, and Apprentice to a Dante-Quoting Butcher in Tuscany at 23; more food writing with Julia Child’s My Life in France at 29; Immaculée Ilibagiza’s Left to Tell: Discovering God amidst the Rwandan Holocaust at 30; CNN anchor Anderson Cooper’s Dispatches from the Edge: A Memoir of War, Disasters and Survival at 43; and Isabella Hatkoff’s Owen & Mzee: The True Story of a Remarkable Friendship (between a baby hippo and a giant tortoise) at 44. In 2007, Ishmael Beah’s discredited A Long Way Gone: Memoirs of a Boy Soldier came in at 8; Walter Isaacson’s Einstein: His Life and Universe 13; Ayaan Hirst Ali’s autobiography of her life in Muslim society, Infidel, 18; The Reagan Diaries 25; Jesus of Nazareth by Pope Benedict XVI 29; Mother Teresa: Come be my Light 36; Clapton: The Autobiography 40; Tina Brown’s The Diana Chronicles 45; Tony Dungy’s Quiet Strength: The Principles, Practices & Priorities of a Winning Life 47; and Daniel Tammet’s Born on a Blue Day: Inside the Extraordinary Mind of an Autistic Savant at 49. Acknowledgements A sincere thank you to Michael Webster at RMIT for assistance with access to Nielsen BookScan statistics, and to the reviewers of this article for their insightful comments. Any errors are, of course, our own. References Australian Broadcasting Commission (ABC). “About Us.” Australian Story 2008. 1 June 2008. ‹http://www.abc.net.au/austory/aboutus.htm>. Australian Bureau of Statistics. “1363.0 Book Publishers, Australia, 2003–04.” 2005. 1 June 2008 ‹http://www.abs.gov.au/ausstats/abs@.nsf/mf/1363.0>. Bair, Deirdre “Too Much S & M.” Sydney Morning Herald 10–11 Sept. 2005: 17. Basset, Troy J., and Christina M. Walter. “Booksellers and Bestsellers: British Book Sales as Documented by The Bookman, 1891–1906.” Book History 4 (2001): 205–36. Brien, Donna Lee, Leonie Rutherford, and Rosemary Williamson. “Hearth and Hotmail: The Domestic Sphere as Commodity and Community in Cyberspace.” M/C Journal 10.4 (2007). 1 June 2008 ‹http://journal.media-culture.org.au/0708/10-brien.php>. Carter, David, and Anne Galligan. “Introduction.” Making Books: Contemporary Australian Publishing. St Lucia: U of Queensland P, 2007. 1–14. Corporall, Glenda. Project Octopus: Report Commissioned by the Australian Society of Authors. Sydney: Australian Society of Authors, 1990. Dempsey, John “Biography Rewrite: A&E’s Signature Series Heads to Sib Net.” Variety 4 Jun. 2006. 1 June 2008 ‹http://www.variety.com/article/VR1117944601.html?categoryid=1238&cs=1>. Donaldson, Ian. “Matters of Life and Death: The Return of Biography.” Australian Book Review 286 (Nov. 2006): 23–29. Douglas, Kate. “‘Blurbing’ Biographical: Authorship and Autobiography.” Biography 24.4 (2001): 806–26. Eliot, Simon. “Very Necessary but not Sufficient: A Personal View of Quantitative Analysis in Book History.” Book History 5 (2002): 283–93. Feather, John, and Hazel Woodbridge. “Bestsellers in the British Book Industry.” Publishing Research Quarterly 23.3 (Sept. 2007): 210–23. Feather, JP, and M Reid. “Bestsellers and the British Book Industry.” Publishing Research Quarterly 11.1 (1995): 57–72. Galligan, Anne. “Living in the Marketplace: Publishing in the 1990s.” Publishing Studies 7 (1999): 36–44. Grossman, Lev. “Time’s Person of the Year: You.” Time 13 Dec. 2006. Online edition. 1 June 2008 ‹http://www.time.com/time/magazine/article/0%2C9171%2C1569514%2C00.html>. Gutjahr, Paul C. “No Longer Left Behind: Amazon.com, Reader Response, and the Changing Fortunes of the Christian Novel in America.” Book History 5 (2002): 209–36. Hamilton, Nigel. Biography: A Brief History. Cambridge, MA: Harvard UP, 2007. Kaplan, Justin. “A Culture of Biography.” The Literary Biography: Problems and Solutions. Ed. Dale Salwak. Basingstoke: Macmillan, 1996. 1–11. Korda, Michael. Making the List: A Cultural History of the American Bestseller 1900–1999. New York: Barnes & Noble, 2001. Miller, Laura J. “The Bestseller List as Marketing Tool and Historical Fiction.” Book History 3 (2000): 286–304. Morreale, Joanne. “Revisiting The Osbournes: The Hybrid Reality-Sitcom.” Journal of Film and Video 55.1 (Spring 2003): 3–15. Rak, Julie. “Bio-Power: CBC Television’s Life & Times and A&E Network’s Biography on A&E.” LifeWriting 1.2 (2005): 1–18. Starck, Nigel. “Capturing Life—Not Death: A Case For Burying The Posthumous Parallax.” Text: The Journal of the Australian Association of Writing Programs 5.2 (2001). 1 June 2008 ‹http://www.textjournal.com.au/oct01/starck.htm>.

19

Potts, Graham. ""I Want to Pump You Up!" Lance Armstrong, Alex Rodriguez, and the Biopolitics of Data- and Analogue-Flesh." M/C Journal 16, no.6 (November6, 2013). http://dx.doi.org/10.5204/mcj.726.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

The copyrighting of digital augmentations (our data-flesh), their privatization and ownership by others from a vast distance that is simultaneously instantly telematically surmountable started simply enough. It was the initially innocuous corporatization of language and semiotics that started the deeper ontological flip, which placed the posthuman bits and parts over the posthuman that thought that it was running things. The posthumans in question, myself included, didn't help things much when, for instance, we all clicked an unthinking or unconcerned "yes" to Facebook® or Gmail®'s "terms and conditions of use" policies that gives them the real ownership and final say over those data based augments of sociality, speech, and memory. Today there is growing popular concern (or at least acknowledgement) over the surveillance of these augmentations by government, especially after the Edward Snowden NSA leaks. The same holds true for the dataveillance of data-flesh (i.e. Gmail® or Facebook® accounts) by private corporations for reasons of profit and/or at the behest of governments for reasons of "national security." While drawing a picture of this (bodily) state, of the intrusion through language of brands into our being and their coterminous policing of intelligible and iterative body boundaries and extensions, I want to address the next step in copyrighted augmentation, one that is current practice in professional sport, and part of the bourgeoning "anti-aging" industry, with rewriting of cellular structure and hormonal levels, for a price, on the open market. What I want to problematize is the contradiction between the rhetorical moralizing against upgrading the analogue-flesh, especially with respect to celebrity sports stars like Lance Armstrong and Alex Rodriquez, all the while the "anti-aging" industry does the same without censor. Indeed, it does so within the context of the contradictory social messaging and norms that our data-flesh and electric augmentations receive to constantly upgrade. I pose the question of the contradiction between the messages given to our analogue-flesh and data-flesh in order to examine the specific site of commentary on professional sports stars and their practices, but also to point to the ethical gap that exists not just for (legal) performance enhancing drugs (PED), but also to show the link to privatized and copyrighted genomic testing, the dataveillance of this information, and subsequent augmentations that may be undertaken because of the results. Copyrighted Language and Semiotics as Gateway Drug The corporatization of language and semiotics came about with an intrusion of exclusively held signs from the capitalist economy into language. This makes sense if one want to make surplus value greater: stamp a name onto something, especially a base commodity like a food product, and build up the name of that stamp, however one will, so that that name has perceived value in and of itself, and then charge as much as one can for it. Such is the story of the lack of real correlation between the price of Starbucks Coffee® and coffee as a commodity, set by Starbucks® on the basis of the cultural worth of the symbols and signs associated with it, rather than by what they pay for the labor and production costs prior to its branding. But what happens to these legally protected stamps once they start acting as more than just a sign and referent to a subsection of a specific commodity or thing? Once the stamp has worth and a life that is socially determined? What happens when these stamps get verbed, adjectived, and nouned? Naomi Klein, in the book that the New York Times referred to as a "movement bible" for the anti-globalization forces of the late 1990s said "logos, by the force of ubiquity, have become the closest thing we have to an international language, recognized and understood in many more places than English" (xxxvi). But there is an inherent built-in tension of copyrighted language and semiotics that illustrates the coterminous problems with data- and analogue-flesh augments. "We have almost two centuries' worth of brand-name history under our collective belt, coalescing to create a sort of global pop-cultural Morse code. But there is just one catch: while we may all have the code implanted in our brains, we're not really allowed to use it" (Klein 176). Companies want their "brands to be the air you breathe in - but don't dare exhale" or otherwise try to engage in a two-way dialogue that alters the intended meaning (Klein 182). Private signs power first-world and BRIC capitalism, language, and bodies. I do not have a coffee in the morning; I have Starbucks®. I do not speak on a cellular phone; I speak iPhone®. I am not using my computer right now; I am writing MacBook Air®. I do not look something up, search it, or research it; I Google® it. Klein was writing before the everyday uptake of sophisticated miniaturized and mobile computing and communication devices. With the digitalization of our senses and electronic limbs this viral invasion of language became material, effecting both our data- and analogue-flesh. The trajectory? First we used it; then we wore it as culturally and socially demarcating clothing; and finally we no longer used copyrighted speech terms: it became an always-present augmentation, an adjective to the lexicon body of language, and thereby out of democratic semiotic control. Today Twitter® is our (140 character limited) medium of speech. Skype® is our sense of sight, the way we have "real" face-to-face communication. Yelp® has extended our sense of taste and smell through restaurant reviews. The iPhone® is our sense of hearing. And OkCupid® and/or Grindr® and other sites and apps have become the skin of our sexual organs (and the site where they first meet). Today, love at first sight happens through .jpeg extensions; our first sexual experience ranked on a scale of risk determined by the type of video feed file format used: was it "protected" enough to stop its "spread"? In this sense the corporatization of language and semiotics acted as the gateway drug to corporatized digital-flesh; from use of something that is external to us to an augmentation that is part of us and indeed may be in excess of us or any notion of a singular liberal subject.Replacement of Analogue-Flesh? Arguably, this could be viewed as the coming to be of the full replacement of the fleshy analogue body by what are, or started as digital augmentations. Is this what Marshall McLuhan meant when he spoke of the "electronic exteriorization of the central nervous system" through the growing complexity of our "electric extensions"? McLuhan's work that spoke of the "global village" enabled by new technologies is usually read as a euphoric celebration of the utopic possibilities of interconnectivity. What these misreadings overlook is the darker side of his thought, where the "cultural probe" picks up the warning signals of the change to come, so that a Christian inspired project, a cultural Noah’s Ark, can be created to save the past from the future to come (Coupland). Jean Baudrillard, Paul Virilio, and Guy Debord have analyzed this replacement of the real and the changes to the relations between people—one I am arguing is branded/restricted—by offering us the terms simulacrum (Baudrillard), substitution (Virilio), and spectacle (Debord). The commonality which links Baudrillard and Virilio, but not Debord, is that the former two do not explicitly situate their critique as being within the loss of the real that they then describe. Baudrillard expresses that he can have a 'cool detachment' from his subject (Forget Foucault/Forget Baudrillard), while Virilio's is a Catholic moralist's cry lamenting the disappearance of the heterogeneous experiential dimensions in transit along the various axes of space and time. What differentiates Debord is that he had no qualms positioning his own person and his text, The Society of the Spectacle (SotS), as within its own subject matter - a critique that is limited, and acknowledged as such, by the blindness of its own inescapable horizon.This Revolt Will Be Copyrighted Yet today the analogue - at the least - performs a revolt in or possibly in excess of the spectacle that seeks its containment. How and at what site is the revolt by the analogue-flesh most viewable? Ironically, in the actions of celebrity professional sports stars and the Celebrity Class in general. Today it revolts against copyrighted data-flesh with copyrighted analogue-flesh. This is even the case when the specific site of contestation is (at least the illusion of) immortality, where the runaway digital always felt it held the trump card. A regimen of Human Growth Hormone (HGH) and other PEDs purports to do the same thing, if not better, at the cellular level, than the endless youth paraded in the unaging photo employed by the Facebook or Grindr Bodies®. But with the everyday use and popularization of drugs and enhancement supplements like HGH and related PEDs there is something more fundamental at play than the economic juggernaut that is the Body Beautiful; more than fleshy jealousy of Photoshopped® electronic skins. This drug use represents the logical extension of the ethics that drive our tech-wired lives. We are told daily to upgrade: our sexual organs (OkCupid® or Grindr®) for a better, more accurate match; our memory (Google® services) for largeness and safe portability; and our hearing and sight (iPhone® or Skype®) for increase connectivity, engaging the "real" (that we have lost). These upgrades are controlled and copyrighted, but that which grows the economy is an especially favored moral act in an age of austerity. Why should it be surprising, then, that with the economic backing of key players of Google®—kingpin of the global for-profit dataveillance racket—that for $99.95 23andMe® will send one a home DNA test kit, which once returned will be analyzed for genetic issues, with a personalized web-interface, including "featured links." Analogue-flesh fights back with willing copyrighted dataveillance of its genetic code. The test and the personalized results allow for augmentations of the Angelina Jolie type: private testing for genetic markers, a double mastectomy provided by private healthcare, followed by copyrighted replacement flesh. This is where we find the biopolitics of data- and analogue-flesh, lead forth, in an ironic turn, by the Celebrity Class, whom depend for their income on the lives of their posthuman bodies. This is a complete reversal of the course Debord charts out for them: The celebrity, the spectacular representation of a living human being, embodies this banality by embodying the image of a possible role. Being a star means specializing in the seemingly lived; the star is the object of identification with the shallow seeming life that has to compensate for the fragmented productive specializations which are actually lived. (SotS) While the electronic global village was to have left the flesh-and-blood as waste, today there is resistance by the analogue from where we would least expect it - attempts to catch up and replant itself as ontologically prior to the digital through legal medical supplementation; to make the posthuman the posthuman. We find the Celebrity Class at the forefront of the resistance, of making our posthuman bodies as controlled augmentations of a posthuman. But there is a definite contradiction as well, specifically in the press coverage of professional sports. The axiomatic ethical and moral sentiment of our age to always upgrade data-flesh and analogue-flesh is contradicted in professional sports by the recent suspensions of Lance Armstrong and Alex Rodriguez and the political and pundit critical commentary on their actions. Nancy Reagan to the Curbside: An Argument for Lance Armstrong and Alex Rodriguez's "Just Say Yes to Drugs" Campaign Probably to the complete shock of most of my family, friends, students, and former lovers who may be reading this, I actually follow sports reporting with great detail and have done so for years. That I never speak of any sports in my everyday interactions, haven't played a team or individual sport since I could speak (and thereby use my voice to inform my parents that I was refusing to participate), and even decline amateur or minor league play, like throwing a ball of any kind at a family BBQ, leaves me to, like Judith Butler, "give an account of oneself." And this accounting for my sports addiction is not incidental or insignificant with respect either to how the posthuman present can move from a state of posthumanism to one of posthumanism, nor my specific interpellation into (and excess) in either of those worlds. Recognizing that I will not overcome my addiction without admitting my problem, this paper is thus a first-step public acknowledgement: I have been seeing "Dr. C" for a period of three years, and together, through weekly appointments, we have been working through this issue of mine. (Now for the sake of avoiding the cycle of lying that often accompanies addiction I should probably add that Dr. C is a chiropractor who I see for back and nerve damage issues, and the talk therapy portion, a safe space to deal with the sports addiction, was an organic outgrowth of the original therapy structure). My data-flesh that had me wired in and sitting all the time had done havoc to the analogue-flesh. My copyrighted augments were demanding that I do something to remedy a situation where I was unable to be sitting and wired in all the time. Part of the treatment involved the insertion of many acupuncture needles in various parts of my body, and then having an electric current run through them for a sustained period of time. Ironically, as it was the wired augmentations that demanded this, due to my immobility at this time - one doesn't move with acupuncture needles deep within the body - I was forced away from my devices and into unmediated conversation with Dr. C about sports, celebrity sports stars, and the recent (argued) infractions by Armstrong and Rodriguez. Now I say "argued" because in the first place are what A-Rod and Armstrong did, or are accused of doing, the use of PEDs, HGH, and all the rest (cf. Lupica; Thompson, and Vinton) really a crime? Are they on their way, or are there real threats of jail and criminal prosecution? And in the most important sense, and despite all the rhetoric, are they really going against prevailing social norms with respect to medical enhancement? No, no, and no. What is peculiar about the "witch-hunt" of A-Rod and Armstrong - their words - is that we are undertaking it in the first place, while high-end boutique medical clinics (and internet pharmacies) offer the same treatment for analogue-flesh. Fixes for the human in posthuman; ways of keeping the human up to speed; arguably the moral equivalent, if done so with free will, of upgrading the software for ones iOS device. If the critiques of Baudrillard and Virilio are right, we seem to find nothing wrong with crippling our physical bodies and social skills by living through computers and telematic technologies, and obsess over the next upgrade that will make us (more) faster and quicker (than the other or others), while we righteously deny the same process to the flesh for those who, in Debord's description, are the most complicit in the spectacle, to the supposedly most posthuman of us - those that have become pure spectacle (Debord), pure simulation (Baudrillard), a total substitution (Virilio). But it seems that celebrities, and sports celebrities in specific haven't gone along for the ride of never-ending play of their own signifiers at the expense of doing away with the real; they were not, in Debord's words, content with "specializing in the seemingly lived"; they wanted, conversely, to specialize in the most maximally lived flesh, right down to cellular regeneration towards genetic youth, which is the strongest claim in favor of taking HGH. It looks like they were prepared to, in the case of Armstrong, engage in the "most sophisticated, professionalized and successful doping program that sport has ever seen" in the name of the flesh (BBC). But a doping program that can, for the most part, be legally obtained as treatment, and in the same city as A-Rod plays in and is now suspended for his "crimes" to boot (NY Vitality). This total incongruence between what is desired, sought, and obtained legally by members of their socioeconomic class, and many classes below as well, and is a direct outgrowth of the moral and ethical axiomatic of the day is why A-Rod and Armstrong are so bemused, indignant, and angry, if not in a state of outright denial that they did anything that was wrong, even while they admit, explicitly, that yes, they did what they are accused of doing: taking the drugs. Perhaps another way is needed to look at the unprecedentedly "harsh" and "long" sentences of punishment handed out to A-Rod and Armstrong. The posthuman governing bodies of the sports of the society of the spectacle in question realize that their spectacle machines are being pushed back at. A real threat because it goes with the grain of where the rest of us, or those that can buy in at the moment, are going. And this is where the talk therapy for my sports addiction with Dr. C falls into the story. I realized that the electrified needles were telling me that I too should put the posthuman back in control of my damaged flesh; engage in a (medically copyrighted) piece of performance philosophy and offset some of the areas of possible risk that through restricted techne 23andMe® had (arguably) found. Dr. C and I were peeved with A-Rod and Armstrong not for what they did, but what they didn't tell us. We wanted better details than half-baked admissions of moral culpability. We wanted exact details on what they'd done to keep up to their digital-flesh. Their media bodies were cultural probes, full in view, while their flesh bodies, priceless lab rats, are hidden from view (and likely to remain so due to ongoing litigation). These were, after all, big money cover-ups of (likely) the peak of posthuman science, and the lab results are now hidden behind an army of sports federations lawyers, and agents (and A-Rod's own army since he still plays); posthuman progress covered up by posthuman rules, sages, and agents of manipulation. Massive posthuman economies of spectacle, simulation, or substitution of the real putting as much force as they can bare on resurgent posthuman flesh - a celebrity flesh those economies, posthuman economies, want to see as utterly passive like Debord, but whose actions are showing unexpected posthuman alignment with the flesh. Why are the centers of posthumanist power concerned? Because once one sees that A-Rod and Armstrong did it, once one sees that others are doing the same legally without a fuss being made, then one can see that one can do the same; make flesh-and-blood keep up, or regrow and become more organically youthful, while OkCupid® or Grindr® data-flesh gets stuck with the now lagging Photoshopped® touchups. Which just adds to my desire to get "pumped up"; add a little of A-Rod and Armstrong's concoction to my own routine; and one of a long list of reasons to throw Nancy Reagan under the bus: to "just say yes to drugs." A desire that is tempered by the recognition that the current limits of intelligibility and iteration of subjects, the work of defining the bodies that matter that is now set by copyrighted language and copyrighted electric extensions is only being challenged within this society of the spectacle by an act that may give a feeling of unease for cause. This is because it is copyrighted genetic testing and its dataveillance and manipulation through copyrighted medical technology - the various branded PEDs, HGH treatments, and their providers - that is the tool through which the flesh enacts this biopolitical "rebellion."References Baudrillard, Jean. Forget Foucault/Forget Baudrillard. Trans Nicole Dufresne. Los Angeles: Semiotext(e), 2007. ————. Simulations. Trans. Paul Foss, Paul Patton and Philip Beitchman. Cambridge: Semiotext(e), 1983. BBC. "Lance Armstong: Usada Report Labels Him 'a Serial Cheat.'" BBC Online 11 Oct. 2012. 1 Dec. 2013 ‹http://www.bbc.co.uk/sport/0/cycling/19903716›. Butler, Judith. Giving an Account of Oneself. New York: Fordham University Press, 2005. Clark, Taylor. Starbucked: A Double Tall Tale of Caffeine, Commerce, and Culture. New York: Back Bay, 2008. Coupland, Douglas. Marshall McLuhan. Toronto: Penguin Books, 2009. Debord, Guy. Society of the Spectacle. Detroit: Black & Red: 1977. Klein, Naomi. No Logo: Taking Aim at the Brand Bullies. Toronto: Knopf Canada, 1999. Lupica, Mike. "Alex Rodriguez Beginning to Look a Lot like Lance Armstrong." NY Daily News. 6 Oct. 2013. 1 Dec. 2013 ‹http://www.nydailynews.com/sports/baseball/lupica-a-rod-tour-de-lance-article-1.1477544›. McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill Book Company, 1964. NY Vitality. "Testosterone Treatment." NY Vitality. 1 Dec. 2013 ‹http://vitalityhrt.com/hgh.html›. Thompson, Teri, and Nathaniel Vinton. "What Does Alex Rodriguez Hope to Accomplish by Following Lance Armstrong's Legal Blueprint?" NY Daily News 5 Oct. 2013. 1 Dec. 2013 ‹http://www.nydailynews.com/sports/i-team/a-rod-hope-accomplish-lance-blueprint-article-1.1477280›. Virilio, Paul. Speed and Politics. Trans. Mark Polizzotti. New York: Semiotext(e), 1986.

20

Marshall,P.David. "Seriality and Persona." M/C Journal 17, no.3 (June11, 2014). http://dx.doi.org/10.5204/mcj.802.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

No man [...] can wear one face to himself and another to the multitude, without finally getting bewildered as to which one may be true. (Nathaniel Hawthorne Scarlet Letter – as seen and pondered by Tony Soprano at Bowdoin College, The Sopranos, Season 1, Episode 5: “College”)The fictitious is a particular and varied source of insight into the everyday world. The idea of seriality—with its variations of the serial, series, seriated—is very much connected to our patterns of entertainment. In this essay, I want to begin the process of testing what values and meanings can be drawn from the idea of seriality into comprehending the play of persona in contemporary culture. From a brief overview of the intersection of persona and seriality as well as a review of the deployment of seriality in popular culture, the article focuses on the character/ person-actor relationship to demonstrate how seriality produces persona. The French term for character—personnage—will be used to underline the clear relations between characterisation, person, and persona which have been developed by the recent work by Lenain and Wiame. Personnage, through its variation on the word person helps push the analysis into fully understanding the particular and integrated configuration between a public persona and the fictional role that an actor inhabits (Heinich).There are several qualities related to persona that allow this movement from the fictional world to the everyday world to be profitable. Persona, in terms of origins, in and of itself implies performance and display. Jung, for instance, calls persona a mask where one is “acting a role” (167); while Goffman considers that performance and roles are at the centre of everyday life and everyday forms and patterns of communication. In recent work, I have use persona to describe how online culture pushes most people to construct a public identity that resembles what celebrities have had to construct for their livelihood for at least the last century (“Persona”; “Self”). My work has expanded to an investigation of how online persona relates to individual agency (“Agency”) and professional postures and positioning (Barbour and Marshall).The fictive constructions then are intensified versions of what persona is addressing: the fabrication of a role for particular directions and ends. Characters or personnages are constructed personas for very directed ends. Their limitation to the study of persona as a dimension of public culture is that they are not real; however, when one thinks of the actor who takes on this fictive identity, there is clearly a relationship between the real personality and that of the character. Moreover, as Nayar’s analysis of highly famous characters that are fictitious reveals, these celebrated characters, such as Harry Potter or Wolverine, sometime take on a public presence in and of themselves. To capture this public movement of a fictional character, Nayar blends the terms celebrity with fiction and calls these semi-public/semi-real entities “celefiction”: the characters are famous, highly visible, and move across media, information, and cultural platforms with ease and speed (18-20). Their celebrity status underlines their power to move outside of their primary text into public discourse and through public spaces—an extra-textual movement which fundamentally defines what a celebrity embodies.Seriality has to be seen as fundamental to a personnage’s power of and extension into the public world. For instance with Harry Potter again, at least some of his recognition is dependent on the linking or seriating the related books and movies. Seriality helps organise our sense of affective connection to our popular culture. The familiarity of some element of repetition is both comforting for audiences and provides at least a sense of guarantee or warranty that they will enjoy the future text as much as they enjoyed the past related text. Seriality, though, also produces a myriad of other effects and affects which provides a useful background to understand its utility in both the understanding of character and its value in investigating contemporary public persona. Etymologically, the words “series” and seriality are from the Latin and refer to “succession” in classical usage and are identified with ancestry and the patterns of identification and linking descendants (Oxford English Dictionary). The original use of the seriality highlights its value in understanding the formation of the constitution of person and persona and how the past and ancestry connect in series to the current or contemporary self. Its current usage, however, has broadened metaphorically outwards to identify anything that is in sequence or linked or joined: it can be a series of lectures and arguments or a related mark of cars manufactured in a manner that are stylistically linked. It has since been deployed to capture the production process of various cultural forms and one of the key origins of this usage came from the 19th century novel. There are many examples where the 19th century novel was sold and presented in serial form that are too numerous to even summarise here. It is useful to use Dickens’ serial production as a defining example of how seriality moved into popular culture and the entertainment industry more broadly. Part of the reason for the sheer length of many of Charles Dickens’ works related to their original distribution as serials. In fact, all his novels were first distributed in chapters in monthly form in magazines or newspapers. A number of related consequences from Dickens’ serialisation are relevant to understanding seriality in entertainment culture more widely (Hayward). First, his novel serialisation established a continuous connection to his readers over years. Thus Dickens’ name itself became synonymous and connected to an international reading public. Second, his use of seriality established a production form that was seen to be more affordable to its audience: seriality has to be understood as a form that is closely connected to economies and markets as cultural commodities kneaded their way into the structure of everyday life. And third, seriality established through repetition not only the author’s name but also the name of the key characters that populated the cultural form. Although not wholly attributable to the serial nature of the delivery, the characters such as Oliver Twist, Ebenezer Scrooge or David Copperfield along with a host of other major and minor players in his many books become integrated into everyday discourse because of their ever-presence and delayed delivery over stories over time (see Allen 78-79). In the same way that newspapers became part of the vernacular of contemporary culture, fictional characters from novels lived for years at a time in the consciousness of this large reading public. The characters or personnages themselves became personalities that through usage became a way of describing other behaviours. One can think of Uriah Heep and his sheer obsequiousness in David Copperfield as a character-type that became part of popular culture thinking and expressing a clear negative sentiment about a personality trait. In the twentieth century, serials became associated much more with book series. One of the more successful serial genres was the murder mystery. It developed what could be described as recognisable personnages that were both fictional and real. Thus, the real Agatha Christie with her consistent and prodigious production of short who-dunnit novels was linked to her Belgian fictional detective Hercule Poirot. Variations of these serial constructions occurred in children’s fiction, the emerging science fiction genre, and westerns with authors and characters rising to related prominence.In a similar vein, early to mid-twentieth century film produced the film serial. In its production and exhibition, the film serial was a déclassé genre in its overt emphasis on the economic quality of seriality. Thus, the film serial was generally a filler genre that was interspersed before and after a feature film in screenings (Dixon). As well as producing a familiarity with characters such as Flash Gordon, it was also instrumental in producing actors with a public profile that grew from this repetition. Flash Gordon was not just a character; he was also the actor Buster Crabbe and, over time, the association became indissoluble for audiences and actor alike. Feature film serials also developed in the first half-century of American cinema in particular with child actors like Shirley Temple, Mickey Rooney and Judy Garland often reprising variations of their previous roles. Seriality more or less became the standard form of delivery of broadcast media for most of the last 70 years and this was driven by the economies of production it developed. Whether the production was news, comedy, or drama, most radio and television forms were and are variation of serials. As well as being the zenith of seriality, television serials have been the most studied form of seriality of all cultural forms and are thus the greatest source of research into what serials actually produced. The classic serial that began on radio and migrated to television was the soap opera. Although most of the long-running soap operas have now disappeared, many have endured for more than 30 years with the American series The Guiding Light lasting 72 years and the British soap Coronation Street now in its 64th year. Australian nighttime soap operas have managed a similar longevity: Neighbours is in its 30th year, while Home and Away is in its 27th year. Much of the analyses of soap operas and serials deals with the narrative and the potential long narrative arcs related to characters and storylines. In contrast to most evening television serials historically, soap operas maintain the continuity from one episode to the next in an unbroken continuity narrative. Evening television serials, such as situation comedies, while maintaining long arcs over their run are episodic in nature: the structure of the story is generally concluded in the given episode with at least partial closure in a manner that is never engaged with in the never-ending soap opera serials.Although there are other cultural forms that deploy seriality in their structures—one can think of comic books and manga as two obvious other connected and highly visible serial sources—online and video games represent the other key media platform of serials in contemporary culture. Once again, a “horizon of expectation” (Jauss and De Man 23) motivates the iteration of new versions of games by the industry. New versions of games are designed to build on gamer loyalties while augmenting the quality and possibilities of the particular game. Game culture and gamers have a different structural relationship to serials which at least Denson and Jahn-Sudmann describe as digital seriality: a new version of a game is also imagined to be technologically more sophisticated in its production values and this transformation of the similitude of game structure with innovation drives the economy of what are often described as “franchises.” New versions of Minecraft as online upgrades or Call of Duty launches draw the literal reinvestment of the gamer. New consoles provide a further push to serialisation of games as they accentuate some transformed quality in gameplay, interaction, or quality of animated graphics. Sports franchises are perhaps the most serialised form of game: to replicate new professional seasons in each major sport, the sports game transforms with a new coterie of players each year.From these various venues, one can see the centrality of seriality in cultural forms. There is no question that one of the dimensions of seriality that transcends these cultural forms is its coordination and intersection with the development of the industrialisation of culture and this understanding of the economic motivation behind series has been explored from some of the earliest analyses of seriality (see Hagedorn; Browne). Also, seriality has been mined extensively in terms of its production of the pleasure of repetition and transformation. The exploration of the popular, whether in studies of readers of romance fiction (Radway), or fans of science fiction television (Tulloch and Jenkins; Jenkins), serials have provided the resource for the exploration of the power of the audience to connect, engage and reconstruct texts.The analysis of the serialisation of character—the production of a public personnage—and its relation to persona surprisingly has been understudied. While certain writers have remarked on the longevity of a certain character, such as Vicky Lord’s 40 year character on the soap opera One Life to Live, and the interesting capacity to maintain both complicated and hidden storylines (de Kosnik), and fan audience studies have looked at the parasocial-familiar relationship that fan and character construct, less has been developed about the relationship of the serial character, the actor and a form of twinned public identity. Seriality does produce a patterning of personnage, a structure of familiarity for the audience, but also a structure of performance for the actor. For instance, in a longitudinal analysis of the character of Fu Manchu, Mayer is able to discern how a patterning of iconic form shapes, replicates, and reiterates the look of Fu Manchu across decades of films (Mayer). Similarly, there has been a certain work on the “taxonomy of character” where the serial character of a television program is analysed in terms of 6 parts: physical traits/appearance; speech patterns, psychological traits/habitual behaviours; interaction with other characters; environment; biography (Pearson quoted in Lotz).From seriality what emerges is a particular kind of “type-casting” where the actor becomes wedded to the specific iteration of the taxonomy of performance. As with other elements related to seriality, serial character performance is also closely aligned to the economic. Previously I have described this economic patterning of performance the “John Wayne Syndrome.” Wayne’s career developed into a form of serial performance where the individual born as Marion Morrison becomes structured into a cultural and economic category that determines the next film role. The economic weight of type also constructs the limits and range of the actor. Type or typage as a form of casting has always been an element of film and theatrical performance; but it is the seriality of performance—the actual construction of a personnage that flows between the fictional and real person—that allows an actor to claim a persona that can be exchanged within the industry. Even 15 years after his death, Wayne remained one of the most popular performers in the United States, his status unrivalled in its close definition of American value that became wedded with a conservative masculinity and politics (Wills).Type and typecasting have an interesting relationship to seriality. From Eisenstein’s original use of the term typage, where the character is chosen to fit into the meaning of the film and the image was placed into its sequence to make that meaning, it generally describes the circ*mscribing of the actor into their look. As Wojcik’s analysis reveals, typecasting in various periods of theatre and film acting has been seen as something to be fought for by actors (in the 1850s) and actively resisted in Hollywood in 1950 by the Screen Actors Guild in support of more range of roles for each actor. It is also seen as something that leads to cultural stereotypes that can reinforce the racial profiling that has haunted diverse cultures and the dangers of law enforcement for centuries (Wojcik 169-71). Early writers in the study of film acting, emphasised that its difference from theatre was that in film the actor and character converged in terms of connected reality and a physicality: the film actor was less a mask and more a sense of “being”(Kracauer). Cavell’s work suggested film over stage performance allowed an individuality over type to emerge (34). Thompson’s semiotic “commutation” test was another way of assessing the power of the individual “star” actor to be seen as elemental to the construction and meaning of the film role Television produced with regularity character-actors where performance and identity became indissoluble partly because of the sheer repetition and the massive visibility of these seriated performances.One of the most typecast individuals in television history was Leonard Nimoy as Spock in Star Trek: although the original Star Trek series ran for only three seasons, the physical caricature of Spock in the series as a half-Vulcan and half-human made it difficult for the actor Nimoy to exit the role (Laws). Indeed, his famous autobiography riffed on this mis-identity with the forceful but still economically powerful title I am Not Spock in 1975. When Nimoy perceived that his fans thought that he was unhappy in his role as Spock, he published a further tome—I Am Spock—that righted his relationship to his fictional identity and its continued source of roles for the previous 30 years. Although it is usually perceived as quite different in its constitution of a public identity, a very similar structure of persona developed around the American CBS news anchor Walter Cronkite. With his status as anchor confirmed in its power and centrality to American culture in his desk reportage of the assassination and death of President Kennedy in November 1963, Cronkite went on to inhabit a persona as the most trusted man in the United States by the sheer gravitas of hosting the Evening News stripped across every weeknight at 6:30pm for the next 19 years. In contrast to Nimoy, Cronkite became Cronkite the television news anchor, where persona, actor, and professional identity merged—at least in terms of almost all forms of the man’s visibility.From this vantage point of understanding the seriality of character/personnage and how it informs the idea of the actor, I want to provide a longer conclusion about how seriality informs the concept of persona in the contemporary moment. First of all, what this study reveals is the way in which the production of identity is overlaid onto any conception of identity itself. If we can understand persona not in any negative formulation, but rather as a form of productive performance of a public self, then it becomes very useful to see that these very visible public blendings of performance and the actor-self can make sense more generally as to how the public self is produced and constituted. My final and concluding examples will try and elucidate this insight further.In 2013, Netflix launched into the production of original drama with its release of House of Cards. The series itself was remarkable for a number of reasons. First among them, it was positioned as a quality series and clearly connected to the lineage of recent American subscription television programs such as The Sopranos, Six Feet Under, Dexter, Madmen, The Wire, Deadwood, and True Blood among a few others. House of Cards was an Americanised version of a celebrated British mini-series. In the American version, an ambitious party whip, Frank Underwood, manoeuvres with ruthlessness and the calculating support of his wife closer to the presidency and the heart and soul of American power. How the series expressed quality was at least partially in its choice of actors. The role of Frank Underwood was played by the respected film actor Kevin Spacey. His wife, Clare, was played by the equally high profile Robin Warren. Quality was also expressed through the connection of the audience of viewers to an anti-hero: a personnage that was not filled with virtue but moved with Machiavellian acuity towards his objective of ultimate power. This idea of quality emerged in many ways from the successful construction of the character of Tony Soprano by James Gandolfini in the acclaimed HBO television series The Sopranos that reconstructed the very conception of the family in organised crime. Tony Soprano was enacted as complex and conflicted with a sense of right and justice, but embedded in the personnage were psychological tropes and scars, and an understanding of the need for violence to maintain influence power and a perverse but natural sense of order (Martin).The new television serial character now embodied a larger code and coterie of acting: from The Sopranos, there is the underlying sense and sensibility of method acting (see Vineberg; Stanislavski). Gandolfini inhabited the role of Tony Soprano and used the inner and hidden drives and motivations to become the source for the display of the character. Likewise, Spacey inhabits Frank Underwood. In that new habitus of television character, the actor becomes subsumed by the role. Gandolfini becomes both over-determined by the role and his own identity as an actor becomes melded to the role. Kevin Spacey, despite his longer and highly visible history as a film actor is overwhelmed by the televisual role of Frank Underwood. Its serial power, where audiences connect for hours and hours, where the actor commits to weeks and weeks of shoots, and years and years of being the character—a serious character with emotional depth, with psychological motivation that rivals the most visceral of film roles—transforms the actor into a blended public person and the related personnage.This blend of fictional and public life is complex as much for the producing actor as it is for the audience that makes the habitus real. What Kevin Spacey/Frank Underwood inhabit is a blended persona, whose power is dependent on the constructed identity that is at source the actor’s production as much as any institutional form or any writer or director connected to making House of Cards “real.” There is no question that this serial public identity will be difficult for Kevin Spacey to disentangle when the series ends; in many ways it will be an elemental part of his continuing public identity. This is the economic power and risk of seriality.One can see similar blendings in the persona in popular music and its own form of contemporary seriality in performance. For example, Eminem is a stage name for a person sometimes called Marshall Mathers; but Eminem takes this a step further and produces beyond a character in its integration of the personal—a real personnage, Slim Shady, to inhabit his music and its stories. To further complexify this construction, Eminem relies on the production of his stories with elements that appear to be from his everyday life (Dawkins). His characterisations because of the emotional depth he inhabits through his rapped stories betray a connection to his own psychological state. Following in the history of popular music performance where the singer-songwriter’s work is seen by all to present a version of the public self that is closer emotionally to the private self, we once again see how the seriality of performance begins to produce a blended public persona. Rap music has inherited this seriality of produced identity from twentieth century icons of the singer/songwriter and its display of the public/private self—in reverse order from grunge to punk, from folk to blues.Finally, it is worthwhile to think of online culture in similar ways in the production of public personas. Seriality is elemental to online culture. Social media encourage the production of public identities through forms of repetition of that identity. In order to establish a public profile, social media users establish an identity with some consistency over time. The everydayness in the production of the public self online thus resembles the production and performance of seriality in fiction. Professional social media sites such as LinkedIn encourage the consistency of public identity and this is very important in understanding the new versions of the public self that are deployed in contemporary culture. However, much like the new psychological depth that is part of the meaning of serial characters such as Frank Underwood in House of Cards, Slim Shady in Eminem, or Tony Soprano in The Sopranos, social media seriality also encourages greater revelations of the private self via Instagram and Facebook walls and images. We are collectively reconstituted as personas online, seriated by the continuing presence of our online sites and regularly drawn to reveal more and greater depths of our character. In other words, the online persona resembles the new depth of the quality television serial personnage with elaborate arcs and great complexity. Seriality in our public identity is also uncovered in the production of our game avatars where, in order to develop trust and connection to friends in online settings, we maintain our identity and our patterns of gameplay. At the core of this online identity is a desire for visibility, and we are drawn to be “picked up” and shared in some repeatable form across what we each perceive as a meaningful dimension of culture. Through the circulation of viral images, texts, and videos we engage in a circulation and repetition of meaning that feeds back into the constancy and value of an online identity. Through memes we replicate and seriate content that at some level seriates personas in terms of humour, connection and value.Seriality is central to understanding the formation of our masks of public identity and is at least one valuable analytical way to understand the development of the contemporary persona. This essay represents the first foray in thinking through the relationship between seriality and persona.ReferencesBarbour, Kim, and P. David Marshall. “The Academic Online Constructing Persona.” First Monday 17.9 (2012).Browne, Nick. “The Political Economy of the (Super)Text.” Quarterly Review of Film Studies 9.3 (1984): 174-82. Cavell, Stanley. “Reflections on the Ontology of Film.” Movie Acting: The Film Reader. Ed. Wojcik and Pamela Robertson. London: Routledge, 2004 (1979). 29-35.Dawkins, Marcia Alesan. “Close to the Edge: Representational Tactics of Eminem.” The Journal of Popular Culture 43.3 (2010): 463-85.De Kosnik, Abigail. “One Life to Live: Soap Opera Storytelling.” How to Watch Television. Ed. Ethan Thompson and Jason Mittell. New York: New York University Press, 2013. 355-63.Denson, Shane, and Andreas Jahn-Sudmann. “Digital Seriality: On the Serial Aesthetics and Practice of Digital Games.” Journal of Computer Game Culture 7.1 (2013): 1-32.Dixon, Wheeler Winston. “Flash Gordon and the 1930s and 40s Science Fiction Serial.” Screening the Past 11 (2011). 20 May 2014.Goffman, Erving. The Presentation of Self in Everyday Life. Woodstock, New York: The Overlook Press, 1973.Hagedorn, Roger “Technology and Economic Exploitation: The Serial as a Form of Narrative Presentation.” Wide Angle 10. 4 (1988): 4-12.Hayward, Jennifer Poole. Consuming Pleasures: Active Audiences and Serial Fictions from Dickens to Soap Opera. Lexington: University Press of Kentucky, 1997.Heinrich, Nathalie. “Personne, Personnage, Personalité: L'acteur a L'ère De Sa Reproductibilité Technique.” Personne/Personnage. Eds. Thierry Lenain and Aline Wiame. Paris: Librairie Philosophique J. Vrin, 2011. 77-101.Jauss, Hans Robert, and Paul De Man. Toward an Aesthetic of Reception. Brighton: Harvester, 1982.Jenkins, Henry. Textual Poachers: Television Fans & Participatory Culture. New York: Routledge, 1992.Jung, C. G., et al. Two Essays on Analytical Psychology. 2nd ed. Princeton, N.J.: Princeton University Press, 1966.Kracauer, Siegfried. “Remarks on the Actor.” Movie Acting, the Film Reader. Ed. Pamela Robertson Wojcik. London: Routledge, 2004 (1960). 19-27.Leonard Nimoy & Pharrell Williams: Star Trek & Creating Spock. Ep. 12. Reserve Channel. December 2013. Lenain, Thierry, and Aline Wiame (eds.). Personne/Personnage. Librairie Philosophiques J. VRIN, 2011.Lotz, Amanda D. “House: Narrative Complexity.” How to Watch TV. Ed. Ethan Thompson and Jason Mittell. New York: New York University Press, 2013. 22-29.Marshall, P. David. “The Cate Blanchett Persona and the Allure of the Oscar.” The Conversation (2014). 4 April 2014.Marshall, P. David “Persona Studies: Mapping the Proliferation of the Public Self.” Journalism 15.2 (2014): 153-70.Marshall, P. David. “Personifying Agency: The Public–Persona–Place–Issue Continuum.” Celebrity Studies 4.3 (2013): 369-71.Marshall, P. David. “The Promotion and Presentation of the Self: Celebrity as Marker of Presentational Media.” Celebrity Studies 1.1 (2010): 35-48.Marshall, P. David. Celebrity and Power: Fame in Contemporary Culture. 2nd Ed. Minneapolis: University of Minnesota Press, 2014.Martin, Brett. Difficult Men: Behind the Scenes of a Creative Revolution: From The Sopranos and The Wire to Mad Men and Breaking Bad. London: Faber and Faber, 2013.Mayer, R. “Image Power: Seriality, Iconicity and the Mask of Fu Manchu.” Screen 53.4 (2012): 398-417.Nayar, Pramod K. Seeing Stars: Spectacle, Society, and Celebrity Culture. New Delhi; Thousand Oaks, California: Sage Publications, 2009.Nimoy, Leonard. I Am Not Spock. Milbrae, California: Celestial Arts, 1975.Nimoy, Leonard. I Am Spock. 1st ed. New York: Hyperion, 1995.Radway, Janice A. Reading the Romance: Women, Patriarchy, and Popular Literature. Chapel Hill: University of North Carolina Press, 1984.Stanislavski, Constantin. Creating a Role. New York: Routledge, 1989 (1961).Thompson, John O. “Screen Acting and the Commutation Test.” Movie Acting: The Film Reader. Ed. Pamela Robertson Wojcik. London: Routledge, 2004 (1978). 37-48.Tulloch, John, and Henry Jenkins. Science Fiction Audiences: Watching Doctor Who and Star Trek. London; New York: Routledge, 1995.Vineberg, Steve. Method Actors: Three Generations of an American Acting Style. New York; Toronto: Schirmer Books, 1991.Wills, Garry. John Wayne’s America: The Politics of Celebrity. New York: Simon & Schuster, 1997.Wojcik, Pamela Robertson. “Typecasting.” Movie Acting: The Film Reader. Ed. Pamela Robertson Wojcik. London: Routledge, 2004. 169-89.

21

Losh, Elizabeth. "Artificial Intelligence." M/C Journal 10, no.5 (October1, 2007). http://dx.doi.org/10.5204/mcj.2710.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

On the morning of Thursday, 4 May 2006, the United States House Permanent Select Committee on Intelligence held an open hearing entitled “Terrorist Use of the Internet.” The Intelligence committee meeting was scheduled to take place in Room 1302 of the Longworth Office Building, a Depression-era structure with a neoclassical façade. Because of a dysfunctional elevator, some of the congressional representatives were late to the meeting. During the testimony about the newest political applications for cutting-edge digital technology, the microphones periodically malfunctioned, and witnesses complained of “technical problems” several times. By the end of the day it seemed that what was to be remembered about the hearing was the shocking revelation that terrorists were using videogames to recruit young jihadists. The Associated Press wrote a short, restrained article about the hearing that only mentioned “computer games and recruitment videos” in passing. Eager to have their version of the news item picked up, Reuters made videogames the focus of their coverage with a headline that announced, “Islamists Using US Videogames in Youth Appeal.” Like a game of telephone, as the Reuters videogame story was quickly re-run by several Internet news services, each iteration of the title seemed less true to the exact language of the original. One Internet news service changed the headline to “Islamic militants recruit using U.S. video games.” Fox News re-titled the story again to emphasise that this alert about technological manipulation was coming from recognised specialists in the anti-terrorism surveillance field: “Experts: Islamic Militants Customizing Violent Video Games.” As the story circulated, the body of the article remained largely unchanged, in which the Reuters reporter described the digital materials from Islamic extremists that were shown at the congressional hearing. During the segment that apparently most captured the attention of the wire service reporters, eerie music played as an English-speaking narrator condemned the “infidel” and declared that he had “put a jihad” on them, as aerial shots moved over 3D computer-generated images of flaming oil facilities and mosques covered with geometric designs. Suddenly, this menacing voice-over was interrupted by an explosion, as a virtual rocket was launched into a simulated military helicopter. The Reuters reporter shared this dystopian vision from cyberspace with Western audiences by quoting directly from the chilling commentary and describing a dissonant montage of images and remixed sound. “I was just a boy when the infidels came to my village in Blackhawk helicopters,” a narrator’s voice said as the screen flashed between images of street-level gunfights, explosions and helicopter assaults. Then came a recording of President George W. Bush’s September 16, 2001, statement: “This crusade, this war on terrorism, is going to take a while.” It was edited to repeat the word “crusade,” which Muslims often define as an attack on Islam by Christianity. According to the news reports, the key piece of evidence before Congress seemed to be a film by “SonicJihad” of recorded videogame play, which – according to the experts – was widely distributed online. Much of the clip takes place from the point of view of a first-person shooter, seen as if through the eyes of an armed insurgent, but the viewer also periodically sees third-person action in which the player appears as a running figure wearing a red-and-white checked keffiyeh, who dashes toward the screen with a rocket launcher balanced on his shoulder. Significantly, another of the player’s hand-held weapons is a detonator that triggers remote blasts. As jaunty music plays, helicopters, tanks, and armoured vehicles burst into smoke and flame. Finally, at the triumphant ending of the video, a green and white flag bearing a crescent is hoisted aloft into the sky to signify victory by Islamic forces. To explain the existence of this digital alternative history in which jihadists could be conquerors, the Reuters story described the deviousness of the country’s terrorist opponents, who were now apparently modifying popular videogames through their wizardry and inserting anti-American, pro-insurgency content into U.S.-made consumer technology. One of the latest video games modified by militants is the popular “Battlefield 2” from leading video game publisher, Electronic Arts Inc of Redwood City, California. Jeff Brown, a spokesman for Electronic Arts, said enthusiasts often write software modifications, known as “mods,” to video games. “Millions of people create mods on games around the world,” he said. “We have absolutely no control over them. It’s like drawing a mustache on a picture.” Although the Electronic Arts executive dismissed the activities of modders as a “mustache on a picture” that could only be considered little more than childish vandalism of their off-the-shelf corporate product, others saw a more serious form of criminality at work. Testifying experts and the legislators listening on the committee used the video to call for greater Internet surveillance efforts and electronic counter-measures. Within twenty-four hours of the sensationalistic news breaking, however, a group of Battlefield 2 fans was crowing about the idiocy of reporters. The game play footage wasn’t from a high-tech modification of the software by Islamic extremists; it had been posted on a Planet Battlefield forum the previous December of 2005 by a game fan who had cut together regular game play with a Bush remix and a parody snippet of the soundtrack from the 2004 hit comedy film Team America. The voice describing the Black Hawk helicopters was the voice of Trey Parker of South Park cartoon fame, and – much to Parker’s amusem*nt – even the mention of “goats screaming” did not clue spectators in to the fact of a comic source. Ironically, the moment in the movie from which the sound clip is excerpted is one about intelligence gathering. As an agent of Team America, a fictional elite U.S. commando squad, the hero of the film’s all-puppet cast, Gary Johnston, is impersonating a jihadist radical inside a hostile Egyptian tavern that is modelled on the cantina scene from Star Wars. Additional laughs come from the fact that agent Johnston is accepted by the menacing terrorist cell as “Hakmed,” despite the fact that he utters a series of improbable clichés made up of incoherent stereotypes about life in the Middle East while dressed up in a disguise made up of shoe polish and a turban from a bathroom towel. The man behind the “SonicJihad” pseudonym turned out to be a twenty-five-year-old hospital administrator named Samir, and what reporters and representatives saw was nothing more exotic than game play from an add-on expansion pack of Battlefield 2, which – like other versions of the game – allows first-person shooter play from the position of the opponent as a standard feature. While SonicJihad initially joined his fellow gamers in ridiculing the mainstream media, he also expressed astonishment and outrage about a larger politics of reception. In one interview he argued that the media illiteracy of Reuters potentially enabled a whole series of category errors, in which harmless gamers could be demonised as terrorists. It wasn’t intended for the purpose what it was portrayed to be by the media. So no I don’t regret making a funny video . . . why should I? The only thing I regret is thinking that news from Reuters was objective and always right. The least they could do is some online research before publishing this. If they label me al-Qaeda just for making this silly video, that makes you think, what is this al-Qaeda? And is everything al-Qaeda? Although Sonic Jihad dismissed his own work as “silly” or “funny,” he expected considerably more from a credible news agency like Reuters: “objective” reporting, “online research,” and fact-checking before “publishing.” Within the week, almost all of the salient details in the Reuters story were revealed to be incorrect. SonicJihad’s film was not made by terrorists or for terrorists: it was not created by “Islamic militants” for “Muslim youths.” The videogame it depicted had not been modified by a “tech-savvy militant” with advanced programming skills. Of course, what is most extraordinary about this story isn’t just that Reuters merely got its facts wrong; it is that a self-identified “parody” video was shown to the august House Intelligence Committee by a team of well-paid “experts” from the Science Applications International Corporation (SAIC), a major contractor with the federal government, as key evidence of terrorist recruitment techniques and abuse of digital networks. Moreover, this story of media illiteracy unfolded in the context of a fundamental Constitutional debate about domestic surveillance via communications technology and the further regulation of digital content by lawmakers. Furthermore, the transcripts of the actual hearing showed that much more than simple gullibility or technological ignorance was in play. Based on their exchanges in the public record, elected representatives and government experts appear to be keenly aware that the digital discourses of an emerging information culture might be challenging their authority and that of the longstanding institutions of knowledge and power with which they are affiliated. These hearings can be seen as representative of a larger historical moment in which emphatic declarations about prohibiting specific practices in digital culture have come to occupy a prominent place at the podium, news desk, or official Web portal. This environment of cultural reaction can be used to explain why policy makers’ reaction to terrorists’ use of networked communication and digital media actually tells us more about our own American ideologies about technology and rhetoric in a contemporary information environment. When the experts come forward at the Sonic Jihad hearing to “walk us through the media and some of the products,” they present digital artefacts of an information economy that mirrors many of the features of our own consumption of objects of electronic discourse, which seem dangerously easy to copy and distribute and thus also create confusion about their intended meanings, audiences, and purposes. From this one hearing we can see how the reception of many new digital genres plays out in the public sphere of legislative discourse. Web pages, videogames, and Weblogs are mentioned specifically in the transcript. The main architecture of the witnesses’ presentation to the committee is organised according to the rhetorical conventions of a PowerPoint presentation. Moreover, the arguments made by expert witnesses about the relationship of orality to literacy or of public to private communications in new media are highly relevant to how we might understand other important digital genres, such as electronic mail or text messaging. The hearing also invites consideration of privacy, intellectual property, and digital “rights,” because moral values about freedom and ownership are alluded to by many of the elected representatives present, albeit often through the looking glass of user behaviours imagined as radically Other. For example, terrorists are described as “modders” and “hackers” who subvert those who properly create, own, legitimate, and regulate intellectual property. To explain embarrassing leaks of infinitely replicable digital files, witness Ron Roughead says, “We’re not even sure that they don’t even hack into the kinds of spaces that hold photographs in order to get pictures that our forces have taken.” Another witness, Undersecretary of Defense for Policy and International Affairs, Peter Rodman claims that “any video game that comes out, as soon as the code is released, they will modify it and change the game for their needs.” Thus, the implication of these witnesses’ testimony is that the release of code into the public domain can contribute to political subversion, much as covert intrusion into computer networks by stealthy hackers can. However, the witnesses from the Pentagon and from the government contractor SAIC often present a contradictory image of the supposed terrorists in the hearing transcripts. Sometimes the enemy is depicted as an organisation of technological masterminds, capable of manipulating the computer code of unwitting Americans and snatching their rightful intellectual property away; sometimes those from the opposing forces are depicted as pre-modern and even sub-literate political innocents. In contrast, the congressional representatives seem to focus on similarities when comparing the work of “terrorists” to the everyday digital practices of their constituents and even of themselves. According to the transcripts of this open hearing, legislators on both sides of the aisle express anxiety about domestic patterns of Internet reception. Even the legislators’ own Web pages are potentially disruptive electronic artefacts, particularly when the demands of digital labour interfere with their duties as lawmakers. Although the subject of the hearing is ostensibly terrorist Websites, Representative Anna Eshoo (D-California) bemoans the difficulty of maintaining her own official congressional site. As she observes, “So we are – as members, I think we’re very sensitive about what’s on our Website, and if I retained what I had on my Website three years ago, I’d be out of business. So we know that they have to be renewed. They go up, they go down, they’re rebuilt, they’re – you know, the message is targeted to the future.” In their questions, lawmakers identify Weblogs (blogs) as a particular area of concern as a destabilising alternative to authoritative print sources of information from established institutions. Representative Alcee Hastings (D-Florida) compares the polluting power of insurgent bloggers to that of influential online muckrakers from the American political Right. Hastings complains of “garbage on our regular mainstream news that comes from blog sites.” Representative Heather Wilson (R-New Mexico) attempts to project a media-savvy persona by bringing up the “phenomenon of blogging” in conjunction with her questions about jihadist Websites in which she notes how Internet traffic can be magnified by cooperative ventures among groups of ideologically like-minded content-providers: “These Websites, and particularly the most active ones, are they cross-linked? And do they have kind of hot links to your other favorite sites on them?” At one point Representative Wilson asks witness Rodman if he knows “of your 100 hottest sites where the Webmasters are educated? What nationality they are? Where they’re getting their money from?” In her questions, Wilson implicitly acknowledges that Web work reflects influences from pedagogical communities, economic networks of the exchange of capital, and even potentially the specific ideologies of nation-states. It is perhaps indicative of the government contractors’ anachronistic worldview that the witness is unable to answer Wilson’s question. He explains that his agency focuses on the physical location of the server or ISP rather than the social backgrounds of the individuals who might be manufacturing objectionable digital texts. The premise behind the contractors’ working method – surveilling the technical apparatus not the social network – may be related to other beliefs expressed by government witnesses, such as the supposition that jihadist Websites are collectively produced and spontaneously emerge from the indigenous, traditional, tribal culture, instead of assuming that Iraqi insurgents have analogous beliefs, practices, and technological awareness to those in first-world countries. The residual subtexts in the witnesses’ conjectures about competing cultures of orality and literacy may tell us something about a reactionary rhetoric around videogames and digital culture more generally. According to the experts before Congress, the Middle Eastern audience for these videogames and Websites is limited by its membership in a pre-literate society that is only capable of abortive cultural production without access to knowledge that is archived in printed codices. Sometimes the witnesses before Congress seem to be unintentionally channelling the ideas of the late literacy theorist Walter Ong about the “secondary orality” associated with talky electronic media such as television, radio, audio recording, or telephone communication. Later followers of Ong extend this concept of secondary orality to hypertext, hypermedia, e-mail, and blogs, because they similarly share features of both speech and written discourse. Although Ong’s disciples celebrate this vibrant reconnection to a mythic, communal past of what Kathleen Welch calls “electric rhetoric,” the defence industry consultants express their profound state of alarm at the potentially dangerous and subversive character of this hybrid form of communication. The concept of an “oral tradition” is first introduced by the expert witnesses in the context of modern marketing and product distribution: “The Internet is used for a variety of things – command and control,” one witness states. “One of the things that’s missed frequently is how and – how effective the adversary is at using the Internet to distribute product. They’re using that distribution network as a modern form of oral tradition, if you will.” Thus, although the Internet can be deployed for hierarchical “command and control” activities, it also functions as a highly efficient peer-to-peer distributed network for disseminating the commodity of information. Throughout the hearings, the witnesses imply that unregulated lateral communication among social actors who are not authorised to speak for nation-states or to produce legitimated expert discourses is potentially destabilising to political order. Witness Eric Michael describes the “oral tradition” and the conventions of communal life in the Middle East to emphasise the primacy of speech in the collective discursive practices of this alien population: “I’d like to point your attention to the media types and the fact that the oral tradition is listed as most important. The other media listed support that. And the significance of the oral tradition is more than just – it’s the medium by which, once it comes off the Internet, it is transferred.” The experts go on to claim that this “oral tradition” can contaminate other media because it functions as “rumor,” the traditional bane of the stately discourse of military leaders since the classical era. The oral tradition now also has an aspect of rumor. A[n] event takes place. There is an explosion in a city. Rumor is that the United States Air Force dropped a bomb and is doing indiscriminate killing. This ends up being discussed on the street. It ends up showing up in a Friday sermon in a mosque or in another religious institution. It then gets recycled into written materials. Media picks up the story and broadcasts it, at which point it’s now a fact. In this particular case that we were telling you about, it showed up on a network television, and their propaganda continues to go back to this false initial report on network television and continue to reiterate that it’s a fact, even though the United States government has proven that it was not a fact, even though the network has since recanted the broadcast. In this example, many-to-many discussion on the “street” is formalised into a one-to many “sermon” and then further stylised using technology in a one-to-many broadcast on “network television” in which “propaganda” that is “false” can no longer be disputed. This “oral tradition” is like digital media, because elements of discourse can be infinitely copied or “recycled,” and it is designed to “reiterate” content. In this hearing, the word “rhetoric” is associated with destructive counter-cultural forces by the witnesses who reiterate cultural truisms dating back to Plato and the Gorgias. For example, witness Eric Michael initially presents “rhetoric” as the use of culturally specific and hence untranslatable figures of speech, but he quickly moves to an outright castigation of the entire communicative mode. “Rhetoric,” he tells us, is designed to “distort the truth,” because it is a “selective” assembly or a “distortion.” Rhetoric is also at odds with reason, because it appeals to “emotion” and a romanticised Weltanschauung oriented around discourses of “struggle.” The film by SonicJihad is chosen as the final clip by the witnesses before Congress, because it allegedly combines many different types of emotional appeal, and thus it conveniently ties together all of the themes that the witnesses present to the legislators about unreliable oral or rhetorical sources in the Middle East: And there you see how all these products are linked together. And you can see where the games are set to psychologically condition you to go kill coalition forces. You can see how they use humor. You can see how the entire campaign is carefully crafted to first evoke an emotion and then to evoke a response and to direct that response in the direction that they want. Jihadist digital products, especially videogames, are effective means of manipulation, the witnesses argue, because they employ multiple channels of persuasion and carefully sequenced and integrated subliminal messages. To understand the larger cultural conversation of the hearing, it is important to keep in mind that the related argument that “games” can “psychologically condition” players to be predisposed to violence is one that was important in other congressional hearings of the period, as well one that played a role in bills and resolutions that were passed by the full body of the legislative branch. In the witness’s testimony an appeal to anti-game sympathies at home is combined with a critique of a closed anti-democratic system abroad in which the circuits of rhetorical production and their composite metonymic chains are described as those that command specific, unvarying, robotic responses. This sharp criticism of the artful use of a presentation style that is “crafted” is ironic, given that the witnesses’ “compilation” of jihadist digital material is staged in the form of a carefully structured PowerPoint presentation, one that is paced to a well-rehearsed rhythm of “slide, please” or “next slide” in the transcript. The transcript also reveals that the members of the House Intelligence Committee were not the original audience for the witnesses’ PowerPoint presentation. Rather, when it was first created by SAIC, this “expert” presentation was designed for training purposes for the troops on the ground, who would be facing the challenges of deployment in hostile terrain. According to the witnesses, having the slide show showcased before Congress was something of an afterthought. Nonetheless, Congressman Tiahrt (R-KN) is so impressed with the rhetorical mastery of the consultants that he tries to appropriate it. As Tiarht puts it, “I’d like to get a copy of that slide sometime.” From the hearing we also learn that the terrorists’ Websites are threatening precisely because they manifest a polymorphously perverse geometry of expansion. For example, one SAIC witness before the House Committee compares the replication and elaboration of digital material online to a “spiderweb.” Like Representative Eshoo’s site, he also notes that the terrorists’ sites go “up” and “down,” but the consultant is left to speculate about whether or not there is any “central coordination” to serve as an organising principle and to explain the persistence and consistency of messages despite the apparent lack of a single authorial ethos to offer a stable, humanised, point of reference. In the hearing, the oft-cited solution to the problem created by the hybridity and iterability of digital rhetoric appears to be “public diplomacy.” Both consultants and lawmakers seem to agree that the damaging messages of the insurgents must be countered with U.S. sanctioned information, and thus the phrase “public diplomacy” appears in the hearing seven times. However, witness Roughhead complains that the protean “oral tradition” and what Henry Jenkins has called the “transmedia” character of digital culture, which often crosses several platforms of traditional print, projection, or broadcast media, stymies their best rhetorical efforts: “I think the point that we’ve tried to make in the briefing is that wherever there’s Internet availability at all, they can then download these – these programs and put them onto compact discs, DVDs, or post them into posters, and provide them to a greater range of people in the oral tradition that they’ve grown up in. And so they only need a few Internet sites in order to distribute and disseminate the message.” Of course, to maintain their share of the government market, the Science Applications International Corporation also employs practices of publicity and promotion through the Internet and digital media. They use HTML Web pages for these purposes, as well as PowerPoint presentations and online video. The rhetoric of the Website of SAIC emphasises their motto “From Science to Solutions.” After a short Flash film about how SAIC scientists and engineers solve “complex technical problems,” the visitor is taken to the home page of the firm that re-emphasises their central message about expertise. The maps, uniforms, and specialised tools and equipment that are depicted in these opening Web pages reinforce an ethos of professional specialisation that is able to respond to multiple threats posed by the “global war on terror.” By 26 June 2006, the incident finally was being described as a “Pentagon Snafu” by ABC News. From the opening of reporter Jake Tapper’s investigative Webcast, established government institutions were put on the spot: “So, how much does the Pentagon know about videogames? Well, when it came to a recent appearance before Congress, apparently not enough.” Indeed, the very language about “experts” that was highlighted in the earlier coverage is repeated by Tapper in mockery, with the significant exception of “independent expert” Ian Bogost of the Georgia Institute of Technology. If the Pentagon and SAIC deride the legitimacy of rhetoric as a cultural practice, Bogost occupies himself with its defence. In his recent book Persuasive Games: The Expressive Power of Videogames, Bogost draws upon the authority of the “2,500 year history of rhetoric” to argue that videogames represent a significant development in that cultural narrative. Given that Bogost and his Watercooler Games Weblog co-editor Gonzalo Frasca were actively involved in the detective work that exposed the depth of professional incompetence involved in the government’s line-up of witnesses, it is appropriate that Bogost is given the final words in the ABC exposé. As Bogost says, “We should be deeply bothered by this. We should really be questioning the kind of advice that Congress is getting.” Bogost may be right that Congress received terrible counsel on that day, but a close reading of the transcript reveals that elected officials were much more than passive listeners: in fact they were lively participants in a cultural conversation about regulating digital media. After looking at the actual language of these exchanges, it seems that the persuasiveness of the misinformation from the Pentagon and SAIC had as much to do with lawmakers’ preconceived anxieties about practices of computer-mediated communication close to home as it did with the contradictory stereotypes that were presented to them about Internet practices abroad. In other words, lawmakers found themselves looking into a fun house mirror that distorted what should have been familiar artefacts of American popular culture because it was precisely what they wanted to see. References ABC News. “Terrorist Videogame?” Nightline Online. 21 June 2006. 22 June 2006 http://abcnews.go.com/Video/playerIndex?id=2105341>. Bogost, Ian. Persuasive Games: Videogames and Procedural Rhetoric. Cambridge, MA: MIT Press, 2007. Game Politics. “Was Congress Misled by ‘Terrorist’ Game Video? We Talk to Gamer Who Created the Footage.” 11 May 2006. http://gamepolitics.livejournal.com/285129.html#cutid1>. Jenkins, Henry. Convergence Culture: Where Old and New Media Collide. New York: New York UP, 2006. julieb. “David Morgan Is a Horrible Writer and Should Be Fired.” Online posting. 5 May 2006. Dvorak Uncensored Cage Match Forums. http://cagematch.dvorak.org/index.php/topic,130.0.html>. Mahmood. “Terrorists Don’t Recruit with Battlefield 2.” GGL Global Gaming. 16 May 2006 http://www.ggl.com/news.php?NewsId=3090>. Morgan, David. “Islamists Using U.S. Video Games in Youth Appeal.” Reuters online news service. 4 May 2006 http://today.reuters.com/news/ArticleNews.aspx?type=topNews &storyID=2006-05-04T215543Z_01_N04305973_RTRUKOC_0_US-SECURITY- VIDEOGAMES.xml&pageNumber=0&imageid=&cap=&sz=13&WTModLoc= NewsArt-C1-ArticlePage2>. Ong, Walter J. Orality and Literacy: The Technologizing of the Word. London/New York: Methuen, 1982. Parker, Trey. Online posting. 7 May 2006. 9 May 2006 http://www.treyparker.com>. Plato. “Gorgias.” Plato: Collected Dialogues. Princeton: Princeton UP, 1961. Shrader, Katherine. “Pentagon Surfing Thousands of Jihad Sites.” Associated Press 4 May 2006. SonicJihad. “SonicJihad: A Day in the Life of a Resistance Fighter.” Online posting. 26 Dec. 2005. Planet Battlefield Forums. 9 May 2006 http://www.forumplanet.com/planetbattlefield/topic.asp?fid=13670&tid=1806909&p=1>. Tapper, Jake, and Audery Taylor. “Terrorist Video Game or Pentagon Snafu?” ABC News Nightline 21 June 2006. 30 June 2006 http://abcnews.go.com/Nightline/Technology/story?id=2105128&page=1>. U.S. Congressional Record. Panel I of the Hearing of the House Select Intelligence Committee, Subject: “Terrorist Use of the Internet for Communications.” Federal News Service. 4 May 2006. Welch, Kathleen E. Electric Rhetoric: Classical Rhetoric, Oralism, and the New Literacy. Cambridge, MA: MIT Press, 1999. Citation reference for this article MLA Style Losh, Elizabeth. "Artificial Intelligence: Media Illiteracy and the SonicJihad Debacle in Congress." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0710/08-losh.php>. APA Style Losh, E. (Oct. 2007) "Artificial Intelligence: Media Illiteracy and the SonicJihad Debacle in Congress," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0710/08-losh.php>.

22

Brabazon, Tara. "A Red Light Sabre to Go, and Other Histories of the Present." M/C Journal 2, no.4 (June1, 1999). http://dx.doi.org/10.5204/mcj.1761.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

If I find out that you have bought a $90 red light sabre, Tara, well there's going to be trouble. -- Kevin Brabazon A few Saturdays ago, my 71-year old father tried to convince me of imminent responsibilities. As I am considering the purchase of a house, there are mortgages, bank fees and years of misery to endure. Unfortunately, I am not an effective Big Picture Person. The lure of the light sabre is almost too great. For 30 year old Generation Xers like myself, it is more than a cultural object. It is a textual anchor, and a necessary component to any future history of the present. Revelling in the aura of the Australian release for Star Wars: The Phantom Menace, this paper investigates popular memory, an undertheorised affiliation between popular culture and cultural studies.1 The excitement encircling the Star Wars prequel has been justified in terms of 'hype' or marketing. Such judgements frame the men and women cuing for tickets, talking Yodas and light sabres as fools or duped souls who need to get out more. My analysis explores why Star Wars has generated this enthusiasm, and how cultural studies can mobilise this passionate commitment to consider notions of popularity, preservation and ephemerality. We'll always have Tattooine. Star Wars has been a primary popular cultural social formation for a generation. The stories of Luke Skywalker, Princess Leia, Han Solo, Chewbacca, Darth Vader, Yoda, C-3PO and R2D2 offer an alternative narrative for the late 1970s and 1980s. It was a comfort to have the Royal Shakespearian tones of Alec Guinness confirming that the Force would be with us, through economic rationalism, unemployment, Pauline Hanson and Madonna discovering yoga. The Star Wars Trilogy, encompassing A New Hope, The Empire Strikes Back and Return of the Jedi, was released between 1977 and 1983. These films have rarely slipped from public attention, being periodically 'brought back' through new cinematic and video releases. The currency of Star Wars is matched with the other great popular cultural formations of the post-war period: the James Bond series and Star Trek. One reason for the continued success of these programmes is that other writers, film makers and producers cannot leave these texts alone. Bond survives not only through Pierce Brosnan's good looks, but the 'Hey Baby' antics of Austin Powers. Star Trek, through four distinct series, has become an industry that will last longer than Voyager's passage back from the Delta Quadrant. Star Wars, perhaps even more effectively than the other popular cultural heavyweights, has enmeshed itself into other filmic and televisual programming. Films like Spaceballs and television quizzes on Good News Week keep the knowledge system and language current and pertinent.2 Like Umberto Eco realised of Casablanca, Star Wars is "a living example of living textuality" (199). Both films are popular because of imperfections and intertextual archetypes, forming a filmic quilt of sensations and affectivities. Viewers are aware that "the cliches are talking among themselves" (Eco 209). As these cinematic texts move through time, the depth and commitment of these (con)textual dialogues are repeated and reinscribed. To hold on to a memory is to isolate a moment or an image and encircle it with meaning. Each day we experience millions of texts: some are remembered, but most are lost. Some popular cultural texts move from ephemera to popular memory to history. In moving beyond individual reminiscences -- the personal experiences of our lifetime -- we enter the sphere of popular culture. Collective or popular memory is a group or community experience of a textualised reality. For example, during the Second World War, there were many private experiences, but certain moments arch beyond the individual. Songs by Vera Lynn are fully textualised experiences that become the fodder for collective memory. Similarly, Star Wars provides a sense-making mechanism for the 1980s. Like all popular culture, these texts allow myriad readership strategies, but there is collective recognition of relevance and importance. Popular memory is such an important site because it provides us, as cultural critics, with a map of emotionally resonant sites of the past, moments that are linked with specific subjectivities and a commonality of expression. While Star Wars, like all popular cultural formations, has a wide audience, there are specific readings that are pertinent for particular groups. To unify a generation around cultural texts is an act of collective memory. As Harris has suggested, "sometimes, youth does interesting things with its legacy and creatively adapts its problematic into seemingly autonomous cultural forms" (79). Generation X refers to an age cohort born between the mid-1960s and the mid-1970s. Finally cultural studies theorists have found a Grail subculture. Being depthless, ambivalent, sexually repressed and social failures, Xers are a cultural studies dream come true. They were the children of the media revolution. Star Wars is integral to this textualised database. A fan on the night of the first screening corrected a journalist: "we aren't Generation X, we are the Star Wars generation" (Brendon, in Miller 9). An infatuation and reflexivity with the media is the single framework of knowledge in which Xers operate. This shared understanding is the basis for comedy, and particularly revealed (in Australia) in programmes like The Panel and Good News Week. Television themes, lines of film dialogue and contemporary news broadcasts are the basis of the game show. The aesthetics of life transforms television into a real. Or, put another way, "individual lives may be fragmented and confused but McDonald's is universal" (Hopkins 17). A group of textual readers share a literacy, a new way of reading the word and world of texts. Nostalgia is a weapon. The 1990s has been a decade of revivals: from Abba to skateboards, an era of retro reinscription has challenged linear theories of history and popular culture. As Timothy Carter reveals, "we all loved the Star Wars movies when we were younger, and so we naturally look forward to a continuation of those films" (9). The 1980s has often been portrayed as a bad time, of Thatcher and Reagan, cold war brinkmanship, youth unemployment and HIV. For those who were children and (amorphously phrased) 'young adults' of this era, the popular memory is of fluorescent fingerless gloves, Ray Bans, 'Choose Life' t-shirts and bubble skirts. It was an era of styling mousse, big hair, the Wham tan, Kylie and Jason and Rick Astley's dancing. Star Wars action figures gave the films a tangibility, holding the future of the rebellion in our hands (literally). These memories clumsily slop into the cup of the present. The problem with 'youth' is that it is semiotically too rich: the expression is understood, but not explained, by discourses as varied as the educational system, family structures, leisure industries and legal, medical and psychological institutions. It is a term of saturation, where normality is taught, and deviance is monitored. All cultural studies theorists carry the baggage of the Birmingham Centre into any history of youth culture. The taken-for-granted 'youth as resistance' mantra, embodied in Resistance through Rituals and Subculture: The Meaning of Style, transformed young people into the ventriloquist's puppet of cultural studies. The strings of the dancing, smoking, swearing and drinking puppet took many years to cut. The feminist blade of Angela McRobbie did some damage to the fraying filaments, as did Dick Hebdige's reflexive corrections in Hiding in the Light. However, the publications, promotion and pedagogy of Gen X ended the theoretical charade. Gen X, the media sophisticates, played with popular culture, rather than 'proper politics.' In Coupland's Generation X, Claire, one of the main characters believed that "Either our lives become stories, or there's just no way to get through them." ... We know that this is why the three of us left our lives behind us and came to the desert -- to tell stories and to make our own lives worthwhile tales in the process. (8) Television and film are part of this story telling process. This intense connection generated an ironic and reflexive literacy in the media. Television became the basis for personal pleasures and local resistances, resulting in a disciplined mobilisation of popular cultural surfaces. Even better than the real thing. As the youngest of Generation Xers are now in their late twenties, they have moved from McJobs to careers. Robert Kizlik, a teacher trainer at an American community college expressed horror as the lack of 'commonsensical knowledge' from his new students. He conducted a survey for teachers training in the social sciences, assessing their grasp of history. There was one hundred percent recognition of such names as Madonna, Mike Tyson, and Sharon Stone, but they hardly qualify as important social studies content ... . I wondered silently just what it is that these students are going to teach when they become employed ... . The deeper question is not that we have so many high school graduates and third and fourth year college students who are devoid of basic information about American history and culture, but rather, how, in the first place, these students came to have the expectations that they could become teachers. (n. pag.) Kizlik's fear is that the students, regardless of their enthusiasm, had poor recognition of knowledge he deemed significant and worthy. His teaching task, to convince students of the need for non-popular cultural knowledges, has resulted in his course being termed 'boring' or 'hard'. He has been unable to reconcile the convoluted connections between personal stories and televisual narratives. I am reminded (perhaps unhelpfully) of one of the most famous filmic teachers, Mr Holland. Upon being attacked by his superiors for using rock and roll in his classes, he replied that he would use anything to instil in his students a love of music. Working with, rather than against, popular culture is an obvious pedagogical imperative. George Lucas has, for example, confirmed the Oprahfied spirituality of the current age. Obviously Star Wars utilises fables, myths3 and fairy tales to summon the beautiful Princess, the gallant hero and the evil Empire, but has become something more. Star Wars slots cleanly into an era of Body Shop Feminism, John Gray's gender politics and Rikki Lake's relationship management. Brian Johnson and Susan Oh argued that the film is actually a new religion. A long time ago in a galaxy far far away -- late 1970s California -- the known universe of George Lucas came into being. In the beginning, George created Star Wars. And the screen was without form, and void. And George said, 'Let there be light', and there was Industrial Light and Magic. And George divided the light from the darkness, with light sabres, and called the darkness the Evil Empire.... And George saw that it was good. (14) The writers underestimate the profound emotional investment placed in the trilogy by millions of people. Genesis narratives describe the Star Wars phenomenon, but do not analyse it. The reason why the films are important is not only because they are a replacement for religion. Instead, they are an integrated component of popular memory. Johnson and Oh have underestimated the influence of pop culture as "the new religion" (14). It is not a form of cheap grace. The history of ideas is neither linear nor traceable. There is no clear path from Plato to Prozac or Moses to Mogadon. Obi-Wan Kenobi is not a personal trainer for the ailing spirituality of our age. It was Ewan McGregor who fulfilled the Xer dream to be the young Obi Wan. As he has stated, "there is nothing cooler than being a Jedi knight" (qtd. in Grant 15). Having survived feet sawing in Shallow Grave and a painfully large enema in Trainspotting, there are few actors who are better prepared to carry the iconographic burden of a Star Wars prequel. Born in 1971, he is the Molly Ringwall of the 1990s. There is something delicious about the new Obi Wan, that hails what Hicks described as "a sense of awareness and self- awareness, of detached observation, of not taking things seriously, and a use of subtle dry humour" (79). The metaphoric light sabre was passed to McGregor. The pull of the dark side. When fans attend The Phantom Menace, they tend to the past, as to a loved garden. Whether this memory is a monument or a ruin depends on the preservation of the analogue world in the digital realm. The most significant theoretical and discursive task in the present is to disrupt the dual ideologies punctuating the contemporary era: inevitable technological change and progress.4 Only then may theorists ponder the future of a digitised past. Disempowered groups, who were denied a voice and role in the analogue history of the twentieth century, will have inequalities reified and reinforced through the digital archiving of contemporary life. The Web has been pivotal to the new Star Wars film. Lucasfilm has an Internet division and an official Website. Between mid November and May, this site has been accessed twenty million times (Gallott 15). Other sites, such as TheForce.net and Countdown to Star Wars, are a record of the enthusiasm and passion of fans. As Daniel Fallon and Matthew Buchanan have realised, "these sites represent the ultimate in film fandom -- virtual communities where like-minded enthusiasts can bathe in the aura generated by their favourite masterpiece" (27). Screensavers, games, desktop wallpaper, interviews and photo galleries have been downloaded and customised. Some ephemeral responses to The Phantom Menace have been digitally recorded. Yet this moment of audience affectivity will be lost without a consideration of digital memory. The potentials and problems of the digital and analogue environments need to be oriented into critical theories of information, knowledge, entertainment and pleasure. The binary language of computer-mediated communication allows a smooth transference of data. Knowledge and meaning systems are not exchanged as easily. Classifying, organising and preserving information make it useful. Archival procedures have been both late and irregular in their application.5 Bocher and Ihlenfeldt assert that 2500 new web sites are coming on-line every day ("A Higher Signal-to-Noise Ratio"). The difficulties and problems confronting librarians and archivists who wish to preserve digital information is revealed in the Australian government's PADI (Preserving Access to Digital Information) Site. Compared with an object in a museum which may lie undisturbed for years in a storeroom, or a book on a shelf, or even Egyptian hieroglyd on the wall of a tomb, digital information requires much more active maintenance. If we want access to digital information in the future, we must plan and act now. (PADI, "Why Preserve Access to Digital Information?") phics carve The speed of digitisation means that responsibility for preserving cultural texts, and the skills necessary to enact this process, is increasing the pressure facing information professionals. An even greater difficulty when preserving digital information is what to keep, and what to release to the ephemeral winds of cyberspace. 'Qualitative criteria' construct an historical record that restates the ideologies of the powerful. Concerns with quality undermine the voices of the disempowered, displaced and decentred. The media's instability through technological obsolescence adds a time imperative that is absent from other archival discussions.6 While these problems have always taken place in the analogue world, there was a myriad of alternative sites where ephemeral material was stored, such as the family home. Popular cultural information will suffer most from the 'blind spots' of digital archivists. While libraries rarely preserve the ephemera of a time, many homes (including mine) preserve the 'trash' of a culture. A red light sabre, toy dalek, Duran Duran posters and a talking Undertaker are all traces of past obsessions and fandoms. Passion evaporates, and interests morph into new trends. These objects remain in attics, under beds, in boxes and sheds throughout the world. Digital documents necessitate a larger project of preservation, with great financial (and spatial) commitments of technology, software and maintenance. Libraries rarely preserve the ephemera -- the texture and light -- of the analogue world. The digital era reduces the number of fan-based archivists. Subsequently forfeited is the spectrum of interests and ideologies that construct the popular memory of a culture. Once bits replace atoms, the recorded world becomes structured by digital codes. Only particular texts will be significant enough to store digitally. Samuel Florman stated that "in the digital age nothing need be lost; do we face the prospect of drowning in trivia as the generations succeed each other?" (n. pag.) The trivia of academics may be the fodder (and pleasures) of everyday life. Digitised preservation, like analogue preservation, can never 'represent' plural paths through the past. There is always a limit and boundary to what is acceptable obsolescence. The Star Wars films suggests that "the whole palette of digital technology is much more subtle and supple; if you can dream it, you can see it" (Corliss 65). This film will also record how many of the dreams survive and are archived. Films, throughout the century, have changed the way in which we construct and remember the past. They convey an expressive memory, rather than an accurate history. Certainly, Star Wars is only a movie. Yet, as Rushkoff has suggested, "we have developed a new language of references and self-references that identify media as a real thing and media history as an actual social history" (32). The build up in Australia to The Phantom Menace has been wilfully joyful. This is a history of the present, a time which I know will, in retrospect, be remembered with great fondness. It is a collective event for a generation, but it speaks to us all in different ways. At ten, it is easy to be amazed and enthralled at popular culture. By thirty, it is more difficult. When we see Star Wars, we go back to visit our memories. With red light sabre in hand, we splice through time, as much as space. Footnotes The United States release of the film occurred on 19 May 1999. In Australia, the film's first screenings were on 3 June. Many cinemas showed The Phantom Menace at 12:01 am, (very) early Thursday morning. The three main players of the GNW team, Paul McDermott, Mikey Robbins and Julie McCrossin, were featured on the cover of Australia's Juice magazine in costumes from The Phantom Menace, being Obi-Wan, Yoda and Queen Amidala respectively. Actually, the National Air and Space Museum had a Star Wars exhibition in 1997, titled "Star Wars: The Magic of Myth". For example, Janet Collins, Michael Hammond and Jerry Wellington, in Teaching and Learning with the Media, stated that "the message is simple: we now have the technology to inform, entertain and educate. Miss it and you, your family and your school will be left behind" (3). Herb Brody described the Net as "an overstuffed, underorganised attic full of pictures and documents that vary wildly in value", in "Wired Science". The interesting question is, whose values will predominate when the attic is being cleared and sorted? This problem is extended because the statutory provision of legal deposit, which obliges publishers to place copies of publications in the national library of the country in which the item is published, does not include CD-ROMs or software. References Bocher, Bob, and Kay Ihlenfeldt. "A Higher Signal-to-Noise Ratio: Effective Use of WebSearch Engines." State of Wisconsin Department of Public Instruction Website. 13 Mar. 1998. 15 June 1999 <http://www.dpi.state.wi.us/dpi/dlcl/lbstat/search2.php>. Brody, Herb. "Wired Science." Technology Review Oct. 1996. 15 June 1999 <http://www.techreview.com/articles/oct96/brody.php>. Carter, Timothy. "Wars Weary." Cinescape 39 (Mar./Apr. 1999): 9. Collins, Janet, Michael Hammond, and Jerry Wellington. Teaching and Learning with Multimedia. London: Routledge, 1997. Corliss, Richard. "Ready, Set, Glow!" Time 18 (3 May 1999): 65. Count Down to Star Wars. 1999. 15 June 1999 <http://starwars.countingdown.com/>. Coupland, Douglas. Generation X. London: Abacus, 1991. Eco, Umberto. Travels in Hyper-Reality. London: Picador, 1987. Fallon, Daniel, and Matthew Buchanan. "Now Screening." Australian Net Guide 4.5 (June 1999): 27. Florman, Samuel. "From Here to Eternity." MIT's Technology Review 100.3 (Apr. 1997). Gallott, Kirsten. "May the Web Be with you." Who Weekly 24 May 1999: 15. Grant, Fiona. "Ewan's Star Soars!" TV Week 29 May - 4 June 1999: 15. Hall, Stuart, and Tony Jefferson, eds. Resistance through Rituals. London: Hutchinson, 1976. Harris, David. From Class Struggle to the Politics of Pleasure: the Effects of Gramscianism on Cultural Studies. London: Routledge, 1992. Hebdige, Dick. Hiding in the Light. London: Routledge, 1988. Hopkins, Susan. "Generation Pulp." Youth Studies Australia Spring 1995. Johnson, Brian, and Susan Oh. "The Second Coming: as the Newest Star Wars Film Illustrates, Pop Culture Has Become a New Religion." Maclean's 24 May 1999: 14-8. Juice 78 (June 1999). Kizlik, Robert. "Generation X Wants to Teach." International Journal of Instructional Media 26.2 (Spring 1999). Lucasfilm Ltd. Star Wars: Welcome to the Official Site. 1999. 15 June 1999 <http://www.starwars.com/>. Miller, Nick. "Generation X-Wing Fighter." The West Australian 4 June 1999: 9. PADI. "What Digital Information Should be Preserved? Appraisal and Selection." Preserving Access to Digital Information (PADI) Website. 11 March 1999. 15 June 1999 <http://www.nla.gov.au/padi/what.php>. PADI. "Why Preserve Access to Digital Information?" Preserving Access to Digital Information (PADI) Website. <http://www.nla.gov.au/padi/why.php>. Rushkoff, Douglas. Media Virus. Sydney: Random House, 1994. Citation reference for this article MLA style: Tara Brabazon. "A Red Light Sabre to Go, and Other Histories of the Present." M/C: A Journal of Media and Culture 2.4 (1999). [your date of access] <http://www.uq.edu.au/mc/9906/sabre.php>. Chicago style: Tara Brabazon, "A Red Light Sabre to Go, and Other Histories of the Present," M/C: A Journal of Media and Culture 2, no. 4 (1999), <http://www.uq.edu.au/mc/9906/sabre.php> ([your date of access]). APA style: Tara Brabazon. (1999) A red light sabre to go, and other histories of the present. M/C: A Journal of Media and Culture 2(4). <http://www.uq.edu.au/mc/9906/sabre.php> ([your date of access]).

23

Cheong, Pauline Hope. "Faith Tweets: Ambient Religious Communication and Microblogging Rituals." M/C Journal 13, no.2 (May3, 2010). http://dx.doi.org/10.5204/mcj.223.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

There’s no reason to think that Jesus wouldn’t have Facebooked or twittered if he came into the world now. Can you imagine his killer status updates? Reverend Schenck, New York, All Saints Episcopal Church (Mapes) The fundamental problem of religious communication is how best to represent and mediate the sacred. (O’Leary 787) What would Jesus tweet? Historically, the quest for sacred connections has relied on the mediation of faith communication via technological implements, from the use of the drum to mediate the Divine, to the use of the mechanical clock by monks as reminders to observe the canonical hours of prayer (Mumford). Today, religious communication practices increasingly implicate Web 2.0, or interactive, user-generated content like blogs (Cheong, Halavis & Kwon), and microblogs like “tweets” of no more than 140 characters sent via Web-based applications like text messaging, instant messaging, e-mail, or on the Web. According to the Pew Internet and American Life Project’s latest report in October 2009, 19% of online adults said that they used a microblogging service to send messages from a computer or mobile device to family and friends who have signed up to receive them (Fox, Zickuhr & Smith). The ascendency of microblogging leads to interesting questions of how new media use alters spatio-temporal dynamics in peoples’ everyday consciousness, including ways in which tweeting facilitates ambient religious interactions. The notion of ambient strikes a particularly resonant chord for religious communication: many faith traditions advocate the practice of sacred mindfulness, and a consistent piety in light of holy devotion to an omnipresent and omniscient Divine being. This paper examines how faith believers appropriate the emergent microblogging practices to create an encompassing cultural surround to include microblogging rituals which promote regular, heightened prayer awareness. Faith tweets help constitute epiphany and a persistent sense of sacred connected presence, which in turn rouses an identification of a higher moral purpose and solidarity with other local and global believers. Amidst ongoing tensions about microblogging, religious organisations and their leadership have also begun to incorporate Twitter into their communication practices and outreach, to encourage the extension of presence beyond the church walls. Faith Tweeting and Mobile Mediated Prayers Twitter’s Website describes itself as a new media service that help users communicate and stay connected through the exchange of quick, frequent answers to the question, “What are you doing?” Some evangelical Christian groups harness these coincident messaging flows to create meaningful pathways for personal, intercessory and synchronised prayer. Using hashtags in a Twitter post creates a community convention or grouping around faith ideas and allows others to access them. Popular faith related hashtags include #twurch (Twitter + church), #prayer, #JIL (Jesus is Lord) and #pray4 (as in, #pray4 my mother). Just as mobile telephony assists distal family members to build “connected presence” (Christensen), I suggest that faith tweets stimulating mobile mediated prayers help build a sense of closeness and “religious connected presence” amongst the distributed family of faith believers, to recreate and reaffirm Divine and corporeal bonds. Consider the Calvin Institute of Worship’s set up of six different Twitter feeds to “pray the hours”. Praying the hours is an ancient practice of praying set prayers throughout certain times of the day, as marked in the Book of Common Prayer in the Christian tradition. Inspired by the Holy Scripture’s injunction to “pray without ceasing” ( 1 Thessalonians 5:17), users can sign up to receive hourly personal or intercessory prayers sent in brief verses or view a Tweetgrid with prayer feeds, to prompt continuous prayer or help those who are unsure of what words to pray. In this way, contemporary believers may reinvent the century-old practice of constant faith mediation as Twitter use helps to reintegrate scripture into people’s daily lives. Faith tweets that goad personal and intercessory prayer also makes ambient religious life salient, and preserves self-awareness of sanctified moments during normal, everyday activities. Furthermore, while the above “praying the hours” performance promotes a specific integration of scripture or prayer into individuals’ daily rhythms, other faith tweets are more focused on evangelism: to reach others through recurrent prayers or random inspirational messages sent throughout the day. For instance, as BBC News reports, religious leaders such as Cardinal Brady, head of Ireland’s Catholic Church, encourage parishioners to use Twitter to spread “the gift of prayer”, as they microblog their daily prayers for their friends and family. Cardinal Brady commented that, “such a sea of prayer is sure to strengthen our sense of solidarity with one another and remind us those who receive them that others really do care" (emphasis mine). Indeed, Cardinal Brady’s observation is instructive to the “Twitness” of faithful microbloggers who desire to shape the blogosphere, and create new faith connections. “JesusTweeters” is a faith-based social networking site, and a service which allows users to send out messages from any random tweet from the Bible Tweet Library, or their own personal messages on a scheduled basis. The site reports that over 500 members of JesusTweeters, each with an average of 500 followers, have signed up to help “spread the Word” worldwide through Twitter. This is an interesting emergent form of Twitter action, as it translates to more than 2.5 million faith tweets being circulated online daily. Moreover, Twitter encourages ‘connected presence’ whereby the use of microblogging enables online faith believers to enjoy an intimate, ‘always on’ virtual presence with their other congregational members during times of physical absence. In the recently released e-book The Reason Your Church Must Twitter, subtitled Making Your Ministry Contagious, author and self-proclaimed ‘technology evangelist’ Anthony Coppedge advocates churches to adopt Twitter as part of their overall communication strategy to maintain relational connectedness beyond the boundaries of established institutional practices. In his book, Coppedge argues that Twitter can be used as a “megaphone” for updates and announcements or as a “conversation” to spur sharing of ideas and prayer exchanges. In line with education scholars who promote Twitter as a pedagogical tool to enhance free-flowing interactions outside of the classroom (Dunlap & Lowenthal), Coppedge encourages pastors to tweet “life application points” from their sermons to their congregational members throughout the week, to reinforce the theme of their Sunday lesson. Ministry leaders are also encouraged to adopt Twitter to “become highly accessible” to members and communicate with their volunteers, in order to build stronger ecumenical relationships. Communication technology scholar Michele Jackson notes that Twitter is a form of visible “lifelogging” as interactants self-disclose their lived-in moments (731). In the case of faith tweets, co-presence is constructed when instantaneous Twitter updates announce new happenings on the church campus, shares prayer requests, confirms details of new events and gives public commendations to celebrate victories of staff members. In this way, microblogging helps to build a portable church where fellow believers can connect to each-other via the thread of frequent, running commentaries of their everyday lives. To further develop ‘connected presence’, a significant number of Churches have also begun to incorporate real-time Twitter streams during their Sunday services. For example, to stimulate congregational members’ sharing of their spontaneous reactions to the movement of the Holy Spirit, Westwind Church in Michigan has created a dozen “Twitter Sundays” where members are free to tweet at any time and at any worship service (Rochman). At Woodlands Church in Houston, a new service was started in 2009 which encourages parishioners to tweet their thoughts, reflections and questions throughout the service. The tweets are reviewed by church staff and they are posted as scrolling visual messages on a screen behind the pastor while he preaches (Patel). It is interesting to note that recurring faith tweets spatially filling the sanctuary screens blurs the visual hierarchies between the pastor as foreground and congregations as background to the degree that tweet voices from the congregation are blended into the church worship service. The interactive use of Twitter also differs from the forms of personal silent meditation and private devotional prayer that, traditionally, most liturgical church services encourage. In this way, key to new organisational practices within religious organisations is what some social commentators are now calling “ambient intimacy”, an enveloping social awareness of one’s social network (Pontin). Indeed, several pastors have acknowledged that faith tweets have enabled them to know their congregational members’ reflections, struggles and interests better and thus they are able to improve their teaching and caring ministry to meet congregants’ evolving spiritual needs (Mapes).Microblogging Rituals and Tweeting Tensions In many ways, faith tweets can be comprehended as microblogging rituals which have an ambient quality in engendering individuals’ spiritual self and group consciousness. The importance of examining emergent cyber-rituals is underscored by Stephen O’Leary in his 1996 seminal article on Cyberspace as Sacred Space. Writing in an earlier era of digital connections, O’Leary discussed e-mail and discussion forum cyber-rituals and what ritual gains in the virtual environment aside from its conventional physiological interactions. Drawing from Walter Ong’s understanding of the “secondary orality” accompanying the shift to electronic media, he argued that cyber-ritual as performative utterances restructure and reintegrate the minds and emotions of their participants, such that they are more aware of their interior self and a sense of communal group membership. Here, the above illustrative examples show how Twitter functions as the context for contemporary, mediated ritual practices to help believers construct a connected presence and affirm their religious identities within an environment where wired communication is a significant part of everyday life. To draw from Walter Ong’s words, microblogging rituals create a new textual and visual “sensorium” that has insightful implications for communication and media scholars. Faith tweeting by restructuring believers’ consciousness and generating a heightened awareness of relationship between the I, You and the Thou opens up possibilities for community building and revitalised religiosity to counteract claims of secularisation in technologically advanced and developed countries. “Praying the hours” guided by scripturally inspired faith tweets, for example, help seekers and believers experience epiphany and practice their faith in a more holistic way as they de-familarize mundane conditions and redeem a sense of the sacred from their everyday surrounds. Through the intermittent sharing of intercessory prayer tweets, faithful followers enact prayer chains and perceive themselves to be immersed in invariable spiritual battle to ward off evil ideology or atheistic beliefs. Moreover, the erosion of the authority of the church is offset by changed leadership practices within religious organisations which have experimented and actively incorporated Twitter into their daily institutional practices. To the extent that laity are willing to engage, creative practices to encourage congregational members to tweet during and after the service help revivify communal sentiments and a higher moral purpose through identification and solidarity with clergy leaders and other believers. Yet this ambience has its possible drawbacks as some experience tensions in their perception and use of Twitter as new technology within the church. Microblogging rituals may have negative implications for individual believers and religious organisations as they can weaken or pervert the existing relational links. As Pauline Cheong and Jessie Poon have pointed out, use of the Internet within religious organisations may bring about an alternative form of “perverse religious social capital building” as some clergy view that online communication detracts from real time relations and physical rituals. Indeed, some religious leaders have already articulated their concerns about Twitter and new tensions they experience in balancing the need to engage with new media audiences and the need for quiet reflection that spiritual rites such as confession of sins and the Holy Communion entail. According to the critics of faith tweeting, microblogging is time consuming and contributes to cognitive overload by taking away one’s attention to what is noteworthy at the moment. For Pastor Hayes of California for example, Twitter distracts his congregation’s focus on the sermon and thus he only recommends his members to tweet after the service. In an interview with the Houston Chronicle, he said: “If two people are talking at the same time, somebody’s not listening”, and “You cannot do two things at once and expect you’re not going to miss something” (Patel). Furthermore, similar to prior concerns voiced with new technologies, there are concerns over inappropriate tweet content that can comprise of crudity, gossip, malevolent and hate messages, which may be especially corrosive to faith communities that strive to model virtues like love, temperance and truth-telling (Vitello). In turn, some congregational members are also experiencing frustrations as they negotiate church boundaries and other members’ disapproval of their tweeting practices during service and church events. Censure of microblogging has taken the form of official requests for tweeting members to leave the sanctuary, to less formal social critique and the application of peer pressure to halt tweeting during religious proceedings and activities (Mapes). As a result of these connectivity tensions, varying recommendations have been recently published as fresh efforts to manage religious communication taking place in ambience. For instance, Coppedge recommends every tweeting church to include Twitter usage in their “church communications policy” to promote accountability within the organisation. The policy should include guidelines against excessive use of Twitter as spam, and for at least one leader to subscribe and monitor every Twitter account used. Furthermore, the Interpreter magazine of the United Methodist Church worldwide featured recommendations by Rev. Safiyah Fosua who listed eight important attributes for pastors wishing to incorporate Twitter during their worship services (Rice). These attributes are: highly adaptive; not easily distracted; secure in their presentation style; not easily taken aback when people appear to be focused on something other than listenin; into quality rather than volume; not easily rattled by things that are new; secure enough as a preacher to let God work through whatever is tweeted even if it is not the main points of the sermon; and carried on the same current the congregation is travelling on. For the most part, these attributes underscore how successful (read wired) contemporary religious leaders should be tolerant of ambient religious communication and of blurring hierarchies of information control when faced with microblogging and the “inexorable advance of multimodal connectedness” (Schroeder 1). To conclude, the rise of faith tweeting opens up a new portal to investigate accretive changes to culture as microblogging rituals nurture piety expressed in continuous prayer, praise and ecclesial updates. The emergent Twitter sensorium demonstrates the variety of ways in which religious adherents appropriate new media within the ken and tensions of their daily lives. References BBC News. “Twitter Your Prayer says Cardinal.” 27 April 2009. ‹http://news.bbc.co.uk/go/pr/fr/-/2/hi/uk_news/northern_ireland/8020285.stm›. Cheong, P.H., A. Halavis and K. Kwon. “The Chronicles of Me: Understanding Blogging as a Religious Practice. Journal of Media and Religion 7 (2008): 107-131. Cheong, P.H., and J.P.H. Poon. “‘WWW.Faith.Org’: (Re)structuring Communication and Social Capital Building among Religious Organizations.” Information, Communication and Society 11.1 (2008): 89-110. Christensen, Toke Haunstrup. “‘Connected Presence’ in Distributed Family Life.” New Media and Society 11 (2009): 433-451. Coppedge, Anthony. “The Reason Your Church Must Twitter: Making Your Ministry Contagious.” 2009. ‹http://www.twitterforchurches.com/›. Dunlap, Joanna, and Patrick Lowenthal. “Tweeting the Night Away: Using Twitter to Enhance Social Presence.” Journal of Information Systems Education 20.2 (2009): 129-135. Fox, Susannah, Kathryn Zickuhr, and Aaron Smith. “Twitter and Status Updating" Pew Internet & American Life Project, 2009. Oct. 2009 ‹http://www.pewinternet.org/~/media//Files/Reports/2009/PIP_Twitter_Fall_2009_web.pdf›. Jackson, Michele. “The Mash-Up: A New Archetype for Communication.” Journal of Computer-Mediated Communication 14.3 (2009): 730-734. Mapes, Diane. “Holy Twitter! Tweeting from the Pews.” 2009. 3 June 2009 ‹http://www.nbcwashington.com/.../Holy_Twitter__Tweeting_from_the_pews.html›. Mumford, Lewis. Technics and Civilization. New York: Harcourt, 1934. Patel, Purva. “Tweeting during Church Services Gets Blessing of Pastors.” Houston Chronicle (2009). 10 Oct. 2009 ‹http://www.chron.com/disp/story.mpl/metropolitan/6662287.html›. O’Leary, Stephen. ”Cyberspace as Sacred Space: Communicating Religion on Computer Networks.” Journal of the American Academy of Religion 64.4 (1996): 781-808. Pontin, Jason. “Twitter and Ambient Intimacy: How Evan Williams Helped Create the New Social Medium of Microblogging.” MIT Review 2007. 15 Nov. 2009 ‹http://www.technologyreview.com/communications/19713/?a=f›. Rice, Kami. “The New Worship Question: To Tweet or Not to Tweet.” Interpreter Magazine (Nov.-Dec. 2009). ‹http://www.interpretermagazine.org/interior.asp?ptid=43&mid=13871›. Rochman, Bonnie. “Twittering in Church, with the Pastor’s O.K.” Time 3 May 2009. ‹http://www.time.com/time/business/article/0,8599,1895463,00.html›. Schroeder, Ralph. “Mobile Phones and the Inexorable Advance of Multimodal Connectedness.” New Media and Society 12.1 (2010): 75-90. Vitello, Paul. “Lead Us to Tweet, and Forgive the Trespassers.” New York Times 5 July 2009. ‹http://www.nytimes.com/2009/07/05/technology/internet/05twitter.html›.

24

Pace, Steven. "Revisiting Mackay Online." M/C Journal 22, no.3 (June19, 2019). http://dx.doi.org/10.5204/mcj.1527.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

IntroductionIn July 1997, the Mackay campus of Central Queensland University hosted a conference with the theme Regional Australia: Visions of Mackay. It was the first academic conference to be held at the young campus, and its aim was to provide an opportunity for academics, business people, government officials, and other interested parties to discuss their visions for the development of Mackay, a regional community of 75,000 people situated on the Central Queensland coast (Danaher). I delivered a presentation at that conference and authored a chapter in the book that emerged from its proceedings. The chapter entitled “Mackay Online” explored the potential impact that the Internet could have on the Mackay region, particularly in the areas of regional business, education, health, and entertainment (Pace). Two decades later, how does the reality compare with that vision?Broadband BluesAt the time of the Visions of Mackay conference, public commercial use of the Internet was in its infancy. Many Internet services and technologies that users take for granted today were uncommon or non-existent then. Examples include online video, video-conferencing, Voice over Internet Protocol (VoIP), blogs, social media, peer-to-peer file sharing, payment gateways, content management systems, wireless data communications, smartphones, mobile applications, and tablet computers. In 1997, most users connected to the Internet using slow dial-up modems with speeds ranging from 28.8 Kbps to 33.6 Kbps. 56 Kbps modems had just become available. Lamenting these slow data transmission speeds, I looked forward to a time when widespread availability of high-bandwidth networks would allow the Internet’s services to “expand to include electronic commerce, home entertainment and desktop video-conferencing” (Pace 103). Although that future eventually arrived, I incorrectly anticipated how it would arrive.In 1997, Optus and Telstra were engaged in the rollout of hybrid fibre coaxial (HFC) networks in Sydney, Melbourne, and Brisbane for the Optus Vision and Foxtel pay TV services (Meredith). These HFC networks had a large amount of unused bandwidth, which both Telstra and Optus planned to use to provide broadband Internet services. Telstra's Big Pond Cable broadband service was already available to approximately one million households in Sydney and Melbourne (Taylor), and Optus was considering extending its cable network into regional Australia through partnerships with smaller regional telecommunications companies (Lewis). These promising developments seemed to point the way forward to a future high-bandwidth network, but that was not the case. A short time after the Visions of Mackay conference, Telstra and Optus ceased the rollout of their HFC networks in response to the invention of Asynchronous Digital Subscriber Line (ADSL), a technology that increases the bandwidth of copper wire and enables Internet connections of up to 6 Mbps over the existing phone network. ADSL was significantly faster than a dial-up service, it was broadly available to homes and businesses across the country, and it did not require enormous investment in infrastructure. However, ADSL could not offer speeds anywhere near the 27 Mbps of the HFC networks. When it came to broadband provision, Australia seemed destined to continue playing catch-up with the rest of the world. According to data from the Organisation for Economic Cooperation and Development (OECD), in 2009 Australia ranked 18th in the world for broadband penetration, with 24.1 percent of Australians having a fixed-line broadband subscription. Statistics like these eventually prompted the federal government to commit to the deployment of a National Broadband Network (NBN). In 2009, the Kevin Rudd Government announced that the NBN would combine fibre-to-the-premises (FTTP), fixed wireless, and satellite technologies to deliver Internet speeds of up to 100 Mbps to 90 percent of Australian homes, schools, and workplaces (Rudd).The rollout of the NBN in Mackay commenced in 2013 and continued, suburb by suburb, until its completion in 2017 (Frost, “Mackay”; Garvey). The rollout was anything but smooth. After a change of government in 2013, the NBN was redesigned to reduce costs. A mixed copper/optical technology known as fibre-to-the-node (FTTN) replaced FTTP as the preferred approach for providing most NBN connections. The resulting connection speeds were significantly slower than the 100 Mbps that was originally proposed. Many Mackay premises could only achieve a maximum speed of 40 Mbps, which led to some overcharging by Internet service providers, and subsequent compensation for failing to deliver services they had promised (“Optus”). Some Mackay residents even complained that their new NBN connections were slower than their former ADSL connections. NBN Co representatives claimed that the problems were due to “service providers not buying enough space in the network to provide the service they had promised to customers” (“Telcos”). Unsurprisingly, the number of complaints about the NBN that were lodged with the Telecommunications Industry Ombudsman skyrocketed during the last six months of 2017. Queensland complaints increased by approximately 40 percent when compared with the same period during the previous year (“Qld”).Despite the challenges presented by infrastructure limitations, the rollout of the NBN was a boost for the Mackay region. For some rural residents, it meant having reliable Internet access for the first time. Frost, for example, reports on the experiences of a Mackay couple who could not get an ADSL service at their rural home because it was too far away from the nearest telephone exchange. Unreliable 3G mobile broadband was the only option for operating their air-conditioning business. All of that changed with the arrival of the NBN. “It’s so fast we can run a number of things at the same time”, the couple reported (“NBN”).Networking the NationOne factor that contributed to the uptake of Internet services in the Mackay region after the Visions of Mackay conference was the Australian Government’s Networking the Nation (NTN) program. When the national telecommunications carrier Telstra was partially privatised in 1997, and further sold in 1999, proceeds from the sale were used to fund an ambitious communications infrastructure program named Networking the Nation (Department of Communications, Information Technology and the Arts). The program funded projects that improved the availability, accessibility, affordability, and use of communications facilities and services throughout regional Australia. Eligibility for funding was limited to not-for-profit organisations, including local councils, regional development organisations, community groups, local government associations, and state and territory governments.In 1998, the Mackay region received $930,000 in Networking the Nation funding for Mackay Regionlink, a project that aimed to provide equitable community access to online services, skills development for local residents, an affordable online presence for local business and community organisations, and increased external awareness of the Mackay region (Jewell et al.). One element of the project was a training program that provided basic Internet skills to 2,168 people across the region over a period of two years. A second element of the project involved the establishment of 20 public Internet access centres in locations throughout the region, such as libraries, community centres, and tourist information centres. The centres provided free Internet access to users and encouraged local participation and skill development. More than 9,200 users were recorded in these centres during the first year of the project, and the facilities remained active until 2006. A third element of the project was a regional web portal that provided a free easily-updated online presence for community organisations. The project aimed to have every business and community group in the Mackay region represented on the website, with hosting fees for the business web pages funding its ongoing operation and development. More than 6,000 organisations were listed on the site, and the project remained financially viable until 2005.The availability, affordability and use of communications facilities and services in Mackay increased significantly during the period of the Regionlink project. Changes in technology, services, markets, competition, and many other factors contributed to this increase, so it is difficult to ascertain the extent to which Mackay Regionlink fostered those outcomes. However, the large number of people who participated in the Regionlink training program and made use of the public Internet access centres, suggests that the project had a positive influence on digital literacy in the Mackay region.The Impact on BusinessThe Internet has transformed regional business for both consumers and business owners alike since the Visions of Mackay conference. When Mackay residents made a purchase in 1997, their choice of suppliers was limited to a few local businesses. Today they can shop online in a global market. Security concerns were initially a major obstacle to the growth of electronic commerce. Consumers were slow to adopt the Internet as a place for doing business, fearing that their credit card details would be vulnerable to hackers once they were placed online. After observing the efforts that finance and software companies were making to eliminate those obstacles, I anticipated that it would only be a matter of time before online transactions became commonplace:Consumers seeking a particular product will be able to quickly find the names of suitable suppliers around the world, compare their prices, and place an order with the one that can deliver the product at the cheapest price. (Pace 106)This expectation was soon fulfilled by the arrival of online payment systems such as PayPal in 1998, and online shopping services such as eBay in 1997. eBay is a global online auction and shopping website where individuals and businesses buy and sell goods and services worldwide. The eBay service is free to use for buyers, but sellers are charged modest fees when they make a sale. It exemplifies the notion of “friction-free capitalism” articulated by Gates (157).In 1997, regional Australian business owners were largely sceptical about the potential benefits the Internet could bring to their businesses. Only 11 percent of Australian businesses had some form of web presence, and less than 35 percent of those early adopters felt that their website was significant to their business (Department of Industry, Science and Tourism). Anticipating the significant opportunities that the Internet offered Mackay businesses to compete in new markets, I recommended that they work “towards the goal of providing products and services that meet the needs of international consumers as well as local ones” (107). In the two decades that have passed since that time, many Mackay businesses have been doing just that. One prime example is Big on Shoes (bigonshoes.com.au), a retailer of ladies’ shoes from sizes five to fifteen (Plane). Big on Shoes has physical shopfronts in Mackay and Moranbah, an online store that has been operating since 2009, and more than 12,000 followers on Facebook. This speciality store caters for women who have traditionally been unable to find shoes in their size. As the store’s customer base has grown within Australia and internationally, an unexpected transgender market has also emerged. In 2018 Big on Shoes was one of 30 regional businesses featured in the first Facebook and Instagram Annual Gift Guide, and it continues to build on its strengths (Cureton).The Impact on HealthThe growth of the Internet has improved the availability of specialist health services for people in the Mackay region. Traditionally, access to surgical services in Mackay has been much more limited than in metropolitan areas because of the shortage of specialists willing to practise in regional areas (Green). In 2003, a senior informant from the Royal Australasian College of Surgeons bluntly described the Central Queensland region from Mackay to Gladstone as “a black hole in terms of surgery” (Birrell et al. 15). In 1997 I anticipated that, although the Internet would never completely replace a visit to a local doctor or hospital, it would provide tools that improve the availability of specialist medical services for people living in regional areas. Using these tools, doctors would be able to “analyse medical images captured from patients living in remote locations” and “diagnose patients at a distance” (Pace 108).These expectations have been realised in the form of Queensland Health’s Telehealth initiative, which permits medical specialists in Brisbane and Townsville to conduct consultations with patients at the Mackay Base Hospital using video-conference technology. Telehealth reduces the need for patients to travel for specialist advice, and it provides health professionals with access to peer support. Averill (7), for example, reports on the experience of a breast cancer patient at the Mackay Base Hospital who was able to participate in a drug trial with a Townsville oncologist through the Telehealth network. Mackay health professionals organised the patient’s scans, administered blood tests, and checked her lymph nodes, blood pressure and weight. Townsville health professionals then used this information to advise the Mackay team about her ongoing treatment. The patient expressed appreciation that the service allowed her to avoid the lengthy round-trip to Townsville. Prior to being offered the Telehealth option, she had refused to participate in the trial because “the trip was just too much of a stumbling block” (Averill 7).The Impact on Media and EntertainmentThe field of media and entertainment is another aspect of regional life that has been reshaped by the Internet since the Visions of Mackay conference. Most of these changes have been equally apparent in both regional and metropolitan areas. Over the past decade, the way individuals consume media has been transformed by new online services offering user-generated video, video-on-demand, and catch-up TV. These developments were among the changes I anticipated in 1997:The convergence of television and the Internet will stimulate the creation of new services such as video-on-demand. Today television is a synchronous media—programs are usually viewed while they are being broadcast. When high-quality video can be transmitted over the information superhighway, users will be able to watch what they want, when and where they like. […] Newly released movies will continue to be rented, but probably not from stores. Instead, consumers will shop on the information superhighway for movies that can be delivered on demand.In the mid-2000s, free online video-sharing services such as YouTube and Vimeo began to emerge. These websites allow users to freely upload, view, share, comment on, and curate online videos. Subscription-based streaming services such as Netflix and Amazon Prime have also become increasingly popular since that time. These services offer online streaming of a library of films and television programs for a fee of less than 20 dollars per month. Computers, smart TVs, Blu-ray players, game consoles, mobile phones, tablets, and other devices provide a multitude of ways of accessing streaming services. Some of these devices cost less than 100 dollars, while higher-end electronic devices include the capability as a bundled feature. Netflix became available in Mackay at the time of its Australian launch in 2015. The growth of streaming services greatly reduced the demand for video rental shops in the region, and all closed down as a result. The last remaining video rental store in Mackay closed its doors in 2018 after trading for 26 years (“Last”).Some of the most dramatic transformations that have occurred the field of media and entertainment were not anticipated in 1997. The rise of mobile technology, including wireless data communications, smartphones, mobile applications, and tablet computers, was largely unforeseen at that time. Some Internet luminaries such as Vinton Cerf expected that mobile access to the Internet via laptop computers would become commonplace (Lange), but this view did not encompass the evolution of smartphones, and it was not widely held. Similarly, the rise of social media services and the impact they have had on the way people share content and communicate was generally unexpected. In some respects, these phenomena resemble the Black Swan events described by Nassim Nicholas Taleb (xvii)—surprising events with a major effect that are often inappropriately rationalised after the fact. They remind us of how difficult it is to predict the future media landscape by extrapolating from things we know, while failing to take into consideration what we do not know.The Challenge for MackayIn 1997, when exploring the potential impact that the Internet could have on the Mackay region, I identified a special challenge that the community faced if it wanted to be competitive in this new environment:The region has traditionally prospered from industries that control physical resources such as coal, sugar and tourism, but over the last two decades there has been a global ‘shift away from physical assets and towards information as the principal driver of wealth creation’ (Petre and Harrington 1996). The risk for Mackay is that its residents may be inclined to believe that wealth can only be created by means of industries that control physical assets. The community must realise that its value-added information is at least as precious as its abundant natural resources. (110)The Mackay region has not responded well to this challenge, as evidenced by measures such as the Knowledge City Index (KCI), a collection of six indicators that assess how well a city is positioned to grow and advance in today’s technology-driven, knowledge-based economy. A 2017 study used the KCI to conduct a comparative analysis of 25 Australian cities (Pratchett, Hu, Walsh, and Tuli). Mackay rated reasonably well in the areas of Income and Digital Access. But the city’s ratings were “very limited across all the other measures of the KCI”: Knowledge Capacity, Knowledge Mobility, Knowledge Industries and Smart Work (44).The need to be competitive in a technology-driven, knowledge-based economy is likely to become even more pressing in the years ahead. The 2017 World Energy Outlook Report estimated that China’s coal use is likely to have peaked in 2013 amid a rapid shift toward renewable energy, which means that demand for Mackay’s coal will continue to decline (International Energy Agency). The sugar industry is in crisis, finding itself unable to diversify its revenue base or increase production enough to offset falling global sugar prices (Rynne). The region’s biggest tourism drawcard, the Great Barrier Reef, continues to be degraded by mass coral bleaching events and ongoing threats posed by climate change and poor water quality (Great Barrier Reef Marine Park Authority). All of these developments have disturbing implications for Mackay’s regional economy and its reliance on coal, sugar, and tourism. Diversifying the local economy through the introduction of new knowledge industries would be one way of preparing the Mackay region for the impact of new technologies and the economic challenges that lie ahead.ReferencesAverill, Zizi. “Webcam Consultations.” Daily Mercury 22 Nov. 2018: 7.Birrell, Bob, Lesleyanne Hawthorne, and Virginia Rapson. The Outlook for Surgical Services in Australasia. Melbourne: Monash University Centre for Population and Urban Research, 2003.Cureton, Aidan. “Big Shoes, Big Ideas.” Daily Mercury 8 Dec. 2018: 12.Danaher, Geoff. Ed. Visions of Mackay: Conference Papers. Rockhampton: Central Queensland UP, 1998.Department of Communications, Information Technology and the Arts. Networking the Nation: Evaluation of Outcomes and Impacts. Canberra: Australian Government, 2005.Department of Industry, Science and Tourism. Electronic Commerce in Australia. Canberra: Australian Government, 1998.Frost, Pamela. “Mackay Is Up with Switch to Speed to NBN.” Daily Mercury 15 Aug. 2013: 8.———. “NBN Boost to Business.” Daily Mercury 29 Oct. 2013: 3.Gates, Bill. The Road Ahead. New York: Viking Penguin, 1995.Garvey, Cas. “NBN Rollout Hit, Miss in Mackay.” Daily Mercury 11 Jul. 2017: 6.Great Barrier Reef Marine Park Authority. Reef Blueprint: Great Barrier Reef Blueprint for Resilience. Townsville: Great Barrier Reef Marine Park Authority, 2017.Green, Anthony. “Surgical Services and Referrals in Rural and Remote Australia.” Medical Journal of Australia 177.2 (2002): 110–11.International Energy Agency. World Energy Outlook 2017. France: IEA Publications, 2017.Jewell, Roderick, Mary O’Flynn, Fiorella De Cindio, and Margaret Cameron. “RCM and MRL—A Reflection on Two Approaches to Constructing Communication Memory.” Constructing and Sharing Memory: Community Informatics, Identity and Empowerment. Eds. Larry Stillman and Graeme Johanson. Newcastle: Cambridge Scholars Publishing, 2007. 73–86.Lange, Larry. “The Internet: Where’s It All Going?” Information Week 17 Jul. 1995: 30.“Last Man Standing Shuts Doors after 26 Years of Trade.” Daily Mercury 28 Aug. 2018: 7.Lewis, Steve. “Optus Plans to Share Cost Burden.” Australian Financial Review 22 May 1997: 26.Meredith, Helen. “Time Short for Cable Modem.” Australian Financial Review 10 Apr. 1997: 42Nassim Nicholas Taleb. The Black Swan: The Impact of the Highly Improbable. New York: Random House, 2007.“Optus Offers Comp for Slow NBN.” Daily Mercury 10 Nov. 2017: 15.Organisation for Economic Cooperation and Development. “Fixed Broadband Subscriptions.” OECD Data, n.d. <https://data.oecd.org/broadband/fixed-broadband-subscriptions.htm>.Pace, Steven. “Mackay Online.” Visions of Mackay: Conference Papers. Ed. Geoff Danaher. Rockhampton: Central Queensland University Press, 1998. 111–19.Petre, Daniel and David Harrington. The Clever Country? Australia’s Digital Future. Sydney: Lansdown Publishing, 1996.Plane, Melanie. “A Shoe-In for Big Success.” Daily Mercury 9 Sep. 2017: 6.Pratchett, Lawrence, Richard Hu, Michael Walsh, and Sajeda Tuli. The Knowledge City Index: A Tale of 25 Cities in Australia. Canberra: University of Canberra neXus Research Centre, 2017.“Qld Customers NB-uN Happy Complaints about NBN Service Double in 12 Months.” Daily Mercury 17 Apr. 2018: 1.Rudd, Kevin. “Media Release: New National Broadband Network.” Parliament of Australia Press Release, 7 Apr. 2009 <https://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id:"media/pressrel/PS8T6">.Rynne, David. “Revitalising the Sugar Industry.” Sugar Policy Insights Feb. 2019: 2–3.Taylor, Emma. “A Dip in the Pond.” Sydney Morning Herald 16 Aug. 1997: 12.“Telcos and NBN Co in a Crisis.” Daily Mercury 27 Jul. 2017: 6.

25

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10, no.6 (April1, 2008). http://dx.doi.org/10.5204/mcj.2723.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media? How could practice-based approaches inform this research instead of relying on espoused theories-in-use? What new methodologies could be developed for CJ implementation? What role can the “heroic” individual reporter or editor have in “the swarm”? Do the claims about OhmyNews and other sites stand up to longitudinal observation? Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators? How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/>. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 http://www.innosight.com/documents/Theory%20Building.pdf>. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit>. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 http://www.gladwell.com/1997/1997_03_17_a_cool.htm>. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all>. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 http://www.demos.co.uk/publications/proameconomy>. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm>. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out>. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine>. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 http://blog.futurestreetconsulting.com/?p=39>. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973. Citation reference for this article MLA Style Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10.6/11.1 (2008). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0804/10-burns.php>. APA Style Burns, A. (Apr. 2008) "Select Issues with New Media Theories of Citizen Journalism," M/C Journal, 10(6)/11(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0804/10-burns.php>.

26

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 11, no.1 (June1, 2008). http://dx.doi.org/10.5204/mcj.30.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media?How could practice-based approaches inform this research instead of relying on espoused theories-in-use?What new methodologies could be developed for CJ implementation?What role can the “heroic” individual reporter or editor have in “the swarm”?Do the claims about OhmyNews and other sites stand up to longitudinal observation?Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators?How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 < http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/ >. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 < http://www.innosight.com/documents/Theory%20Building.pdf >. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 < http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit >. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 < http://www.gladwell.com/1997/1997_03_17_a_cool.htm >. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 < http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all >. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 < http://www.demos.co.uk/publications/proameconomy >. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 < http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm >. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 < http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out >. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 < http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine >. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 < http://blog.futurestreetconsulting.com/?p=39 >. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973.

27

Droumeva, Milena. "Curating Everyday Life: Approaches to Documenting Everyday Soundscapes." M/C Journal 18, no.4 (August10, 2015). http://dx.doi.org/10.5204/mcj.1009.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

In the last decade, the cell phone’s transformation from a tool for mobile telephony into a multi-modal, computational “smart” media device has engendered a new kind of emplacement, and the ubiquity of technological mediation into the everyday settings of urban life. With it, a new kind of media literacy has become necessary for participation in the networked social publics (Ito; Jenkins et al.). Increasingly, the way we experience our physical environments, make sense of immediate events, and form impressions is through the lens of the camera and through the ear of the microphone, framed by the mediating possibilities of smartphones. Adopting these practices as a kind of new media “grammar” (Burn 29)—a multi-modal language for public and interpersonal communication—offers new perspectives for thinking about the way in which mobile computing technologies allow us to explore our environments and produce new types of cultural knowledge. Living in the Social Multiverse Many of us are concerned about new cultural practices that communication technologies bring about. In her now classic TED talk “Connected but alone?” Sherry Turkle talks about the world of instant communication as having the illusion of control through which we micromanage our immersion in mobile media and split virtual-physical presence. According to Turkle, what we fear is, on the one hand, being caught unprepared in a spontaneous event and, on the other hand, missing out or not documenting or recording events—a phenomenon that Abha Dawesar calls living in the “digital now.” There is, at the same time, a growing number of ways in which mobile computing devices connect us to new dimensions of everyday life and everyday experience: geo-locative services and augmented reality, convergent media and instantaneous participation in the social web. These technological capabilities arguably shift the nature of presence and set the stage for mobile users to communicate the flow of their everyday life through digital storytelling and media production. According to a Digital Insights survey on social media trends (Bennett), more than 500 million tweets are sent per day and 5 Vines tweeted every second; 100 hours of video are uploaded to YouTube every minute; more than 20 billion photos have been shared on Instagram to date; and close to 7 million people actively produce and publish content using social blogging platforms. There are more than 1 billion smartphones in the US alone, and most social media platforms are primarily accessed using mobile devices. The question is: how do we understand the enormity of these statistics as a coherent new media phenomenon and as a predominant form of media production and cultural participation? More importantly, how do mobile technologies re-mediate the way we see, hear, and perceive our surrounding evironment as part of the cultural circuit of capturing, sharing, and communicating with and through media artefacts? Such questions have furnished communication theory even before McLuhan’s famous tagline “the medium is the message”. Much of the discourse around communication technology and the senses has been marked by distinctions between “orality” and “literacy” understood as forms of collective consciousness engendered by technological shifts. Leveraging Jonathan Sterne’s critique of this “audio-visual litany”, an exploration of convergent multi-modal technologies allows us to focus instead on practices and techniques of use, considered as both perceptual and cultural constructs that reflect and inform social life. Here in particular, a focus on sound—or aurality—can help provide a fresh new entry point into studying technology and culture. The phenomenon of everyday photography is already well conceptualised as a cultural expression and a practice connected with identity construction and interpersonal communication (Pink, Visual). Much more rarely do we study the act of capturing information using mobile media devices as a multi-sensory practice that entails perceptual techniques as well as aesthetic considerations, and as something that in turn informs our unmediated sensory experience. Daisuke and Ito argue that—in contrast to hobbyist high-quality photographers—users of camera phones redefine the materiality of urban surroundings as “picture-worthy” (or not) and elevate the “mundane into a photographic object.” Indeed, whereas traditionally recordings and photographs hold institutional legitimacy as reliable archival references, the proliferation of portable smart technologies has transformed user-generated content into the gold standard for authentically representing the everyday. Given that visual approaches to studying these phenomena are well underway, this project takes a sound studies perspective, focusing on mediated aural practices in order to explore the way people make sense of their everyday acoustic environments using mobile media. Curation, in this sense, is a metaphor for everyday media production, illuminated by the practice of listening with mobile technology. Everyday Listening with Technology: A Case Study The present conceptualisation of curation emerged out of a participant-driven qualitative case study focused on using mobile media to make sense of urban everyday life. The study comprised 10 participants using iPod Touches (a device equivalent to an iPhone, without the phone part) to produce daily “aural postcards” of their everyday soundscapes and sonic experiences, over the course of two to four weeks. This work was further informed by, and updates, sonic ethnography approaches nascent in the World Soundscape Project, and the field of soundscape studies more broadly. Participants were asked to fill out a questionnaire about their media and technology use, in order to establish their participation in new media culture and correlate that to the documentary styles used in their aural postcards. With regard to capturing sonic material, participants were given open-ended instructions as to content and location, and encouraged to use the full capabilities of the device—that is, to record audio, video, and images, and to use any applications on the device. Specifically, I drew their attention to a recording app (Recorder) and a decibel measurement app (dB), which combines a photo with a static readout of ambient sound levels. One way most participants described the experience of capturing sound in a collection of recordings for a period of time was as making a “digital scrapbook” or a “media diary.” Even though they had recorded individual (often unrelated) soundscapes, almost everyone felt that the final product came together as a stand-alone collection—a kind of gallery of personalised everyday experiences that participants, if anything, wished to further organise, annotate, and flesh out. Examples of aural postcard formats used by participants: decibel photographs of everyday environments and a comparison audio recording of rain on a car roof with and without wipers (in the middle). Working with 139 aural postcards comprising more than 250 audio files and 150 photos and videos, the first step in the analysis was to articulate approaches to media documentation in terms of format, modality, and duration as deliberate choices in conversation with dominant media forms that participants regularly consume and are familiar with. Ambient sonic recordings (audio-only) comprised a large chunk of the data, and within this category there were two approaches: the sonic highlight, a short vignette of a given soundscape with minimal or no introduction or voice-over; and the process recording, featuring the entire duration of an unfolding soundscape or event. Live commentaries, similar to the conventions set forth by radio documentaries, represented voice-over entries at the location of the sound event, sometimes stationary and often in motion as the event unfolded. Voice memos described verbal reflections, pre- or post- sound event, with no discernable ambience—that is, participants intended them to serve as reflective devices rather than as part of the event. Finally, a number of participants also used the sound level meter app, which allowed them to generate visual records of the sonic levels of a given environment or location in the form of sound level photographs. Recording as a Way of Listening In their community soundwalking practice, Förnstrom and Taylor refer to recording sound in everyday settings as taking world experience, mediating it through one’s body and one’s memories and translating it into approximate experience. The media artefacts generated by participants as part of this study constitute precisely such ‘approximations’ of everyday life accessed through aural experience and mediated by the technological capabilities of the iPod. Thinking of aural postcards along this technological axis, the act of documenting everyday soundscapes involves participants acting as media producers, ‘framing’ urban everyday life through a mobile documentary rubric. In the process of curating these documentaries, they have to make decisions about the significance and stylistic framing of each entry and the message they wish to communicate. In order to bring the scope of these curatorial decisions into dialogue with established media forms, in this work’s analysis I combine Bill Nichols’s classification of documentary modes in cinema with Karin Bijsterveld’s concept of soundscape ‘staging’ to characterise the various approaches participants took to the multi-modal curation of their everyday (sonic) experience. In her recent book on the staging of urban soundscapes in both creative and documentary/archival media, Bijsterveld describes the representation of sound as particular ‘dramatisations’ that construct different kinds of meanings about urban space and engender different kinds of listening positions. Nichols’s articulation of cinematic documentary modes helps detail ways in which the author’s intentionality is reflected in the styling, design, and presentation of filmic narratives. Michel Chion’s discussion of cinematic listening modes further contextualises the cultural construction of listening that is a central part of both design and experience of media artefacts. The conceptual lens is especially relevant to understanding mobile curation of mediated sonic experience as a kind of mobile digital storytelling. Working across all postcards, settings, and formats, the following four themes capture some of the dominant stylistic dimensions of mobile media documentation. The exploratory approach describes a methodology for representing everyday life as a flow, predominantly through ambient recordings of unfolding processes that participants referred to in the final discussion as a ‘turn it on and forget it’ approach to recording. As a stylistic method, the exploratory approach aligns most closely with Nichols’s poetic and observational documentary modes, combining a ‘window to the world’ aesthetic with minimal narration, striving to convey the ‘inner truth’ of phenomenal experience. In terms of listening modes reflected in this approach, exploratory aural postcards most strongly engage causal listening, to use Chion’s framework of cinematic listening modes. By and large, the exploratory approach describes incidental documentaries of routine events: soundscapes that are featured as a result of greater attentiveness and investment in the sonic aspects of everyday life. The entries created using this approach reflect a process of discovering (seeing and hearing) the ordinary as extra-ordinary; re-experiencing sometimes mundane and routine places and activities with a fresh perspective; and actively exploring hidden characteristics, nuances of meaning, and significance. For instance, in the following example, one participant explores a new neighborhood while on a work errand:The narrative approach to creating aural postcards stages sound as a springboard for recollecting memories and storytelling through reflecting on associations with other soundscapes, environments, and interactions. Rather than highlighting place, routine, or sound itself, this methodology constructs sound as a window into the identity and inner life of the recordist, mobilising most strongly a semantic listening mode through association and narrative around sound’s meaning in context (Chion 28). This approach combines a subjective narrative development with a participatory aesthetic that draws the listener into the unfolding story. This approach is also performative, in that it stages sound as a deeply subjective experience and approaches the narrative from a personally significant perspective. Most often this type of sound staging was curated using voice memo narratives about a particular sonic experience in conjunction with an ambient sonic highlight, or as a live commentary. Recollections typically emerged from incidental encounters, or in the midst of other observations about sound. In the following example a participant reminisces about the sound of wind, which, interestingly, she did not record: Today I have been listening to the wind. It’s really rainy and windy outside today and it was reminding me how much I like the sound of wind. And you know when I was growing up on the wide prairies, we sure had a lot of wind and sometimes I kind of miss the sound of it… (Participant 1) The aesthetic approach describes instances where the creation of aural postcards was motivated by a reduced listening position (Chion 29)—driven primarily by the qualities and features of the soundscape itself. This curatorial practice for staging mediated aural experience combines a largely subjective approach to documenting with an absence of traditional narrative development and an affective and evocative aesthetic. Where the exploratory documentary approach seeks to represent place, routine, environment, and context through sonic characteristics, the aesthetic approach features sound first and foremost, aiming to represent and comment on sound qualities and characteristics in a more ‘authentic’ manner. The media formats most often used in conjunction with this approach were the incidental ambient sonic highlight and the live commentary. In the following example we have the sound of coffee being made as an important domestic ritual where important auditory qualities are foregrounded: That’s the sound of a stovetop percolator which I’ve been using for many years and I pretty much know exactly how long it takes to make a pot of coffee by the sound that it makes. As soon as it starts gurgling I know I have about a minute before it burns. It’s like the coffee calls and I come. (Participant 6) The analytical approach characterises entries that stage mediated aural experience as a way of systematically and inductively investigating everyday phenomena. It is a conceptual and analytical experimental methodology employed to move towards confirming or disproving a ‘hypothesis’ or forming a theory about sonic relations developed in the course of the study. As such, this approach most strongly aligns with Chion’s semantic listening mode, with the addition of the interactive element of analytical inquiry. In this context, sound is treated as a variable to be measured, compared, researched, and theorised about in an explicit attempt to form conclusions about social relationships, personal significance, place, or function. This analytical methodology combines an explicit and critical focus to the process of documenting itself (whether it be measuring decibels or systematically attending to sonic qualities) with a distinctive analytical synthesis that presents as ‘formal discovery’ or even ‘truth.’ In using this approach, participants most often mobilised the format of short sonic highlights and follow-up voice memos. While these aural postcards typically contained sound level photographs (decibel measurement values), in some cases the inquiry and subsequent conclusions were made inductively through sustained observation of a series of soundscapes. The following example is by a participant who exclusively recorded and compared various domestic spaces in terms of sound levels, comparing and contrasting them using voice memos. This is a sound level photograph of his home computer system: So I decided to record sitting next to my computer today just because my computer is loud, so I wanted to see exactly how loud it really was. But I kept the door closed just to be sort of fair, see how quiet it could possibly get. I think it peaked at 75 decibels, and that’s like, I looked up a decibel scale, and apparently a lawn mower is like 90 decibels. (Participant 2) Mediated Curation as a New Media Cultural Practice? One aspect of adopting the metaphor of ‘curation’ towards everyday media production is that it shifts the critical discourse on aesthetic expression from the realm of specialised expertise to general practice (“Everyone’s a photographer”). The act of curation is filtered through the aesthetic and technological capabilities of the smartphone, a device that has become co-constitutive of our routine sensorial encounters with the world. Revisiting McLuhan-inspired discourses on communication technologies stages the iPhone not as a device that itself shifts consciousness but as an agent in a media ecology co-constructed by the forces of use and design—a “crystallization of cultural practices” (Sterne). As such, mobile technology is continuously re-crystalised as design ‘constraints’ meet both normative and transgressive user approaches to interacting with everyday life. The concept of ‘social curation’ already exists in commercial discourse for social web marketing (O’Connell; Allton). High-traffic, wide-integration web services such as Digg and Pinterest, as well as older portals such as Reddit, all work on the principles of arranging user-generated, web-aggregated, and re-purposed content around custom themes. From a business perspective, the notion of ‘social curation’ captures, unsurprisingly, only the surface level of consumer behaviour rather than the kinds of values and meaning that this process holds for people. In the more traditional sense, art curation involves aesthetic, pragmatic, epistemological, and communication choices about the subject of (re)presentation, including considerations such as manner of display, intended audience, and affective and phenomenal impact. In his 2012 book tracing the discourse and culture of curating, Paul O’Neill proposes that over the last few decades the role of the curator has shifted from one of arts administrator to important agent in the production of cultural experiences, an influential cultural figure in her own right, independent of artistic content (88). Such discursive shifts in the formulation of ‘curatorship’ can easily be transposed from a specialised to a generalised context of cultural production, in which everyone with the technological means to capture, share, and frame the material and sensory content of everyday life is a curator of sorts. Each of us is an agent with a unique aesthetic and epistemological perspective, regardless of the content we curate. The entire communicative exchange is necessarily located within a nexus of new media practices as an activity that simultaneously frames a cultural construction of sensory experience and serves as a cultural production of the self. To return to the question of listening and a sound studies perspective into mediated cultural practices, technology has not single-handedly changed the way we listen and attend to everyday experience, but it has certainly influenced the range and manner in which we make sense of the sensory ‘everyday’. Unlike acoustic listening, mobile digital technologies prompt us to frame sonic experience in a multi-modal and multi-medial fashion—through the microphone, through the camera, and through the interactive, analytical capabilities of the device itself. Each decision for sensory capture as a curatorial act is both epistemological and aesthetic; it implies value of personal significance and an intention to communicate meaning. The occurrences that are captured constitute impressions, highlights, significant moments, emotions, reflections, experiments, and creative efforts—very different knowledge artefacts from those produced through textual means. Framing phenomenal experience—in this case, listening—in this way is, I argue, a core characteristic of a more general type of new media literacy and sensibility: that of multi-modal documenting of sensory materialities, or the curation of everyday life. References Allton, Mike. “5 Cool Content Curation Tools for Social Marketers.” Social Media Today. 15 Apr. 2013. 10 June 2015 ‹http://socialmediatoday.com/mike-allton/1378881/5-cool-content-curation-tools-social-marketers›. Bennett, Shea. “Social Media Stats 2014.” Mediabistro. 9 June 2014. 20 June 2015 ‹http://www.mediabistro.com/alltwitter/social-media-statistics-2014_b57746›. Bijsterveld, Karin, ed. Soundscapes of the Urban Past: Staged Sound as Mediated Cultural Heritage. Bielefeld: Transcript-Verlag, 2013. Burn, Andrew. Making New Media: Creative Production and Digital Literacies. New York, NY: Peter Lang Publishing, 2009. Daisuke, Okabe, and Mizuko Ito. “Camera Phones Changing the Definition of Picture-worthy.” Japan Media Review. 8 Aug. 2015 ‹http://www.dourish.com/classes/ics234cw04/ito3.pdf›. Chion, Michel. Audio-Vision: Sound on Screen. New York, NY: Columbia UP, 1994. Förnstrom, Mikael, and Sean Taylor. “Creative Soundwalks.” Urban Soundscapes and Critical Citizenship Symposium. Limerick, Ireland. 27–29 March 2014. Ito, Mizuko, ed. Hanging Out, Messing Around, and Geeking Out: Kids Living and Learning with New Media. Cambridge, MA: The MIT Press, 2010. Jenkins, Henry, Ravi Purushotma, Margaret Weigel, Katie Clinton, and Alice J. Robison. Confronting the Challenges of Participatory Culture: Media Education for the 21st Century. White Paper prepared for the McArthur Foundation, 2006. McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964. Nichols, Brian. Introduction to Documentary. Bloomington & Indianapolis, Indiana: Indiana UP, 2001. Nielsen. “State of the Media – The Social Media Report.” Nielsen 4 Dec. 2012. 12 May 2015 ‹http://www.nielsen.com/us/en/insights/reports/2012/state-of-the-media-the-social-media-report-2012.html›. O’Connel, Judy. “Social Content Curation – A Shift from the Traditional.” 8 Aug. 2011. 11 May 2015 ‹http://judyoconnell.com/2011/08/08/social-content-curation-a-shift-from-the-traditional/›. O’Neill, Paul. The Culture of Curating and the Curating of Culture(s). Cambridge, MA: MIT Press, 2012. Pink, Sarah. Doing Visual Ethnography. London, UK: Sage, 2007. ———. Situating Everyday Life. London, UK: Sage, 2012. Sterne, Jonathan. The Audible Past: Cultural Origins of Sound Reproduction. Durham, NC: Duke UP, 2003. Schafer, R. Murray, ed. World Soundscape Project. European Sound Diary (reprinted). Vancouver: A.R.C. Publications, 1977. Turkle, Sherry. “Connected But Alone?” TED Talk, Feb. 2012. 8 Aug. 2015 ‹http://www.ted.com/talks/sherry_turkle_alone_together?language=en›.

28

Taveira, Rodney. "Don DeLillo, 9/11 and the Remains of Fresh Kills." M/C Journal 13, no.4 (August19, 2010). http://dx.doi.org/10.5204/mcj.281.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

It’s a portrait of grief, to be sure, but it puts grief in the air, as a cultural atmospheric, without giving us anything to mourn.—— Tom Junod, “The Man Who Invented 9/11”The nearly decade-long attempt by families of 9/11 victims to reclaim the remains of their relatives involves rhetorics of bodilessness, waste, and virtuality that offer startling illustrations of what might be termed “the poetics of grief.” After combining as the WTC Families for Proper Burial Inc. in 2002, the families sued the city of New York in 2005. They lost and the case has been under appeal since 2008. WTC Families is asking for nearly one million tons of material to be moved from the Fresh Kills landfill on Staten Island in order to sift it for human remains. These remains will then be reclaimed and interred: Proper Burial. But the matter is far less definitive. When a judge hearing the appeal asked how one would prove someone’s identity, the city’s lawyer replied, “You have to be able to particularise and say it’s your body. All that’s left here is a bunch of undifferentiated dust.” The reply “elicited gasps and muttered ‘no’s’ from a crowd whose members wore laminated photos of deceased victims” (Hughes). These laminated displays are an attempt by WTC Families to counteract the notion of the victims as “undifferentiated dust”; the protected, hermetic images are testimony to painful uncertainty, an (always) outmoded relic of the evidentiary self.In the face of such uncertainty, it was not only court audiences who waited for a particular response to the terrorist attacks. Adam Hirsch, reviewer for the New York Sun, claimed that “the writer whose September 11 novel seemed most necessary was Don DeLillo. Mr. DeLillo, more than any other novelist, has always worked at the intersection of public terror and private fear.” DeLillo’s prescience regarding the centrality of terrorism in American culture was noted by many critics in the aftermath of the attack on the World Trade Centre. The novelist even penned an essay for Harper’s in which he reflected on the role of the novelist in the new cultural landscape of the post-9/11 world. In an online book club exchange for Slate, Meghan O’Rourke says, “DeLillo seemed eerily primed to write a novel about the events of September 11. … Rereading some of his earlier books, including the terrorism-riddled Mao II, I wondered, half-seriously, if Mohamed Atta and crew had been studying DeLillo.” If there was any writer who might have been said to have seen it coming it was DeLillo. The World Trade Center had figured in his novels before the 9/11 attacks. The twin towers are a primary landmark in Underworld, gracing the cover of the novel in ghostly black and white. In Players (1977), a Wall Street worker becomes involved in a terrorist plot to bomb the New York Stock exchange and his wife works in the WTC for the “Grief Management Council”—“Where else would you stack all this grief?” (18).ClassificationsAs the WTC Families for Proper Burial Inc. trial demonstrates, the reality of the terrorist attacks of September 11 offered an altogether more macabre and less poetic reality than DeLillo’s fiction had depicted. The Fresh Kills landfill serves in Underworld as a metaphor for the accumulated history of Cold War America in the last half-century. Taking in the “man-made mountain,” waste management executive Brian Glassic thinks, “It was science fiction and prehistory”; seeing the World Trade Center in the distance, “he sensed a poetic balance between that idea and this one” (Underworld 184). But the poetic balance DeLillo explores in the 1997 novel has been sundered by the obliteration of the twin towers. Fresh Kills and the WTC are now united by a disquieting grief. The landfill, which closed in 2001, was forced to reopen when the towers collapsed to receive their waste. Fresh Kills bears molecular witness to this too-big collective trauma. “‘They commingled it, and then they dumped it,’ Mr. Siegel [lawyer for WTC Families] said of the remains being mixed with household trash, adding that a Fresh Kills worker had witnessed city employees use that mixture to fill potholes” (Hughes). The revelation is obscene: Are we walking and driving over our dead? The commingling of rubble and human remains becomes a collective (of) contamination too toxic, too overwhelming for conventional comprehension. “You can’t even consider the issue of closure until this issue has been resolved,” says the lawyer representing WTC Families (Hartocollis).Nick Shay, Underworld’s main character, is another waste executive who travels the world to observe ways of dealing with garbage. Of shopping with his wife, Nick says, “Marion and I saw products as garbage even when they sat gleaming on store shelves, yet unbought. We didn’t say, What kind of casserole will that make? We said, What kind of garbage will that make?” (121). This attests to the virtuality of waste, a potentiality of the products – commercial, temporal, biological – that comprise the stuff of contemporary American culture. Synecdoche and metonymy both, waste becomes the ground of hysteron proteron, the rhetorical figure that disorders time and makes the future always present. Like (its) Fresh Kills, waste is science fiction and prehistory.Repeating the apparent causal and temporal inversion of hysteron proteron, Nick’s son Jeff uses his home computer to access a simultaneous future and past that is the internal horizon of Underworld’s historical fiction. Jeff has previously been using his computer to search for something in the video footage of the “Texas Highway Killer,” a serial murderer who randomly shoots people on Texan highways. Jeff tries to resolve the image so that the pixels will yield more, exposing their past and future. “He was looking for lost information. He enhanced and super-slowed, trying to find some pixel in the data swarm that might provide a clue to the identity of the shooter” (118). Searching for something more, something buried, Jeff, like WTC Families, is attempting to redeem the artifactual and the overlooked by reconfiguring them as identity. DeLillo recognises this molecular episteme through the “dot theory of reality”: “Once you get inside a dot, you gain access to hidden information, you slide inside the smallest event. This is what technology does. It peels back the shadows and redeems the dazed and rambling past. It makes reality come true” (177). Like the gleaming supermarket products Nick and Marion see as garbage, the unredeemed opens onto complex temporal and rhetorical orders. Getting inside garbage is like getting “inside a dot.” This approach is not possible for the unplanned waste of 9/11. Having already lost its case, WTC Families will almost certainly lose its appeal because its categories and its means are unworkable and inapplicable: they cannot particularise.PremonitionsIn his 9/11 essay “In the Ruins of the Future,” published in Harper’s a few months after the attacks, DeLillo says “We are all breathing the fumes of lower Manhattan where traces of the dead are everywhere, in the soft breeze off the river, on rooftops and windows, in our hair and on our clothes” (39). DeLillo‘s portrait of molecular waste adumbrates the need to create “counternarratives.” Until the events of 11 September 2001 the American narrative was that of the Cold War, and thus also the narrative of Underworld; one for which DeLillo claims the Bush administration was feeling nostalgic. “This is over now,” he says. “The narrative ends in the rubble and it is left to us to create the counternarrative” (34).DeLillo was already at work on a narrative of his own at the time of the terrorist attacks. As Joseph Conte notes, when the World Trade Center was attacked, “DeLillo, had nearly finished drafting his thirteenth novel, Cosmopolis [… and] shared in the collective seizure of the American mind” (179). And while it was released in 2003, DeLillo sets the novel in 2000 on “a day in April.” If the millennium, the year 2000, has been as Boxall claims the horizon of DeLillo’s writing, the tagging of this “day in April” at the beginning of the novel signals Cosmopolis as a limit-work (4). 9/11 functions as a felt absence in the novel, a binding thing floating in the air, like the shirt that DeLillo will use to begin and end Falling Man; a story that will ‘go beyond’ the millennial limit, a story that is, effectively, the counternarrative of which DeLillo speaks in his 9/11 essay. Given the timing of the terrorist attacks in New York, and DeLillo’s development of his novel, it is extraordinary to consider just how Cosmopolis reflects on its author’s position as a man who should have “seen it coming.” The billionaire protagonist Eric Packer traverses Manhattan by car, his journey a bifurcation between sophistication and banality. Along the way he has an onanistic sexual encounter whilst having his prostate examined, hacks into and deletes his wife’s old money European fortune, loses his own self-made wealth by irrationally betting against the rise of the yen, kills a man, and shoots himself in the hand in front of his assassin. Eric actively moves toward his own death. Throughout Eric’s journey the socially binding integrity of the present and the future is teased apart. He continually sees images of future events before they occur – putting his hand on his chin, a bomb explosion, and finally, his own murder – via video screens in his car and wristwatch. These are, as Conte rightly notes, repeated instances of hysteron proteron (186). His corpse does not herald obsolescence but begins the true life of waste: virtual information. Or, as Eric’s “Chief of Theory” asks, “Why die when you can live on a disk?” (106). There are shades here of Jeff’s pixelated excursion into the video footage of the Texas Highway Killer: “Once you get inside a dot, you gain access to hidden information.” Life at this level is not only virtual, it is particularised, a point (or a collection of points) Eric comes to grasp during the protracted scene in which he watches himself die: “The stuff he sneezes when he sneezes, this is him” (207). In Falling Man, the work in which DeLillo engages directly with the 9/11 attack, the particularised body recurs in various forms. First there is the (now iconic) falling man: the otherwise unknown victim of the terrorist attack who leapt from the WTC and whose descent was captured in a photograph by Richard Drew. This figure was named (particularised) by Tom Junod (who provides the epigram for this essay) as “The Falling Man.” In DeLillo’s novel another Falling Man, a performance artist, re-enacts the moment by jumping off buildings, reiterating the photograph (back) into a bodily performance. In these various incarnations the falling man is serially particularised: photographed, named, then emulated. The falling man is a single individual, and multiple copies. He lives on long after death and so does his trauma. He represents the poetic expression of collective grief. Particularised bodies also infect the terror narrative of Falling Man at a molecular level. Falling Man’s terrorist, Hammad, achieves a similar life-after-death by becoming “organic shrapnel.” The surviving victims of the suicide bomb attack, months later, begin to display signs of the suicide bombers in lumps and sores emerging from their bodies, too-small bits of the attacker forever incorporated. Hammad is thus paired with the victims of the crash in a kind of disseminative and absorptive (rhetorical) structure. “The world changes first in the mind of the man who wants to change it. The time is coming, our truth, our shame, and each man becomes the other, and the other still another, and then there is no separation” (80). RevisionsThe traces of American culture that were already contained in the landfill in Underworld have now become the resting place of the dust and the bodies of the trauma of 9/11. Rereading DeLillo’s magnum opus one cannot help but be struck by the new resonance of Fresh Kills.The landfill showed him smack-on how the waste stream ended, where all the appetites and hankerings, the sodden second thoughts came runneling out, the things you wanted ardently and then did not…. He knew the stench must ride the wind into every dining room for miles around. When people heard a noise at night, did they think the heap was coming down around them, sliding toward their homes, an omnivorous movie terror filling their doorways and windows?The wind carried the stink across the kill…. The biggest secrets are the ones spread before us. (184-5)The landfill looms large on the landscape, a huge pile of evidence for the mass trauma of what remains, those that remain, and what may come—waste in all its virtuality. The “omnivorous movie terror filling their doorways and windows” is a picture of dust-blanketed Downtown NYC that everybody, everywhere, continually saw. The mediatory second sight of sifting the landfill, of combing the second site of the victims for its “sodden second thoughts,” is at once something “you wanted ardently and then did not.” The particles are wanted as a distillate, produced by the frameline of an intentional, processual practice that ‘edits’ 9/11 and its aftermath into a less unacceptable sequence that might allow the familiar mourning ritual of burying a corpse. WTC Families Inc. is seeking to throw the frame of human identity around the unincorporated particles of waste in the Fresh Kills landfill, an unbearably man-made, million-ton mountain. This operation is an attempt to immure the victims and their families from the attacks and its afterlife as waste or recycled material, refusing the ever-present virtual life of waste that always accompanied them. Of course, even if WTC Families is granted its wish to sift Fresh Kills, how can it differentiate its remains from those of the 9/11 attackers? The latter have a molecular, virtual afterlife in the present and the living, lumpy reminders that surface as foreign bodies.Resisting the city’s drive to rebuild and move on, WTC Families for Proper Burial Inc. is absorbed with the classification of waste rather than its deployment. In spite of the group’s failed court action, the Fresh Kills site will still be dug over: a civil works project by the NYC Department of Parks & Recreation will reclaim the landfill and rename it “Freshkills Park,” a re-creational area to be twice the size of Central Park—As DeLillo foresaw, “The biggest secrets are the ones spread before us.”ReferencesBoxall, Peter. Don DeLillo: The Possibility of Fiction. London: Routledge, 2006.Conte, Joseph M. “Writing amid the Ruins: 9/11 and Cosmopolis”. The Cambridge Companion to Don DeLillo. Ed. John N. Duvall. Cambridge: Cambridge University Press, 2008. 179-192.Cowart, David. Don DeLillo: The Physics of Language. Athens: University of Georgia Press, 2003.DeLillo, Don. Players. London: Vintage, 1991.———. Mao II. London: Vintage, 1992.———. Underworld. London: Picador, 1997.———. “In the Ruins of the Future”. Harper’s. Dec. 2001: 33-40.———. Cosmopolis. London: Picador, 2003.———. Falling Man. New York: Scribner, 2007.Hartocollis, Anemona. “Landfill Has 9/11 Remains, Medical Examiner Wrote”. 24 Mar. 2007. The New York Times. 7 Mar. 2009 ‹http://www.nytimes.com/2007/03/24%20/nyregion/24remains.html›. Hirsch, Adam. “DeLillo Confronts September 11”. 2 May 2007. The New York Sun. 10 May 2007 ‹http://www.nysun.com/arts/delillo-confronts-september-11/53594/›.Hughes, C. J. “9/11 Families Press Judges on Sifting at Landfill”. 16 Dec. 2009. The New York Times. 17 Dec. 2009 ‹http://www.nytimes.com/2009/12/17/nyregion/17sift.html›.Junod, Tom. “The Man Who Invented 9/11”. 7 May 2007. Rev. of Falling Man by Don DeLillo. Esquire. 28 May 2007 ‹http://www.esquire.com/fiction/book-review/delillo›.O’Rourke, Meghan. “DeLillo Seemed Almost Eerily Primed to Write a Novel about 9/11”. 23 May 2007. Slate.com. 28 May 2007 ‹http://www.slate.com/id/2166831/%20entry/2166848/›.

29

Wang, Jing. "The Coffee/Café-Scape in Chinese Urban Cities." M/C Journal 15, no.2 (May2, 2012). http://dx.doi.org/10.5204/mcj.468.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

IntroductionIn this article, I set out to accomplish two tasks. The first is to map coffee and cafés in Mainland China in different historical periods. The second is to focus on coffee and cafés in the socio-cultural milieu of contemporary China in order to understand the symbolic value of the emerging coffee/café-scape. Cafés, rather than coffee, are at the centre of this current trend in contemporary Chinese cities. With instant coffee dominating as a drink, the Chinese have developed a cultural and social demand for cafés, but have not yet developed coffee palates. Historical Coffee Map In 1901, coffee was served in a restaurant in the city of Tianjin. This restaurant, named Kiessling, was run by a German chef, a former solider who came to China with the eight-nation alliance. At that time, coffee was reserved mostly for foreign politicians and military officials as well as wealthy businessmen—very few ordinary Chinese drank it. (For more history of Kiessling, including pictures and videos, see Kiessling). Another group of coffee consumers were from the cultural elites—the young revolutionary intellectuals and writers with overseas experience. It was almost a fashion among the literary elite to spend time in cafés. However, this was negatively judged as “Western” and “bourgeois.” For example, in 1932, Lu Xun, one of the most important twentieth century Chinese writers, commented on the café fashion during 1920s (133-36), and listed the reasons why he would not visit one. He did not drink coffee because it was “foreigners’ food”, and he was too busy writing for the kind of leisure enjoyed in cafés. Moreover, he did not, he wrote, have the nerve to go to a café, and particularly not the Revolutionary Café that was popular among cultural celebrities at that time. He claimed that the “paradise” of the café was for genius, and for handsome revolutionary writers (who he described as having red lips and white teeth, whereas his teeth were yellow). His final complaint was that even if he went to the Revolutionary Café, he would hesitate going in (Lu Xun 133-36). From Lu Xun’s list, we can recognise his nationalism and resistance to what were identified as Western foods and lifestyles. It is easy to also feel his dissatisfaction with those dilettante revolutionary intellectuals who spent time in cafés, talking and enjoying Western food, rather than working. In contrast to Lu Xun’s resistance to coffee and café culture, another well-known writer, Zhang Ailing, frequented cafés when she lived in Shanghai from the 1920s to 1950s. She wrote about the smell of cakes and bread sold in Kiessling’s branch store located right next to her parents’ house (Yuyue). Born into a wealthy family, exposed to Western culture and food at a very young age, Zhang Ailing liked to spend her social and writing time in cafés, ordering her favourite cakes, hot chocolate, and coffee. When she left Shanghai and immigrated to the USA, coffee was an important part of her writing life: the smell and taste reminding her of old friends and Shanghai (Chunzi). However, during Zhang’s time, it was still a privileged and elite practice to patronise a café when these were located in foreign settlements with foreign chefs, and served mainly foreigners, wealthy businessmen, and cultural celebrities. After 1949, when the Chinese Communist Party established the People’s Republic of China, until the late 1970s, there were no coffee shops in Mainland China. It was only when Deng Xiaoping suggested neo-liberalism as a so-called “reform-and-open-up” economic policy that foreign commerce and products were again seen in China. In 1988, ten years after the implementation of Deng Xiaoping’s policy, the Nestlé coffee company made the first inroads into the mainland market, featuring homegrown coffee beans in Yunnan province (China Beverage News; Dong; ITC). Nestlé’s bottled instant coffee found its way into the Chinese market, avoiding a direct challenge to the tea culture. Nestlé packaged its coffee to resemble health food products and marketed it as a holiday gift suitable for friends and relatives. As a symbol of modernity and “the West”, coffee-as-gift meshed with the traditional Chinese cultural custom that values gift giving. It also satisfied a collective desire for foreign products (and contact with foreign cultures) during the economic reform era. Even today, with its competitively low price, instant coffee dominates coffee consumption at home, in the workplace, and on Chinese airlines. While Nestlé aimed their product at native Chinese consumers, the multinational companies who later entered China’s coffee market, such as Sara Lee, mainly targeted international hotels such as IHG, Marriott, and Hyatt. The multinationals also favoured coffee shops like Kommune in Shanghai that offered more sophisticated kinds of coffee to foreign consumers and China’s upper class (Byers). If Nestlé introduced coffee to ordinary Chinese families, it was Starbucks who introduced the coffee-based “third space” to urban life in contemporary China on a signficant scale. Differing from the cafés before 1949, Starbucks stores are accessible to ordinary Chinese citizens. The first in Mainland China opened in Beijing’s China World Trade Center in January 1999, targeting mainly white-collar workers and foreigners. Starbucks coffee shops provide a space for informal business meetings, chatting with friends, and relaxing and, with its 500th store opened in 2011, dominate the field in China. Starbucks are located mainly in the central business districts and airports, and the company plans to have 1,500 sites by 2015 (Starbucks). Despite this massive presence, Starbucks constitutes only part of the café-scape in contemporary Chinese cities. There are two other kinds of cafés. One type is usually located in universities or residential areas and is frequented mainly by students or locals working in cultural professions. A representative of this kind is Sculpting in Time Café. In November 1997, two years before the opening of the first Starbucks in Beijing, two newlywed college graduates opened the first small Sculpting in Time Café near Beijing University’s East Gate. This has been expanded into a chain, and boasts 18 branches on the Mainland. (For more about its history, see Sculpting in Time Café). Interestingly, both Starbucks and Sculpting in Time Café acquired their names from literature, Starbucks from Moby Dick, and Sculpting in Time from the Russian filmmaker Andrei Tarkovsky’s film diary of the same name. For Chinese students of literature and the arts, drinking coffee is less about acquiring more energy to accomplish their work, and more about entering a sensual world, where the aroma of coffee mixes with the sounds from the coffee machine and music, as well as the lighting of the space. More importantly, cafés with this ambience become, in themselves, cultural sites associated with literature, films, and music. Owners of this kind of café are often lovers of foreign literatures, films, and cultures, and their cafés host various cultural events, including forums, book clubs, movie screenings, and music clubs. Generally speaking, coffee served in this kind of café is simpler than in the kind discussed below. This third type of café includes those located in tourist and entertainment sites such as art districts, bar areas, and historical sites, and which are frequented by foreign and native tourists, artists and other cultural workers. If Starbucks cultivates a fast-paced business/professional atmosphere, and Sculpting in Time Cafés an artsy and literary atmosphere, this third kind of café is more like an upscale “bar” with trained baristas serving complicated coffees and emphasising their flavour. These coffee shops are more expensive than the other kinds, with an average price three times that of Starbucks. Currently, cafés of this type are found only in “first-tier” cities and usually located in art districts and tourist areas—such as Beijing’s 798 Art District and Nanluo Guxiang, Shanghai’s Tai Kang Road (a.k.a. “the art street”), and Hangzhou’s Westlake area. While Nestlé and Starbucks use coffee beans grown in Yunnan provinces, these “art cafés” are more inclined to use imported coffee beans from suppliers like Sara Lee. Coffee and Cafés in Contemporary China After just ten years, there are hundreds of cafés in Chinese cities. Why has there been such a demand for coffee or, more accurately, cafés, in such a short period of time? The first reason is the lack of “third space” environments in Mainland China. Before cafés appeared in the late 1990s, stores like KFC (which opened its first store in 1987) and McDonald’s (with its first store opened in 1990) filled this role for urban residents, providing locations where customers could experience Western food, meet friends, work, or read. In fact, KFC and McDonald’s were once very popular with college students looking for a place to study. Both stores had relatively clean food environments and good lighting. They also had air conditioning in the summer and heating in the winter, which are not provided in most Chinese university dormitories. However, since neither chain was set up to be a café and customers occupying seats for long periods while ordering minimal amounts of food or drink affected profits, staff members began to indirectly ask customers to leave after dining. At the same time, as more people were able to afford to eat at KFC and McDonald’s, their fast foods were also becoming more and more popular, especially among young people. As a consequence, both types of chain restaurant were becoming noisy and crowded and, thus, no longer ideal for reading, studying, or meeting with friends. Although tea has been a traditional drink in Chinese culture, traditional teahouses were expensive places more suitable for business meetings or for the cultural or intellectual elite. Since almost every family owns a tea set and can readily purchase tea, friends and family would usually make and consume tea at home. In recent years, however, new kinds of teahouses have emerged, similar in style to cafés, targeting the younger generation with more affordable prices and a wider range of choices, so the lack of a “third space” does not fully explain the café boom. Another factor affecting the popularity of cafés has been the development and uptake of Internet technology, including the increasing use of laptops and wireless Internet in recent years. The Internet has been available in China since the late 1990s, while computers and then laptops entered ordinary Chinese homes in the early twenty-first century. The IT industry has created not only a new field of research and production, but has also fostered new professions and demands. Particularly, in recent years in Mainland China, a new socially acceptable profession—freelancing in such areas as graphic design, photography, writing, film, music, and the fashion industry—has emerged. Most freelancers’ work is computer- and Internet-based. Cafés provide suitable working space, with wireless service, and the bonus of coffee that is, first of all, somatically stimulating. In addition, the emergence of the creative and cultural industries (which are supported by the Chinese government) has created work for these freelancers and, arguably, an increasing demand for café-based third spaces where such people can meet, talk and work. Furthermore, the flourishing of cafés in first-tier cities is part of the “aesthetic economy” (Lloyd 24) that caters to the making and selling of lifestyle experience. Alongside foreign restaurants, bars, galleries, and design firms, cafés contribute to city branding, and link a city to the global urban network. Cafés, like restaurants, galleries and bars, provide a space for the flow of global commodities, as well as for the human flow of tourists, travelling artists, freelancers, and cultural specialists. Finally, cafés provide a type of service that contributes to friendly owner/waiter-customer relations. During the planned-economy era, most stores and hotels in China were State-owned, staff salaries were not related to individual performance, and indifferent (and even unfriendly) service was common. During the economic reform era, privately owned stores and shops began to replace State-owned ones. At the same time, a large number of people from the countryside flowed into the cities seeking opportunities. Most had little if any professional training and so could only find work in factories or in the service industry. However, most café employees are urban, with better educational backgrounds, and many were already familiar with coffee culture. In addition, café owners, particularly those of places like Sculpting in Time Cafe, often invest in creating a positive, community atmosphere, learning about their customers and sharing personal experiences with their regular clients. This leads to my next point—the generation of the 1980s’ need for a social community. Cafés’ Symbolic Value—Community A demand for a sense of community among the generation of the 1980s is a unique socio-cultural phenomenon in China, which paradoxically co-exists with their desire for individualism. Mao Zedong started the “One Child Policy” in 1979 to slow the rapid population growth in China, and the generations born under this policy are often called “the lonely generations,” with both parents working full-time. At the same time, they are “the generation of me,” labelled as spoiled, self-centred, and obsessed with consumption (de Kloet; Liu; Rofel; Wang). The individuals of this generation, now aged in their 20s and 30s, constitute the primary consumers of coffee in China. Whereas individualism is an important value to them, a sense of community is also desirable in order to compensate for their lack of siblings. Furthermore, the 1980s’ generation has also benefitted from the university expansion policy implemented in 1999. Since then, China has witnessed a surge of university students and graduates who not only received scientific and other course-based knowledge, but also had a better chance to be exposed to foreign cultures through their books, music, and movies. With this interesting tension between individualism and collectivism, the atmosphere provided by cafés has fostered a series of curious temporary communities built on cultural and culinary taste. Interestingly, it has become an aspiration of many young college students and graduates to open a community-space style café in a city. One of the best examples is the new Henduoren’s (Many People’s) Café. This was a project initiated by Wen Erniu, a recent college graduate who wanted to open a café in Beijing but did not have sufficient funds to do so. She posted a message on the Internet, asking people to invest a minimum of US$316 to open a café with her. With 78 investors, the café opened in September 2011 in Beijing (see pictures of Henduoren’s Café). In an interview with the China Daily, Wen Erniu stated that, “To open a cafe was a dream of mine, but I could not afford it […] We thought opening a cafe might be many people’s dream […] and we could get together via the Internet to make it come true” (quoted in Liu 2011). Conclusion: Café Culture and (Instant) Coffee in China There is a Chinese saying that, if you hate someone—just persuade him or her to open a coffee shop. Since cafés provide spaces where one can spend a relatively long time for little financial outlay, owners have to increase prices to cover their expenses. This can result in fewer customers. In retaliation, cafés—particularly those with cultural and literary ambience—host cultural events to attract people, and/or they offer food and wine along with coffee. The high prices, however, remain. In fact, the average price of coffee in China is often higher than in Europe and North America. For example, a medium Starbucks’ caffè latte in China averaged around US$4.40 in 2010, according to the price list of a Starbucks outlet in Shanghai—and the prices has recently increased again (Xinhua 2012). This partially explains why instant coffee is still so popular in China. A bag of instant Nestlé coffee cost only some US$0.25 in a Beijing supermarket in 2010, and requires only hot water, which is accessible free almost everywhere in China, in any restaurant, office building, or household. As an habitual, addictive treat, however, coffee has not yet become a customary, let alone necessary, drink for most Chinese. Moreover, while many, especially those of the older generations, could discern the quality and varieties of tea, very few can judge the quality of the coffee served in cafés. As a result, few Mainland Chinese coffee consumers have a purely somatic demand for coffee—craving its smell or taste—and the highly sweetened and creamed instant coffee offered by companies like Nestlé or Maxwell has largely shaped the current Chinese palate for coffee. Ben Highmore has proposed that “food spaces (shops, restaurants and so on) can be seen, for some social agents, as a potential space where new ‘not-me’ worlds are encountered” (396) He continues to expand that “how these potential spaces are negotiated—the various affective registers of experience (joy, aggression, fear)—reflect the multicultural shapes of a culture (its racism, its openness, its acceptance of difference)” (396). Cafés in contemporary China provide spaces where one encounters and constructs new “not-me” worlds, and more importantly, new “with-me” worlds. While café-going communicates an appreciation and desire for new lifestyles and new selves, it can be hoped that in the near future, coffee will also be appreciated for its smell, taste, and other benefits. Of course, it is also necessary that future Chinese coffee consumers also recognise the rich and complex cultural, political, and social issues behind the coffee economy in the era of globalisation. References Byers, Paul [former Managing Director, Sara Lee’s Asia Pacific]. Pers. comm. Apr. 2012. China Beverage News. “Nestlé Acquires 70% Stake in Chinese Mineral Water Producer.” (2010). 31 Mar. 2012 ‹http://chinabevnews.wordpress.com/2010/02/21/nestle-acquires-70-stake-in-chinese-mineral-water-producer›. Chunzi. 张爱玲地图[The Map of Eileen Chang]. 汉语大词典出版 [Hanyu Dacidian Chubanshe], 2003. de Kloet, Jeroen. China with a Cut: Globalization, Urban Youth and Popular Music. Amsterdam: Amsterdam UP, 2010. Dong, Jonathan. “A Caffeinated Timeline: Developing Yunnan’s Coffee Cultivation.” China Brief (2011): 24-26. Highmore, Ben. “Alimentary Agents: Food, Cultural Theory and Multiculturalism.” Journal of Intercultural Studies, 29.4 (2008): 381-98. ITC (International Trade Center). The Coffee Sector in China: An Overview of Production, Trade And Consumption, 2010. Liu, Kang. Globalization and Cultural Trends in China. Honolulu: University of Hawai’i Press, 2004. Liu, Zhihu. “From Virtual to Reality.” China Daily (Dec. 2011) 31 Mar. 2012 ‹http://www.chinadaily.com.cn/life/2011-12/26/content_14326490.htm›. Lloyd, Richard. Neobohemia: Art and Commerce in the Postindustrial City. London: Routledge, 2006. Lu, Xun. “Geming Kafei Guan [Revolutionary Café]”. San Xian Ji. Taibei Shi: Feng Yun Shi Dai Chu Ban Gong Si: Fa Xing Suo Xue Wen Hua Gong Si, Mingguo 78 (1989): 133-36. Rofel, Lisa. Desiring China: Experiments in Neoliberalism, Sexuality, and Public Culture. Durham and London: Duke UP, 2007: 1-30. “Starbucks Celebrates Its 500th Store Opening in Mainland China.” Starbucks Newsroom (Oct. 2011) 31 Mar. 2012. ‹http://news.starbucks.com/article_display.cfm?article_id=580›. Wang, Jing. High Culture Fever: Politics, Aesthetics, and Ideology in Deng’s China. Berkeley, Los Angeles, London: U of California P, 1996. Xinhua. “Starbucks Raises Coffee Prices in China Stores.” Xinhua News (Jan. 2012). 31 Mar. 2012 ‹http://news.xinhuanet.com/english/china/2012-01/31/c_131384671.htm›. Yuyue. Ed. “On the History of the Western-Style Restaurants: Aileen Chang A Frequent Customer of Kiessling.” China.com.cn (2010). 31 Mar. 2012 ‹http://www.china.com.cn/culture/txt/2010-01/30/content_19334964.htm›.

30

Newman, James. "Save the Videogame! The National Videogame Archive: Preservation, Supersession and Obsolescence." M/C Journal 12, no.3 (July15, 2009). http://dx.doi.org/10.5204/mcj.167.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Introduction In October 2008, the UK’s National Videogame Archive became a reality and after years of negotiation, preparation and planning, this partnership between Nottingham Trent University’s Centre for Contemporary Play research group and The National Media Museum, accepted its first public donations to the collection. These first donations came from Sony’s Computer Entertainment Europe’s London Studios who presented the original, pre-production PlayStation 2 EyeToy camera (complete with its hand-written #1 sticker) and Harmonix who crossed the Atlantic to deliver prototypes of the Rock Band drum kit and guitar controllers along with a slew of games. Since then, we have been inundated with donations, enquiries and volunteers offering their services and it is clear that we have exciting and challenging times ahead of us at the NVA as we seek to continue our collecting programme and preserve, conserve, display and interpret these vital parts of popular culture. This essay, however, is not so much a document of these possible futures for our research or the challenges we face in moving forward as it is a discussion of some of the issues that make game preservation a vital and timely undertaking. In briefly telling the story of the genesis of the NVA, I hope to draw attention to some of the peculiarities (in both senses) of the situation in which videogames currently exist. While considerable attention has been paid to the preservation and curation of new media arts (e.g. Cook et al.), comparatively little work has been undertaken in relation to games. Surprisingly, the games industry has been similarly neglectful of the histories of gameplay and gamemaking. Throughout our research, it has became abundantly clear that even those individuals and companies most intimately associated with the development of this form, do not hold their corporate and personal histories in the high esteem we expected (see also Lowood et al.). And so, despite the well-worn bluster of an industry that proclaims itself as culturally significant as Hollywood, it is surprisingly difficult to find a definitive copy of the boxart of the final release of a Triple-A title let alone any of the pre-production materials. Through our journeys in the past couple of years, we have encountered shoeboxes under CEOs’ desks and proud parents’ collections of tapes and press cuttings. These are the closest things to a formalised archive that we currently have for many of the biggest British game development and publishing companies. Not only is this problematic in and of itself as we run the risk of losing titles and documents forever as well as the stories locked up in the memories of key individuals who grow ever older, but also it is symptomatic of an industry that, despite its public proclamations, neither places a high value on its products as popular culture nor truly recognises their impact on that culture. While a few valorised, still-ongoing, franchises like the Super Mario and Legend of Zelda series are repackaged and (digitally) re-released so as to provide continuity with current releases, a huge number of games simply disappear from view once their short period of retail limelight passes. Indeed, my argument in this essay rests to some extent on the admittedly polemical, and maybe even antagonistic, assertion that the past business and marketing practices of the videogames industry are partly to blame for the comparatively underdeveloped state of game preservation and the seemingly low cultural value placed on old games within the mainstream marketplace. Small wonder, then, that archives and formalised collections are not widespread. However antagonistic this point may seem, this essay does not set out merely to criticise the games industry. Indeed, it is important to recognise that the success and viability of projects such as the NVA is derived partly from close collaboration with industry partners. As such, it is my hope that in addition to contributing to the conversation about the importance and need for formalised strategies of game preservation, this essay goes some way to demonstrating the necessity of universities, museums, developers, publishers, advertisers and retailers tackling these issues in partnership. The Best Game Is the Next Game As will be clear from these opening paragraphs, this essay is primarily concerned with ‘old’ games. Perhaps surprisingly, however, we shall see that ‘old’ games are frequently not that old at all as even the shiniest, and newest of interactive experiences soon slip from view under the pressure of a relentless industrial and institutional push towards the forthcoming release and the ‘next generation’. More surprising still is that ‘old’ games are often difficult to come by as they occupy, at best, a marginalised position in the contemporary marketplace, assuming they are even visible at all. This is an odd situation. Videogames are, as any introductory primer on game studies will surely reveal, big business (see Kerr, for instance, as well as trade bodies such as ELSPA and The ESA for up-to-date sales figures). Given the videogame industry seems dedicated to growing its business and broadening its audiences (see Radd on Sony’s ‘Game 3.0’ strategy, for instance), it seems strange, from a commercial perspective if no other, that publishers’ and developers’ back catalogues are not being mercilessly plundered to wring the last pennies of profit from their IPs. Despite being cherished by players and fans, some of whom are actively engaged in their own private collecting and curation regimes (sometimes to apparently obsessive excess as Jones, among others, has noted), videogames have, nonetheless, been undervalued as part of our national popular cultural heritage by institutions of memory such as museums and archives which, I would suggest, have largely ignored and sometimes misunderstood or misrepresented them. Most of all, however, I wish to draw attention to the harm caused by the videogames industry itself. Consumers’ attentions are focused on ‘products’, on audiovisual (but mainly visual) technicalities and high-definition video specs rather than on the experiences of play and performance, or on games as artworks or artefact. Most damagingly, however, by constructing and contributing to an advertising, marketing and popular critical discourse that trades almost exclusively in the language of instant obsolescence, videogames have been robbed of their historical value and old platforms and titles are reduced to redundant, legacy systems and easily-marginalised ‘retro’ curiosities. The vision of inevitable technological progress that the videogames industry trades in reminds us of Paul Duguid’s concept of ‘supersession’ (see also Giddings and Kennedy, on the ‘technological imaginary’). Duguid identifies supersession as one of the key tropes in discussions of new media. The reductive idea that each new form subsumes and replaces its predecessor means that videogames are, to some extent, bound up in the same set of tensions that undermine the longevity of all new media. Chun rightly notes that, in contrast with more open terms like multimedia, ‘new media’ has always been somewhat problematic. Unaccommodating, ‘it portrayed other media as old or dead; it converged rather than multiplied; it did not efface itself in favor of a happy if redundant plurality’ (1). The very newness of new media and of videogames as the apotheosis of the interactivity and multimodality they promise (Newman, "In Search"), their gleam and shine, is quickly tarnished as they are replaced by ever-newer, ever more exciting, capable and ‘revolutionary’ technologies whose promise and moment in the limelight is, in turn, equally fleeting. As Franzen has noted, obsolescence and the trail of abandoned, superseded systems is a natural, even planned-for, product of an infatuation with the newness of new media. For Kline et al., the obsession with obsolescence leads to the characterisation of the videogames industry as a ‘perpetual innovation economy’ whose institutions ‘devote a growing share of their resources to the continual alteration and upgrading of their products. However, it is my contention here that the supersessionary tendency exerts a more serious impact on videogames than some other media partly because the apparently natural logic of obsolescence and technological progress goes largely unchecked and partly because there remain few institutions dedicated to considering and acting upon game preservation. The simple fact, as Lowood et al. have noted, is that material damage is being done as a result of this manufactured sense of continual progress and immediate, irrefutable obsolescence. By focusing on the upcoming new release and the preview of what is yet to come; by exciting gamers about what is in development and demonstrating the manifest ways in which the sheen of the new inevitably tarnishes the old. That which is replaced is fit only for the bargain bin or the budget-priced collection download, and as such, it is my position that we are systematically undermining and perhaps even eradicating the possibility of a thorough and well-documented history for videogames. This is a situation that we at the National Videogame Archive, along with colleagues in the emerging field of game preservation (e.g. the International Game Developers Association Game Preservation Special Interest Group, and the Keeping Emulation Environments Portable project) are, naturally, keen to address. Chief amongst our concerns is better understanding how it has come to be that, in 2009, game studies scholars and colleagues from across the memory and heritage sectors are still only at the beginning of the process of considering game preservation. The IGDA Game Preservation SIG was founded only five years ago and its ‘White Paper’ (Lowood et al.) is just published. Surprisingly, despite the importance of videogames within popular culture and the emergence and consolidation of the industry as a potent creative force, there remains comparatively little academic commentary or investigation into the specific situation and life-cycles of games or the demands that they place upon archivists and scholars of digital histories and cultural heritage. As I hope to demonstrate in this essay, one of the key tasks of the project of game preservation is to draw attention to the consequences of the concentration, even fetishisation, of the next generation, the new and the forthcoming. The focus on what I have termed ‘the lure of the imminent’ (e.g. Newman, Playing), the fixation on not only the present but also the as-yet-unreleased next generation, has contributed to the normalisation of the discourses of technological advancement and the inevitability and finality of obsolescence. The conflation of gameplay pleasure and cultural import with technological – and indeed, usually visual – sophistication gives rise to a context of endless newness, within which there appears to be little space for the ‘outdated’, the ‘superseded’ or the ‘old’. In a commercial and cultural space in which so little value is placed upon anything but the next game, we risk losing touch with the continuities of development and the practices of play while simultaneously robbing players and scholars of the critical tools and resources necessary for contextualised appreciation and analysis of game form and aesthetics, for instance (see Monnens, "Why", for more on the value of preserving ‘old’ games for analysis and scholarship). Moreover, we risk losing specific games, platforms, artefacts and products as they disappear into the bargain bucket or crumble to dust as media decay, deterioration and ‘bit rot’ (Monnens, "Losing") set in. Space does not here permit a discussion of the scope and extent of the preservation work required (for instance, the NVA sets its sights on preserving, documenting, interpreting and exhibiting ‘videogame culture’ in its broadest sense and recognises the importance of videogames as more than just code and as enmeshed within complex networks of productive, consumptive and performative practices). Neither is it my intention to discuss here the specific challenges and numerous issues associated with archival and exhibition tools such as emulation which seek to rebirth code on up-to-date, manageable, well-supported hardware platforms but which are frequently insensitive to the specificities and nuances of the played experience (see Newman, "On Emulation", for some further notes on videogame emulation, archiving and exhibition and Takesh*ta’s comments in Nutt on the technologies and aesthetics of glitches, for instance). Each of these issues is vitally important and will, doubtless become a part of the forthcoming research agenda for game preservation scholars. My focus here, however, is rather more straightforward and foundational and though it is deliberately controversial, it is my hope that its casts some light over some ingrained assumptions about videogames and the magnitude and urgency of the game preservation project. Videogames Are Disappearing? At a time when retailers’ shelves struggle under the weight of newly-released titles and digital distribution systems such as Steam, the PlayStation Network, Xbox Live Marketplace, WiiWare, DSiWare et al bring new ways to purchase and consume playable content, it might seem strange to suggest that videogames are disappearing. In addition to what we have perhaps come to think of as the ‘usual suspects’ in the hardware and software publishing marketplace, over the past year or so Apple have, unexpectedly and perhaps even surprising themselves, carved out a new gaming platform with the iPhone/iPod Touch and have dramatically simplified the notoriously difficult process of distributing mobile content with the iTunes App Store. In the face of this apparent glut of games and the emergence and (re)discovery of new markets with the iPhone, Wii and Nintendo DS, videogames seem an ever more a vital and visible part of popular culture. Yet, for all their commercial success and seemingly penetration the simple fact is that they are disappearing. And at an alarming rate. Addressing the IGDA community of game developers and producers, Henry Lowood makes the point with admirable clarity (see also Ruggill and McAllister): If we fail to address the problems of game preservation, the games you are making will disappear, perhaps within a few decades. You will lose access to your own intellectual property, you will be unable to show new developers the games you designed or that inspired you, and you may even find it necessary to re-invent a bunch of wheels. (Lowood et al. 1) For me, this point hit home most persuasively a few years ago when, along with Iain Simons, I was invited by the British Film Institute to contribute a book to their ‘Screen Guides’ series. 100 Videogames (Newman and Simons) was an intriguing prospect that provided us with the challenge and opportunity to explore some of the key moments in videogaming’s forty year history. However, although the research and writing processes proved to be an immensely pleasurable and rewarding experience that we hope culminated in an accessible, informative volume offering insight into some well-known (and some less-well known) games, the project was ultimately tinged with a more than a little disappointment and frustration. Assuming our book had successfully piqued the interest of our readers into rediscovering games previously played or perhaps investigating games for the first time, what could they then do? Where could they go to find these games in order to experience their delights (or their flaws and problems) at first hand? Had our volume been concerned with television or film, as most of the Screen Guides are, then online and offline retailers, libraries, and even archives for less widely-available materials, would have been obvious ports of call. For the student of videogames, however, the choices are not so much limited as practically non-existant. It is only comparatively recently that videogame retailers have shifted away from an almost exclusive focus on new releases and the zeitgeist platforms towards a recognition of old games and systems through the creation of the ‘pre-owned’ marketplace. The ‘pre-owned’ transaction is one in which old titles may be traded in for cash or against the purchase of new releases of hardware or software. Surely, then, this represents the commercial viability of classic games and is a recognition on the part of retail that the new release is not the only game in town. Yet, if we consider more carefully the ‘pre-owned’ model, we find a few telling points. First, there is cold economic sense to the pre-owned business model. In their financial statements for FY08, ‘GAME revealed that the service isn’t just a key part of its offer to consumers, but its also represents an ‘attractive’ gross margin 39 per cent.’ (French). Second, and most important, the premise of the pre-owned business as it is communicated to consumers still offers nothing but primacy to the new release. That one would trade-in one’s old games in order to consume these putatively better new ones speaks eloquently in the language of obsolesce and what Dovey and Kennedy have called the ‘technological imaginary’. The wire mesh buckets of old, pre-owned games are not displayed or coded as treasure troves for the discerning or completist collector but rather are nothing more than bargain bins. These are not classic games. These are cheap games. Cheap because they are old. Cheap because they have had their day. This is a curious situation that affects videogames most unfairly. Of course, my caricature of the videogame retailer is still incomplete as a good deal of the instantly visible shopfloor space is dedicated neither to pre-owned nor new releases but rather to displays of empty boxes often sporting unfinalised, sometimes mocked-up, boxart flaunting titles available for pre-order. Titles you cannot even buy yet. In the videogames marketplace, even the present is not exciting enough. The best game is always the next game. Importantly, retail is not alone in manufacturing this sense of dissatisfaction with the past and even the present. The specialist videogames press plays at least as important a role in reinforcing and normalising the supersessionary discourse of instant obsolescence by fixing readers’ attentions and expectations on the just-visible horizon. Examining the pages of specialist gaming publications reveals them to be something akin to Futurist paeans dedicating anything from 70 to 90% of their non-advertising pages to previews, interviews with developers about still-in-development titles (see Newman, Playing, for more on the specialist gaming press’ love affair with the next generation and the NDA scoop). Though a small number of publications specifically address retro titles (e.g. Imagine Publishing’s Retro Gamer), most titles are essentially vehicles to promote current and future product lines with many magazines essentially operating as delivery devices for cover-mounted CDs/DVDs offering teaser videos or playable demos of forthcoming titles to further whet the appetite. Manufacturing a sense of excitement might seem wholly natural and perhaps even desirable in helping to maintain a keen interest in gaming culture but the effect of the imbalance of popular coverage has a potentially deleterious effect on the status of superseded titles. Xbox World 360’s magnificently-titled ‘Anticip–O–Meter’ ™ does more than simply build anticipation. Like regular features that run under headings such as ‘The Next Best Game in The World Ever is…’, it seeks to author not so much excitement about the imminent release but a dissatisfaction with the present with which unfavourable comparisons are inevitably drawn. The current or previous crop of (once new, let us not forget) titles are not simply superseded but rather are reinvented as yardsticks to judge the prowess of the even newer and unarguably ‘better’. As Ashton has noted, the continual promotion of the impressiveness of the next generation requires a delicate balancing act and a selective, institutionalised system of recall and forgetting that recovers the past as a suite of (often technical) benchmarks (twice as many polygons, higher resolution etc.) In the absence of formalised and systematic collecting, these obsoleted titles run the risk of being forgotten forever once they no longer serve the purpose of demonstrating the comparative advancement of the successors. The Future of Videogaming’s Past Even if we accept the myriad claims of game studies scholars that videogames are worthy of serious interrogation in and of themselves and as part of a multifaceted, transmedial supersystem, we might be tempted to think that the lack of formalised collections, archival resources and readily available ‘old/classic’ titles at retail is of no great significance. After all, as Jones has observed, the videogame player is almost primed to undertake this kind of activity as gaming can, at least partly, be understood as the act and art of collecting. Games such as Animal Crossing make this tendency most manifest by challenging their players to collect objects and artefacts – from natural history through to works of visual art – so as to fill the initially-empty in-game Museum’s cases. While almost all videogames from The Sims to Katamari Damacy can be considered to engage their players in collecting and collection management work to some extent, Animal Crossing is perhaps the most pertinent example of the indivisibility of the gamer/archivist. Moreover, the permeability of the boundary between the fan’s collection of toys, dolls, posters and the other treasured objects of merchandising and the manipulation of inventories, acquisitions and equipment lists that we see in the menus and gameplay imperatives of videogames ensures an extensiveness and scope of fan collecting and archival work. Similarly, the sociality of fan collecting and the value placed on private hoarding, public sharing and the processes of research ‘…bridges to new levels of the game’ (Jones 48). Perhaps we should be as unsurprised that their focus on collecting makes videogames similar to eBay as we are to the realisation that eBay with its competitiveness, its winning and losing states, and its inexorable countdown timer, is nothing if not a game? We should be mindful, however, of overstating the positive effects of fandom on the fate of old games. Alongside eBay’s veneration of the original object, p2p and bittorrent sites reduce the videogame to its barest. Quite apart from the (il)legality of emulation and videogame ripping and sharing (see Conley et al.), the existence of ‘ROMs’ and the technicalities of their distribution reveals much about the peculiar tension between the interest in old games and their putative cultural and economic value. (St)ripped down to the barest of code, ROMs deny the gamer the paratextuality of the instruction manual or boxart. In fact, divorced from its context and robbed of its materiality, ROMs perhaps serve to make the original game even more distant. More tellingly, ROMs are typically distributed by the thousand in zipped files. And so, in just a few minutes, entire console back-catalogues – every game released in every territory – are available for browsing and playing on a PC or Mac. The completism of the collections allows detailed scrutiny of differences in Japanese versus European releases, for instance, and can be seen as a vital investigative resource. However, that these ROMs are packaged into collections of many thousands speaks implicitly of these games’ perceived value. In a similar vein, the budget-priced retro re-release collection helps to diminish the value of each constituent game and serves to simultaneously manufacture and highlight the manifestly unfair comparison between these intriguingly retro curios and the legitimately full-priced games of now and next. Customer comments at Amazon.co.uk demonstrate the way in which historical and technological comparisons are now solidly embedded within the popular discourse (see also Newman 2009b). Leaving feedback on Sega’s PS3/Xbox 360 Sega MegaDrive Ultimate Collection customers berate the publisher for the apparently meagre selection of titles on offer. Interestingly, this charge seems based less around the quality, variety or range of the collection but rather centres on jarring technological schisms and a clear sense of these titles being of necessarily and inevitably diminished monetary value. Comments range from outraged consternation, ‘Wtf, only 40 games?’, ‘I wont be getting this as one disc could hold the entire arsenal of consoles and games from commodore to sega saturn(Maybe even Dreamcast’ through to more detailed analyses that draw attention to the number of bits and bytes but that notably neglect any consideration of gameplay, experientiality, cultural significance or, heaven forbid, fun. “Ultimate” Collection? 32Mb of games on a Blu-ray disc?…here are 40 Megadrive games at a total of 31 Megabytes of data. This was taking the Michael on a DVD release for the PS2 (or even on a UMD for the PSP), but for a format that can store 50 Gigabytes of data, it’s an insult. Sega’s entire back catalogue of Megadrive games only comes to around 800 Megabytes - they could fit that several times over on a DVD. The ultimate consequence of these different but complementary attitudes to games that fix attentions on the future and package up decontextualised ROMs by the thousand or even collections of 40 titles on a single disc (selling for less than half the price of one of the original cartridges) is a disregard – perhaps even a disrespect – for ‘old’ games. Indeed, it is this tendency, this dominant discourse of inevitable, natural and unimpeachable obsolescence and supersession, that provided one of the prime motivators for establishing the NVA. As Lowood et al. note in the title of the IGDA Game Preservation SIG’s White Paper, we need to act to preserve and conserve videogames ‘before it’s too late’.ReferencesAshton, D. ‘Digital Gaming Upgrade and Recovery: Enrolling Memories and Technologies as a Strategy for the Future.’ M/C Journal 11.6 (2008). 13 Jun 2009 ‹http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/86›.Buffa, C. ‘How to Fix Videogame Journalism.’ GameDaily 20 July 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/how-to-fix-videogame-journalism/69202/?biz=1›. ———. ‘Opinion: How to Become a Better Videogame Journalist.’ GameDaily 28 July 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/opinion-how-to-become-a-better-videogame-journalist/69236/?biz=1. ———. ‘Opinion: The Videogame Review – Problems and Solutions.’ GameDaily 2 Aug. 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/opinion-the-videogame-review-problems-and-solutions/69257/?biz=1›. ———. ‘Opinion: Why Videogame Journalism Sucks.’ GameDaily 14 July 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/opinion-why-videogame-journalism-sucks/69180/?biz=1›. Cook, Sarah, Beryl Graham, and Sarah Martin eds. Curating New Media, Gateshead: BALTIC, 2002. Duguid, Paul. ‘Material Matters: The Past and Futurology of the Book.’ In Gary Nunberg, ed. The Future of the Book. Berkeley, CA: University of California Press, 1996. 63–101. French, Michael. 'GAME Reveals Pre-Owned Trading Is 18% of Business.’ MCV 22 Apr. 2009. 13 Jun 2009 ‹http://www.mcvuk.com/news/34019/GAME-reveals-pre-owned-trading-is-18-per-cent-of-business›. Giddings, Seth, and Helen Kennedy. ‘Digital Games as New Media.’ In J. Rutter and J. Bryce, eds. Understanding Digital Games. London: Sage. 129–147. Gillen, Kieron. ‘The New Games Journalism.’ Kieron Gillen’s Workblog 2004. 13 June 2009 ‹http://gillen.cream.org/wordpress_html/?page_id=3›. Jones, S. The Meaning of Video Games: Gaming and Textual Strategies, New York: Routledge, 2008. Kerr, A. The Business and Culture of Digital Games. London: Sage, 2006. Lister, Martin, John Dovey, Seth Giddings, Ian Grant and Kevin Kelly. New Media: A Critical Introduction. London and New York: Routledge, 2003. Lowood, Henry, Andrew Armstrong, Devin Monnens, Zach Vowell, Judd Ruggill, Ken McAllister, and Rachel Donahue. Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009 ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. Monnens, Devin. ‘Why Are Games Worth Preserving?’ In Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009 ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. ———. ‘Losing Digital Game History: Bit by Bit.’ In Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009 ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. Newman, J. ‘In Search of the Videogame Player: The Lives of Mario.’ New Media and Society 4.3 (2002): 407-425.———. ‘On Emulation.’ The National Videogame Archive Research Diary, 2009. 13 June 2009 ‹http://www.nationalvideogamearchive.org/index.php/2009/04/on-emulation/›. ———. ‘Our Cultural Heritage – Available by the Bucketload.’ The National Videogame Archive Research Diary, 2009. 10 Apr. 2009 ‹http://www.nationalvideogamearchive.org/index.php/2009/04/our-cultural-heritage-available-by-the-bucketload/›. ———. Playing with Videogames, London: Routledge, 2008. ———, and I. Simons. 100 Videogames. London: BFI Publishing, 2007. Nutt, C. ‘He Is 8-Bit: Capcom's Hironobu Takesh*ta Speaks.’ Gamasutra 2008. 13 June 2009 ‹http://www.gamasutra.com/view/feature/3752/›. Radd, D. ‘Gaming 3.0. Sony’s Phil Harrison Explains the PS3 Virtual Community, Home.’ Business Week 9 Mar. 2007. 13 June 2009 ‹http://www.businessweek.com/innovate/content/mar2007/id20070309_764852.htm?chan=innovation_game+room_top+stories›. Ruggill, Judd, and Ken McAllister. ‘What If We Do Nothing?’ Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009. ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. 16-19.

31

Bender, Stuart Marshall. "You Are Not Expected to Survive: Affective Friction in the Combat Shooter Game Battlefield 1." M/C Journal 20, no.1 (March15, 2017). http://dx.doi.org/10.5204/mcj.1207.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

IntroductionI stumble to my feet breathing heavily and, over the roar of a tank, a nearby soldier yells right into my face: “We’re surrounded! We have to hold this line!” I follow him, moving past burning debris and wounded men being helped walk back in the opposite direction. Shells explode around me, a whistle sounds, and then the Hun attack; shadowy figures that I fire upon as they approach through the battlefield fog and smoke. I shoot some. I take cover behind walls as others fire back. I reload the weapon. I am hit by incoming fire, and a red damage indicator appears onscreen, so I move to a better cover position. As I am hit again and again, the image becomes blurry and appears as if in slow-motion, the sound also becoming muffled. As an enemy wielding a flame-thrower appears and blasts me with thick fire, my avatar gasps and collapses. The screen fades to black.So far, so very normal in the World War One themed first-person shooter Battlefield 1 (Electronic Arts 2016). But then the game does something unanticipated. I expect to reappear—or respawn—in the same scenario to play better, to stay in the fight longer. Instead, the camera view switches to an external position, craning upwards cinematically from my character’s dying body. Text superimposed over the view indicates the minimalist epitaph: “Harvey Nottoway 1889-1918.” The camera view then races backwards, high over the battlefield and finally settles into position behind a mounted machine-gun further back from the frontline as the enemy advances closer. Immediately I commence shooting, mowing down German troops as they enter our trenches. Soon I am hit and knocked away from the machine-gun. Picking up a shotgun I start shooting the enemy at close-quarters, until I am once again overrun and my character collapses. Now the onscreen text states I was playing as “Dean Stevenson 1899-1918.”I have attempted this prologue to the Battlefield 1 campaign a number of times. No matter how skilfully I play, or how effectively I simply run away and hide from the combat, this pattern continues: the structure of the game forces the player’s avatar to be repeatedly killed in order for the narrative to progress. Over a series of player deaths, respawning as an entirely new character each time, the combat grows in ferocity and the music also becomes increasingly frenetic. The fighting turns to hand-to-hand combat, or shovel-to-head combat to be more precise, and eventually an artillery barrage wipes everybody out (Figure 1). At this point, the prologue is complete and the gamer may continue in a variety of single-player episodes in different theatres of WW1, each of which is structured according to the normal rules of combat games: when your avatar is killed, you respawn at the most recent checkpoint for a follow-up attempt.What are we to make of this alternative narrative structure deployed by the opening episode of Battlefield 1? In contrast to the normal video-game affordances of re-playability until completion, this narrative necessitation of death is in some ways motivated by the onscreen text that introduces the prologue: “What follows is frontline combat. You are not expected to survive.” Certainly it is true that the rest of the game (either single-player or in its online multiplayer deathmatch mode) follows the predictable pattern of dying, replaying, completing. And also we would not expect Battlefield 1 to be motivated primarily by a kind of historical fidelity given that an earlier instalment in the series, Battlefield 1942 (2002) was described by one reviewer as:a comic book version of WWII. The fact that any player can casually hop into a tank, drive around, hop out and pick off an enemy soldier with a sniper rifle, hop into a plane, parachute out, and then call in artillery fire (within the span of a few minutes) should tell you a lot about the game. (Osborne)However what is happening in this will-to-die structure of the game’s prologue represents an alternative and affectively unsettling game experience both in its ludological structure as well as its affective impact. Defamiliarization and Humanization Drawing upon a phenomenology of game-play, whereby the scholar examines the game “as played” (see Atkins and Kryzwinska; Keogh; Wilson) to consider how the text reveals itself to the player, I argue that the introductory single-player episode of Battlefield 1 functions to create a defamiliarizing effect on the player. Defamiliarization, the Russian Formalist term for the effect created by art when some unusual aspect of a text challenges accepted perceptions and/or representations (Schklovski; Thompson), is a remarkably common effect created by the techniques used in combat cinema and video-games. This is unsurprising. After all, warfare is one of the very examples Schklovski uses as something that audiences have developed habituated responses to and which artworks must defamiliarize. The effect may be created by many techniques in a text, and in certain cases a work may defamiliarize even its own form. For instance, recent work on the violence in Saving Private Ryan shows that during the lengthy Omaha Beach sequence, the most vivid instances of violence—including the famous shot of a soldier picking up his dismembered arm—occur well after the audience has potentially become inured to the onslaught of the earlier frequent, but less graphic, carnage (Bender Film Style and WW2). To make these moments stand out with equivalent horrific impact against the background of the Normandy beach bloodbath Spielberg also treats them with a stuttered frame effect and accompanying audio distortion, motivated (to use a related Formalist term) by the character’s apparent concussion and temporary disorientation. Effectively a sequence of point of view shots then, this moment in Private Ryan has become a model for many other war texts, and indeed the player’s death in the opening sequence of Battlefield 1 is portrayed using a very similar (though not identical) audio-visual treatment (Figure 2).Although the Formalists never played videogames, recent scholarship has approached the medium from a similar perspective. For example, Brendan Keogh has focused on the challenges to traditional videogame pleasure generated by the 2012 dystopian shooter Spec Ops: The Line. Keogh notes that the game developers intended to create displeasure and “[forcing] the player to consider what is obscured in the pixilation of war” by, for instance, having them kill fellow American troops in order for the game narrative to continue (Keogh 9). In addition, the game openly taunts the player’s expectations of entertainment based, uncritical run-and-gun gameplay with onscreen text during level loading periods such as “Do you feel like a hero yet?” (8).These kinds of challenges to the expectations of entertainment in combat shooters are found also in one sequence from the 2009 game Call of Duty: Modern Warfare 2 in which the player—as an undercover operative—is forced to participate in a terrorist attack in which civilians are killed (Figure 3). While playing that level, titled “No Russian,” Timothy Welsh argues: “The player may shoot the unarmed civilians or not; the level still creeps slowly forward regardless” (Welsh 409). In Welsh’s analysis, this level emerges as an unusual attempt by a popular video game to “humanize” the non-playing characters that are ordinarily gunned down without any critical and self-reflective thought by the player in most shooter games. The player is forced into a scenario in which they must make a highly difficult ethical choice, but the game will show civilians being killed either way.In contrast to the usual criticisms of violent video games—eg., that they may be held responsible for school shootings, increased adolescent aggression and so on —the “No Russian” sequence drew dramatic complaints of being a “terrorist simulator” (Welsh 389). But for Welsh this ethical choice facing the player, to shoot or not to shoot civilians, raises the game to a textual experience offering self-inspection. As in the fictional theme park of Westworld (HBO 2016), it does not really matter to the digital victim if a player kills them, but it should—and does—matter to the player. There are no external consequences to killing a computer game character composed only of pixels, or killing/raping a robot in the Westworld theme park, however there are internal consequences: it makes you a killer, or a rapist (see Harris and Bloom).Thus, from the perspective of defamiliarization, the game can be regarded as creating the effect that Matthew Payne has labelled “critical displeasure.” Writing about the way this is created by Spec Ops, Payne argues that:the result is a game that wields its affective distance as a critique of the necessary illusion that all military shooters trade in, but one that so few acknowledge. In particular, the game’s brutal mise-en-scène, its intertextual references to other war media, and its real and imagined opportunities for player choice, create a discordant feeling that lays bare the ease with which most video war games indulge in their power fantasies. (Payne 270)There is then, a minor tradition of alternative military-themed video game works that attempt to invite or enable the player to conduct a kind of ethical self-examination around their engagement with interactive representations of war via particular incursions of realism. The critical displeasure invoked by texts such as Spec Ops and the “No Russian” level of Call of Duty is particularly interesting in light of another military game that was ultimately cancelled by the publisher after it received public criticism. Titled Six Days in Fallujah, the game was developed with the participation of Marines who had fought in that real life battle and aimed to depict the events as they unfolded in 2004 during the campaign in Iraq. As Justin Rashid argues:the controversy that arose around Six Days in Fallujah was, of course, a result of the view that commercial video games can only ever be pure entertainment; games do not have the authority or credibility to be part of a serious debate. (Rashid 17)On this basis, perhaps a criterial attribute of an acceptable alternative military game is that there is enough familiarity to evoke some critical distance, but not too much familiarity that the player must think about legitimately real-life consequences and impact. After all, Call of Duty was a successful release, even amid the controversy of “No Russian.” This makes sense as the level does not really challenge the overall enjoyment of the game. The novelty of the level, on the one hand, is that it is merely one part of the general narrative and cannot be regarded as representative of the whole game experience. On the other hand, because none of the events and scenarios have a clear indexical relationship to real-world terrorist attacks (at least prior to the Brussels attack in 2016) it is easy to play the ethical choice of shooting or not shooting civilians as a mental exercise rather than a reflection on something that really happened. This is the same lesson learned by the developers of the 2010 game Medal of Honor who ultimately changed the name of the enemy soldiers from “The Taliban” to “OPFOR” (standing in for a generic “Opposing Forces”) after facing pressure from the US and UK Military who claimed that the multiplayer capacities of the game enabled players to play as the Taliban (see Rashid). Conclusion: Affective Friction in Battlefield 1In important ways then, these game experiences are precursors to Battlefield 1’s single player prologue. However, the latter does not attempt a wholesale deconstruction of the genre—as does Spec Ops—or represent an attempt to humanise (or perhaps re-humanise) the non-playable victim characters as Welsh suggests “No Russian” attempts to do. Battlefield 1’s opening structure of death-and-respawn-as-different-character can be read as humanizing the player’s avatar. But most importantly, I take Battlefield’s initially unusual gameplay as an aesthetic attempt to set a particular tone to the game. Motivated by the general cultural attitude of deferential respect for the Great War, Battlefield 1 takes an almost austere stance toward the violence depicted, paradoxically even as this impact is muted in the later gameplay structured according to normal multiplayer deathmatch rules of run-and-gun killing. The futility implied by the player’s constant dying is clearly motivated by an attempt at realism as one of the cultural memories of World War One is the sheer likelihood of being killed, whether as a frontline soldier or a citizen of a country engaged in combat (see Kramer). For Battlefield 1, the repeated dying is really part of the text’s aesthetic engagement. For this reason I prefer the term affective friction rather than critical displeasure. The austere tone of the game is indicated early, just prior to the prologue gameplay with onscreen text that reads:Battlefield 1 is based on events that unfolded over 100 years agoMore than 60 million soldiers fought in “The War to End All Wars”It ended nothing.Yet it changed the world forever. At a simple level, the player’s experience of being killed in order for the next part of the narrative to progress evokes this sense of futility. There have been real responses indicating this, for instance one reviewer argues that the structure is “a powerful treatment” (Howley). But there is potential for increased engagement with the game itself as the structure breaks the replay-cycle of usual games. For instance, another reviewer responds to the overall single-player campaign by suggesting “It is not something you can sit down and play through and not experience on a higher level than just clicking a mouse and tapping a keyboard” (Simpson). This affective friction amplifies, and draws attention to, the other advances in violent stylistics presented in the game. For instance, although the standard onscreen visual distortions are used to show character damage and the direction from which the attack came, the game does use slow-motion to draw out the character’s death. In addition, the game features incidental battlefield details of shell-shock, such as soldiers simply holding the head in their hands, frozen as the battle rages around them (Figure 4). The presence of flame-thrower troops, and subsequently the depictions of characters running as they burn to death are also significant developments in violent aesthetics from earlier games. These elements of violence are constitutive of the affective friction. We may marvel at the technical achievement of such real-time rendering of dynamic fire and the artistic care given to animate deaths and shell-shock depictions. But simultaneously, these “violent delights”—to borrow from Westworld’s citation of Shakespeare—are innovations upon the depictions of earlier games, even contemporary, combat games. Indeed, one critic has almost ashamedly noted: “For a game about one of the most horrific wars in human history, it sure is pretty” (Kain).These violent depictions show a continuation in the tradition of increased detail which has been linked to a model of “reported realism” as a means of understanding audience’s claims of realism in combat films and modern videogames as a result primarily of their hypersaturated audio-visual texture (Bender "Blood Splats"). Here, saturation refers not to the specific technical quality of colour saturation but to the densely layered audio-visual structure often found in contemporary films and videogames. For example, thick mixing of soundtracks, details of gore, and nuanced movements (particularly of dying characters) all contribute to a hypersaturated aesthetic which tends to prompt audiences to make claims of realism for a combat text regardless of whether or not these viewers/players have any real world referent for comparison. Of course, there are likely to be players who will simply blast through any shooter game, giving no regard to the critical displeasure offered by Spec Ops narrative choices or the ethical dilemma of “No Russian.” There are also likely to be players who bypass the single-player campaign altogether and only bother with the multiplayer deathmatch experience, which functions in the same way as it does in other shooter games, including the previous Battlefield games. But perhaps the value of this game’s attempt at alternative storytelling, with its emphasis on tone and affect, is that even the “kill-em-all” player may experience a momentary impact from the violence depicted. This is particularly important given that, to borrow from Stephanie Fisher’s argument in regard to WW2 games, many young people encounter the history of warfare through such popular videogames (Fisher). In the centenary period of World War One, especially in Australia amid the present Anzac commemorative moment, the opportunity for young audiences to engage with the significance of the events. As a side-note, the later part of the single-player campaign even has a Gallipoli sequence, though the narrative of this component is designed as an action-hero adventure. Indeed, this is one example of how the alternative dying-to-continue structure of the prologue creates an affective friction against the normal gameplay and narratives that feature in the rest of the text. The ambivalent ways in which this unsettling opening scenario impacts on the remainder of the game-play, including for instance its depiction of PTSD, is illustrated by some industry reviewers. As one reviewer argues, the game does generate the feeling that “war isn’t fun — except when it is” (Plante). From this view, the cognitive challenge created by the will to die in the prologue creates an affective friction with the normalised entertainment inherent in the game’s multiplayer run-and-gun components that dominate the rest of Battlefield 1’s experience. Therefore, although Battlefield 1 ultimately proves to be an entertainment-oriented combat shooter, it is significant that the developers of this major commercial production decided to include an experimental structure to the prologue as a way of generating tone and affect in a fresh way. ReferencesAtkins, Barry, and Tanya Kryzwinska. "Introduction: Videogame, Player, Text." Videogame, Player, Text. Eds. Atkins, Barry and Tanya Kryzwinska. Manchester: Manchester University Press, 2007.Bender, Stuart Marshall. "Blood Splats and Bodily Collapse: Reported Realism and the Perception of Violence in Combat Films and Videogames." Projections 8.2 (2014): 1-25.Bender, Stuart Marshall. Film Style and the World War II Combat Film. Newcastle, UK: Cambridge Scholars Publishing, 2013.Fisher, Stephanie. "The Best Possible Story? Learning about WWII from FPS Video Games." Guns, Grenades, and Grunts: First-Person Shooter Games. Eds. Gerald A. Voorhees, Josh Call and Katie Whitlock. New York: Continuum, 2012. 299-318.Harris, Sam, and Paul Bloom. "Waking Up with Sam Harris #56 – Abusing Dolores." Sam Harris 12 Dec. 2016. Howley, Daniel. "Review: Beautiful Battlefield 1 Gives the War to End All Wars Its Due Respect." Yahoo! 2016. Kain, Erik. "'Battlefield 1' Is Stunningly Beautiful on PC." Forbes 2016.Keogh, Brendan. Spec Ops: The Line's Conventional Subversion of the Military Shooter. Paper presented at DiGRA 2013: Defragging Game Studies.Kramer, Alan. Dynamic of Destruction: Culture and Mass Killing in the First World War. UK: Oxford University Press, 2007. Osborne, Scott. "Battlefield 1942 Review." Gamesport 2002. Payne, Matthew Thomas. "War Bytes: The Critique of Militainment in Spec Ops: The Line." Critical Studies in Media Communication 31.4 (2014): 265-82. Plante, Chris. "Battlefield 1 Is Excellent Because the Series Has Stopped Trying to Be Call of Duty." The Verge 2016. Rashid, Justin. Terrorism in Video Games and the Storytelling War against Extremism. Paper presented at Hawaii International Conference on Arts and Humanities, 9-12 Jan. 2011.Schklovski, Viktor. "Sterne's Tristram Shandy: Stylistic Commentary." Trans. Lee T. Lemon and Marion J. Reis. Russian Formalist Criticism: Four Essays. Lincoln: University of Nebraska Press, 1965. 25-60.Simpson, Campbell. "Battlefield 1 Isn't a Game: It's a History Lesson." Kotaku 2016. Thompson, Kristin. Breaking the Glass Armor: Neoformalist Film Analysis. New Jersey: Princeton University Press, 1988. Welsh, Timothy. "Face to Face: Humanizing the Digital Display in Call of Duty: Modern Warfare 2." Guns, Grenade, and Grunts: First-Person Shooter Games. Eds. Gerald A. Voorhees, Josh. Call, and Katie Whitlock. New York: Continuum, 2012. 389-414. Wilson, Jason Anthony. "Gameplay and the Aesthetics of Intimacy." PhD diss. Brisbane: Griffith University, 2007.

32

Bowles, Kate. "Academia 1.0: Slow Food in a Fast Food Culture? (A Reply to John Hartley)." M/C Journal 12, no.3 (July15, 2009). http://dx.doi.org/10.5204/mcj.169.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

"You could think of our kind of scholarship," he said, "as something like 'slow food' in a fast-food culture."— Ivan Kreilkamp, co-editor of Victorian Studies(Chronicle of Higher Education, March 2009) John Hartley’s entertaining and polemical defense of a disappearing art form (the print copy journal designed to be ripped eagerly from its envelope and read from cover to cover like a good book) came my way via the usual slightly disconcerting M/C Journal overture: I believe that your research interests and background make you a potential expert reviewer of the manuscript, "LAMENT FOR A LOST RUNNING ORDER? OBSOLESCENCE AND ACADEMIC JOURNALS," which has been submitted to the '' [sic] issue of M/C Journal. The submission's extract is inserted below, and I hope that you will consider undertaking this important task for us. Automated e-mails like these keep strange company, with reminders about overdue library items and passwords about to expire. Inevitably their tone calls to mind the generic flattery of the internet scam that announces foreign business opportunities or an unexpectedly large windfall from a deceased relative. At face value, this e-mail confirms John Hartley’s suspicions about the personalised craft of journal curation. Journal editing, he implies, is going the way of drywalling and smithying—by the time we realise these ancient and time-intensive skills have been lost, it’ll be too late. The usual culprit is to the fore—the internet—and the risk presented by obsolescence is very significant. At stake is the whole rich and messy infrastructure of academic professional identity: scholarly communication, goodwill, rank, trust, service to peers, collegiality, and knowledge itself. As a time-poor reader of journals both online and in print I warmed to this argument, and enjoyed reading about the particularities of journal editing: the cultivation and refinement of a specialised academic skill set involving typefaces, cover photographs and running order. Journal editors are our creative directors. Authors think selfishly and not always consistently about content, position and opportunity, but it’s the longer term commitment of editors to taking care of their particular shingle in the colourful and crowded bazaar of scholarly publishing, that keeps the market functioning in a way that also works for inspectors and administrators. Thinking of all the print journals I’ve opened and shut and put on shelves (sometimes still in their wrappers) and got down again, and photocopied, and forgotten about, I realised that I do retain a dim sense of their look and shape, and that in practical ways this often helps me remember what was in them. Nevertheless, even having been through the process he describes, whereby “you have to log on to some website and follow prompts in order to contribute both papers and the assessment of papers; interactions with editors are minimal,” I came to the conclusion that he had underestimated the human in the practice of refereeing. I wasn’t sure made me an expert reviewer for this piece, except perhaps that in undertaking the review itself I was practising a kind of expertise that entitled me to reflect on what I was doing. So as a way of wrestling with the self-referentiality of the process of providing an anonymous report on an article whose criticism of blind refereeing I shared, I commented on the corporeality and collegiality of the practice: I knew who I was writing about (and to), and I was conscious of both disagreeing and wondering how to avoid giving offence. I was also cold in my office, and wondering about a coffee. “I suspect the cyborg reviewer is (like most cyborgs) a slightly romantic, or at least rhetorical, fantasy,” I added, a bit defensively. “Indeed, the author admits to practising editorship via a form of human intersubjectivity that involves email, so the mere fact that the communication in some cases is via a website doesn’t seem to render the human obsolete.” The cyborg reviewer wasn’t the only thing bothering me about the underlying assumptions concerning electronic scholarly publishing, however. The idea that the electronic disaggregation of content threatens the obsolescence of the print journal and its editor is a little disingenuous. Keyword searches do grab articles independently of issues, it’s true, but it’s a stretch to claim that this functionality is what’s turning diligent front-to-back readers and library flaneurs into the kinds of online mercenaries we mean when we say “users”. Quite the opposite: journal searches are highly seductive invitations to linger and explore. Setting out from the starting point of a single article, readers can now follow a citation trail, or chase up other articles by the same author or on similar topics, all the while keeping in plain sight the running order that was designed by the editors as an apt framework for the piece when it first appeared. Journal publishers have the keenest investment in nurturing the distinctive brand of each of their titles, and as a result the journal name is never far from view. Even the cover photo and layout is now likely to be there somewhere, and to crop up often as readers retrace their steps and set out again in another direction. So to propose that online access makes the syntactical form of a journal issue irrelevant to readers is to underestimate both the erotics of syntax, and the capacity of online readers to cope with a whole new libidinous economy of searching characterised by multiple syntactical options. And if readers are no longer sequestered within the pages of an individual hard copy journal—there really is a temptation to mention serial monogamy here—their freedom to operate more playfully only draws attention to the structural horizontalities of the academic public sphere, which is surely the basis of our most durable claims to profess expertise. Precisely because we are hyperlinked together across institutions and disciplines, we can justly argue that we are perpetually peer-reviewing each other, in a fairly disinterested fashion, and no longer exclusively in the kinds of locally parochial clusters that have defined (and isolated) the Australian academy. So although disaggregation irritates journal editors, a more credible risk to their craft comes from the disintermediation of scholarly communication that is one of the web’s key affordances. The shift towards user generated content, collaboratively generated, openly accessible and instantly shareable across many platforms, does make traditional scholarly publishing, with its laborious insistence on double blind refereeing, look a bit retro. How can this kind of thing not become obsolete given how long it takes for new ideas to make their way into print, what with all that courtly call and response between referees, editors and authors, and the time consumed in arranging layout and running order and cover photos? Now that the hegemons who propped up the gold standard journals are blogging and podcasting their ideas, sharing their bookmarks, and letting us know what they’re doing by the hour on Twitter, with presumably no loss of quality to their intellectual presence, what kind of premium or scarcity value can we place on the content they used to submit to print and online journals? So it seems to me that the blogging hegemon is at least as much of a problem for the traditional editor as the time challenged browser hoping for a quick hit in a keyword search. But there are much more complicated reasons why the journal format itself is not at risk, even from www.henryjenkins.org. Indeed, new “traditional” journals are being proposed and launched all the time. The mere award of an A* for the International Journal of Cultural Studies in the Australian journal rankings (Australian Research Council) confirms that journals are persistently evaluated in their own right, that the brand of the aggregating instrument still outranks the bits and pieces of disaggregated content, and that the relative standing of different journals depends precisely on the quantification of difficulty in meeting the standards (or matching the celebrity status) of their editors, editorial boards and peer reviewing panels. There’s very little indication in this process that either editors or reviewers are facing obsolescence; too many careers still depend on their continued willingness to stand in the way of the internet’s capacity to let anyone have a go at presenting ideas and research in the public domain. As the many inputs to the ERA exercise endlessly, and perhaps a bit tediously, confirmed, it’s the reputation of editors and their editorial practices that signals the exclusivity of scholarly publishing: in the era of wikis and blogs, an A* journal is one club that’s not open to all. Academia 1.0 is resilient for all these straightforward reasons. Not only in Australia, tenure and promotion depend on it. As a result, since the mid 1990s, editors, publishers, librarians and other stakeholders in scholarly communication have been keeping a wary eye on the pace and direction of change to either its routines or its standards. Their consistent attention has been on the proposition the risk comes from something loosely defined as “digital”. But as King, Tenopir and Clark point out in their study of journal readership in the sciences, the relevance of journal content itself has been extensively disputed and investigated across the disciplines since the 1960s. Despite the predictions of many authors in the 1990s that electronic publishing and pre-publishing would challenge the professional supremacy of the print journal, it seems just as likely that the simple convenience of filesharing has made more vetted academic material available, more easily, to more readers. As they note in a waspish foonote, even the author of one of the most frequently cited predictions that scholarly journals were on the way out had to modify his views, “perhaps due to the fact that his famous 1996 [sic] article "Tragic Loss or Good Riddance? The Impending Demise of Traditional Scholarly Journals" has had thousands of hits or downloads on his server alone.” (King et al,; see also Odlyzko, " Tragic Loss" and "Rapid Evolution"). In other words, all sides now seem to agree that “digital” has proved to be both opportunity and threat to scholarly publication. Odlyzko’s prediction of the disappearance of the print journal and its complex apparatus of self-perpetuation was certainly premature in 1996. So is John Hartley right that it’s time to ask the question again? Earlier this year, the Chronicle of Higher Education’s article “Humanities Journals Confront Identity Crisis”, which covered much of the same ground, generated brisk online discussion among journal editors in the humanities (Howard; see also the EDITOR-L listserv archive). The article summarised the views of a number of editors of “traditional” journals, and offset these with the views of a group representing the Council of Editors of Learned Journals, canvassing the possibility that scholarly publishing could catch up to the opportunities that we tend to shorthand as “web 2.0”. The short-lived CELJ blog discussion led by Jo Guldi in February 2009 proposed four principles we might expect to shape the future of scholarly publishing in the humanities: technical interoperability, which is pretty uncontroversial; the expansion of scholarly curation to a role in managing and making sense of “the noise of the web”; diversification of content types and platforms; and a more inclusive approach to the contribution of non-academic experts. (Guldi et al.) Far from ceding the inexorability of their own obsolescence, the four authors of this blog (each of them journal editors) have re-imagined the craft of editing, and have drafted an amibitious but also quite achievable manifesto for the renovation of scholarly communication. This is focused on developing a new and more confident role for the academy in the next phase of the development of the knowledge-building capacity of the web. Rather than confining themselves to being accessed only by their professional peers (and students) via university libraries in hardcopy or via institutional electronic subscription, scholars should be at the forefront of the way knowledge is managed and developed in the online public sphere. This would mean developing metrics that worked as well for delicious and diigo as they do for journal rankings; and it would mean a more upfront contribution to quality assurance and benchmarking of information available on the web, including information generated from outside the academy. This resonates with John Hartley’s endorsem*nt of wiki-style open refereeing, which as an idea contains a substantial backwards nod to Ginsparg’s system of pre-publication of the early 1990s (see Ginsparg). It also suggests a more sophisticated understanding of scholarly collaboration than the current assumption that this consists exclusively of a shift to multiply-authored content, the benefit of which has tended to divide scholars in the humanities (Young). But it was not as a reviewer or an author that this article really engaged me in thinking about the question of human obsolescence. Recently I’ve been studying the fragmentation, outsourcing and automation of work processes in the fast food industry or, as it calls itself, the Quick Service Restaurant trade. I was drawn into this study by thinking about the complex reorganisation of time and communication brought about by the partial technologisation of the McDonalds drive-thru in Australia. Now that drive-thru orders are taken through a driveway speaker, the order window (and its operator) have been rendered obsolete, and this now permanently closed window is usually stacked high with cardboard boxes. Although the QSR industry in the US has experimented with outsourcing ordering to call centres at other locations (“May I take your order?”), in Australia the task itself has simply been added to the demands of customer engagement at the paying window, with the slightly odd result that the highest goal of customer service at this point is to be able to deal simultaneously with two customers at two different stages of the drive-thru process—the one who is ordering three Happy Meals and a coffee via your headset, and the one who is sitting in front of you holding out money—without offending or confusing either. This formal approval of a shift from undivided customer attention to the time-efficiency of multitasking is a small but important reorientation of everyday service culture, making one teenager redundant and doubling the demands placed on the other. The management of quick service restaurant workers and their productivity offers us a new perspective on the pressures we are experiencing in the academic labour market. Like many of my colleagues, I have been watching with a degree of ambivalence the way in which the national drive to quantify excellence in research in Australia has resulted in some shallow-end thinking about how to measure what it is that scholars do, and how to demonstrate that we are doing it competitively. Our productivity is shepherded by the constant recalibration of our workload, conceived as a bundle of discrete and measurable tasks, by anxious institutions trying to stay ahead in the national game of musical chairs, which only offers a limited number of seats at the research table—while still keeping half an eye on their enterprise bargaining obligations. Or, as the Quick Service Restaurant sector puts it: Operational margins are narrowing. While you need to increase the quality, speed and accuracy of service, the reality is that you also need to control labor costs. If you reduce unnecessary labor costs and improve workforce productivity, the likelihood of expanding your margins increases. Noncompliance can cost you. (Kronos) In their haste to increase quality, speed and accuracy of academic work, while lowering labor costs and fending off the economic risk of noncompliance, our institutions have systematically overlooked the need to develop meaningful ways to accommodate the significant scholarly work of reading, an activity that takes real time, and that in its nature is radically incompatible with the kinds of multitasking we are all increasingly using to manage the demands placed on us. Without a measure of reading, we fall back on the exceptionally inadequate proxy of citation. As King et al. point out, citation typically skews towards a small number of articles, and the effect of using this as a measure of reading is to suggest that the majority of articles are never read at all. Their long-term studies of what scientists read, and why, have been driven by the need to challenge this myth, and they have demonstrated that while journals might not be unwrapped and read with quite the Christmas-morning eagerness that John Hartley describes, their content is eventually read more than once, and often more than once by the same person. Both electronic scholarly publishing, and digital redistribution of material original published in print, have greatly assisted traditional journals in acquiring something like the pass-on value of popular magazines in dentists’ waiting rooms. But for all this to work, academics have to be given time to sit and read, and as it would be absurd to try to itemise and remunerate this labour specifically, then this time needs to be built into the normative workload for anyone who is expected to engage in any of the complex tasks involved in the collaborative production of knowledge. With that in mind, I concluded my review on what I hoped was a constructive note of solidarity. “What’s really under pressure here—forms of collegiality, altruism and imaginative contributions to a more outward-facing type of scholarship—is not at risk from search engines, it seems to me. What is being pressured into obsolescence, risking subscriptions to journals as much as purchases of books, is the craft and professional value placed on reading. This pressure is not coming from the internet, but from all the other bureaucratic rationalities described in this paper, that for the time being do still value journals selectively above other kinds of public contribution, but fail to appreciate the labour required to make them appear in any form, and completely overlook the labour required to absorb their contents and respond.” For obvious reasons, my warm thanks are due to John Hartley and to the two editors of this M/C Journal issue for their very unexpected invitation to expand on my original referee’s report.References Australian Research Council. “The Excellence in Research for Australia (ERA) Initiative: Journal Lists.” 2009. 3 July 2009 ‹http://www.arc.gov.au/era/era_journal_list.htm›. Ginsparg, Paul. “Can Peer Review be Better Focused?” 2003. 1 July 2009 ‹http://people.ccmr.cornell.edu/~ginsparg/blurb/pg02pr.html›. Guldi, Jo, Michael Widner, Bonnie Wheeler, and Jana Argersinger. The Council of Editors of Learned Journals Blog. 2009. 1 July 2009 ‹http://thecelj.blogspot.com›. Howard, Jennifer. “Humanities Journals Confront Identity Crisis.” The Chronicle of Higher Education 27 Mar. 2009. 1 July 2009 ‹http://chronicle.com/free/v55/i29/29a00102.htm›. King, Donald, Carol Tenopir, and Michael Clarke. "Measuring Total Reading of Journal Articles." D-Lib Magazine 12.10 (2006). 1 July 2009 ‹http://www.dlib.org/dlib/october06/king/10king.html›. Kronos Incorporated. “How Can You Reduce Your Labor Costs without Sacrificing Speed of Service?” (2009). 1 July 2009 ‹http://www.qsrweb.com/white_paper.php?id=1738&download=1›.“May I Take Your Order? Local McDonald's Outsources to a Call Center.” Billings Gazette, Montana, 5 July 2006. SharedXpertise Forum. 1 July 2009 ‹http://www.sharedxpertise.org/file/3433/mcdonalds-outsourcing-to-call-center.html›.Odlyzko, Andrew. “The Rapid Evolution of Scholarly Publishing.” Learned Publishing 15.1 (2002): 7-19. ———. “Tragic Loss or Good Riddance? The Impending Demise of Traditional Scholarly Journals.” International Journal of Human-Computer Studies 42 (1995): 71-122. Young, Jeffrey. “Digital Humanities Scholars Collaborate More on Journal Articles than 'Traditional' Researchers.” The Chronicle of Higher Education 27 April 2009. 1 July 2009 ‹http://chronicle.com/wiredcampus/article/3736/digital-humanities-scholars-collaborate-more-on-journal-articles-than-on-traditional-researchers›.

33

Holloway, Donell Joy, Lelia Green, and Danielle Brady. "FireWatch: Creative Responses to Bushfire Catastrophes." M/C Journal 16, no.1 (March19, 2013). http://dx.doi.org/10.5204/mcj.599.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

IntroductionBushfires have taken numerous lives and destroyed communities throughout Australia over many years. Catastrophic fire weather alerts have occurred during the Australian summer of 2012–13, and long-term forecasts predict increased bushfire events throughout several areas of Australia. This article highlights how organisational and individual responses to bushfire in Australia often entail creative responses—either improvised responses at the time of bushfire emergencies or innovative (organisational, strategic, or technological) changes which help protect the community from, or mitigate against, future bushfire catastrophes. These improvised or innovative responses include emergency communications systems, practices, and devices. This article reports on findings from a research project funded by the Australian Research Council titled Using Community Engagement and Enhanced Visual Information to Promote FireWatch Satellite Communications as a Support for Collaborative Decision-making. FireWatch is a Web-based public information product based on near real time satellite data produced by the West Australian (WA) Government entity, Landgate. The project researches ways in which remote and regional publics can be engaged and mobilised through the development of a more user-friendly FireWatch site to make fire information accessible and usable, allowing a community-focused response to risk.The significance of the research project is evident both in how it addresses the important and life-threatening challenge of bushfires; and also in how Australia’s increasingly hot, dry, long summers are adding to historically-established risks. This innovative project uses an iterative, participatory design process incorporating action-research practices. This will ensure that the new Firewatch interface is redesigned, tested, observed, and reflected upon multiple times—and will incorporate the collective creativity of users, designers, and researchers.The qualitative findings reported on in this article are based on 19 interviews with community members in the town of Kununurra in the remote Kimberley region of WA. The findings are positioned within a reconceptualised framework in which creativity is viewed as an essential component of successful emergency responses. This includes, we argue, two critical aspects of creativity: improvisation during a catastrophic event; and ongoing innovation to improve future responses to catastrophes—including communication practices and technologies. This shifts the discourse within the literature in relation to the effective management and community responses to the changing phenomenon of fire catastrophes. Findings from the first round of interviews, and results of enquiries into previous bushfires in Australia, are used to highlight how these elements of creativity often entail a collective creativity on the part of emergency responders or the community in general. An additional focus is on the importance of the critical use of communication during a bushfire event.ImprovisationThe notion of "improvisation" is often associated with artistic performance. Nonetheless, improvisation is also integral to making effectual responses during natural catastrophes. “Extreme events present unforeseen conditions and problems, requiring a need for adaptation, creativity, and improvisation while demanding efficient and rapid delivery of services under extreme conditions” (Harrald 257).Catastrophes present us with unexpected scenarios and require rapid, on the spot problem solving and “even if you plan for a bushfire it is not going to go to plan. When the wind changes direction there has to be a new plan” (Jeff. Personal Interview. 2012). Jazz musicians or improvisational actors “work to build their knowledge across a range of fields, and this knowledge provides the elements for each improvisational outcome” (Kendra and Wachendorf 2). Similarly, emergency responders’ knowledge and preparation can be drawn “upon in the ambiguous and dynamic conditions of a disaster where not every need has been anticipated or accounted for” (Kendra and Wachtendorf 2). Individuals and community organisations not associated with emergency services also improvise in a creative and intuitive manner in the way they respond to catastrophes (Webb and Chevreau). For example, during the 9/11 terrorism catastrophe in the USA an assorted group of boat owners rapidly self-organised to evacuate Lower Manhattan. On their return trips, they carried emergency personnel and supplies to the area (Kendra and Wachendorf 5). An interviewee in our study also recalls bush fire incidents where creative problem solving and intuitive decision-making are called for. “It’s like in a fire, you have to be thinking fast. You need to be semi self-sufficient until help arrives. But without doing anything stupid and creating a worse situation” (Kelly. Personal Interview. 2012). Kelly then describes the rapid community response she witnessed during a recent fire on the outskirts of Kununurra, WA.Everyone had to be accounted for, moving cars, getting the tractors out, protecting the bores because you need the water. It happens really fast and it is a matter of rustling everyone up with the machinery. (2012)In this sense, the strength of communities in responding to catastrophes or disasters “results largely from the abilities of [both] individuals and organisations to adapt and improvise under conditions of uncertainty” (Webb and Chevreau 67). These improvised responses frequently involve a collective creativity—where groups of neighbours or emergency workers act in response to the unforseen, often in a unified and self-organising manner. InnovationCatastrophes also stimulate change and innovation for the future. Disasters create a new environment that must be explored, assessed, and comprehended. Disasters change the physical and social landscape, and thereby require a period of exploration, learning, and the development of new approaches. (Kendra and Wachtendorf 6)These new approaches can include organisational change, new response strategies, and technologies and communication improvements. Celebrated inventor Benjamin Franklin, for instance, facilitated the formation of the first Volunteer Fire department in the 1850s as a response to previous urban fire catastrophes in the USA (Mumford 258). This organisational innovation continues to play an instrumental part in modern fire fighting practices. Indeed, people living in rural and remote areas of Australia are heavily reliant on volunteer groups, due to the sparse population and vast distances that need to be covered.As with most inventions and innovations, new endeavours aimed at improving responses to catastrophes do not occur in a vacuum. They “are not just accidents, nor the inscrutable products of sporadic genius, but have abundant and clear causes in prior scientific and technological development” (Gifillian 61). Likewise, the development of our user-friendly and publically available FireWatch site relies on the accumulation of preceding inventions and innovations. This includes the many years spent developing the existing FireWatch site, a site dense in information of significant value to scientists, foresters, land managers, and fire experts.CommunicationsOften overlooked in discussions regarding emergency communications is the microgeographical exchanges that occur in response to the threat of natural disasters. This is where neighbours fill the critical period before emergency service responders can appear on site. In this situation, it is often local knowledge that underpins improvised grassroots communication networks that inform and organise the neighbourhood. During a recent bushfire on peri-rural blocks on the outskirts of Kununurra, neighbours went into action before emergency services volunteers could respond.We phoned around and someone would phone and call in. Instead of 000 being rung ten times, make sure that one person rang it in. 40 channel [CB Radio] was handy – two-way communication, four wheelers – knocking on doors making sure everyone is out of the house, just in case. (Jane. Personal Interview. 2012) Similarly, individuals and community groups have been able to inform and assist each other on a larger scale via social network technologies (SNTs). This creative application of SNTs began after the 9/11 terror attacks in 2001 when individuals created wikis in order to find missing persons (Palen and Lui). Twitter has experienced considerable growth and was used freely during the 2009 Black Saturday fires in Australia. Studies of tweeting activity during these fires indicate that “tweets made during Black Saturday are laden with actionable factual information which contrasts with earlier claims that tweets are of no value made of mere random personal notes” (Sinnappan et al. n.p.).Traditionally, official alerts and warnings have been provided to the public via television and radio. However, several inquiries into the recent bushfires within Australia show concern “with the way in which fire agencies deliver information to community members during a bushfire...[and in order to] improve community safety from bushfire, systems need to be implemented that enable community members to communicate information to fire agencies, making use of local knowledge” (Elsworth et al. 8).Technological and social developments over the last decade mean the public no longer relies on a single source of official information (Sorensen and Sorensen). Therefore, SNTs such as Twitter and Facebook are being used by the media and emergency authorities to make information available to the public. These SNTs are dynamic, in that there can be a two-way flow of information between the public and emergency organisations. Nonetheless, there has been limited use of SNTs by emergency agencies to source information posted by in situ residents, in order to help in decision-making (Freeman). Organisational use of multiple communication channels and platforms to inform citizens about bushfire emergencies ensures a greater degree of coverage—in case of communication systems breakdowns or difficulties—as in the telephone alert system breakdown in Kelmscott-Roleystone, WA or a recent fire in Warrnambool, Victoria which took out the regional telephone exchange making telephone calls, mobiles, landlines, and the Internet non-operational (Johnson). The new FireWatch site will provide an additional information option for rural and remote Australians who, often rely on visual sightings and on word-of-mouth to be informed about fires in their region. “The neighbour came over and said - there is a fire, we’d better get our act together because it is going to hit us. No sooner than I turned around, I thought sh*t, here it comes” (Richard. Personal Interview. 2012). The FireWatch ProjectThe FireWatch project involves the redevelopment of an existing FireWatch website to extend the usability of the product from experts to ordinary users in order to facilitate community-based decision-making and action both before and during bushfire emergencies. To this purpose, the project has been broken down to two distinct, yet interdependent, strands. The community strand involves collaboration within a community (in this case the Kununurra community) in order to carry out a community-centred approach to further development of the site. The design strand involves the development of an intuitive and accessible Web presentation of complex information in clear, unambiguous ways to inform action in stressful circ*mstances. At this stage, a first round of 19 semi-structured interviews with stakeholders has been conducted in Kununurra to determine fire-related information-seeking behaviours, attitudes to mediated information services in the region, as well as user feedback on a prototype website developed in the design strand of the project. Stakeholders included emergency services personnel (payed and volunteer), shire representatives, tourism operators, small business operators (including tourism operators), a forest manager, a mango farmer, an Indigenous ranger team manager as well as general community members. Interviewees reported dissatisfaction with current information systems. They gave positive feedback about the website prototype. “It’s very much, very easy to follow” (David. Personal Interview. 2012). “It looks so much better than [the old site]. You couldn’t get in that close on [the other site]. It is fantastic” (Lance. Personal Interview. 2012). They also added thought-provoking contributions to the design of the website (to be discussed later).Residents of Kununurra who were interviewed for this research project found bushfire warning communications unsatisfactory, especially during a recent fire on the outskirts of town. People who called 000 had difficulties passing the information on, having to explain exactly where Kununurra was and the location of fires to operators not familiar with the area. When asked how the Kununurra community gets their fire information a Shire representative explained: That is not very good at the moment. The only other way we can think about it is perhaps more updates on things like Facebook, perhaps on a website, but with this current fire there really wasn’t a lot of information and a lot of people didn’t know what was going on. We [the shire] knew because we were talking to the [fire] brigades and to FESA [Fire and Emergency Services Authority] but most residents didn’t have any idea and it looks pretty bad. (Ginny. Personal Interview. 2012) All being well, the new user-friendly FireWatch site will add another platform through which fire information messages are transmitted. Community members will be offered continuously streamed bushfire location information, which is independent of any emergency services communication systems. In particular, rural and remote areas of Australia will have fire information at the ready.The participatory methodology used in the design of the new FireWatch website makes use of collaborative creativity, whereby users’ vision of the website and context are incorporated. This iterative process “creates an equal evolving participatory process between user and designer towards sharing values and knowledge and creating new domains of collective creativity” (Park 2012). The rich and sometimes contradictory suggestions made by interviewees in this project often reflected individual visions of the tasks and information required, and individual preferences regarding the delivery of this information. “I have been thinking about how could this really work for me? I can give you feedback on what has happened in the past but how could it work for me in the future?” (Keith. Personal Interview. 2012). Keith and other community members interviewed in Kununurra indicated a variety of extra functions on the site not expected by the product designers. Some of these unexpected functions were common to most interviewees such as the great importance placed on the inclusion of a satellite view option on the site map (example shown in Figure 1). Jeremy, a member of an Indigenous ranger unit in the Kununurra area, was very keen to incorporate the satellite view options on the site. He explained that some of the older rangers:can’t use GPSs and don’t know time zones or what zones to put in, so they’ll use a satellite-style view. We’ll have Google Earth up on one [screen], and also our [own] imagery up on another [screen] and go that way. Be scrolling in and see – we’ve got a huge fire scar for 2011 around here; another guy will be on another computer zoning in and say, I think it is here. It’s quite simplistic but it works. (Personal Interview. 2012) In the case above, where rangers are already switching between computer screens to incorporate a satellite view into their planning, the importance of a satellite view layer on the FireWatch website makes user context an essential part of the design process. Incorporating many layers on one screen, as recommended by participants also ensures a more elegant solution to an existing problem.Figure 1: Satellite view in the Kununurra area showing features such as gorges, rivers, escarpments and dry riverbedsThis research project will involve further consultation with participants (both online and offline) regarding bushfire safety communications in their region, as well as the further design of the site. The website will be available over multiple devices (for example desktops, smart phones, and hand held tablet devices) and will be launched late this year. Further work will also be carried out to determine if social media is appropriate for this community of users in order to build awareness and share information regarding the site.Conclusion Community members improvise and self-organise when communicating fire information and organising help for each other. This can happen at a microgeographical (neighbourhood) level or on a wider level via social networking sites. Organisations also develop innovative communication systems or devices as a response to the threat of bushfires. Communication innovations, such as the use of Twitter and Facebook by fire emergency services, have been appropriated and fine-tuned by these organisations. Other innovations such as the user-friendly Firewatch site rely on previous technological developments in satellite-delivered imagery—as well as community input regarding the design and use of the site.Our early research into community members’ fire-related information-seeking behaviours and attitudes to mediated information services in the region of Kununurra has found unexpectedly creative responses, which range from collective creativity on the part of emergency responders or the community in general during events to creative use of existing information and communication networks. We intend to utilise this creativity in re-purposing FireWatch alongside the creative work of the designers in the project.Although it is commonplace to think of graphic design and new technology as incorporating creativity, it is rarely acknowledged how frequently these innovations harness everyday perspectives from non-professionals. In the case of the FireWatch developments, the creativity of designers and technologists has been informed by the creative responses of members of the public who are best placed to understand the challenges posed by restricted information flows on the ground in times of crisis. In these situations, people respond not only with new ideas for the future but with innovative responses in the present as they communicate with each other to deal with the challenge of a fast-moving and unpredictable situation. Such improvisation, honed through close awareness of the contours and parameters of both community and communication, are one of the ways through which people help keep themselves and each other safe in the face of dramatic developments.ReferencesElsworth, G., and K. Stevens, J. Gilbert, H. Goodman, A Rhodes. "Evaluating the Community Safety Approach to Bushfires in Australia: Towards an Assessment of What Works and How." Biennial Conference of the Eupopean Evaluation Society, Lisbon, Oct. 2008. Freeman, Mark. "Fire, Wind and Water: Social Networks in Natural Disasters." Journal of Cases on Information Technology (JCIT) 13.2 (2011): 69–79.Gilfillan, S. Colum. The Sociology of Invention. Chicago: Follett Publishing, 1935.Harrald, John R. "Agility and Discipline: Critical Success Factors for Disaster Response." The Annals of the American Academy of Political and Social Science 604.1 (2006): 256–72.Johnson, Peter. "Australia Unprepared for Bushfire”. Australian Broadcasting Corporation 17 Dec. 2012. 3 Jan. 2013 ‹http://www.abc.net.au/environment/articles/2012/12/17/3654075.htm›.Keelty, Mick J. "A Shared Responsibility: the Report of the Perth Hills Bushfires February 2011". Department of Premier and Cabinet, Government of Western Australia, Perth.Kendra, James, and Tricia Wachtendorf. "Improvisation, Creativity, and the Art of Emergency Management." NATO Advanced Research Workshop on Understanding and Responding to Terrorism: A Multi-Dimensional Approach. Washington, DC, 8-9 Sep. 2006.———. "Creativity in Emergency Response after the World Trade Centre Attack". Amud Conference of the International Emergency Management Society. University of Delaware. 14-17 May 2002. Mumford, Michael D. "Social Innovation: Ten Cases from Benjamin Franklin." Creativity Research Journal 14.2 (2002): 253–66.Palen, Leysia, and Sophia.B. Liu. "Citizen Communications in Crisis: Anticipating a Future of ICT-Supported Public Participation." Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. San Jose, 28 Apr. - 3 May 2007.Park, Ji Yong. "Design Process Excludes Users: The Co-Creation Activities between User and Designer." Digital Creativity 23.1 (2012): 79–92. Sinnappan, Suku, Cathy Farrell, and Elizabeth Stewart. "Priceless Tweets! A Study on Twitter Messages Posted During Crisis: Black Saturday." Proceedings of 21st Australasian Conference on Information Systems (ACIS 2010). Brisbane, Australia, 1-3 Dec 2010.Sorensen, John H., and Barbara Vogt Sorensen. "Community Processes: Warning and Evacuation." Handbook of Disaster Research. Eds. Havidán Rodríguez, Enrico Louis Quarantelli, and Russell Rowe Dynes. New York: Springer, 2007. 183–99.Webb, Gary R., and Francois-Regis Chevreau. "Planning to Improvise: The Importance of Creativity and Flexibility in Crisis Response." International Journal of Emergency Management 3.1 (2006): 66–72.

34

Mules, Warwick. "Virtual Culture, Time and Images." M/C Journal 3, no.2 (May1, 2000). http://dx.doi.org/10.5204/mcj.1839.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Introduction The proliferation of electronic images and audiovisual forms, together with the recent expansion of Internet communication makes me wonder about the adequacy of present theoretical apparatus within the humanities and communication disciplines to explain these new phenomena and their effects on human life. As someone working roughly within a cultural and media studies framework, I have long harboured suspicions about the ability of concepts such as text, discourse and representation to give an account of the new media which does not simply reduce them to another version of earlier media forms. Many of these concepts were established during the 1970s and 80s, in the development of poststructuralism and its linguistic bias towards the analysis of literary and print media text. The application of these concepts to an electronic medium based on the visual image rather than the printed word seems somewhat perverse, and needs to be replaced by the application of other concepts drawn from a paradigm more suited for the purpose. In this brief essay, I want to explore some of the issues involved in thinking about a new cultural paradigm based on the photovisual/electronic image, to describe and critique the transformation of culture currently taking place through the accelerated uptake of new televisual, audiovisual and computer technologies. I am reminded here of the existential philosopher Heidegger's words about technology: 'the essence of technology is by no means anything technological' (Heidegger 4). For Heidegger, technology is part of the 'enframing' of the beingness which humans inhabit in various ways (Dasein). But technology itself does not constitute this beingness. This is good news for those of us (like myself) who have only a general and non-technical knowledge of the new technologies currently sweeping the globe, but who sense their profound effects on the human condition. Indeed, it suggests that technical knowledge in itself is insufficient and even inadequate to formulate appropriate questions about the relationship between technology and human being, and to the capacities of humans to respond to, and transform their technologically mediated situations. We need a new way of understanding human being as mediated by technologies, which takes into account the specific technological form in which mediation occurs today. To do this, we need new ways of conceptualising culture, and the specific kind of human subjectivity made possible within a culture conditioned by electronic media. From Material to Virtual Culture The concept of culture, as it has been predominantly understood in the humanities and associated disciplines, is based on the idea of physical presence. That is to say, culture is understood in terms of the various representations and practices that people experience within social and historical contexts defined by the living presence of one human being to another. The paradigm case here is speech-based linguistics in which all forms of communication are understood in terms of an innate subjectivity, expressed in the act of communicating something to someone else. Although privileging the site and moment of co-presence, this model does not require the speakers to be immediately present to each other in face-to-face situations, but asks only that co-presence be the ideal upon which successful acts of communication take place. As French philosopher Jacques Derrida has consistently argued over the last thirty years, all forms of western discourse, in one way or another, have been based on this kind of understanding of the way meanings and expressions of subject identity take place (Derrida 27ff.). A good case in point is the introductory essay by John Frow and Meaghan Morris to their edited text book Australian Cultural Studies: A Reader, where culture is defined as "a contested and conflictual set of practices of representation bound up with the processes of formation and re-formation of social groups" (xx). If culture is defined in terms of the agonistic formation of social groups through practices of representation, then there can be no way of thinking about culture outside the social as the privileged domain of human interaction. Culture is reduced to the social as a kind of paradigm limit, which is, in turn, characterised by the formation of social groups fixed in time and space. Even when an effort is made to indicate that social groups are themselves culturally constituted, as Frow and Morris go on to say, the social is nevertheless invoked again as an underlying presumption: "the social processes by which the categories of the real and of group existence are formed" (xx). In this model, social groups are formed by social processes. The task of representation and signification (the task of culture) is to draw the group together, no matter how widespread or dispersed, to make it coherent and identifiably different from other groups. Under these terms, the task of cultural analysis is to describe how this process takes place. This 'material' approach to culture normalises the social at the expense of the cultural, underpinned by a 'metaphysics of presence' whereby meaning and identity are established within a system of differential values (difference) by fixing human subjectivity in space and time. I argue that the uptake of new communication technologies makes this concept of culture obsolete. Culture now has to be understood in terms of 'virtual presence' in which the physical context of human existence is simultaneously 'doubled' and indeed proliferated into a virtual reality, with effective force in the 'real' world. From this perspective, we need to rethink culture so that it is no longer understood in terms of differential meanings, identities, texts, discourses and representational forms, but rather as a new kind of ontology involving the 'being' of human subjects and their relations to each other in deterritorialised fields of mediated co-presence, where the real and the virtual enmesh and interact. In this case, the laws governing physical presence no longer apply since it is possible to be 'here' and 'there' at the same time. We need a new approach and a new set of analytical terms to account for this new phenomenon. Virtual Culture and the Time of Human Presence In his well known critique of modern culture, Walter Benjamin invents the concept of the 'dialectical image' to define the visual concreteness of the everyday world and its effect on human consciousness. Dialectical images operate through an instantaneous flash of vision which breaks through everyday reality, allowing an influx of otherness to flood present awareness in a transformation of the past into the present: "the past can be seized only as an image which flashes up at the instant when it can be recognized and is never seen again" (Benjamin, Theses 255). Bypassing discourse, language and meaning, dialectical images invoke the eternal return -- the affirmation of the present as an ever-constant repetition of temporality -- as the 'ground' of history, progress and the future. Modern technology and its infinite power of reproduction has created the condition under which the image separates from its object, thereby releasing materiality from its moribund state in the past (Benjamin, The Work of Art). The ground of temporality is thus rendered virtual and evanescent, involving a 'deterritorialisation' of human experience from its ego-attachment to the present; an experience which Benjamin understands in repressed mythical terms. For Benjamin, the exemplary modern technology is photography. A photograph 'destroys' the originariness of the object, by robbing it of aura, or "the unique phenomenon of a distance, however close it may be" (Benjamin, The Work of Art 222). The photographic image is thus dialectical because it collapses the distance between the object and its image, thereby undermining the ontological space between the past and the present which might otherwise grant to the object a unique being in the presence of the viewer. But all 'things' also have their images, which can be separated and dispersed through space and time. Benjamin's approach to culture, where time surpasses space, and where the reproduced image takes priority over the real, now appears strangely prophetic. By suggesting that images are somehow directly and concretely affective in the constitution of human temporality, Benjamin has anticipated the current 'postmodern' condition in which the electronic image has become enmeshed in everyday life. As Paul Virilio argues, new communication technologies accelerate the transmission of images to such a rate that the past is collapsed into the present, creating an overpowering sense of immediacy: the speed of new optoelectronic and electroacoustic milieu becomes a final void (the void of the quick), a vacuum that no longer depends on the interval between places or things and so on the world's very extension, but on the interface of an instantaneous transmission of remote appearances, on a geographic and geometric retention in which all volume, all relief vanish. (33) Distance is now experienced in terms of its virtual proximity to the perceiving subject, in which space is no longer understood in terms of Newtonian extension, but as collapsed or compressed temporality, defined by the speed of light. In this Einsteinian world, human interaction is no longer governed by the law of non-contradiction which demands that one thing cannot be something else or somewhere else at the same time, and instead becomes 'interfacial', where the image-double enmeshes with its originary being as a co-extensive ontology based on "trans-appearance", or the effective appearance on a single horizon of two things from different space and time zones: "the direct transparence of space that enables each of us to perceive our immediate neighbours is completed by the indirect transparence of the speed-time of the electromagnetic waves that transmit our images and our voices" (Virilio 37). Like the light from some distant star which reaches earth millions of years after its explosive death, we now live in a world of remote and immediately past events, whose effects are constantly felt in real time. In this case the present is haunted by its past, creating a doppelgänger effect in which human being is doubled with its image in a co-extensive existence across space and time. Body Doubles Here we can no longer speak of the image as a representation, or even a signification, since the image is no longer secondary to the thing from which it is separated, nor is it a sign of anything else. Rather, we need to think of the possibility of a kind of 'image-event', incorporating both the physical reality of the human body and its image, stretched through time and space. French theorists Gilles Deleuze and Félix Guattari have developed an entire theoretical scheme to define and describe this kind of phenomenon. At one point in their magnum opus, A Thousand Plateaus: Capitalism and Schizophrenia, they introduce the concept of haecceity: a body is not defined by the form that determines it nor as a determinate substance or subject nor by the organs it possesses or the function it fulfils. On the plane of consistency, a body is defined by a longitude and a latitude: in other words the sum total of the material elements belonging to it under given relations of movement and rest, speed and slowness (longitude); the sum total of the intensive affects it is capable of at a given power or degree of potential (latitude). (260) This haecceity of the human body, as "trajectory", or "interassemblage" (262) denies the priority of an originating event or substance from which its constitutive elements could be derived. For instance photographs cease to be 'indexes' of things, and become instead part of an assemblage which includes living bodies and other forms of human presence (speech, writing, expressive signs), linked contingently into assemblages through space and time. A photographic image is just as much part of the 'beingness' of something as the thing itself; things and images are part of a perpetual process of becoming; a contingent linking of bricolage with different and diverging material expressions and effects. Thinking along these lines will get us around the problem of non-contradiction (that something cannot be both 'here' and 'there' at the same time), by extending the concept of 'thing' to include all the elements of its dispersal in time and space. Here we move from the idea of a thing as unique to itself (for instance the body as human presence) and hence subject to a logic of exchange based on scarcity and lack, to the idea of a thing as 'becoming', and subject to a logic of proliferation and excess. In this case, the unique phenomenon of human presence anchored in speech can no longer be used as a focal point to fix human subjectivity, its meanings and forms of expression, since there will be many different kinds of 'presencing' of human being, through the myriad trajectories traced out in all the practices and assemblages through time and space. A Practical Approach By thinking of culture in terms of virtual presence, we can no longer assume the existence of a bedrock foundation for human interaction based on the physical proximity of individuals to each other in time and space. Rather we need to think of culture in terms the emergence of new kinds of 'beingness', which deterritorialises human presence in different ways through the mediating power of photovisual and electronic imagery. These new kinds of beingness are not really new. Recent writers and cultural theorists have already described in detail the emergence of a virtual culture in the nineteenth century with the invention of photography and film, as well as various viewing devices such as the stereoscope and other staging apparatuses including the panorama and diorama (Friedberg, Batchen, Crary). Analysis of virtual culture needs to identify the various trajectories along which elements are assembled into an incessant and contingent 'becoming'. In terms of photovisual and electronic media, this can take place in different ways. By tracing the effective history of an image, it is possible to locate points at which transformations from one form to another occur, indicating different effects in different contexts through time. For instance by scanning through old magazines, you might be able to trace the 'destiny' of a particular type of image, and the kinds of meanings associated with it. Keeping in mind that an image is not a representation, but a form of affect, it might be possible to identify critical points where the image turns into its other (in fashion imagery we are now confronted with images of thin bodies suddenly becoming too thin, and hence dangerously subversive). Another approach concerns the phenomenon known as the media event, in which electronic images outstrip and overdetermine physical events in real time to which they are attached. In this case an analysis of a media event would involve the description of the interaction between events and their mediated presence, as mutually effective in real time. Recent examples here include the Gulf War and other international emergencies and conflicts in the Balkans and the 1986 coup in the Philippines, where media presence enabled images to have a direct effect on the decisions and deployment of troops and strategic activities. In certain circ*mstances, the conduct of warfare might now take place entirely in virtual reality (Kellner). But these 'peak events' don't really exhaust the ways in which the phenomenon of the media event inhabits and affects our everyday lives. Indeed, it might be better to characterise our entire lives as conditioned to various degrees by media eventness, as we become more and more attached and dependent on electronic imagery and communication to gain our sense of place in the world. An analysis of this kind of everyday interaction is long overdue. We can learn about the virtual through our own everyday experiences. Here I am not so much thinking of experiences to be had in futuristic apparatuses such as the virtual reality body suit and other computer generated digital environments, but the kinds of experiences of the virtual described by Benjamin in his wanderings through the streets of Berlin and Paris in the 1920s (Benjamin, One Way Street). A casual walk down the main street of any town, and a perfunctory gaze in the shop windows will trigger many interesting connections between specific elements and the assemblages through which their effects are made known. On a recent trip to Bundaberg, a country town in Queensland, I came across a mechanised doll in a jewellery store display, made up in the likeness of a watchmaker working at a miniature workbench. The constant motion of the doll's arm as it moved up and down on the bench in a simulation of work repeated the electromechanical movements of the dozens of clocks and watches displayed elsewhere in the store window, suggesting a link between the human and the machine. Here I was presented not only with a pleasant shop display, but also with the commodification of time itself, as an endless repetition of an interval between successive actions, acted out by the doll and its perpetual movement. My pleasure at the display was channelled through the doll and his work, as a fetishised enchantment or "fairy scene" of industrialised productivity, in which the idea of time is visualised in a specific image-material form. I can imagine many other such displays in other windows in other towns and cities, all working to reproduce this particular kind of assemblage, which constantly 'pushes' the idea-image of time as commodity into the future, so long as the displays and their associated apparatuses of marketing continue in this way rather than some other way. So my suggestion then, is to open our eyes to the virtual not as a futuristic technology, but as it already shapes and defines the world around us through time. By taking the visual appearance of things as immaterial forms with material affectivity, we allow ourselves to move beyond the limitations of physical presence, which demands that one thing cannot be something else, or somewhere else at the same time. The reduction of culture to the social should be replaced by an inquiry into the proliferation of the social through the cultural, as so many experiences of the virtual in time and space. References Bataille, Georges. Visions of Excess: Selected Writings, 1927-1939.Trans. Allan Stoekl. Minneapolis: Minnesota UP, 1985. Batchen, Geoffrey. "Spectres of Cyberspace." Afterimage 23.3. Benjamin, Walter. "Theses on the Philosophy of History." Illuminations: Essays and Reflections. Trans. Hannah Arendt. New York: Schocken, 1968. 253-64. ---. "The Work of Art in the Age of Electronic Reproduction." Illuminations: Essays and Reflections. Trans. Hannah Arendt. New York: Schocken, 1968. 217-51. ---. One Way Street and Other Writings. Trans. Edmund Jephcott and Kingsley Shorter. London: Verso, 1979. Buck-Morss, Susan. The Dialectics of Seeing: Walter Benjamin and the Arcades Project. Cambridge, Mass.: MIT P, 1997. Crary, Jonathan. Techniques of the Observer: On Vision and Modernity in the Nineteenth Century. Chicago: MIT P, 1992. Derrida, Jacques. Of Grammatology. Trans. Gayatri Spivak. Baltimore: Johns Hopkins UP, 1974. Friedberg, Anne. Window Shopping: Cinema and the Postmodern. Berkeley: U of California P, 1993. Frow, John. Time & Commodity Culture: Essays in Cultural Theory and Postmodernity. Oxford: Clarendon, 1997. Frow, John, and Meaghan Morris, eds. Australian Cultural Studies: A Reader. St. Leonards, NSW: Allen and Unwin, 1993. Heidegger, Martin. "The Question Concerning Technology." The Question Concerning Technology. Trans. William Lovitt. New York: Harper. 3-35. Kellner, Douglas. "Virilio, War and Technology." Theory, Culture & Society 16.5-6 (1999): 103-25. Sean Aylward Smith. "Where Does the Body End?" M/C: A Journal of Media and Culture 2.3 (1999). 30 Apr. 2000 <http://www.uq.edu.au/mc/9905/end.php>. Virilio, Paul. Open Sky. Trans. Julie Rose. London: Verso, 1997. Zimnik, Nina. "'Give Me a Body': Deleuze's Time Image and the Taxonomy of the Body in the Work of Gabriele Leidloff." Enculturation 2.1 (1998). <http://www.uta.edu/huma/enculturation/>. Citation reference for this article MLA style: Warwick Mules. "Virtual Culture, Time and Images: Beyond Representation." M/C: A Journal of Media and Culture 3.2 (2000). [your date of access] <http://www.api-network.com/mc/0005/images.php>. Chicago style: Warwick Mules, "Virtual Culture, Time and Images: Beyond Representation," M/C: A Journal of Media and Culture 3, no. 2 (2000), <http://www.api-network.com/mc/0005/images.php> ([your date of access]). APA style: Warwick Mules. (2000) Virtual culture, time and images: beyond representation. M/C: A Journal of Media and Culture 3(2). <http://www.api-network.com/mc/0005/images.php> ([your date of access]).

35

Wishart, Alison. "Make It So: Harnessing Technology to Provide Professional Development to Regional Museum Workers." M/C Journal 22, no.3 (June19, 2019). http://dx.doi.org/10.5204/mcj.1519.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

IntroductionIn regional Australia and New Zealand, museums and art galleries are increasingly becoming primary sites of cultural engagement. They are one of the key tourist attractions for regional towns and expected to generate much needed tourism revenue. In 2017 in New South Wales alone, there were three million visitors to regional galleries and museums (MGNSW 13). However, apart from those (partially) funded by local councils, they are often run on donations, good will, and the enthusiasm of volunteers. Regional museums and galleries provide some paid, and more unpaid, employment for ageing populations. While two-thirds of Australia’s population lives in capital cities, the remainder who live in regional towns are likely to be in the 60+ age cohort because people are choosing to retire away from the bustling, growing cities (ABS). At last count, there were about 3000 museums and galleries in Australia with about 80% of them located in regional areas (Scott). Over the last 40 years, this figure has tripled from the 1000 regional and provincial museums estimated by Peter Piggott in his 1975 report (24). According to a 2014 survey (Shaw and Davidson), New Zealand has about 470 museums and galleries and about 70% are located outside capital cities. The vast majority, 85%, have less than five, full-time paid staff, and more than half of these were run entirely by ageing volunteers. They are entrusted with managing the vast majority of the history and heritage collections of Australia and New Zealand. These ageing volunteers need a diverse range of skills and experience to care for and interpret collections. How do you find the time and budget for professional development for both paid staff and volunteers? Many professional development events are held in capital cities, which are often a significant distance from the regional museum—this adds substantially to the costs of attending and the time commitment required to get there. In addition, it is not uncommon for people working in regional museums to be responsible for everything—from security, collection management, conservation, research, interpretation and public programs to changing the light bulbs. While there are a large number of resources available online, following a manual is often more difficult than learning from other colleagues or learning in a more formal educational or vocational environment where you can receive timely feedback on your work. Further, a foundational level of prior knowledge and experience is often required to follow written instructions. This article will suggest some strategies for low cost professional development and networking. It involves planning, thinking strategically and forming partnerships with others in the region. It is time to harness the power of modern communications technology and use it as a tool for professional development. Some models of professional development in regional areas that have been implemented in the past will also be reviewed. The focus for this article is on training and professional development for workers in regional museums, heritage sites and keeping places. Regional art galleries have not been included because they tend to have separate regional networks and training opportunities. For example, there are professional development opportunities provided through the Art Galleries Association of Australia and their state branches. Regional galleries are also far more likely to have one or more paid staff members (Winkworth, “Fixing the Slums” 2). Regional Museums, Volunteers, and Social CapitalIt is widely accepted that regional museums and galleries enhance social capital and reduce social isolation (Kelly 32; Burton and Griffin 328). However, while working in a regional museum or gallery can help to build friendship networks, it can also be professionally isolating. How do you benchmark what you do against other places if you are two or more hours drive from those places? How do you learn from other colleagues if all your colleagues are also isolated by the ‘tyranny of distance’ and struggling with the same lack of access to training? In 2017 in New South Wales alone, there were 8,629 active volunteers working in regional museums and galleries giving almost five million hours, which Museums and Galleries NSW calculated was worth over $150 million per annum in unpaid labour (MGNSW 1). Providing training and professional development to this group is an investment in Australia’s social and cultural capital.Unlike other community-run groups, the museums and heritage places which have emerged in regional Australia and New Zealand are not part of a national or state branch network. Volunteers who work for the Red Cross, Scouts or Landcare benefit from being part of a national organisation which provides funding, support workers, a website, governance structure, marketing, political advocacy and training (Winkworth, “Let a Thousand Flowers” 11). In Australia and New Zealand, this role is undertaken by the Australian Museums and Galleries Association AMaGA (formerly Museums Australia) and Museums Aotearoa respectively. However, both of these groups operate at the macro policy level, for example organising annual conferences, publishing a journal and developing Indigenous policy frameworks, rather than the local, practical level. In 1995, due to their advocacy work, Landcare Australia received $500 million over five years from the federal government to fund 5000 Landcare groups, which are run by 120,000 volunteers (Oppenheimer 177). They argued successfully that the sustainable development of land resources started at the local level. What do we need to do to convince government of the need for sustainable development of our local and regional museum and heritage resources?Training for Volunteers Working in Regional Museums: The Current SituationAnother barrier to training for regional museum workers is the assumption that the 70:20:10 model of professional development should apply. That is, 70% of one’s professional development is done ‘on the job’ by completing tasks and problem-solving; 20% is achieved by learning from mentors, coaches and role models and 10% is learnt from attending conferences and symposia and enrolling in formal courses of study. However, this model pre-supposes that there are people in your workplace whom you can learn from and who can show you how to complete a task, and that you are not destroying or damaging a precious, unique object if you happen to make a mistake.Some museum volunteers come with skills in research, marketing, administration, customer service or photography, but very few come with specific museum skills like writing exhibition text, registering an acquisition or conserving artefacts. These skills need to be taught. As Kylie Winkworth has written, museum management now requires a [...] skills set, which is not so readily found in small communities, and which in many ways is less rewarding for the available volunteers, who may have left school at 15. We do not expect volunteer librarians to catalogue books, which are in any case of low intrinsic value, but we still expect volunteers in their 70s and 80s to catalogue irreplaceable heritage collections and meet ever more onerous museum standards. That so many volunteers manage to do this is extraordinary. (“Let a Thousand Flowers” 13)Workers in regional museums are constantly required to step outside their comfort zones and learn new skills with minimal professional support. While these challenging experiences can be very rewarding, they are also potentially damaging for our irreplaceable material cultural heritage.Training for museum professionals has been on the agenda of the International Council of Museums (ICOM) since 1947 (Boylan 62). However, until 1996, their work focused on recommending curricula for new museum professionals and did not include life-long learning and on-going professional development. ICOM’s International Committee for the Training of Personnel (ICTOP) and the ICOM Executive has responded to this in their new curricula—ICOM Curricula Guidelines for Professional Museum Development, but this does not address the difficulties staff or volunteers working in regional areas face in accessing training.In some parts of Australia, there are regional support and professional development programs in place. For example, in Queensland, there is the Museum Development Officer (MDO) network. However, because of the geographic size of the state and the spread of the museums, these five regionally based staff often have 60-80 museums or keeping places in their region needing support and so their time and expertise is spread very thinly. It is also predominantly a fee-for-service arrangement. That is, the museums have to pay for the MDO to come and deliver training. Usually this is done by the MDO working with a local museum to apply for a Regional Arts Development Fund (RADF) grant. In Victoria there is a roving curator program where eligible regional museums can apply to have a professional curator come and work with them for a few days to help the volunteers curate exhibitions. The roving curator can also provide advice on “develop[ing] high quality exhibitions for diverse audiences” via email, telephone and networking events. Tasmania operates a similar scheme but their two roving curators are available for up to 25 days of work each year with eligible museums, provided the local council makes a financial contribution. The New South Wales government supports the museum advisor program through which a museum professional will come to your museum for up to 20 days/year to give advice and hands-on training—provided your local council pays $7000, an amount that is matched by the state government—for this service. In 2010, in response to recommendations in the Dunn Report (2007), the Collections Council of Australia (CCA) established a pilot project with the City of Kalgoorlie-Boulder in Western Australia and $120,000 in funding from the Myer Foundation to trial the provision of a paid Collections Care Coordinator who would provide free training, expertise and support to local museums in the region. Tragically, CCA was de-funded by the Cultural Ministers Council the same year and the roll-out of a hub and spoke regional model was not supported by government due to the lack of an evidence base (Winkworth, “Let a Thousand Flowers” 18). An evaluation of the trial project would have tested a different model of regional training and added to the evidence base.All these state-based models (except the aborted Collections Care hub in Western Australia) require small regional museums to compete with each other for access to a museum professional and to successfully apply for funding, usually from their local council or state government. If they are successful, the training that is delivered is a one-off, as they are unlikely to get a second slice of the regional pie.An alternative to this competitive, fly-in fly-out, one-off model of professional development is to harness the technology and resources of local libraries and other cultural facilities in regional areas. This is what the Sydney Opera House Trust did in March 2019 to deliver their All about Women program of speakers via live streaming to 37 satellite sites throughout Australia and New Zealand.Harnessing Technology and Using Regional Library Infrastructure to Provide Training: ScenarioImagine the following scenario. It is a Monday morning in a regional library in Dubbo, New South Wales. Dubbo is 391 km or five hours drive by car from the nearest capital city (Sydney) and there are 50 regional museums within a 100 km radius. Ten people are gathered in a meeting room at the library watching a live stream of the keynote speakers who are presenting at their national museums conference. They are from five regional museums where they work as volunteers or part-time paid staff. They cannot afford to pay $2000, or more, to attend the conference, but they are happy to self-fund to drive for an hour or two to link up with other colleagues to listen to the presentations. They make notes and tweet in their questions using the conference twitter handle and hashtag. They have not been exposed to international speakers in the industry before and the ideas presented are fresh and stimulating. When the conference breaks for morning tea, they take a break too and get to know each other over a cuppa (provided free of charge by the library). Just as the networking sessions at conferences are vitally important for the delegates, they are even more important to address social isolation amongst this group. When they reconvene, they discuss their questions and agree to email the presenters with the questions that are unresolved. After the conference keynote sessions finish, the main conference (in the capital city) disperses into parallel sessions, which are no longer available via live stream.To make the two-hour drive more worthwhile and continue their professional development, they have arranged to hold a significance assessment workshop as well. Each museum worker has brought along photographs of one item in their collection that they want to do more research on. Some of them have also brought the object, if it is small and robust enough to travel. They have downloaded copies of Significance 2.0 and read it before they arrived. They started to write significance reports but could not fully understand how to apply some of the criteria. They cannot afford to pay for professional workshop facilitators, but they have arranged for the local studies librarian to give them an hour of free training on using the library’s resources (online and onsite) to do research on the local area and local families. They learn more about Trove, Papers Past and other research tools which are available online. This is hands-on and computer-based skills training using their own laptops/tablets or the ones provided by the library. After the training with the librarian, they break into two groups and read each other’s significance reports and make suggestions. The day finishes with a cuppa at 2.30pm giving them time to drive home before the sun sets. They agree to exchange email addresses so they can keep in touch. All the volunteers and staff who attended these sessions in regional areas feel energised after these meetings. They no longer feel so isolated and like they are working in the dark. They feel supported just knowing that there are other people who are struggling with the same issues and constraints as they are. They are sick of talking about the lack of budget, expertise, training and resources and want to do something with what they have.Bert (fictional name) decides that it is worth capitalising on this success. He emails the people who came to the session in Dubbo to ask them if they would like to do it again but focus on some different training needs. He asks them to choose two of the following three professional development options. First, they can choose to watch and discuss a recording of the keynote presentations from day two of the recent national conference. The conference organisers have uploaded digital recordings of the speakers’ presentations and the question time to the AMaGA website. This is an option for local libraries that do not have sufficient bandwidth to live stream video. The local library technician will help them cast the videos to a large screen. Second, they can each bring an object from their museum collection that they think needs conservation work. If the item is too fragile or big to move, they will bring digital photographs of it instead. Bert consulted their state-based museum and found some specialist conservators who have agreed to Skype or Facetime them in Dubbo free of charge, to give them expert advice about how to care for their objects, and most importantly, what not to do. The IT technician at Dubbo Library can set up their meeting room so that they can cast the Skype session onto a large smart screen TV. One week before the event, they will send a list of their objects and photographs of them to the conservator so that she can prepare, and they can make best use of her time. After this session, they will feel more confident about undertaking small cleaning and flattening treatments and know when they should not attempt a treatment themselves and need to call on the experts. Third, they could choose to have a training session with the council’s grants officer on writing grant applications. As he assesses grant applications, he can tell them what local councils look for in a successful grant application. He can also inform them about some of the grants that might be relevant to them. After the formal training, there will be an opportunity for them to exchange information about the grants they have applied for in the past—sometimes finding out what’s available can be difficult—and work in small groups to critique each other’s grant applications.The group chooses options two and three, as they want more practical skills development. They take a break in the middle of the day for lunch, which gives them the opportunity to exchange anecdotes from their volunteer work and listen to and support each other. They feel validated and affirmed. They have gained new skills and don’t feel so isolated. Before they leave, Alice agrees to get in touch with everyone to organise their next regional training day.Harnessing Technology and Using Regional Library Infrastructure to Provide Training: BenefitsThese scenarios need not be futuristic. The training needs are real, as is the desire to learn and the capacity of libraries to support regional groups. While funding for regional museums has stagnated or declined in recent years, libraries have been surging ahead. In August 2018, the New South Wales Government announced an “historic investment” of $60 million into all 370 public libraries that would “transform the way NSW’s public libraries deliver much-needed services, especially in regional areas” (Smith). Libraries are equipped and charged with the responsibility of enabling local community groups to make best use of their resources. Most state and national museum workers are keen to share their expertise with their regional colleagues: funding and distance are often the only barriers. These scenarios allow national conference keynote speakers to reach a much larger audience than the conference attendees. While this strategy might reduce the number of workers from regional areas who pay to attend conferences, the reality is that due to distance, other volunteer commitments, expense and family responsibilities, they probably would not attend anyway. Most regional museums and galleries and their staff might be asset-rich, but they are cash-poor, and the only way their workers get to attend conferences is if they win a bursary or grant. In 2005, Winkworth said: “the future for community museums is to locate them within local government as an integral part of the cultural, educational and economic infrastructure of the community, just like libraries and galleries” (“Fixing the Slums” 7). Fourteen years on, very little progress has been made in this direction. Those museums which have been integrated into the local council infrastructure, such as at Orange and Wagga Wagga in western New South Wales, are doing much better than those that are still stuck in ‘cultural poverty’ and trying to operate independently.However, the co-location and convergence of museums, libraries and archives is only successful if it is well managed. Helena Robinson has examined the impact on museum collection management and interpretation of five local government funded, converged collecting institutions in Australia and New Zealand and found that the process is complex and does not necessarily result in “optimal” cross-disciplinary expertise or best practice outcomes (14158).ConclusionRobinson’s research, however, did not consider community-based collecting institutions using regional libraries as sites for training and networking. By harnessing local library resources and making better use of existing communications technology it is possible to create regional hubs for professional development and collegiate support, which are not reliant on grants. If the current competitive, fly-in fly-out, self-funded model of providing professional development and support to regional museums continues, then the future for our cultural heritage collections and the dedicated volunteers who care for them is bleak. Alternatively, the scenarios I have described give regional museum workers agency to address their own professional development needs. This in no way removes the need for leadership, advocacy and coordination by national representative bodies such as AMaGA and Museums Aotearoa. If AMaGA partnered with the Australian Library and Information Association (ALIA) to stream their conference keynote sessions to strategically located regional libraries and used some of their annual funding from the Department of Communication and the Arts to pay for museum professionals to travel to some of those sites to deliver training, they would be investing in the nation’s social and cultural capital and addressing the professional development needs of regional museum workers. This would also increase the sustainability of our cultural heritage collections, which are valuable economic assets.ReferencesAustralian Bureau of Statistics. “2071.0—Census of Population and Housing: Reflecting Australia—Snapshot of Australia, 2016”. Canberra: Australian Bureau of Statistics, 2017. 17 Mar. 2019 <https://www.abs.gov.au/ausstats/abs@.nsf/Lookup/by%20Subject/2071.0~2016~Main%20Features~Snapshot%20of%20Australia,%202016~2>.Boylan, Patrick. “The Intangible Heritage: A Challenge and an Opportunity for Museums and Museum Professional Training.” International Journal of Intangible Heritage 1 (2006): 53–65.Burton, Christine, and Jane Griffin. “More than a Museum? Understanding How Small Museums Contribute to Social Capital in Regional Communities.” Asia Pacific Journal of Arts & Cultural Management 5.1 (2008): 314–32. 17 Mar. 2019 <http://apjacm.arts.unimelb.edu.au/article/view/32>.Dunn, Anne. The Dunn Report: A Report on the Concept of Regional Collections Jobs. Adelaide: Collections Council of Australia, 2007.ICOM Curricula Guidelines for Professional Museum Development. 2000. <http://museumstudies.si.edu/ICOM-ICTOP/comp.htm>.Kelly, Lynda. “Measuring the Impact of Museums on Their Communities: The Role of the 21st Century Museum.” New Roles and Issues of Museums INTERCOM Symposium (2006): 25–34. 17 Mar. 2019 <https://media.australianmuseum.net.au/media/dd/Uploads/Documents/9355/impact+paper+INTERCOM+2006.bb50ba1.pdf>.Museums and Galleries New South Wales (MGNSW). 2018 NSW Museums and Galleries Sector Census. Museums and Galleries of New South Wales. Data and Insights—Culture Counts. Sydney: MGNSW, 2019. 17 Mar. 2019 <https://mgnsw.org.au/wp-content/uploads/2019/02/2018-NSW-Museum-Gallery-Sector-Census.pdf>Oppenheimer, Melanie. Volunteering: Why We Can’t Survive without It. Sydney: U of New South Wales P, 2008.Pigott, Peter. Museums in Australia 1975. Report of the Committee of Inquiry on Museums and National Collections Including the Report of the Planning Committee on the Gallery of Aboriginal Australia. Canberra: Australian Government Printing Service, 1975. 17 Mar. 2019 <https://apo.org.au/node/35268>.Public Sector Commission, Western Australia. 70:20:10 Framework Learning Philosophy. Perth: Government of Western Australia, 2018. 17 Mar. 2019 <https://publicsector.wa.gov.au/centre-public-sector-excellence/about-centre/702010-framework>.Robinson, Helena. “‘A Lot of People Going That Extra Mile’: Professional Collaboration and Cross-Disciplinarity in Converged Collecting Institutions.” Museum Management and Curatorship 31 (2016): 141–58.Scott, Lee. National Operations Manager, Museums Australia, Personal Communication. 22 Oct. 2018.Shaw, Iain, and Lee Davidson, Museums Aotearoa 2014 Sector Survey Report. Wellington: Victoria U, 2014. 17 Mar. 2019 <http://www.museumsaotearoa.org.nz/sites/default/files/documents/museums_aotearoa_sector_survey_2014_report_-_final_draft_oct_2015.pdf>.Smith, Alexandra. “NSW Libraries to Benefit from $60 Million Boost.” Sydney Morning Herald 24 Aug. 2018. 17 Mar. 2019 <https://www.smh.com.au/politics/nsw/nsw-libraries-to-benefit-from-60-million-boost-20180823-p4zzdj.html>. Winkworth, Kylie. “Fixing the Slums of Australian Museums; or Sustaining Heritage Collections in Regional Australia.” Museums Australia Conference Paper. Canberra: Museums Australia, 2005. ———. “Let a Thousand Flowers Bloom: Museums in Regional Australia.” Understanding Museums—Australian Museums and Museology. Eds. Des Griffin and Leon Paroissien. Canberra: National Museum of Australia, 2011. 17 Mar. 2019 <https://nma.gov.au/research/understanding-museums/KWinkworth_2011.html>.

36

Simpson, Catherine. "Cars, Climates and Subjectivity: Car Sharing and Resisting Hegemonic Automobile Culture?" M/C Journal 12, no.4 (September3, 2009). http://dx.doi.org/10.5204/mcj.176.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Al Gore brought climate change into … our living rooms. … The 2008 oil price hikes [and the global financial crisis] awakened the world to potential economic hardship in a rapidly urbanising world where the petrol-driven automobile is still king. (Mouritz 47) Six hundred million cars (Urry, “Climate Change” 265) traverse the world’s roads, or sit idly in garages and clogging city streets. The West’s economic progress has been built in part around the success of the automotive industry, where the private car rules the spaces and rhythms of daily life. The problem of “automobile dependence” (Newman and Kenworthy) is often cited as one of the biggest challenges facing countries attempting to combat anthropogenic climate change. Sociologist John Urry has claimed that automobility is an “entire culture” that has re-defined movement in the contemporary world (Urry Mobilities 133). As such, it is the single most significant environmental challenge “because of the intensity of resource use, the production of pollutants and the dominant culture which sustains the major discourses of what constitutes the good life” (Urry Sociology 57-8). Climate change has forced a re-thinking of not only how we produce and dispose of cars, but also how we use them. What might a society not dominated by the private, petrol-driven car look like? Some of the pre-eminent writers on climate change futures, such as Gwynne Dyer, James Lovelock and John Urry, discuss one possibility that might emerge when oil becomes scarce: societies will descend into civil chaos, “a Hobbesian war of all against all” where “regional warlordism” and the most brutish, barbaric aspects of human nature come to the fore (Urry, “Climate Change” 261). Discussing a post-car society, John Urry also proffers another scenario in his “sociologies of the future:” an Orwellian “digital panopticon” in which other modes of transport, far more suited to a networked society, might emerge on a large scale and, in the long run, “might tip the system” into post-car one before it is too late (Urry, “Climate Change” 261). Amongst the many options he discusses is car sharing. Since its introduction in Germany more than 30 years ago, most of the critical literature has been devoted to the planning, environmental and business innovation aspects of car sharing; however very little has been written on its cultural dimensions. This paper analyses this small but developing trend in many Western countries, but more specifically its emergence in Sydney. The convergence of climate change discourse with that of the global financial crisis has resulted in a focus in the mainstream media, over the last few months, on technologies and practices that might save us money and also help the environment. For instance, a Channel 10 News story in May 2009 focused on the boom in car sharing in Sydney (see: http://www.youtube.com/watch? v=EPTT8vYVXro). Car sharing is an adaptive technology that doesn’t do away with the car altogether, but rather transforms the ways in which cars are used, thought about and promoted. I argue that car sharing provides a challenge to the dominant consumerist model of the privately owned car that has sustained capitalist structures for at least the last 50 years. In addition, through looking at some marketing and promotion tactics of car sharing in Australia, I examine some emerging car sharing subjectivities that both extend and subvert the long-established discourses of the automobile’s flexibility and autonomy to tempt monogamous car buyers into becoming philandering car sharers. Much literature has emerged over the last decade devoted to the ubiquitous phenomenon of automobility. “The car is the literal ‘iron cage’ of modernity, motorised, moving and domestic,” claims Urry (“Connections” 28). Over the course of twentieth century, automobility became “the dominant form of daily movement over much of the planet (dominating even those who do not move by cars)” (Paterson 132). Underpinning Urry’s prolific production of literature is his concept of automobility. This he defines as a complex system of “intersecting assemblages” that is not only about driving cars but the nexus between “production, consumption, machinic complexes, mobility, culture and environmental resource use” (Urry, “Connections” 28). In addition, Matthew Paterson, in his Automobile Politics, asserts that “automobility” should be viewed as everything that makes driving around in a car possible: highways, parking structures and traffic rules (87). While the private car seems an inevitable outcome of a capitalistic, individualistic modern society, much work has gone into the process of naturalising a dominant notion of automobility on drivers’ horizons. Through art, literature, popular music and brand advertising, the car has long been associated with seductive forms of identity, and societies have been built around a hegemonic culture of car ownership and driving as the pre-eminent, modern mode of self-expression. And more than 50 years of a popular Hollywood film genre—road movies—has been devoted to glorifying the car as total freedom, or in its more nihilistic version, “freedom on the road to nowhere” (Corrigan). As Paterson claims, “autonomous mobility of car driving is socially produced … by a range of interventions that have made it possible” (18). One of the main reasons automobility has been so successful, he claims, is through its ability to reproduce capitalist society. It provided a commodity around which a whole set of symbols, images and discourses could be constructed which served to effectively legitimise capitalist society. (30) Once the process is locked-in, it then becomes difficult to reverse as billions of agents have adapted to it and built their lives around “automobility’s strange mixture of co-ercion and flexibility” (Urry, “Climate Change” 266). The Decline of the Car Globally, the greatest recent rupture in the automobile’s meta-narrative of success came about in October 2008 when three CEOs from the major US car firms (General Motors, Ford and Chrysler) begged the United States Senate for emergency loan funds to avoid going bankrupt. To put the economic significance of this into context, Emma Rothschild notes “when the listing of the ‘Fortune 500’ began in 1955, General Motors was the largest American corporation, and it was one of the three largest, measured in revenues, every year until 2007” (Rothschilds, “Can we transform”). Curiously, instead of focusing on the death of the car (industry), as we know it, that this scenario might inevitably herald, much of the media attention focused on the hypocrisy and environmental hubris of the fact that all the CEOs had flown in private luxury jets to Washington. “Couldn’t they have at least jet-pooled?” complained one Democrat Senator (Wutkowski). In their next visit to Washington, most of them drove up in experimental vehicles still in pre-production, including plug-in hybrids. Up until that point no other manufacturing industry had been bailed out in the current financial crisis. Of course it’s not the first time the automobile industries have been given government assistance. The Australian automotive industry has received on-going government subsidies since the 1980s. Most recently, PM Kevin Rudd granted a 6.2 billion dollar ‘green car’ package to Australian automotive manufacturers. His justification to the growing chorus of doubts about the economic legitimacy of such a move was: “Some might say it's not worth trying to have a car industry, that is not my view, it is not the view of the Australian government and it never will be the view of any government which I lead” (The Australian). Amongst the many reasons for the government support of these industries must include the extraordinary interweaving of discourses of nationhood and progress with the success of the car industry. As the last few months reveal, evidently the mantra still prevails of “what’s good for the country is good for GM and vice versa”, as the former CEO of General Motors, Charles “Engine” Wilson, argued back in 1952 (Hirsch). In post-industrial societies like Australia it’s not only the economic aspects of the automotive industries that are criticised. Cars seem to be slowly losing their grip on identity-formation that they managed to maintain throughout “the century of the car” (Gilroy). They are no longer unproblematically associated with progress, freedom, youthfulness and absolute autonomy. The decline and eventual death of the automobile as we know it will be long, arduous and drawn-out. But there are some signs of a post-automobile society emerging, perhaps where cars will still be used but they will not dominate our society, urban space and culture in quite the same way that they have over the last 50 years. Urry discusses six transformations that might ‘tip’ the hegemonic system of automobility into a post-car one. He mentions new fuel systems, new materials for car construction, the de-privatisation of cars, development of communications technologies and integration of networked public transport through smart card technology and systems (Urry, Mobilities 281-284). As Paterson and others have argued, computers and mobile phones have somehow become “more genuine symbols of mobility and in turn progress” than the car (157). As a result, much automobile advertising now intertwines communications technologies with brand to valorise mobility. Car sharing goes some way in not only de-privatising cars but also using smart card technology and networked systems enabling an association with mobility futures. In Automobile Politics Paterson asks, “Is the car fundamentally unsustainable? Can it be greened? Has the car been so naturalised on our mobile horizons that we can’t imagine a society without it?” (27). From a sustainability perspective, one of the biggest problems with cars is still the amount of space devoted to them; highways, garages, car parks. About one-quarter of the land in London and nearly one-half of that in Los Angeles is devoted to car-only environments (Urry, “Connections” 29). In Sydney, it is more like a quarter. We have to reduce the numbers of cars on our roads to make our societies livable (Newman and Kenworthy). Car sharing provokes a re-thinking of urban space. If one quarter of Sydney’s population car shared and we converted this space into green use or local market gardens, then we’d have a radically transformed city. Car sharing, not to be confused with ‘ride sharing’ or ‘car pooling,’ involves a number of people using cars that are parked centrally in dedicated car bays around the inner city. After becoming a member (much like a 6 or 12 monthly gym membership), the cars can be booked (and extended) by the hour via the web or phone. They can then be accessed via a smart card. In Sydney there are 3 car sharing organisations operating: Flexicar (http://www.flexicar.com.au/), CharterDrive (http://www.charterdrive.com.au/) and GoGet (http://www.goget.com.au/).[1] The largest of these, GoGet, has been operating for 6 years and has over 5000 members and 200 cars located predominantly in the inner city suburbs. Anecdotally, GoGet claims its membership is primarily drawn from professionals living in the inner-urban ring. Their motivation for joining is, firstly, the convenience that car sharing provides in a congested, public transport-challenged city like Sydney; secondly, the financial savings derived; and thirdly, members consider the environmental and social benefits axiomatic. [2] The promotion tactics of car sharing seems to reflect this by barely mentioning the environment but focusing on those aspects which link car sharing to futuristic and flexible subjectivities which I outline in the next section. Unlike traditional car rental, the vehicles in car sharing are scattered through local streets in a network allowing local residents and businesses access to the vehicles mostly on foot. One car share vehicle is used by 22-24 members and gets about seven cars off the street (Mehlman 22). With lots of different makes and models of vehicles in each of their fleets, Flexicar’s website claims, “around the corner, around the clock” “Flexicar offers you the freedom of driving your own car without the costs and hassles of owning one,” while GoGet asserts, “like owning a car only better.” Due to the initial lack of interest from government, all the car sharing organisations in Australia are privately owned. This is very different to the situation in Europe where governments grant considerable financial assistance and have often integrated car sharing into pre-existing public transport networks. Urry discusses the spread of car sharing across the Western world: Six hundred plus cities across Europe have developed car-sharing schemes involving 50,000 people (Cervero, 2001). Prototype examples are found such as Liselec in La Rochelle, and in northern California, Berlin and Japan (Motavalli, 2000: 233). In Deptford there is an on-site car pooling service organized by Avis attached to a new housing development, while in Jersey electric hire cars have been introduced by Toyota. (Urry, “Connections” 34) ‘Collaborative Consumption’ and Flexible, Philandering Subjectivities Car sharing shifts the dominant conception of a car from being a ‘commodity’, which people purchase and subsequently identify with, to a ‘service’ or network of vehicles that are collectively used. It does this through breaking down the one car = one person (or one family) ratio with one car instead servicing 20 or more people. One of Paterson’s biggest criticisms concerns car driving as “a form of social exclusion” (44). Car sharing goes some way in subverting the model of hyper-individualism that supports both hegemonic automobility and capitalist structures, whereby the private motorcar produces a “separation of individuals from one another driving in their own private universes with no account for anyone else” (Paterson 90). As a car sharer, the driver has to acknowledge that this is not their private domain, and the car no longer becomes an extension of their living room or bedroom, as is noted in much literature around car cultures (Morris, Sheller, Simpson). There are a community of people using the car, so the driver needs to be attentive to things like keeping the car clean and bringing it back on time so another person can use it. So while car sharing may change the affective relationship and self-identification with the vehicle itself, it doesn’t necessarily change the phenomenological dimensions of car driving, such as the nostalgic pleasure of driving on the open road, or perhaps more realistically in Sydney, the frustration of being caught in a traffic jam. However, the fact the driver doesn’t own the vehicle does alter their relationship to the space and the commodity in a literal as well as a figurative way. Like car ownership, evidently car sharing also produces its own set of limitations on freedom and convenience. That mobility and car ownership equals freedom—the ‘freedom to drive’—is one imaginary which car firms were able to successfully manipulate and perpetuate throughout the twentieth century. However, car sharing also attaches itself to the same discourses of freedom and pervasive individualism and then thwarts them. For instance, GoGet in Sydney have run numerous marketing campaigns that attempt to contest several ‘self-evident truths’ about automobility. One is flexibility. Flexibility (and associated convenience) was one thing that ownership of a car in the late twentieth century was firmly able to affiliate itself with. However, car ownership is now more often associated with being expensive, a hassle and a long-term commitment, through things like buying, licensing, service and maintenance, cleaning, fuelling, parking permits, etc. Cars have also long been linked with sexuality. When in the 1970s financial challenges to the car were coming as a result of the oil shocks, Chair of General Motors, James Roche stated that, “America’s romance with the car is not over. Instead it has blossomed into a marriage” (Rothschilds, Paradise Lost). In one marketing campaign GoGet asked, ‘Why buy a car when all you need is a one night stand?’, implying that owning a car is much like a monogamous relationship that engenders particular commitments and responsibilities, whereas car sharing can just be a ‘flirtation’ or a ‘one night stand’ and you don’t have to come back if you find it a hassle. Car sharing produces a philandering subjectivity that gives individuals the freedom to have lots of different types of cars, and therefore relationships with each of them: I can be a Mini Cooper driver one day and a Falcon driver the next. This disrupts the whole kind of identification with one type of car that ownership encourages. It also breaks down a stalwart of capitalism—brand loyalty to a particular make of car with models changing throughout a person’s lifetime. Car sharing engenders far more fluid types of subjectivities as opposed to those rigid identities associated with ownership of one car. Car sharing can also be regarded as part of an emerging phenomenon of what Rachel Botsman and Roo Rogers have called “collaborative consumption”—when a community gets together “through organized sharing, swapping, bartering, trading, gifting and renting to get the same pleasures of ownership with reduced personal cost and burden, and lower environmental impact” (www.collaborativeconsumption.com). As Urry has stated, these developments indicate a gradual transformation in current economic structures from ownership to access, as shown more generally by many services offered and accessed via the web (Urry Mobilities 283). Rogers and Botsman maintain that this has come about through the “convergence of online social networks increasing cost consciousness and environmental necessity." In the future we could predict an increasing shift to payment to ‘access’ for mobility services, rather than the outright private ownerships of vehicles (Urry, “Connections”). Networked-Subjectivities or a ‘Digital Panopticon’? Cars, no longer able on their own to signify progress in either technical or social terms, attain their symbolic value through their connection to other, now more prevalently ‘progressive’ technologies. (Paterson 155) The term ‘digital panopticon’ has often been used to describe a dystopian world of virtual surveillance through such things as web-enabled social networking sites where much information is public, or alternatively, for example, the traffic surveillance system in London whereby the public can be constantly scrutinised through the centrally monitored cameras that track people’s/vehicle’s movements on city streets. In his “sociologies of the future,” Urry maintains that one thing which might save us from descending into post-car civil chaos is a system governed by a “digital panopticon” mobility system. This would be governed by a nexus system “that orders, regulates, tracks and relatively soon would ‘drive’ each vehicle and monitor each driver/passenger” (Urry, “Connections” 33). The transformation of mobile technologies over the last decade has made car sharing, as a viable business model, possible. Through car sharing’s exploitation of an online booking system, and cars that can be tracked, monitored and traced, the seeds of a mobile “networked-subjectivity” are emerging. But it’s not just the technology people are embracing; a cultural shift is occurring in the way that people understand mobility, their own subjectivity, and more importantly, the role of cars. NETT Magazine did a feature on car sharing, and advertised it on their front cover as “GoGet’s web and mobile challenge to car owners” (May 2009). Car sharing seems to be able to tap into more contemporary understandings of what mobility and flexibility might mean in the twenty-first century. In their marketing and promotion tactics, car sharing organisations often discursively exploit science fiction terminology and generate a subjectivity much more dependent on networks and accessibility (158). In the suburbs people park their cars in garages. In car sharing, the vehicles are parked not in car bays or car parks, but in publically accessible ‘pods’, which promotes a futuristic, sci-fi experience. Even the phenomenological dimensions of swiping a smart card over the front of the windscreen to open the car engender a transformation in access to the car, instead of through a key. This is service-technology of the future while those stuck in car ownership are from the old economy and the “century of the car” (Gilroy). The connections between car sharing and the mobile phone and other communications technologies are part of the notion of a networked, accessible vehicle. However, the more problematic side to this is the car under surveillance. Nic Lowe, of his car sharing organisation GoGet says, “Because you’re tagged on and we know it’s you, you are able to drive the car… every event you do is logged, so we know what time you turned the key, what time you turned it off and we know how far you drove … if a car is lost we can sound the horn to disable it remotely to prevent theft. We can track how fast you were going and even how fast you accelerated … track the kilometres for billing purposes and even find out when people are using the car when they shouldn’t be” (Mehlman 27). The possibility with the GPS technology installed in the car is being able to monitor speeds at which people drive, thereby fining then every minute spent going over the speed limit. While this conjures up the notion of the car under surveillance, it is also a much less bleaker scenario than “a Hobbesian war of all against all”. Conclusion: “Hundreds of Cars, No Garage” The prospect of climate change is provoking innovation at a whole range of levels, as well as providing a re-thinking of how we use taken-for-granted technologies. Sometime this century the one tonne, privately owned, petrol-driven car will become an artefact, much like Sydney trams did last century. At this point in time, car sharing can be regarded as an emerging transitional technology to a post-car society that provides a challenge to hegemonic automobile culture. It is evidently not a radical departure from the car’s vast machinic complex and still remains a part of what Urry calls the “system of automobility”. From a pro-car perspective, its networked surveillance places constraints on the free agency of the car, while for those of the deep green variety it is, no doubt, a compromise. Nevertheless, it provides a starting point for re-thinking the foundations of the privately-owned car. While Urry makes an important point in relation to a society moving from ownership to access, he doesn’t take into account the cultural shifts occurring that are enabling car sharing to be attractive to prospective members: the notion of networked subjectivities, the discursive constructs used to establish car sharing as a thing of the future with pods and smart cards instead of garages and keys. If car sharing became mainstream it could have radical environmental impacts on things like urban space and pollution, as well as the dominant culture of “automobile dependence” (Newman and Kenworthy), as Australia attempts to move to a low carbon economy. Notes [1] My partner Bruce Jeffreys, together with Nic Lowe, founded Newtown Car Share in 2002, which is now called GoGet. [2] Several layers down in the ‘About Us’ link on GoGet’s website is the following information about the environmental benefits of car sharing: “GoGet's aim is to provide a reliable, convenient and affordable transport service that: allows people to live car-free, decreases car usage, improves local air quality, removes private cars from local streets, increases patronage for public transport, allows people to lead more active lives” (http://www.goget.com.au/about-us.html). References The Australian. “Kevin Rudd Throws $6.2bn Lifeline to Car Industry.” 10 Nov. 2008. < http://www.theaustralian.news.com.au/business/story/ 0,28124,24628026-5018011,00.html >.Corrigan, Tim. “Genre, Gender, and Hysteria: The Road Movie in Outer Space.” A Cinema Without Walls: Movies, Culture after Vietnam. New Jersey: Rutgers University Press, 1991. Dwyer, Gwynne. Climate Wars. North Carlton: Scribe, 2008. Featherstone, Mike. “Automobilities: An Introduction.” Theory, Culture and Society 21.4-5 (2004): 1-24. Gilroy, Paul. “Driving while Black.” Car Cultures. Ed. Daniel Miller. Oxford: Berg, 2000. Hirsch, Michael. “Barack the Saviour.” Newsweek 13 Nov. 2008. < http://www.newsweek.com/id/168867 >. Lovelock, James. The Revenge of Gaia: Earth’s Climate Crisis and the Fate of Humanity. Penguin, 2007. Lovelock, James. The Vanishing Face of Gaia. Penguin, 2009. Mehlman, Josh. “Community Driven Success.” NETT Magazine (May 2009): 22-28. Morris, Meaghan. “Fate and the Family Sedan.” East West Film Journal 4.1 (1989): 113-134. Mouritz, Mike. “City Views.” Fast Thinking Winter 2009: 47-50. Newman, P. and J. Kenworthy. Sustainability and Cities: Overcoming Automobile Dependence. Washington DC: Island Press, 1999. Paterson, Matthew. Automobile Politics: Ecology and Cultural Political Economy. Cambridge: Cambridge University Press, 2007. Rothschilds, Emma. Paradise Lost: The Decline of the Auto-Industrial Age. New York: Radom House, 1973. Rothschilds, Emma. “Can We Transform the Auto-Industrial Society?” New York Review of Books 56.3 (2009). < http://www.nybooks.com/articles/22333 >. Sheller, Mimi. “Automotive Emotions: Feeling the Car.” Theory, Culture and Society 21 (2004): 221–42. Simpson, Catherine. “Volatile Vehicles: When Women Take the Wheel.” Womenvision. Ed. Lisa French. Melbourne: Damned Publishing, 2003. 197-210. Urry, John. Sociology Beyond Societies: Mobilities for the 21st Century. London: Routledge, 2000. Urry, John. “Connections.” Environment and Planning D: Society and Space 22 (2004): 27-37. Urry, John. Mobilities. Cambridge, and Maiden, MA: Polity Press, 2008. Urry, John. “Climate Change, Travel and Complex Futures.” British Journal of Sociology 59. 2 (2008): 261-279. Watts, Laura, and John Urry. “Moving Methods, Travelling Times.” Environment and Planning D: Society and Space 26 (2008): 860-874. Wutkowski, Karey. “Auto Execs' Private Flights to Washington Draw Ire.” Reuters News Agency 19 Nov. 2008. < http://www.reuters.com/article/newsOne/idUSTRE4AI8C520081119 >.

You might also be interested in the bibliographies on the topic 'Boot Camp (Computer file)' for other source types:

Books

To the bibliography
Journal articles: 'Boot Camp (Computer file)' – Grafiati (2024)
Top Articles
Latest Posts
Article information

Author: Mr. See Jast

Last Updated:

Views: 6252

Rating: 4.4 / 5 (55 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Mr. See Jast

Birthday: 1999-07-30

Address: 8409 Megan Mountain, New Mathew, MT 44997-8193

Phone: +5023589614038

Job: Chief Executive

Hobby: Leather crafting, Flag Football, Candle making, Flying, Poi, Gunsmithing, Swimming

Introduction: My name is Mr. See Jast, I am a open, jolly, gorgeous, courageous, inexpensive, friendly, homely person who loves writing and wants to share my knowledge and understanding with you.