Hello, I’m Emily! Over the past year, I’ve been working as the Senior Accessibility Editorial Fellow for the Trans-Feminist and Queer Digital Praxis Workshop (TFQ DPW). I’ve been writing alt text for the Digital Research Ethics Collaboratory (DREC) and Cabaret Commons, starting with the image descriptions for T.L. & Jas’s wonderful Fancy Fridays calendar. As a Disabled queer woman, I work from the belief that access should not only be a fundamental right, but can also be fun, engaging, and lead to new knowledge as more people can enter the conversation. This series will explore the history, present, and future of accessibility in digital humanities, beginning with an overview of the connections between computing and injustice.
Why should we think about accessibility in digital humanities?
While the internet has created new methods of participation and has become integrated into many people’s daily lives, the digital divide continues to deepen as those without access or full access to internet technologies face new methods of exclusion. So much information is available online, but it’s still beyond barriers, from sites that don’t support screen readers to the physical use of computers and the internet.
In The History of Special Education: From Isolation to Integration, professor and special education researcher Margret Winzer (1940/1993) notes that many early technologies were created to give Disabled people access, or as attempts to force assimilation. Alexander Graham Bell’s1telephone came out of his work on creating a device that would “make speech visible” for children with hearing impairments by transmitting stylus writing through wires2; typewriters came from creating tactile communication systems for Blind users; and one of Thomas Edison’s listed uses for the 1877 Tin Foil Phonograph was phonographic books that would speak aloud to people with visual impairments, with the first “talking books” available on records in the 1930s, leading to the popularization of long-playing records (Winzer, 1940/1993, p.149). Professor Paul T. Jaeger (2012) tracks how, despite this history, widespread use of communication technologies has outpaced accessible versions by decades, such as “the omnipresence of television and the requirements for closed captioning,” threatening to turn physical and cognitive disabilities into social exclusion as communication technology becomes more omnipresent (pp.39-40).
In the late 1960s, the civil rights movement and computer revolutions collided. American academic Charlton D. McIlwain’s book, Black Software: The Internet and Racial Justice, From the Afronet to Black Lives Matter, tracks the “collision course” of automation and the civil rights movement, which “left the computer revolution unscathed, and the civil rights revolution twisted and mangled up within it” (p.180). Four days before the signing of the Voting Rights Act in 1965, the US Department of Labor headlined their news release: “Heads-on Collision Course for Civil Rights and Automation,” noting that these twin revolutions were “here, and here to stay” (Mcllwain, 2020, p.179).
Growing out of the civil rights movements, Disability Rights activists organized larger projects. Ed Roberts and the Rolling Quads founded the first Center for Independent Living in California in 1972; The Gang of 19 transit protest in Denver later that decade won wheelchair-accessible transit. The U.N. General Assembly’s 1975 Declaration on the Rights of Disabled Persons, followed by the United Nations International Year of Disabled Persons in 1981, led to an emerging focus on disability, which had been internationally neglected (Goggin, 2015, section 2). The eighth right of the 1975 declaration states that “Disabled persons are entitled to have their special needs taken into consideration at all stages of economic and social planning” (United Nations, 1975). While the Declaration focused more on social protection and did not specifically promote rights for communication, the 2006 United Nations Convention on the Rights of Persons with Disabilities shifted the focus to Disability Rights. Article 4 of the 2006 declaration promotes the “research and development” of new technologies for mobility and communication; Article 9 encourages states to identify and eliminate barriers that keep people from full access to spaces, communication, and technology; and Article 21 asks to make information “accessible and usable” to people with disabilities, ensuring the “freedom to seek, receive and impart information and ideas on an equal basis with others and through all forms of communication of their choice” (United Nations, 2006). While these declarations are not binding—even for countries like Canada that ratified the declaration—the focus on making spaces, information, and communication accessible at the lowest barrier to entry possible aims to shift the creation of technologies and environments with accessibility considered from the beginning.
Tara McPherson (2012) notes how the creation of UNIX “operating systems” in the 1960s are deeply embedded in the fights for justice at the same time, pushing back against “operating systems of a larger order” (pp. 141-142) McPherson employs the metaphor of lenticular postcards, which “bring two or more images together even while suppressing their connections” (p. 143). While the images are “conjoined on a structural level (i.e., within the same card),” the lens or construction “ makes simultaneously viewing the various images contained on one card nearly impossible” (McPherson, 2012, 144), elsewhere noting that the lenticular is “a covert mode of the pretense of separate but equal, remixed for midcentury America” (2003, p. 250).
In an interview with communications professor Henry Jenkins (2015), McPherson further notes the historical contiguity of computers, digital humanities, and predominant logics:
The introduction of digital computer operating systems at mid-century installed an extreme logic of modularity and seriality that “black-boxed” knowledge in a manner quite similar to emerging logics of racial visibility and racism at that time. There is something particular to the very forms of the digital that encourages just such a partitioning, a portioning off that also played out in new configurations of the city in the 1960s and 1970s, in the increasing specialization of academic fields, and even in the formation of many modes of identity politics (para 6).
These rights were further enshrined in the Canadian Telecommunications Act of 1993, where under section 24.1, the conditions of supplying telecommunications services include the conditions of access by people with disabilities. Kanayama’s 2003 study of the American counterpart to this act, the Telecommunications Act of 1996, discussed how the implementation has been favourable to the telecommunication industry over the access of those with disabilities, due to the language of “encouraging” the “consideration” of accessibility issues (pp.191-192). Without penalties for non-compliance, this language is ultimately pro-industry, as no amount of inclusion in rule-making will make a difference if there is no obligation (p.185).
Despite the history of technology, advertisements make it clear that ‘accessibility’ often means something different in the marketplace.
Vintage advertisements show that “access” has been a selling feature of computers since they were available for consumers—but the main focus was on the (white, suburban) access to games and information through the home computer, such as the Atari Home Computers advertisements that offers to “bring a world of information, education, and entertainment into your living room.” The other target audience for these advertisements were business-based, such as the IBM 5110 Computing System advertisement, promising increased productivity through the ease of automation, low cost, and “quick access to data.” Access was often framed for the consumer as being low cost, shown in the advertisement for the Commodore 64 computer, which compares the Apple, TRS, and IBM computers that cost $999 and up to the Commodore’s under $600 price tag, asking "if personal computers are for everybody how come they're priced for nobody?" This framing problematizes the ‘accessible add-ons’ that often come at a significant cost to users for which the base model doesn’t work. Judy Brewer, the previous Director of the Web Accessibility Initiative (WAI) at the World Wide Web Consortium (W3C), notes in her TEDX talk that “Digital accessibility goes well beyond the web,” including cellphones, robots, and other machines that highlight the porous boundary between the ‘real’ and ‘digital’ world (2:44-2:47). As the UN Declaration states, and as these advertisements parallel, access also has an economic component.
As we have seen with AI 3, without awareness, we reproduce environments that perpetuate exclusion, as the act of digitalizing involves classification, which always flattens, distorts, and simplifies toward the consolidation of power. This pattern goes back to mapping the physical world, which involves what Professor Miriam Posner (2015) identifies as enshrining:
a Cartesian model of space that derives directly from a colonialist project of empire-building. This business of flattening and distorting space so that it can be graphed with latitude and longitude? That makes sense when you’re assembling an empire — which is why the Mercator projection 4 emerged in Western Europe in the 16th century. It doesn’t help, of course, that Google Maps is owned by a corporate entity with intentions that are pretty opaque (para 5).
As our brief look at history shows, the digital and ‘real’ world are inseparable, but often viewed through a lenticular lens as separate parts. As offline logics are onlined, the relation between user and space continues to spin in the creation of self. This process is described by the late Canadian Philosopher Ian Hacking as a “Looping effect,” which is how a “classification may interact with the people classified,” a recursive loop where the self is formed in relation to their categorization. This echoes the social model of disability, where the categorization of Disabled leads to social, political, and economic stigmatization, dis-abling people in the process.
In the act of self and future creation, we must remember the environment, taking a space-based approach to knowledge. The same method of categorizing-to-control that we witness offline is being onlined and sped up through algorithms within the online empire. Communication scholars Aparajita Bhandari and Sara Bimo (2020) have noted that the algorithmic catering of TikTok leads to the experience of “repeatedly engaging with one’s own self: intra rather than interpersonal connection” (p.3). Content creation is “heavily incentivized” through the design of the home screen with a large, centralized “record” button–aiming to create an echo chamber of the self to “personalize” content to the watcher (2020, p.3). This method of self-building is done within an environment that seeks to keep you there for profit, not in order to better yourself or find something truly ‘new.’ The algorithmic process of siloing internet users into personalized “filter bubbles,” which author-activist Eli Pariser defines as “a unique universe of information for each of us” (2011, p.10), not only bolsters a supposed ‘sense of self’ that can be known, creating an emotional connection to the self that the app creates, but ultimately, the app seeks to profit from that false connection. Filter bubbles offer a lenticular view of both reality and the self—and these versions of reality and the self are aimed at capturing your time and attention through lenticular personalization.
The following images and video shows two vintage computer advertisements spliced into strips and alternated, creating a physical lenticular poster by folding the paper like an accordion. The two posters show the opposing logics of accessibility in computing, with a Burroughs Corporation advertisement showing a man at his computer, a genie emerging from his work proclaiming that “MAN plus a Computer equals a GIANT”; the second of a woman in a wheelchair at a typewriter, the text reminding us that “You don’t have to walk to type.” The transhumanist approach to these posters—combining humans with machines—both exist on the same plane, and focus on the business/productivity aspects of this relationship. However, just as the profit motive seeks to separate ideas of access and accessibility, these images are not simultaneously viewable.
So what does access look like in digital humanities?
Full access requires and creates massive system changes as Disability Justice advocate Patty Berne (2021) states: “We are in a global system that is incompatible with life. There is no way stop a single gear in motion — we must dismantle this machine” (para 13). This requires that datafication—making data models of the world— as well as the creation of digital (and real-world) spaces must be done with people, not for or about people. In terms of communication and datafication, while we want findings to be legible and “useful,” this often involves a flattening or simplification, which must be understood and accounted for by all users. Perhaps digital humanities provide a valuable space to see that identifiers are not, as Posner (2015) states, “containers to be filled in order to produce meaning” but are actively created, performed, and reproduced (para 22). For example, in my lifetime, census, medical, and other forms have gone from a binary male/female identifier to make room for more ways of existing.
Posner (2015) sees this as requiring “dismantling and rebuilding much of the organizing logic, like the data models or databases, that underlies most our work…It’s not only about shifting the focus of projects so that they feature marginalized communities more prominently; it’s about ripping apart and rebuilding the machinery of the archive and database so that it doesn’t reproduce the logic that got us here in the first place” (para 1) These lenticular logics partition reality into ‘understandable’ chunks, but ultimately hide the entire picture. Martin Schoeller’s National Geographic photos, alongside Lise Funderberg’s writing, “The Changing Face of America” (definitely don't get around that paywall using readerview ;) contrasts the self-identification of people with the census categories that they check—showing that self-conception is significantly more nuanced than can be easily categorized. One interviewee also notes that this kind of categorization only reveals a “fraction of her identity” (para 10), as her upbringing, interests, and family combine into her sense of self. Funderberg concludes, "If we can’t slot people into familiar categories, perhaps we’ll be forced to reconsider existing definitions of race and identity, presumptions about who is us and who is them” (para 11).
When it comes to destroying the machine, I’m reminded of Neurath’s Boat, a philosophical simile for knowledge by Otto Neurath (1921/1973):
We are like sailors who on the open sea must reconstruct their ship but are never able to start afresh from the bottom. Where a beam is taken away a new one must at once be put there, and for this the rest of the ship is used as support. In this way, by using the old beams and driftwood the ship can be shaped entirely anew, but only by gradual reconstruction (p.199).
Some element of the foundation must remain at any given moment, or we are left with nowhere to build from. The partitioning created through lenticular logics can never allow what is broken apart to collide in the constructed space that they share; instead, the formatting and delivery must be changed. We must decide collectively where we want to rebuild our systems of knowledge from, and which elements we want to keep. Civil rights organizer and labour leader A. Philip Randolph acknowledged this difficulty while highlighting the importance of sharing power in a 1962 conversation:
Technological change cannot be stopped.
But the great masses of the people should not be required to bear the brunt of the impact of this great automation revolution, which is shaking the world.
The march of science cannot be arrested.
But what can be done about it?
You cannot destroy the machine, you cannot stifle the invention of various geniuses in the world.
Then what is to be done? (McIlwain, 2020, 174).
In the next portions of this series we will take a closer look at this machine, from physical machines such as the computer, mouse, and keyboard; as well as the internal machinations and logics of communication through these devices, such as alt text, language, information structuring, and captioning; working through how to dismantle and rebuild a machine in motion.
Bhandari, A., & Bimo, S. (2020). TikTok and the “Algorithmized Self”: A New Model of Online Interaction. AoIR Selected Papers of Internet Research, 2020. https://doi.org/10.5210/spir.v...
Berne, P. (2021, September 23). Disability justice - a working draft by Patty Berne. Sins Invalid. https://www.sinsinvalid.org/blog/disability-justice-a-working-draft-by-patty-berne
Buolamwini, J. (2016, November). How I'm fighting bias in algorithms [Video]. TED: Ideas Worth Spreading. https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms?language=en
Funderberg, L. (2013, October 1). The changing face of America. National Geographic. https://www.nationalgeographic.com/magazine/article/changing-face-america
Hacking, I. (2006) "Making Up People." London Review of Books 28(16), 285–317.
Jaeger, P. T. (2012). Disability and the Internet : confronting a digital divide. Lynne Rienner Publishers. https://doi.org/10.1515/9781626371910
Jenkins, H. (2015, March 20). Bringing critical perspectives to the digital humanities: An interview with Tara McPherson (Part three) — Pop junctions. https://henryjenkins.org/2015/03/bringing-critical-perspectives-to-the-digital-humanities-an-interview-with-tara-mcpherson-part-three.html
Kanayama, T. (2003). Leaving It Up to the Industry: People With Disabilities and the Telecommunications Act of 1996. The Information Society, 19(2), 185–194. https://doi.org/10.1080/01972240309456
Kantayya, S. (2020). Coded bias. 7th Empire Media.
McIlwain, C. D. (2020). Black software : the Internet and racial justice, from the AfroNet to Black Lives Matter. Oxford University Press.
McPherson, T. (2003) Reconstructing Dixie: Race, Place and Nostalgia in the Imagined South. Duke University Press.
McPherson, T. (2012). Why Are the Digital Humanities So White?: Or Thinking the Histories of Race and Computation. In M. K. Gold (Eds.), Debates in the Digital Humanities (pp. 139–160). University of Minnesota Press. https://doi.org/10.5749/j.ctttv8hq.12.
Monmonier, M. S. (2004). Rhumb lines and map wars : a social history of the Mercator projection. University of Chicago Press. https://doi.org/10.7208/9780226534329
Neurath, O. (1973). Anti-Spengler. In: Neurath, M., Cohen, R.S. (eds) Empiricism and Sociology. Dordrecht. https://doi.org/10.1007/978-94-010-2525-6_6. Originally published 1921.
Noble, S. U. (2018). Algorithms of oppression : how search engines reinforce racism. New York University Press.
O'Neil, L. (2023, August 16). These women tried to warn us about AI. Rolling Stone. https://www.rollingstone.com/culture/culture-features/women-warnings-ai-danger-risk-before-chatgpt-1234804367/
Pariser, E. (2011). The filter bubble : what the Internet is hiding from you. Viking.
Posner, M. (2015, August 12). The Radical Potential of the Digital Humanities: The Most Challenging Computing Problem Is the Interrogation of Power. The London School of Economics and Political Science. https://blogs.lse.ac.uk/impactofsocialsciences/2015/08/12/the-radical-unrealized-potential-of-digital-humanities/.
United Nations General Assembly. (1975, December 9). Declaration on the Rights of Disabled Persons. https://www.ohchr.org/sites/default/files/res3447.pdf
United Nations. (2006, December 13). United Nations Convention on the Rights of Persons with Disabilities. https://www.ohchr.org/en/hrbod...
Unsound: The legacy of Alexander Graham Bell. (2021, May 10). CBC. https://www.cbc.ca/radio/ideas/unsound-the-legacy-of-alexander-graham-bell-1.6020596
Winzer, M. (1993). The History of Special Education: From Isolation to Integration. Gallaudet University Press. Originally published 1940
Vintage Poster Images
Designbeep. (2011, December 1). 55 vintage computer ads which will make you compare today and past. https://designbeep.com/2010/09/30/55-vintage-computer-ads-which-will-make-you-compare-today-and-past/
EveryBody: An Artifact History of Disability in America. (n.d.). Public service ad, 1980s. Smithsonian National Museum of American History. https://everybody.si.edu/media/690
Pennlive. (2016, March 25). Vintage video game ads bring back memories of Atari, original NES. https://www.pennlive.com/entertainment/2016/03/vintage_video_games.html