The idea for this intervention arose from workshop discussions around transparency and ethics in writing and publishing, and touched on questions raised by many participants about what actually constitutes AI when it comes to the writing process.
Participants imagined what it would mean to be able to see how much ‘AI’ goes into the writing of different texts, and to be able to decide whether or not to consume the product, much like the nutritional labels that appear on foodstuffs in a supermarket. Much discussion was had about what ingredients would put readers off a particular text, and also what levels of AI input would make it ethically acceptable to publish or read. If a novel was written with the help of ChatGPT, for example, would it have a ‘high sugar’ type warning on the label. And what would be the ‘palm-oil’ equivalent in literary terms, the ingredient or level of content that for many consumers would mean an outright boycott on a product for moral or ethical reasons?
WWAI team members Billy Dixon and Evan Morgan took the idea forward and speculatively designed a nutritional label for written work that could be filled in and stuck on the back cover of a book or other texts, thus affording the prospective consumer the information needed to make an informed decision on what texts they choose to ingest or engage with.
The X-Ray Specs intervention was tested and critiqued at the third WWAI workshop. Participants were invited to engage with the concept through a range of texts. On the table were: a novel, a magazine, a research paper, a school textbook and a poem, as well as a set of stickers, some empty and some partially filled in, with criteria for ‘Text Ingredients’. The author of the novel, The Spy by crime writer Ajay Chowdhury, has famously been open about how he used AI such as ChatGPT to help write the book, so was picked as an example, with a pre-filled Text Ingredients sticker attached to the rear cover.
The aims of the X-Ray Specs workshopping were:
1. For each group to talk about what value (and potential risks or harms) they saw in having greater transparency around what tools or processes were used to create a text.
2. To create labels for different types of text-based media writing down what they thought should be publicly available knowledge about what has gone into creating each media object, showing the nuances in the need for transparency between these different types and contexts for text.
Findings:
The workshop provoked some fascinating discussion around the potential uses of such a tool by different stakeholders in the book industry, as well as a considerable degree of concern and pushback against the potentially problematic use or deployment of such an intervention. Participants also raised important issues around equality, diversity and inclusion, whereby the use of AI tools can create a more level playing field for writers with disabilities.
Concerns about over-transparency
Participants were reluctant to engage with the sticker activity and decide what text ingredients should be transparent without discussing under what conditions it would be used, who can see the information, who is in control of the system and understanding/deciding on the full context of the underpinning transparency.
Also of concern to participants was the effect on authors if tools they use are catalogued, shown to publishers and consumers meaning that they could be judged on their creative process rather than just the works they produce. Critical to these concerns, was the worry of what anxieties would be induced by yet more digital panopticons, in a world where everything we do is already being tracked.
What is of value for particular stakeholders?
– For publishers: participants could see how consumers would like to engage with the information on the labels and could see Goodreads as somewhere that this might work
– For authors: participants could see value in tools that show you quantified information about what they are working on and track this over time to be able to improve their work, with the added functionality of sharing this with/comparing to data of friends and colleagues. But participants were insistent that they wanted to be able to turn off this sharing at any time.
Other Questions and concerns raised by participants included:
– What would I be judged for? Would the use of Grammarly mean that my work is deemed less valuable?
– What % of AI is too much? For some, 20% means something has no worth, for others this would be less or more.
– Who is in control of this system, where is the pressure to publish information coming from? Public/social pressure? Or publishers’ rules?
– Is it going to judge me as an author if my editor goes through my work using loads of tools that I never used?
– Accuracy or margins of error of this if it were a detection tool. How is this being tracked? By a human? A machine? By AI?
– Labels were perceived as inherently reductive by one participant, who wanted to think of a more elegant solution.
– Are demographics important? Maybe a man is more likely to get away with using an AI tool without being perceived as being less creative?
– Accessibility, diversity, and differences in abilities: One dyslexic person used Grammarly all the time, when another writer said they didn’t see how any author could think about using it.
Ideas for new tools/platforms/services
Tools that show how original/different I am to a machine:
– Originality/cliché-ometer score: (comparing your word combinations to others within a database)
– Humanity score: how different is what I’ve written from what ChatGPT would have written?
– Chat GPT shows me what not to write. We can use ChatGPT to make sure that nothing I write is as bad as an AI’s work.
‘Strava’ for writers:
Participants discussed the merits of a Strava-type app for writers, which would enable them to track multiple parameters and to visualise them over time to get data driven insights into complex analyses of their work. Writers would be able to share this data with other authors to compare data and strategies. Examples included:
- today you were 13% less original than your normal daily average
- 85% of the work that you write after 5pm gets cut by your editor, maybe you should start ending work hours earlier
A tangential idea around this was to have a smarter kind of reference list that shows an author’s sources, letting people look at your inspiration and helping you keep track of ideas, not relying on bookmarks that eventually stop working.