bruise.in | stop tech genocide | gg[at]bruise.in

RSS served here.

Older posts archived here.

4-Dec-25

I've been AFK. Tech work and a longer term creative tech gig have been keeping my computer days to a minimum. I'm no longer working on the PhD but will be continuing to explore what I had had set out to do in my own artistic practice and research. To this end I keep reading. Right now I'm part way through The Necessity of Art by Ernst Fischer. There's a line early on, "In a decaying society, art, if it is truthful, must also reflect decay. And unless it wants to break faith with its social function, art must show the world as changeable. And help to change it."

1-Jul-25

ACE (Arts Council England) recently published a report on AI use, Reponsible AI at Arts Council England artscouncil.org.uk/research-and-data/responsible-ai-arts-council-england.

This arrives in the context of more papers from the tool manufacturers showing that AI is not useful in the ways they claimed it would be. Microsoft actually have to force their staff to use their tools. As others have said, this was also the case with the metaverse. https://www.businessinsider.com/microsoft-internal-memo-using-ai-no-longer-optional-github-copilot-2025-6

The report is a document detailing their steps so far and plans for the next few months. What stood out was the seeming lack of space for AI use to be refused. It seems the purpose was really to ensure it continues to be used, but does so in a way that is somewhat known and controlled.

To this end a toolkit has been published as well. This consists of a checklist of points that should be considered before integrating AI within a project. Unfortunately, many of this points are, in my opinion, either too loose to gauge the AI use has met the required actions or simply impossible within the systems of AI tools available to the majority of people (working at the Arts Council and as wider members of the arts industry relying on ACE funding).

A couple of examples:

  • "Understand what the tools you are using can do with your data"
  • - the terms of usage for these tools is changing continuously. For the organisational accounts discussed in the report this is presumably somewhat possible, but when considering wider use by the artists/recipients of ACE funding, this is much fuzzier.

  • "Put checks in place to test bias at all stages of the systems and processes you develop"
  • - Surely this requires a full understanding of the training data, labelling methods etc. of the tools as well as access to parts of the process that are simply black-boxed in most general tools.

  • "Undertake basic AI literacy training to better understand how generative AI can help and the risks it poses
    - Liaise with colleagues or peers who have experience of delivering AI projects
    - Allow time to learn and iterate throughout the project
    - Seek specialist advice where required (e.g. from an AI consultant)"
  • - Yes, let's just throw more money at all the consultants that can legitimise our continued use of these fetid tools.

  • "Use techniques to increase the relevance and accuracy of outputs (e.g. learn how to write effective prompts)"
  • - The tool probably doesn't work as expected - but don't worry you can just write better prompts. There is such a desire to change processes and conform to these systems. The tools should be shaped to the hand.

    This feels similar to the kind of reports that were published during the NFT/blockchain hype cycle. A faulty argumentation to legitimize the continued use of the technology. It's depressing enough to see it fill so much of our lives, but to see 'artists' embrace it, ignoring all the costs, in the hope of staying afloat in the casino system of funding is grim. As a technician there is little more soul-destroying than installing a piece of work that was prompted into existence.

    This is a footnote on page 8, the only footnote in the report: "Some of those we asked to join this group were reluctant: they either didn’t feel they had the skills or knowledge to contribute or felt strongly that AI was bad for people and planet. We did a lot of work engaging with these individuals to listen to their concerns, and to empower them to realise their perspective was a valuable one that could help to shape the organisation’s response to these technologies." Ultimately the concerns of AI being bad for people and planet are not addressed at all in this report.

    AI is bad for people and planet.

    AI is bad for people and planet.

    AI is bad for people and planet.

    AI is bad for people and planet.

    The social issues are alluded to in the toolkit, in the sections on bias and data security. This is all at the point of use though. It doesn't reflect or contend with the way these tools have been built and the wide ranging issues to be found there. There is, glaringly, nothing about the environmental impact of these tools. This goes against the claims in the report to forefront the environmental responsibility. The obvious answer is that to do so would make the whole thing even more explicitly farcical. This builds no the relatively low bar of action expected when engaging with ACE through guidance from organisations such as Julie's Bicycle much of which seems to revolve around tools for measuring impact (which hopefully guide towards lessening impact). Where is the outrage from JB over the reports into the impacts of data centre usage?

    Tellingly, Microsoft CoPilot is the only ACE approved tool and already provided as part of staff access to Microsoft. The argument here is that the use is covered in the terms of the organisation account, and will not be used for training data. This does imply that they are paying for a contract with Microsoft though - an organisation deserving of boycott attempts as their profiteering from the genocide has been clearly documented. Does the fact that only one tool meets their criteria, not give pause to the positivity of the report? https://www.aljazeera.com/news/2025/7/1/un-report-lists-companies-complicit-in-israels-genocide-who-are-they

    https://www.ohchr.org/en/documents/country-reports/ahrc5923-economy-occupation-economy-genocide-report-special-rapporteur

    The report is positioned to clarify use within ACE, but also guides and can be taken as an example of acceptable use for grant recipients. Encouraging the use of generative AI tools encourages more individualist thinking and further precarity of jobs in the arts. Why hire sign language interpreters, designers, writers etc. when the lead artist can just prompt it all? I hope those still attempting to make work are disgusted by this.

    The report ends - "Arts Council England, then, is not asking: how should we use AI? but why would we use AI? The answer to that question provides the foundation for responsibly engaging with these fast-evolving and wide-reaching technologies." This phrasing again implies that use will continue, with some final breathless tech hype, fast-evolving, wide-reaching. The answer to that question should instead raze the foundation imposed by big-tech and cheered on by all those who work against commonality.

    Things I'd suggest reading - The AI Con, Emily M. Bender, Alex Hanna; Resisting AI by Dan McQuillan; Why We Fear AI, Hagen Blix & Ingeborg Glimmer; Predictive Capital by James Tindall. For kicks, Computer Power and Human Reason by Joseph Weizenbaum.

    19-Jun-25

    Some more thoughts on the updated website. Websites, for me, are an important medium for creative exploration/use. Most artist workshops I've facilitated have been around building simple sites. They are an entry point to the necessary infrastructural questioning, reorganising and imagining I think is needed. I speak the Luddite oaths.

    I am mostly AFK these days - but still thumbing screens. To that end I'm working with processes a bit kinder and more open to whatever playfulness or will to write I can muster from this setting.

    It is apparent that means less updates, rougher edges and more of this ever present drift to text. Writing for myself. Writing from awful times.

    Small packets of data. Bare markup.

    A place for expression.
    A place to seek employment.

    (To jest the fear)
    (To admit the fear)

    A conversation with a friend about a recent assessment interview and the insistence that he should adopt a range of AI solutions to minimise the need for human support. Everything aimed at removing his autonomy from every possible interaction. They wish him sealed away, experiencing only a stream of hallucinated entertainment and reports of his machines interaction with the world outside. A complete narrowing of experience to a single key press (to continue). But he fights this. We talk about others who can't. How we must articulate this all better, more urgently.

    I wonder if the resistance to refuse AI (especially by artists) is partly tied up with the understanding that to refuse this really entails a greater refusal, the questioning of tech practices that are less persuasive about being the next big thing, but are more mundane and embedded. And what happens to the critiques when the next thing takes up the hype (quantum? Another slew of AR) each furthering fascist ends in almost exactly the same ways.

    Another month without the radio stream. I've missed the listening to VLF and scanning shortwave. But I have been finding and recording sounds. Another thing this awful box does so well. Listening.

    6-Jun-25

    I've updated my website. Removed CSS but introduced some more images. It is UNDER CONSTRUCTION. Permanently. The old version is mostly still retrievable on archive.org, and I will get more of it assembled here at some point.

    Despite this update mostly removing things and disregarding most affordances of web design it has still taken many months of stalled attempts. I expect the most updated page will continue to be the reading list, but I'm eager to update the luddite bibliography as well.

    This week I received Lori Emerson's Other Networks book. It is beautiful and very much recommended for anyone who is likely to have landed on this log. Full of interesting details about historical and recent networks.

    Christopher Samuel's new exhibition Watch Us Lead opened in Birmingham. The exhibition highlights the experiences of disabled people of colour in Birmingham and collects together new drawings, audio and a stained glass window that Christopher has made, along with photographs and items from the museum collection.

    Tom K Kemp's Homunculus continues at Two Queens. I need to visit again before the exhbition closes at the end of the month to catch the full film.



~gg 2025