Rather than being presented as a conventional narrative, "STET" is presented as a short section of a scientific paper analysing the principles by which self-driving cars make decisions. The paper is interspersed with suggestions to remove or change content which the journal editor finds inappropriate, to each of which the paper's author responds STET, a proof-reader's term for "let it stand".
The story is built by three linked texts. The body of the scientific paper, the linked footnotes and the linked editorial comments and responses from the "author". On first reading the scientific paper seems very unpersonal and academic. But as we read the footnotes and comments we slowly learn that the author's child was killed in a road accident involving a self-driving car, and the author blames the AI technology's programming, which prioritised the life of an endangered bird above a human child, for the child's death.Â
The stark difference between the very technical and emotionless text of the scientific paper and the increasingly emotional tone of the footnotes and comments is very effective, highlighting the very real effects of apparently objective decisions.Â