The Ghost in the Machine Shouldn’t Be AI

Once upon a time in the 18th century, a fantastic chess-playing machine known as the Mechanical Turk was exhibited around the world, stunning audiences with its ability to beat skilled players and heads of state like Napoleon Bonaparte. Years later it transpired that the machine’s extraordinary feats were only possible because a human was hiding inside the machine, making all the moves.

Today, a similar phenomenon goes on behind the scenes in developing artificial intelligence: Humans label much of the data used to train AI models and they often babysit those models in the wild too, meaning our modern-day machinery isn’t as fully automated as we think. Yet now comes a twist in the tale: AI systems can produce content that is so humanlike, some of those behind-the-scenes humans are training new AI with old AI.

AI models are often described as a black box, so what happens when one black box teaches another? The new system becomes even harder to scrutinize. It can make biases in those systems more entrenched.

A new study from academics at Switzerland EPFL suggested that workers on Amazon.com Inc.’s MTurk — a crowdsourcing job platform named after the original mechanical Turk — have started using ChatGPT and other large language models to automate their work. The researchers said 33%-46% of them were using the AI tools when carrying out their tasks.

Normally, companies and academics hire MTurk workers because of their ability to do things that computers cannot, like label an image, rate an ad or answer survey questions. Their work is often used to train algorithms to do things like recognize photos or read receipts.

Nearly all tasks on MTurk pay tiny amounts. West Virginia-based Sherry Stanley, who was an MTurk worker for more than seven years up until recently, said she’d seen requesters offer to pay just 50 cents for three paragraphs of written work. Turkers can hike up their hourly takings from $3 to around $30 if they use specialized software to speed up their tasks.