Skip to content Skip to sidebar Skip to footer

If you autocomplete your work you’ll never improve

I've always imagined that I have a natural talent for drawing, despite the fact that whenever I actually try and draw something I end up producing an 'artwork' that looks like the wall-scrawling of a over-enthusiastic toddler. And not the kind you can fob off as 'naïve[1]'. The kind you hurriedly crumple into a ball and throw in the bin before anyone else sees it, as if your brain just pooped itself all over the sheet.

So the news that Google has developed a tool called AutoDraw[2], which turns your hideous doodles into perfectly decent ones, should have pleased me: I am, after all, the target demographic. But after playing around with it for a while the transformation of my malformed cats and half-arsed houses into slick clipart, which at first seemed almost magical, struck me as a sinister form of doodle totalitarianism. What was so awful about my dog-that-looked-a-bit-like-a-fox-gone-wrong? And so what if my doodle car had a boot twice as long as the bonnet? Maybe I wanted it like that. Maybe it was a special mafia car that allowed gangsters to transport several people to a whacking site at once. Suck on that, AutoBore!

What did I learn from all this? That even the crappiest doodle often has a certain charm, offering a glimpse of how the artist sees the world. Yes, it might be a world populated by people with blobby heads and dashes for fingers; but if they're playing the trombone or standing triumphantly atop a mountain of stick person corpses that tells you something about the person that drew them (in the case of the latter image, that they're probably serving a double life sentence in a maximum security prison).

That's because our artistic style is as unique as our signature — and there's nothing we humans love more than uniqueness. It's why we've made such a big fuss about gold for thousands of years. Deep down we all know it's nothing but fancy tin, but the yellow colouring makes it seem just that little bit more special.

AutoDraw, on the other hand, promises to make everyone's drawings look competent but similar. It's based on the same technology that underpins Google's QuickDraw[3] programme, a game where you have to draw something and hope the AI recognises it within 20 seconds (if you get it wrong the AI shoots a laser out of your webcam and erases all your memories, according to a piece I read on a website called 'Totally Non-Fake News About AI'). And while AutoDraw is undeniably impressive in technological terms, it's also symptomatic of a very modern malaise: the desire for instant results.

These days we don't even have to know how to spell the name of the thingy we want to Google; we can just splurge a rough approximation into the search bar and let Google figure out what we really meant to write — and then get frustrated when it doesn't instantly translate our caveman-esque 'Michal Fasbengar film watch' text grunt into showing times for Michael Fassbender's latest movie.

Likewise, if you want to be a singer you no longer need to be able to hold a note: just warble a rough approximation and auto-tune will take care of the rest. But with the exception of Cher's 'Believe[4]', the first mainstream hit to make use of auto-tuned vocals (long before it became the standard means of hiding a promising pop drone's talent deficiency), most auto-tuned vocals leave no lasting impression: they're as formulaic and bland as… well, baby formula, slipping through popular culture's digestive system with minimal f uss and effort.

And that's the real problem with technical perfection: it's seriously boring. Often it's the flaws in a piece of art or singer's voice that make it interesting and original, not how closely it adheres to some preconceived notion of the 'standard' way of doing something. In that sense AutoDraw is like a parent who looks at their child's drawing and says smugly 'yeah, I see what you were trying to do there… here, let me' and then changes everything to make it 'better', thereby destroying the child's belief that their own work has any value.

Similarly, the writing app Hemingway[5], while useful as a rudimentary second pair of eyes for your work, will often tell you that a sentence is 'poor' simply because it doesn't fit the tool's narrow parameters for 'good' writing. But a writer's stylistic quirks and grammatical rule-bending is an intrinsic part of their voice — the very thing we often love most about their work. Hemingway is incapable of understanding that because it has no taste, no real discernment, no feeling for talent. It's simply an algorithm that judges your prose against a s et of predetermined criteria and slaps a rudimentary 'score' on it, like a pretentious spellchecker.

Writers who rely solely on such tools instead of learning the intricacies of their craft are unlikely to develop the ability to tell a bad sentence from an ingenious one, and acquire their own innate sense of what works and what doesn't. After all, one person's 'Pah! My two-year-old could do better than that' is another person's 'Oh my god that is so beautiful I feel like my soul is going to explode.' Machines don't respond in the same way. Yet it's not hard to foresee a time when AI will also suggest better lines for your poem or characters for your book, which begs the question: at what point does the artwork belong more to the machine than the human? And will will we start to lose faith in our creative abilities?

Augmenting our creative skills with AI also risks eroding them: whenever we turn to machines to help us with our work, we're not stretching our creative muscles as much as we could — and stretching those muscles is the only way to achieve (and retain) true mastery of something.

Nicholas Carr explores this idea in his fascinating book The Glass Cage[6], which looks at how automation can quickly erode vital skills, leaving us stranded and useless when the machines malfunction (as they generally do at some point). He uses autopilot systems as an example of this phenomenon. Almost every aspect of a flight, bar take-off and landing, is now controlled by a computer, which, Carr explains, has reduced many pilots' ability to act calmly and properly in an emergency situation. In May 2009, for example, the pilots of an Air France flight suffered a "total loss of cognitive control" when the autopilot failed, and ended up plunging the plane into the Atlantic, killing everyone on board — a tragic case of what can happen when we place too much faith in machines.

Carr also draws on studies that underscore how much our sense of happiness and fulfilment derives from performing skilled work. That applies to creative work as much as anything else: anyone who's ever tried to write, paint or play an instrument will know the thrill of writing a great sentence or capturing someone's smile after many hours of practice; it's one of the most satisfying feelings a human being can experience. And even if you subsequently mess the eyes up or leave a great big dangling modifier at the end of the third paragraph, at least the finished piece is all your own work — something you can learn from and improve. Because once you start relying on AI and algorithms to fix all your mistakes, you're ultimately enhancing them more than your own abilities.

References

  1. ^ naïve (en.wikipedia.org)
  2. ^ AutoDraw (www.autodraw.com)
  3. ^ QuickDraw (quickdraw.withgoogle.com)
  4. ^ Believe (www.youtube.com)
  5. ^ Hemingway (www.hemingwayapp.com)
  6. ^ The Glass Cage (www.amazon.co.uk)
Source: www.bing.com