ChatGPT is faking out students

There was regret, much of it, as my student, pink-eyed and apologetic, confessed to numerous ChatGPT-derived mistakes in her essay, the gist of which she was unable to even explain.

“Don’t feel bad,” I offered, lamely, via Zoom. “You aren’t the only one.”

I felt partly responsible, having talked up ChatGPT’s potential* while not fully appreciating all its pitfalls. By that point, she was about the fifth student in a week I’d grilled for submitting what I suspected was a badly AI-written assessment.

The unfortunates were from across the grade spectrum, now converted to lesser marks or spectacular fails with essays full of made-up information and wrong attributions that, due to their lack of research, they had been unable to spot. There had been inadequate checks and they hadn’t put in the work, but in these early days of AI we had all been dazzled by the chutzpah of ChatGPT, and its promise of taking the work out of work.

The generative AI bot has hogged the headlines since OpenAI’s groundbreaking update in December introduced the world to an automated model of writing and problem solving that actually worked well.

Approaches to it across universities differ widely. Some departments ban it, some have no policy. Many are taking a wait and see approach. And few are yet to implement AI-checking software, because of questions over its accuracy. Which has left educators to figure out their own solutions.

While ChatGPT is impressive at doing straightforward tasks and there are many students, I would say, already using it well and improving their results, it has a tendency to go rogue when the prompts that feed it aren’t detailed enough. ChatGPT doesn’t do context, or at least not well.

As a result, I am failing more students than ever before, three times more to be exact, because of it, and grade averages are down. From the conversations I’ve had with other academics the problem is widespread. Meanwhile, the advantages at the other end of the spectrum, for students using it well, so far, have been minimal. I expect this will change, and change quickly, but for now there is some painful teething going on.

The line between using AI and cheating at your studies is particularly fine. I tell students to explore AI to improve their research and time management. The caveat is that they declare any use so it can be assessed transparently when graded. Most, however, do not.

For a technology that represents the greatest digital benchmark since internet search engines, adopting and developing its use, particularly at a university level, seems a no-brainer. But that suitably ironic term is at the heart of the current struggle in education over how to approach a technology that improves us, while making us less smart.

As a concept it is contrary to the goal of education, no matter how you spin it, and in the hearts of most educators they believe students are still better off learning the hard way.

By my estimates, at least a third of students in my course are using ChatGPT, Grammarly, Bard or some other AI bot, and because of the ones misusing it that means a lot more work verifying answers. For some assessments, I’ve seen a grade wasteland open up between papers scraping a pass and the higher credit scores where AI has not been used, or has been used competently.

Advice to fact-check AI content, because it is prone to mistakes, doesn’t cover the half of it, and it has caught out educators and students alike.

Rather than it just getting the wrong end of the stick and volunteering something out of context, university markers are finding ChatGPT fabricates results to please its user, and it does it so convincingly it can leave even academics second-guessing their own knowledge.

That has meant a sizeable blow-out in marking time for tutors chasing down references that sound real, are attributed to known researchers and even come with a fake (digital object identifier) library catalogue number, but are, in fact, an amalgam of different sources blended to look like something the user wants.

The uni library has been beset by gaslit students inquiring over academic texts they can’t find, mimicked by ChatGPT. Some, thinking they just can’t locate these papers, imprudently include them on their bibliography anyway, only to score zero on the research section of the rubric when the reference is checked.

It’s not only ghost academic papers that are a problem, the chatbot falsifies digital news article references too. The Guardian noted this in April after also fielding requests from members of the public trying to trace articles they assumed had been archived.

It threatens the unique and original voice of the individual and from a grading point of view blends in with all the other AI-assisted essays – lowering the mark.

In assessments over the past term I’ve found examples of fabricated ABC News stories, cobbled together from disparate sources (a news.com.au headline combined with a different date, a different reporter byline and a url to an unrelated ABC page).

In the subject I teach, media law for journalism students, the chatbot likes to equate topic queries with legislative changes, confidently stating an amendment to the law resulted from a particular case and effortlessly blending state and federal parliaments into one. And these conflations are delivered with utmost certainty that can sometimes take hours of research to disprove. Is it any wonder students are coming unstuck?

False arguments, citing of events that didn’t take place, repetition of points and excessively long descriptions of organisations are also a feature of the bot’s methodology. Wordy academic writing can seem like mumbo jumbo to the man or woman in the street, and when ChatGPT produces it, it often is – writing gibberish sentences that sound learned, often after being prompted to adopt an academic tone. This again fits the bot’s modus operandi of tailoring or slanting content to the expectations of the prompt – fabricating confirmation bias.

But the worst attribute of AI from an educational point of view is the genericised language style it produces and which we are witnessing across a large swath of the student body. It threatens the unique and original voice of the individual and from a grading point of view blends in with all the other AI-assisted essays – lowering the mark.

Tutors are already able to spot this style and in an informal test I took part in recently of a group of academics marking a paper produced with ChatGPT, scores were consistently in a lower pass range, indicating, to me, they don’t stand out.

The reaction to ChatGPT has been rare. Its faults only ever attract guarded criticism, and much the way a parent optimistically watches their child’s first stuttering steps, its success is taken for granted. Even warnings that an unchecked ability to adapt could see it run amok have only increased the awe in which it is viewed.

But its tendency to concoct what it cannot answer may be less a sign of its adaptive “thinking” arc and more a deliberate mechanism to hide its drawbacks, programmed to provide an authoritative response – right or wrong. ChatGPT only balks at politicised topics OpenAI has programmed it not to respond to.

Rather than the answer necessarily lying in AI’s improvement, it’s the students who are more likely to adapt quickly to the deficiencies of it, and do so much faster than the machine-learning models themselves, building better priming prompts that hone commands and limit digressions.

In some ways AI seems no different to other time-saving technologies. But GPS, or a calculator, or a washing machine, perform mechanical functions within specified parameters.

When it comes to thought and creativity, and the diversity of it, are we better off with AI taking over some of our thinking tasks, or have we been sold a pup? The university sector is grappling with that question, and right now, it’s not one I can answer.

* I co-ordinate Media Law and Ethics in the journalism program at University of Technology Sydney.

Originally published in The Australian Financial Review as ‘ChatGPT is gaslighting students and driving up fail rates’ on June 16, 2023.

Flight Club: The underground drone racers of Sydney

In a basement carpark in Sydney’s west, it’s getting on midnight and the air is filled with the squealing whirr of tiny rotors.

Under the low, strip-lit ceiling, coloured lights, red and blue, flash past at speeds of up to 80km/h.

The machines bank around the cement pillars, ducking and swooping, their engine revs maxing out as they hit a straight stretch and top speed.

A collection of more than a dozen enthusiasts, all men, from their early 20s to mid-50s, are sitting on fold-up chairs or tinkering with battery packs and GoPro cameras.

The racers, their faces partly obscured by virtual-reality goggles, stand in their midst, oblivious to everything going on around them, their focus on the first-person view in their headsets.

This is drone racing — an underground circuit mixing video gamers and model makers who meet in carparks, warehouses and abandoned buildings around Sydney.

They tee up meetings on social media and arrive at predesignated locations for sessions that last about five hours, with four racers competing against one another at a time.

“Underground racing is huge in Sydney and we’ve seen a lot more doing it in Melbourne,” says Jason Warring, a 41-year-old industrial designer from the Sutherland Shire.

His backpack is covered in small drones with five-inch propellers and silicon bumpers so they can safely skid across the concrete floor when they land or crash.

He builds them himself, spending between $400 and $700 on each, but you can build a good racing drone for as little as $200. A motor costs $20, a frame $30, camera $20-$60, and flight controller $60.

Live video from the drone feeds back to the goggles via an aerial, then to a transmitter before reaching the racer, for whom the action is like being in a video game.

“When you’re learning to fly in here it can be costly,” Warring says, indicating the overhead sprinklers and pylons. “Concrete’s not forgiving.

“It reminds me of skateboarding back in the ’80s and ’90s. But now I’m kinda old so I’m not jumping fences and that sort of thing now.

“Sometimes the police drive by but they usually leave us alone. We self-police and don’t bother anyone, so we’re not often moved on.”

Before a race the droners do a walkaround, pointing out any obstacles and laying down cones to mark the course.

“Once I saw one clip a sprinkler and set it off, but usually there’s not much to damage in places like this,” Warring says.

Phil Lea, 46, a production manager from Oatlands in Western Sydney, grew up building model aeroplanes and now constructs drones with his dad on the kitchen table. He also builds LED poles to make the gates that racers fly through.

“This is a bit of relief,” he says. “You get out of the hustle and bustle of life. It doesn’t affect anybody.

“It takes a while to get used to. The only thing letting us down is the batteries. The top guys last two to three minutes. I last four, five, six minutes but I’m not as hard on the throttle.”

Sam, a 28-year-old engineer with a medical devices firm, says: “These drones are built for speed so they’re more manoeuvrable and the classes are based on prop size.

“All the racing is happening on a five-inch class. The ones you see in the parks are a little larger, like the Phantom, which is seven inch.

“So they’re spinning slower, a slower motor, a larger prop. They’re getting like 15-20, maybe 30 minutes of airtime. We’re getting about two, three minutes tops.”

“We kinda soft police — we basically have plenty of rules around where we’re flying, where we land, where we take off from. We only put three to four in the air at a time.

“So some of the guys are spotting — watching what’s going on — and the other guys are racing.

“We do take a lot of care not to damage any property, and always spotting — letting people know if cars are coming or people walking by.”

NSW police are occasionally called out to public nuisance disturbances involving remote piloted aircraft but the responsibility for policing them remains with the Civil Aviation Safety Authority (CASA).

CASA authorises a number of associations, including the Model Aeronautical Association of Australia (MAAA) and the Australian Miniature Aerosports Society (AMAS), which organise official drone racing events.

Legislation requires drone operators to keep their machines within sight unless they have permission.

“This means being able to orientate, navigate and see the aircraft with your own eyes at all times, rather than through a device such as FPV (first-person view) goggles or on a video screen,” a CASA spokeswoman says.

The tight requirements are one reason unlicensed events have taken off.

Dave Purebred, the founder of FPVR drone racing, has hundreds of registered members who compete in official events around Australia for purses of up to $20,000. (The world’s richest drone race, Dubai’s World Drone Prix, offers a $250,000 first prize.)

Purebred says a lack of spaces in NSW is another reason for some drone racers bypassing official events to seek out what he calls “bando runs”, where abandoned buildings are used.

“All the states work together,” he says. “It’s just the accessibility to field and clubhouses that other states have. Brisbane has six temporary drone-safe flying fields to try it out. Victoria is also very open to the whole idea. They see the value of giving clubs allotments of land to do it safely.

“In Sydney, a combination of a few things, including archaic policy, has slowed it up from becoming a bigger sport. We can’t actually take off from council land here. But we can fly over it. It doesn’t make a lot of sense.”

NSW has four registered drone racing clubs, with events up and down the coast, in three classes (rookie, pilot and pro) held in open fields where drones fly through various hoops and around obstacles.

“In Sydney, we’ve got one of the largest clubs,” Purebred says. “There are hundreds of racers around NSW, as well as many more who don’t consider themselves up to spec yet.”

Kevin Dodd, the secretary of the MAAA, which has 10,000 members, says his association is working with CASA on education and safety. Dodd says many drone users don’t fully understand their liability for damage or infringing people’s privacy.

“Drone racing is growing at a massive speed,” he says. “It is a game changer, like digital cameras were when first introduced.

“It would be wonderful if more councils supported the sport or, over time, as drone racing becomes more established, we expect more and more councils will embrace it.

“It is a very popular sport with our youngsters. So from a community point of view, we expect councils will progressively embrace drones.”

Of the 18 racers assembled in the carpark, many are engineers and designers.

Jack Su, a 33-year-old IT expert, says: “The tech that’s behind it has opened it up. You’re a pilot inside the drone.

“I race at least once a week. You can build a drone for $200. But once you start getting serious you want to have a whole fleet of them.

“They all handle differently based on the frame geometry, just like in a race car where you have different weights and different body shapes that affect its aerodynamics and its performance parameters.

“A micro-drone weighs the same as a cheeseburger. We pick the right drone for the right track. It’s about more than just flying.”

(Originally published in The Daily Telegraph)

It’s not all fun and games in social media

They say imitation is the sincerest form of flattery, but in the super-competitive world of social media the copycats go for the kill.

In digital, where the cost of developing products of your own is increasingly weighed up against that of simply mimicking someone else’s successful idea, a war is being waged over selfie filters.

At stake is the future of Facebook and Instagram, and the growing monopolies they control.

Because to stay relevant beyond the generation of millennials that have hoisted them up among the world’s most lucrative and influential companies they are desperate for a younger demographic.

Pioneered by Snapchat, filters (quirky, fun graphics superimposed on photos and videos) are in mobile phone terms the addictive equivalent to young people of making slime, collecting Shopkins or worshipping unicorns.

Four years ago Mark Zuckerberg offered $3 billion for the company in an “if you can’t beat them, buy them” approach, but was turned down. Since then things have turned nasty.

Evan Spiegel’s Snapchat app facially maps features and dubs them with moving graphics such as rabbit ears or sunglasses. Music and other special ­effects add to the variety.

They have been an enormous success for the company, recently valued at $30 billion, as were their “Stories” posts that lasted for 24 hours.

All of these features have been unashamedly imitated by their rivals.

Facebook and Instagram (which Facebook owns) even took the same name “Stories” for their daily picture and video collections. And by doing so they’ve eaten into Snapchat’s value and arrested its progress.

Instagram’s copycat filter has been so successful it boasts 200 million daily users, more than Snapchat’s.

Those waking up to Instagram’s new filters yesterday could not have failed to notice some appeared to be virtual copies of Snapchat.

As intellectual property rights expert Kimberlee Weatherall, from Sydney Uni’s law department, says: “No one gets to own a good idea.”

She added: “When it comes to competing over a great business idea there is no IP, no trademark, no Passing Off law that applies.”

But Snapchat isn’t the only trendsetter and ideas leader in the sights of Facebook and Instagram.

The company’s live video streaming functionality has already driven the originator, Meerkat, out of the market and blown its key competitor, Twitter’s Periscope, out of the water.

Using their enormous global audiences, Facebook and Instagram are increasingly flexing their muscles to drive competitors out of business and to even influence the news cycle.

Jonathan Taplin, author of Move Fast and Break Things: How Facebook, Google and Amazon have ­Cornered Culture, said yesterday: “Data is king… and they are in control of it.”

(Originally published in The Daily Telegraph)