How many students use artificial intelligence programs to write texts?

by time news

Time.news – The Romanian historian Carla Ionescu said a few days ago that she had overheard a conversation between two of her students who were exchanging advice on which artificial intelligence program worked best for writing their texts. Are AI programs already that popular among college students? The English newspaper The Guardian dealt with the issue, which also reports some experiences of scientists who have dealt with this type of production based on algorithmic methods.

Back in 2012, computer theorist Ben Goertzel proposed what he himself called a “robotic university test” to say that “there should be awareness” of the fact that “an artificial intelligence is able to obtain a degree in the same way as a human being”.

Computer scientist Nassim Dehouche countered with an article demonstrating that GPT-3, the language model created by the OpenAi research lab, could “produce credible academic writing undetectable by usual anti-plagiarism software.”

The debate is open. Last month, S. Scott Graham, an associate professor at the University of Terxas in Austin, described encouraging students to use technology for their homework with wildly mixed results in an article for the science journal Inside Higher Education.

The best students, he believed, would have met the minimum requirements or a little more, while the lame students struggled, since providing the system with effective suggestions required writing skills of a level high enough to make AI unnecessary. And he concluded: “I strongly suspect that full robotic writing will remain just around the corner.” In other words: we will never get there.

Aki Peritz, a private sector researcher, argued the exact opposite: “With a little practice, a student can use AI to write his or her paper in a fraction of the time.” Who is right? Or better: to what extent does the AI ​​intrusion constitute cheating?

Observes the Guardian: “Universities aren’t limited to tackling essays or assignments entirely generated by algorithms: they also have to judge a myriad of more sophisticated problems. For example, AI-powered word processors routinely suggest alternatives to misspelled sentences. But if the software can algorithmically rewrite a student’s sentence, why shouldn’t it do the same with a paragraph and why not with an entire page?”

In this regard, Prof. Phillip Dawson of Deakin University says: “I think we’re actually going to teach students how to use these tools. I don’t think we will necessarily ban them.”

The English newspaper comments: “The occupations for which universities prepare students, after all, will soon also be based on artificial intelligence, with the humanities being particularly interested. Take journalism, for example…”.

Already. A 2019 survey of 71 news outlets in 32 countries found that AI is already a “significant part of journalism,” employed for news gathering, news production (from automated fact checkers to algorithmic transformation of financial reports into articles) and news distribution (customize websites, manage subscriptions, find new audiences). Finally, the Guardian asks: “Why should journalism professors penalize students for using a technology that could be central to their future careers?”

You may also like

Leave a Comment