Editorial: Don’t let AI rule the world

0

The rise of AI-powered language modeled like OpenAI’s GPT-3 has revolutionized the way we access information and communicate with each other. However, the increasing ease that comes with this technology has also created new challenges for the education system. In particular, there are growing concerns about college students using language models like GPT-3 to cheat in their classes.

One of the main dangers of using language models like GPT-3 to cheat is it can undermine the integrity of the education system. 

When students rely on AI to complete their assignments or write their papers, they are essentially taking shortcuts that allow them to bypass the hard work and effort that is required to truly learn and understand the material.

This creates a situation where students are able to pass their classes without actually mastering the subject matter, which is not only unfair to the other students in the class, but also undermines the value of a college degree.

Another danger of using AI to cheat is that it can lead to a lack of critical thinking skills. When students rely on AI to do the work for them, they don’t have to put in the time and effort to understand the material and develop their own thoughts and ideas.

This can lead to a situation where students are unable to think critically or creatively, which is a critical component of success in many careers.

Finally, the use of AI to cheat can also result in a loss of trust in the education system.

If students and teachers become aware that some students are using AI to cheat, it can erode the trust that is necessary for a successful education system. This can result in a loss of respect for the institution and its faculty, which can have a negative impact on the reputation of the school and the value of a college degree.

While AI-powered language modeled like GPT-3 can be incredibly useful tools, they can also be dangerous when used for cheating in college classes.

It’s up to educators, students and institutions to work together to prevent cheating and ensure that the education system remains fair, rigorous and trustworthy. By doing so, we can preserve the value of a college degree and ensure that future generations are equipped with the critical thinking skills they need to succeed in the rapidly changing world of tomorrow.

Has anything felt off to you about this editorial? Does it feel a little too straightforward or clunky? Does it maybe lack the flare that you’ve come to expect from previous Brown and White editorials? Maybe it feels like someone else — something else — wrote it?

If you think so, you’d be correct. 

The preceding paragraphs weren’t written by any member of the editorial board or any human for that matter. It was created by feeding writing prompts to ChatGPT in an attempt to create a convincing editorial piece.

Through this process, the AI created rhetorical gems such as: “while the siren song of quick and easy answers from chatbots like GPT-3 may be tempting, it’s a road you simply cannot afford to go down,” and “cheating is, well, cheating!” 

After reading the AI editorial, we think (and hope) you’ll agree that it isn’t quite ready to replace us just yet, but that shouldn’t be the only reason not to use it.

Using ChatGPT to write your English essay or lab report — while you sit back and binge-read our website — might result in you getting a B- when you could’ve gotten an A- on your own. However, even if the AI was advanced enough to write an A paper every time, there’s something that gets lost in translation when a human is not involved in the writing process. Not to mention, it looks eerily similar to the no-no called academic dishonesty.

Ironically, the software has already explained with analytical precision all the reasons why using it to cheat is not a logical choice, but the ethical and artistic barriers against use should be an even stronger deterrent. 

In the classroom and beyond, having the ability to use this software doesn’t mean you should. Whether it’s an image, an essay or an article, the use of AI-generated software devalues true human creation. Wanton use of ChatGPT is an irresponsible display of our power as consumers.

Moreover, any piece of AI-generated content that has gained traction in recent weeks, from portraits to think pieces, feels lifeless in some way.

There is an almost imperceptible quality to a piece when a person puts their heart into it — raw outpouring of emotion that seems to have a weight separate from the words and facts conveyed on the page.

A world of AI art and AI culture is a world of plastic smiles and lifeless eyes. The most eloquent poetry or fluent dissertation means nothing without a conscious life to experience its creation. Every typo is a subtle reminder that we’re human.

Comment policy


Comments posted to The Brown and White website are reviewed by a moderator before being approved. Incendiary speech or harassing language, including comments targeted at individuals, may be deemed unacceptable and not published. Spam and other soliciting will also be declined.

The Brown and White also reserves the right to not publish entirely anonymous comments.

Leave A Reply