AI-driven cheating “widespread” even at elite schools like Princeton

Not thrilled about how this situation will affect the perception of higher education, especially when the "AI" bubble is finally allowed to burst and we all end up having to foot the bill for all this irresponsible nonsense. Turning out a generation (or more) whose ivy league degrees signify that the degree holder only knows how to prompt seems bad.
 
Upvote
70 (72 / -2)
Post content hidden for low score. Show…
Post content hidden for low score. Show…

Rector

Ars Tribunus Militum
1,570
Subscriptor++
Since generative AIs are the future, I don't see it as cheating in just a couple years or less. I'm not sure how places of learning will deal with it other than embracing AI going forwards.
You understand that "embracing AI" in colleges means the end of learning, right?
There will be zero value in a college degree if it can obtained by using AI to complete all of the coursework.
 
Upvote
145 (149 / -4)

Jackattak

Ars Tribunus Angusticlavius
6,982
Subscriptor++
Since generative AIs are the future, I don't see it as cheating in just a couple years or less. I'm not sure how places of learning will deal with it other than embracing AI going forwards.
ffs I'm glad you're not in a leadership position (you're not, are you???)
 
Upvote
79 (83 / -4)

forkspoon

Ars Scholae Palatinae
1,047
Subscriptor++
Hopefully a good deal of cheating will not only be seen by profs, but result in fewer reference letters for cheaters. Or perhaps even letters that share profs’ cheating concerns.

Good grades are one thing, but you won’t get into grad school without faculty support.

Anyway, it’s a huge mess regardless. And that honour code? Sadly it’s breaking down outside of Princeton too.
 
Upvote
13 (13 / 0)
Post content hidden for low score. Show…

Sarty

Ars Tribunus Angusticlavius
7,932
Many reports that do arrive to the Honor Committee are now anonymous because of another technological development of longer standing—social media—which has reportedly deterred students from reporting openly out of apprehension of doxxing or shaming among their peer groups.
We're in a bleak place that the reporter of cheaters worries about peer-group shaming, not the cheaters themselves.

Fuckin' thing sucks.
 
Upvote
90 (91 / -1)
Post content hidden for low score. Show…
Post content hidden for low score. Show…
Post content hidden for low score. Show…

Sarty

Ars Tribunus Angusticlavius
7,932
Hopefully a good deal of cheating will not only be seen by profs, but result in fewer reference letters for cheaters. Or perhaps even letters that share profs’ cheating concerns.
Reference letters that increasingly are read only by LLMs, if not also written by them :|
 
Upvote
41 (41 / 0)

dmsilev

Ars Tribunus Angusticlavius
7,352
Subscriptor
Stopping this sort of thing is a hard problem. One solution, which unfortunately doesn't scale very well, is to go truly old school and give oral exams. There's definitely a resurgence in hand-written exams as well; good time to own stock in the companies that sell blank blue books....
 
Upvote
33 (33 / 0)

CardinalChunder

Smack-Fu Master, in training
17
Subscriptor
People who bring up the old “they said that about calculators and everything was fine” trope know nothing about education.

Most education is practicing reasoning about a set of knowledge (facts, processes, procedures, etc…). It’s graded on whether students can demonstrate this.

LLMs completely destroy the evidence of reasoning used for grading.
 
Upvote
68 (72 / -4)

KrookedRooster

Ars Praetorian
510
Subscriptor
I dreaded having to do these wrote-memorization type stuff.

So my Prof had an open-book open-class test strategy in some cases. He was all about teamwork and working together because that's what you would be doing in a real job. So, we had the whole block to take the exam (again, open book, working together) but it was pass/fail. Anything wrong and we all lost out.

So, do you do it by yourself, and get graded against each-other and maybe pass but not with 100%.
Or all go in together and lighten the load where you all pass and need to discuss determine truth?
Then do a post-op on the results, of course.

This was 2007 of course. iPhones were just becoming the new hotness and internet was available on Windows CE and Palm Pilots and Blackberry's so cheating was a bit more laborious.

But I do see other Arsians mention that in today's world (mostly sarcastically), is cheating or doing just the bare-minimum considered "wrong" anymore? As long as it gets done, on time, on/under budget with minimal resources then good job!

Unless its "tradition" to do things some-way then that just keeps on keeping-on.

I do like to take pride in my work but maybe I'm an outlier.
 
Upvote
15 (16 / -1)
Post content hidden for low score. Show…

Hoptimist

Ars Scholae Palatinae
719
Subscriptor++
Now I’m getting downvoted too, but AI isn’t going away, no matter your downvotes! Ahahahahaha
To use a tool well, it's best to understand the underlying physical processes. Obvious example is with 3D physics simulations, anyone can input a geometry and run the program, the point of learning physics is to understand the inputs and outputs, how to test them and recognize a bad output. To understand what level of complexity (or context) is necessary to get a useful result. If you went to Princeton to learn prompt engineering, you really wasted your money and your time. Nobody is saying LLMs are always bad, they are saying it is tool best used if you have already mastered the subject matter, not as a substitute.
 
Upvote
43 (43 / 0)

Fatesrider

Ars Legatus Legionis
25,272
Subscriptor
The cat is out of the bag… small models are already smart enough to be useful in this context. The tech is here to stay
You mistake longevity for ubiquity.

It doesn't matter how smart the models are or how popular they are among those who lack the stuff to get by without them. What matters is if they can make money for the companies offering those models to customers.

They haven't passed that test so far, and there's not a lot of cheating you can do when the VC cash that keeps this spinning plate bullshit from crashing to the ground from a lack of enough paid people to keep the plates spinning runs out. OpenAI is still predicted to go bankrupt by summer next year. And that's now an optimistic prediction.

The small models won't survive if there's no company running them, since a good share of them rely on the updates and even processing of the tokens by the larger companies.

It won't die off, at least not entirely. But what I expect it to do is lose most of the toxic shit, and leave behind the comparatively small amount of useful shit it can do.
 
Upvote
20 (23 / -3)

Sarty

Ars Tribunus Angusticlavius
7,932
Maybe off-topic but serious question: Why does education have to be competitive anyway?
Competition is broadly equivalent to evaluation. If you don't sort by competence in some way, you're going to be sending a shitload of chaff to Princeton, and you're going to have no idea whether somebody coming out the other side is worth a bucket of warm spit.

Just the sort of person I want designing the car I drive or the bridge I drive it over!
 
Upvote
2 (5 / -3)

enilc

Ars Praefectus
3,890
Subscriptor++
The article, and many of the comments seem to give the impression that “because AI” students are cheating.

Students who cheat are cheaters. It’s their personality. They choose to cheat. Tools to cheat, in many forms have always been available to cheaters.

None of them have to cheat. They choose to cheat. Whether it’s easy to cheat or difficult to cheat is not relevant. They are people who are prone to cheat.

Maybe the value of those diplomas should be cheapened accordingly.
 
Upvote
1 (9 / -8)
I teach at a university, and so far my only answer to this problem is to not accept any demonstration of knowledge that I cannot certify myself, such as oral examinations or exams with smartphones face down on the desk and lab PCs disconnected from the internet. But AI, in my opinion, has a second very problematic effect, which is that many students have problems assessing their own knowledge, and AI gives them a false sense of understanding the topics; even well-meaning students think they understand something because they saw the AI solve it for them, until they face an examination situation and learn they don't. It is the exact same problem as before, when they copied or received lessons passively from a more knowledgeable student, only now this "genius" student is available to them 24/7.
 
Upvote
60 (60 / 0)

enilc

Ars Praefectus
3,890
Subscriptor++
Honor codes are part of this problem. My school didn't have an honor code, but my understanding is that many honor codes discourage turning cheaters in.
Find one. Show me an accredited school’s honor code that discourages turning in cheaters. Just one. I’ll wait.

Just spouting garbage in the forum doesn’t make it true. Based on your comment, I 100% believe you went to a school without an honor code.
 
Upvote
27 (27 / 0)
Post content hidden for low score. Show…

ConflictedDude

Ars Centurion
288
Subscriptor
The end of degrees as credentials of learning, maybe. Nobody is obliged to stop learning.
Had a prof in college, his mantra was 'you go to college to learn how to learn'. I have to admit, I've forgotten a lot of what I learned in college, but I still know how to learn.
 
Upvote
50 (50 / 0)

Ipuxi

Ars Centurion
215
Subscriptor++
You are getting downvoted but I don’t think you are necessarily wrong.

Folks on these forums seem to think that we’re going to go back to a time before this existed. That isn’t going to happen, whether we like it or not.
The worms won't go back into the can again. Doesn't mean that decreeing the superiority of free range worms is the appropriate response.
 
Upvote
23 (25 / -2)