Most western countries have somehow decided over the last couple decades that small negative actions should mostly be free of negative consequences.
You can cheat on tests, shoplift in stores, and pretty much nothing will happen to you.
When teachers can’t give failing grades to students or kick them out of their class for blatantly breaking the rules, this is what happens.
Meanwhile I took a language exam in Japan last weekend where a bunch of people got kicked out of the room - instant fail - for using their phone during the break when it was expressly disallowed (we had to put it in a sealed envelope that we couldn’t open until the exam was over, break included). Given reports I’ve heard, I suspect at least a single digit percent of test takers failed the test this session simply for breaking this rule.
From the test takers who got kicked out of the room and tried to negotiate (unsuccessfully) with the proctors, it was instantly obvious who came from cultures where the consequences of rules are carried out and who didn’t.
You know, back in the day, teachers used to try and convey the "why" behind things like writing essays and reading books. Spark notes existed, but a good teacher could convey, hey, there is a reason we are doing this thing, it is because it has value outside the note that says you completed the task itself.
teachers still do this today. It's just that kids are less disciplined, and more prone to attention deficits. Not to mention that punishment for failure has been dulled down to almost non-existent. "No child left behind" had noble intentions, but the way it was implemented leaves much to be desired.
To me, the fix is to cure the lack of consequences in the outcome of cheating. If you're allowed to cheat in an exam (or not enforced), then obviously it's seen as an encouragement to cheat.
Bring back in-person, closed room, no calculator/phone exams, and these score determines your grade(s), rather than the teachers from the school.
> To be clear, I’m not advocating for AI in real learning. AI is only useful right now as a stress test as it reveals how hollow adolescent work has become. If it pushes schools toward offering work with relevance, impact, and agency and away from hopeless busywork (“When will I ever use this?”), that is a win.
But how will they ever know that if they don’t go through the process? I am not saying the current way of teaching is perfect but you can’t tell what is and isn’t bullshit without some experience at some point.
We had a mandatory home economics class that taught how to balance a check book, cook, do laundry, and even how taxes worked. Yet people still thought that class was bullshit and a waste of time. Many classes such as health, gym, shop, a/v, typing, all had people blowing it off as useless stuff they will never need to know. ChatGPT turning every class into that is a nightmare future for the youth of the world. People will grow up entirely unable to think.
Given that I worked with people well before the advent of LLMs who had no idea how marginal tax rates worked, it seems like we should be more aggressively pursuing this as an educational goal.
First of all, the entire post reads like it was written by AI.
Secondly, the author / prompter misses the point entirely with this closing paragraph:
> The next time a teacher complains about AI cheating, ask: If a machine can do this assignment perfectly, why are you giving it to this student?And then we can replace it with education and work that actually matters.
You learn fundamentals because they are necessary for you to understand how the magic works, and because that’s how the human brain works.
Is it important for you to be able to write a binary search algorithm perfectly from scratch? Not especially, no. Is it important for you to be able to describe what it’s doing, and why? Yes, very much so, because otherwise you won’t know when to use it.
If your rebuttal to this is “we can feed the problem to AI and let it figure that out,” I don’t want to live in that world; where curiosity and thought are cast aside in favor of faster results.
> The next time a teacher complains about AI cheating, ask: If a machine can do this assignment perfectly, why are you giving it to this student? And then we can replace it with education and work that actually matters.
While this might be more true of "factoid based classes" (such as geography) - it completely misses the point of subjects where students actively benefit from struggling through the act of the craft itself. (writing, music, foreign languages, etc.)
> They’re copying essays from AI, running them through “humanizing” tools, and handing in work they’ve barely read. They’re having AI listen to lectures so they don’t have to. They’re sneaking AI via their mobile phones into tests.
The essay thing is real, but I am not mad at AI summarizing lectures. And someone having access to a phone will cheat with Google if not AI.
And many teachers I know do written in-person tests now. They aren’t as obsessed with perfect sentence construction, and say they value students individual quirks more (they seem more pleasing against the backdrop of AI slop).
Most western countries have somehow decided over the last couple decades that small negative actions should mostly be free of negative consequences.
You can cheat on tests, shoplift in stores, and pretty much nothing will happen to you.
When teachers can’t give failing grades to students or kick them out of their class for blatantly breaking the rules, this is what happens.
Meanwhile I took a language exam in Japan last weekend where a bunch of people got kicked out of the room - instant fail - for using their phone during the break when it was expressly disallowed (we had to put it in a sealed envelope that we couldn’t open until the exam was over, break included). Given reports I’ve heard, I suspect at least a single digit percent of test takers failed the test this session simply for breaking this rule.
From the test takers who got kicked out of the room and tried to negotiate (unsuccessfully) with the proctors, it was instantly obvious who came from cultures where the consequences of rules are carried out and who didn’t.
You know, back in the day, teachers used to try and convey the "why" behind things like writing essays and reading books. Spark notes existed, but a good teacher could convey, hey, there is a reason we are doing this thing, it is because it has value outside the note that says you completed the task itself.
> back in the day...
teachers still do this today. It's just that kids are less disciplined, and more prone to attention deficits. Not to mention that punishment for failure has been dulled down to almost non-existent. "No child left behind" had noble intentions, but the way it was implemented leaves much to be desired.
To me, the fix is to cure the lack of consequences in the outcome of cheating. If you're allowed to cheat in an exam (or not enforced), then obviously it's seen as an encouragement to cheat.
Bring back in-person, closed room, no calculator/phone exams, and these score determines your grade(s), rather than the teachers from the school.
Back in the day when teachers’ salary can support a family I bet
I'm looking forward to the day a student accidentally turns in a solution to P vs NP and nobody realizes it for months because nobody is doing work.
> To be clear, I’m not advocating for AI in real learning. AI is only useful right now as a stress test as it reveals how hollow adolescent work has become. If it pushes schools toward offering work with relevance, impact, and agency and away from hopeless busywork (“When will I ever use this?”), that is a win.
But how will they ever know that if they don’t go through the process? I am not saying the current way of teaching is perfect but you can’t tell what is and isn’t bullshit without some experience at some point.
We had a mandatory home economics class that taught how to balance a check book, cook, do laundry, and even how taxes worked. Yet people still thought that class was bullshit and a waste of time. Many classes such as health, gym, shop, a/v, typing, all had people blowing it off as useless stuff they will never need to know. ChatGPT turning every class into that is a nightmare future for the youth of the world. People will grow up entirely unable to think.
> how taxes worked
Given that I worked with people well before the advent of LLMs who had no idea how marginal tax rates worked, it seems like we should be more aggressively pursuing this as an educational goal.
First of all, the entire post reads like it was written by AI.
Secondly, the author / prompter misses the point entirely with this closing paragraph:
> The next time a teacher complains about AI cheating, ask: If a machine can do this assignment perfectly, why are you giving it to this student?And then we can replace it with education and work that actually matters.
You learn fundamentals because they are necessary for you to understand how the magic works, and because that’s how the human brain works.
Is it important for you to be able to write a binary search algorithm perfectly from scratch? Not especially, no. Is it important for you to be able to describe what it’s doing, and why? Yes, very much so, because otherwise you won’t know when to use it.
If your rebuttal to this is “we can feed the problem to AI and let it figure that out,” I don’t want to live in that world; where curiosity and thought are cast aside in favor of faster results.
> If a machine can do this assignment perfectly, why are you giving it to this student?
By that logic now that text to speech has gotten quite good we should stop teaching kids to read.
From the article:
> The next time a teacher complains about AI cheating, ask: If a machine can do this assignment perfectly, why are you giving it to this student? And then we can replace it with education and work that actually matters.
While this might be more true of "factoid based classes" (such as geography) - it completely misses the point of subjects where students actively benefit from struggling through the act of the craft itself. (writing, music, foreign languages, etc.)
> They’re copying essays from AI, running them through “humanizing” tools, and handing in work they’ve barely read. They’re having AI listen to lectures so they don’t have to. They’re sneaking AI via their mobile phones into tests.
The essay thing is real, but I am not mad at AI summarizing lectures. And someone having access to a phone will cheat with Google if not AI.
And many teachers I know do written in-person tests now. They aren’t as obsessed with perfect sentence construction, and say they value students individual quirks more (they seem more pleasing against the backdrop of AI slop).
So yes there are challenges but teachers adapt.