- I teach a tech class to marketing students, and it definitely works very well. They are allowed to use ChatGPT and other tools, with one caveat: you remain responsible for the output. I hide white-text prompt injections in specs or longer task instructions (usually in PDFs, works well enough there with copy and paste), and sometimes place a phrase near the end of the text that prompts the LLM to append something like, "I submit this assignment without checking its output, and I accept point deductions as agreed."
I used to do this for a laugh and not deduct points, next year, I showed them this before class as an introduction to working with AI and kind of as a warning, I'll deduct points, expecting nobody falling for it, then they fell for it over and over again. Well.
- > and kind of as a warning, I'll deduct points, expecting nobody falling for it
At this point I knew how the sentence would end.
Well, let me repurpose the old meme:
Quote From Man Points Deducted: What are you gonna do, deduct points from me?
- Good. I wouldn't like cheaters to compete with honest students on the job market.
In my kid's school (American high school equivalent) being caught on using LLM in papers is a failed subject. Students must pass all the subjects to finish the school. Some of these subjects won't be taught the next year so effectively they lose year, two, three....
- This is the same argument as when teachers disallowed calculators in class, as 'You won't always have one in your pocket"
Failure to embrace in new technologies will not give you a 'step up on the competition', but will actively hamper your ability to compete at all.
That firm you're applying to doesn't care about your college book report.
- Expecting a kid to run a mile in Physical Education class rather than call Uber is not denying technical progress, nor is it hurting their ability to call Uber later when it is appropriate.
- Right, you're describing a curriculum clearly centered around a visible indication that the student is learning and performing. That's what I'm suggesting as well.
'AI Traps' will just forever be a game of cat-and-mouse. We need an education overhaul. School faculty should be less focused on catching LLM use, and more focused on teaching lessons that can't be easily bullshit by AI
- Yesterday someone shouted across the room at me “hey, what's 43 divided by 2?”
The point isn't that you won't have a calculator, the point is that you shouldn't need to pull out a calculator for every little operation. We drive everywhere, but that doesn't mean we shouldn't be able to walk a mile if necessary. Failing to develop basic mental arithmetic skills is not a flex.
- Right. We had to 'show our work' to prove we didn't use a calculator in school in situations where it was prohibited. This provided teachers proof that we understood the fundamentals.
Is there an equivalent to showing your work for writing? Seems like modern LLMs can already mock up a 'draft' and a 'outline' or whatever 'showing your work' would be for an essay
- Why wouldn't students be able to learn how to use LLMs afterwards? How does learning to use them via the completely unstructured process of getting output past an overworked teacher out of their depth develop critical skills?
- > How does learning to use them via the completely unstructured process of getting output past an overworked teacher out of their depth develop critical skills?
Nobody said it did. The point isn't to get it past a teacher. The point is to develop a curriculum that encourages growth with technology as opposed to demonizing it
- Every single actually good student learned information in the school and various skills outside of school. The tech is changing so much currently it would be a waste of time for a teacher to try and plan a year long course around them.
- I'm not suggesting to plan a course around using ChatGPT. I just think we're seeing the idea of 'essays and paragraph-based replies to generic questions' be defeated in real-time. There has to be a better way to get quantifiable results than what we currently have
- It really depends on the field you go into. If you're a writer and you want a job writing things, you go to college to become a better writer and prepare yourself for working in that industry. If the professors just let you turn in AI slop, how does that benefit anyone? You didn't write anything, why are you here paying tuition? And it demonstrates to the industry that if colleges are handing out degrees to writers for AI slop, why do they even need writers? Just cut the middleman out and they can make the slop themselves.
You go to school to learn. Turning in AI slop doesn't teach you anything. You didn't have to research the subject and commit time to crafting the work into something good. You just typed in a prompt (or copy and pasted it) and then turned in whatever the computer made. The point of learning isn't to turn in assignments, it's to learn and demonstrate your knowledge via assignments. If you want to get a job producing AI slop, don't bother going to school.
- > You go to school to learn.
This is not the mindset of very many people. They go to school because it's a requirement to get a job.
Talk to someone in college, or especially a trade school, and you'll see that the overwhelming majority are cheating, especially those from lower trust cultures. I work at a FAANG and, in casual conversation, many of my colleagues admitted to cheating with a dismissive "everyone does it".
- Writing specifically, I'll concede may need some oversight to prevent LLM use.
In general though, we should be looking at how to re-design assignments to demonstrate an understanding without being a large block of text, atleast imo.
We (Or atleast, my school) started teaching us how to use a calculator in things like Trig and Calc. It's not about 'can you divide correctly to arrive at the correct value of sin' but 'can you differentiate when to use sin vs cos', which I think was the more valuable lesson. But maybe LLMs are so powerful, or so 'do it all', that we just cannot compare it to the calculator (Not in their current iteration, but looking ahead...)
- In person proctored exams, with individually randomized questions from a large pool, along with written answers completed during the test, like required for state certifications, are probably the only answer.
The popular way to get around video chat proctoring is to physically attach notes to your screen, so when you sweep the room with the built in camera, it doesn't see anything.
- This works in theory, I wonder if it's too resource intensive to be actually feasible though. You can't proctor work done at home, and you can't trust the parents, so you'd need 'homework centers' which sounds like a nightmare, or only administer these during class hours?
- If you’re getting prompt injected, you have skipped right passed thinking critically about what you’re doing and into the same level of intellectual dishonesty as cheating, ie, not learning the thing and then attempting to still attain a grade for work not done.
- Agreed, but already in this same comment section there are people speculating on ways to defeat this, like a small model just to detect prompt injections. Students will catch on quick, and any novel trick you deploy will be killed by word-of-mouth once the first round of grades come back. I understand the need to do something, but it feels like a band-aid solution on a hemorrhaging gash. I don't think 'AI traps' are a viable solution moving forward for education
- it isn't though
it's more like having your friend write an essay for you, except the friend is an impressionable 5 year old with a PhD
- Except for a calculator didn't turn your brain off. Maybe you should turn yours back on.
- Go apply for a job and tell the interviewer that you refuse to use LLMs.
You'll be overlooked for someone who is 'current'
Atleast in both my last company and current one, brass was pushing to have copilot rewrite your emails...to the annoyance of most users
- Nobody cares about your college book report. It's there to prove, or teach you, that you can do the extremely basic task of synthesizing information. Same with math without a calculator. You should have a mental model of basic math. It helps avoid shooting your foot off later. You might never have to do math from first principles again, but you should get over the hump once.
- [dead]
- i wonder why the labs don't put a small model for detecting prompt injection in front of the main llm.
it's 20b at most and it can work quite well.
for now you can proxy http through llama guard. 'luxury' security if you can build and pay.
is there an architectural limitation?
- The limitation is efficiency and efficacy. If you have to add an additional layer of inference to any request you’re negatively impacting your bottom line so the companies, which are compute bound, have a strong incentive to squeeze everything into a single forward pass. It’s also not clear that a separate model that is smaller than the main model will perform better than just training the main model to detect prompt injection. They are both probabilistic models that have no structural way of distinguishing user input from malicious instructions.
- [dead]
- [dead]
- Prompt injecting homework assignments is a funny idea, but doesn't seem very productive.
Either the teacher needs to adjust how they are teaching new concepts or the student needs to ask themself why they are attending college in the first place.
- The student is attending college to get a job. Most students don't care about the course.
Probably around 50% of students in my year were only in it for the well paying jobs a prestigious degree like that could give them.
This has to be part of the threat model for cheating.
- I am in my final year of my bachelors in Software Engineering. I was (mostly still am) very interested in both SWE and CS in various angles - I studied a decent bit of PL theory, I tried to get into systems programming, I've built a bunch of "portfolio crud" software and had a short internship in a real company, with all of the above being roughly equally interesting to me. All this is to say I genuinely love the field so far.
However, the only benefit I've got from my local university is that it saves me from military while I study. Past year 2 (out of 4, country-specific quirks) there was roughly one subject actually worth paying attention to, so I also have switched to a "just get a decent grade at any cost" mode, as most of the material we're studying (and especially most of the assignments we've done) has negative value in real world.
Most of my peers consider me both more enthused and more knowledgeable than the average student, which mostly makes me realise that roughly 95% of my peers don't care about the contents of the courses.
All this is to say that, while grading is hard, the only thing that might get people to actually care is a proper course, no matter what threats you make.
- I know many people who were in the exact same situation as you while at uni. I hope you find value.
For me, my hobbies probably gave me 2x more experience, but uni forced me to learn things i would have never learned by myself. It made me believe self taught engineers were inherently flawed from only knowing what they themselves thought was important.
I'm sure you'll find value at the end, but I think you are valid in feeling you are wasting time.
- It's pointless. Just an arms race of gimmicks. There's really no option besides making homework all optional, and putting 100% of the grade into in-person exams. I basically don't trust that any new graduate has earned their degree, and won't until schools do what's necessary to crush cheaters.
- I agree with you in spirit, but the last meta pre-LLM was that exams were bad at measuring student skill and that students felt more fairly treated when their grade was the result of multiple assignments and projects. I think it's a shame we have move away from that
- > exams were bad at measuring student skill
They are. I have a friend who was significantly more smart and thorough in our studies but often get bad scores on exams not being able to concentrate under the pressure.
- Exams also rarely measured skill in the course. Often just a subset. We would often spend the last month of each semester cramming exams instead of studying the curse material because it wasn't that useful.
I rarely felt I got a lot out of courses, but I often felt I would if I got to study it properly
- I also struggled with exams, but that's because my understanding was often shallow, due to a lack of effort to study and understand the material. I'm very suspicious of people that say they're smart, but can't perform on exams. That said, there's plenty of ways to structure things to avoid this. Have weekly, easy, pass / fail exams that ensure you've read the material at a basic level, or understood some basic concepts. Lab work. Presentations with live grilling from the professor to ensure you understand the topic.
- I don’t think my friend would claim to be smart (and not I’m not talking about myself in third person to sound more convincingly, I have a real las in mind). I say they are. I saw them in a day to day work and they are both more knowledgeable and more productive than I am. It’s being put on the spot, with high stakes and limited time, they had a difficulty with.
> there's plenty of ways to structure things to avoid this
Sure, I was arguing specifically against GGP’s solution, i.e. betting everything on the finals.
- Isn’t that actually a valid way to test? IMHO Performing under pressure is a capability signal in itself.
- Well, that is a way to test students’ ability to perform under pressure, but I’m adamant it’s not a fair assessment of their skill in the subject at hand, nor how much they’d worked and improved during the course. On several occasions I have gotten higher marks than my friend because of their anxiety issues, despite me being a worse student and arguably a worse researcher (what we studied for).
- If you can’t concentrate under pressure then you will not go very far in employment….
- Huh? Not every job requires this trait, and even though some do, it’s not something nonlinear optics professor ought to evaluate.
Sure, it’s a nice quality to have and I find it useful at times: when it’s “suddenly” the last day to write a proposal, or when someone has to present at a conference. (However, these tasks many other skills besides just the ability to stay calm.) But I can’t agree that it is indispensable for a researcher.
- [flagged]
- I don’t want to be a CEO, mate.
Why would I give up my cushy place where I’m paid to do interesting stuff, for a stressful position full of management responsibilities? I swear, more people should learn the idea of lagom.
- Now that’s some “unemployed and living with my parents” level of comment.
- the course is now no longer cs/swe.
the course is now
"how to pass exams in cs/swe"
- Better than "how to get a passing grade in cs/swe"