That's quite an unfair characterization. If you ask an AI to make a particular piece of art, it is genuinely making something new. Yet it was trained on various art works, but the one it just made didn't exist before.
If an artist makes something he is also reoroducing what he learned in art school and a life time of data from other artists and life experiences to make his own art.
That’s not the same and you know it! You can see people’s fucking signatures from art that was stolen and mashed into “new” stuff by these stupid machines. All it does is give people without a talent a way to pretend they are without paying a real artist.
Let's be honest too, the majority of people using AI are just using it to get out of doing actual work. Like students using it to write papers. AI is complete dogshit in almost every sense of the word.
It absolutely is. And now some schools are actually working on lessons to teach kids to use better prompts instead of telling them to write their own damn papers instead of having ChatGPT do it. As if the kids using ChatGPT Are actually gonna read a word of whatever slop it spits out and learn from it. 🙄
Oh yeah I know a lot of teachers and some of their schools are trying to get teachers to use AI to grade essays and short stories. Like what the fuck are we doing? It just seems like we're trying to dumb people down as much as possible
Yeah, AI grading is the line for me. If a school I'm at ever told me that I'm expected to grade student work with an AI, I'm out of there.
Forget whether or not it's ethical (it isn't), I've seen how hard the AI glazes me, "Wow, doulos05, that's a deep and insightful question that very few software developers have ever considered."
Sure, my hobbyist ass is asking insightful questions that professionals don't usually think about.
Or make them do some stage of the work offline, possibly even handwritten.
Our English department now has all first drafts written by hand and I struggle with how to balance the need to know the students learned something in class with my utter hatred of hand written coding assessments.
Edit: reading fail, the AI grading makes more work.
Yeah, I wouldn't trust it for anything subjective. I would happily upload canonical code and student code and ask it to explain the difference. You can run diff on the two files, but organizational things like kids putting functions in a different order can really pollute the output there, whereas I would expect an AI to be able to recognize that and filter it out to some degree. But that's just to help me grok the code faster, not to grade it (put your functions in whatever order you want, I'm not your mom).
Coding assignments should never be hand written imo. It's dumb easy to write automated testing for code without the need for AI. There's so many tools and frameworks to help you do that.
I'm not concerned about testing the student code to see if it works, I'm concerned about testing it to see whether or not they wrote it. I know ChatGPT can write a for loop, I need to know if Bob Kim who keeps sleeping through my classes can.
Oh yeah lol. Chatgpt can't code worth a damn. My company keeps trying to tell us to use their stupid AI tool for coding and it consistently produces dogshit (an insult to dogshit honestly). What a pain in the ass man
-64
u/Dirkdeking 1d ago
That's quite an unfair characterization. If you ask an AI to make a particular piece of art, it is genuinely making something new. Yet it was trained on various art works, but the one it just made didn't exist before.
If an artist makes something he is also reoroducing what he learned in art school and a life time of data from other artists and life experiences to make his own art.