top of page
Black lettering reading "GP" on a yellow background.

Survey finds majority of AI use for academic assignments undetected

  • Gatepost News Team
  • May 9
  • 12 min read

Maddison Behringer / THE GATEPOST
Maddison Behringer / THE GATEPOST

By Paul Harrington Staff Writer By Anita Loughlin Staff Writer The Gatepost conducted an unscientific survey regarding student use of Artificial Intelligence (AI), and according to survey respondents, 80.5% of unauthorized use of AI for academic assignments is not detected.


The unscientific survey of 250 students was conducted between Nov. 19 and Dec. 5.


Students were surveyed on their understanding of AI and their use of it both in the classroom and in their daily lives. 


Students were asked to rate their understanding of AI on a scale of 1 to 5, with 1 indicating a very poor level of understanding, 2 indicating a poor level of understanding, 3 indicating a basic level of understanding, 4 indicating a good level of understanding, and 5 indicating a high level of understanding.


Of the survey respondents, 2% said they had a “very poor level of understanding,” 4% said they had a “poor level of understanding,” 58% said they had a “basic level of understanding,” 27.6% said they had a “good level of understanding,” and 8.4% said they had a “high level of understanding.”


Students were also asked how many of their class syllabi during the fall semester posted rules for authorized use of AI for academic assignments. 


Of the survey respondents, 14.8% said four classes, 16.8% said three, 15.2% said two, 13.2% said one, and 40% of respondents said none of their classes posted an AI policy in the syllabus.


When students were asked if they had been given assignments that allow for authorized use of AI, 42.8% said they had not, whereas 38% of respondents said they had. 


Additionally, 19.2% of respondents said they were “not sure” if they have had any assignments that authorized use of AI.


Of the survey respondents, 12% said they had used AI in an unauthorized manner for an academic assignment more than 10 times during the last year, 6.8% said they used it 5 to 10 times, 32.8% of respondents said they used it 1 to 5 times, and 48.4% of respondents said they have not used AI in an unauthorized way for an academic assignment.


When asked if they saw an improvement in their grade after using AI in an unauthorized way for an assignment, 40.7% of survey respondents said they saw an improvement, 22.8% of respondents said they had not seen an improvement, and 36.5% said they were unsure. 


Students who had used AI in an unauthorized way for an academic assignment were asked whether this was detected by their instructor.


Of the survey respondents, 80.5% said it had not been detected, 3.2% of respondents said it had been, and 16.3% said they were not sure if it had been detected.


Of the survey respondents, 52% indicated they use AI for purposes other than academic work, 42.4% said they do not, and 5.6% said they were unsure.


Students said AI policies vary from class to class.

Junior Brian Fintonis said, “Some of them specify that use is prohibited.” 

They added some professors allow it to be used as a tool, but not for final assignments.


Senior Sean Letarte said in his department, “No AI for any assignments.”


Junior Amy Bickford said, “Some of my classes require AI for more technical software such as ARC-GIS.”


She added she uses AI to detect movement and analyze changes in data, and that it’s not to replicate art or words.


Freshman Aidan Lee said, “I know for my composition class, my professor says we can use ChatGPT for outlining our essays, but otherwise, we can't use it for the actual writing bit of it.


“I personally don’t see the point in using it. I can see how AI can be useful, but I feel like AI has no soul in it. It kind of takes away from what you're doing in that class,” he said.


Junior Sasha Charmant said, “I think AI is a great tool when it comes to information and learning, but if you’re copying something from AI, it will be noticeable.


“It’s not something you should depend on,” she added.


Sophomore Shiba Nankya said, “I have heard fellow students use it as a way to understand difficult papers.


“As a bio student, you get scientific papers that can be harder to understand, so it’s good to be able to assess what you’re reading.


“Our bio communications professor allows us to use AI to generate scientific questions to ask for our research studies,” said Nankya.


“AI is very confusing right now. I’m still skeptical. I feel like I can’t really trust it, especially because I’ve heard some of the sources used are not actual sources,” she added.


Junior Kenzy El Sayed said, “I’ve used ChatGPT for studying and I also use it for social media sometimes to help create content with idea generation.”


El Sayed said one of her communications arts professors lets students use ChatGPT to help structure papers, “but obviously, don’t plagiarize,” she added. 


“I think that’s very smart for them to allow us to use it because people are going to use it anyway, so having ground rules is important,” said El Sayed.


“AI is going to keep growing, but people have to use it smarter,” El Sayed added.


Students were not surprised when they were informed that the survey found that 80.5% of unauthorized use of AI for academic assignments was not detected.


Freshman Makayla Cupid said, “I’m not surprised that more people are getting away with using AI.”


She added she thinks “the world is changing” and technology is “advancing,” making AI harder to detect.


Junior Ashley LaCivita said she had mixed feelings about The Gatepost survey results.


“I think there are so many tricks and ways you can just keep reprocessing [AI technology] to the point where it’s impossible to detect,” Lacivita said.


She said she believes a lot of students know how to “outsmart the system” and change the way their unauthorized use of AI sounds in order to bypass AI-detection systems on Canvas and other sites so their professors won't notice.


“Kids are just sneaky, so I’m not that surprised,” Lactiva added. 


Freshman Devin Shepherd said, “No, it doesn’t surprise me. I feel like most people use it. I think it’s a good source, but if you use it the wrong way, it can be cheating.”


Sophomore Natalie Reynolds said, “I feel like there’s a certain way to use it correctly so it’s not so much cheating but instead points you in the right direction.”


Junior Alicia Phillips said, “It’s not surprising to me. I’ve seen people go to obnoxious lengths to make sure that their stuff won’t be detected as usage of AI.”


Referring to The Gatepost survey results indicating a large number of students use AI in an unauthorized way for academic assignments, senior Eden Hudson said, “That is a very high percentage, but not too surprising.” 


Junior Dylan Santos said he is “not surprised.”


Senior Cyrus Bergeron thinks professors should make assignments less boring.

“AI just takes from preexisting human knowledge, so having assignments that are less of a recap and require more interaction should help,” said Bergeron.

Senior Fabian Williams-Glenn was “not surprised a single bit” by The Gatepost survey results.


“AI is an unregulated issue, especially in the educational field,” Williams-Glenn said.


Senior Jacob Measmer thought the results “made sense.


“I’m not surprised. The concept of AI is evolving by the day, and new uses for AI keep multiplying,” Measmer said.


“It’s like everything comes with AI nowadays,” Measmer said. 


Sophomore Adam Harrison was “shocked” by the results. 


“If AI can’t be easily detected, I feel like there’s going to be a huge influx of students who aren’t actually learning and taking advantage of the education they could be getting,” said Harrison. 


He added, “I feel like professors are going to have to start redesigning how they format assignments to force students to really dig deep into their critical thinking to complete the assignment.”


Junior Rileigh Kelley was “kind of surprised” by the survey results.


“I always figured AI detectors were better at picking up on AI work. 


“My professors always make it seem like they run through them with 100% accuracy, and that they’ll never get by the detectors,” said Kelley.


Referring to The Gatepost survey results, senior Michael Gardner said, “I think that’s wild because I can usually tell when something is done by AI. 


“AI writing usually follows the same basic format and doesn’t have much personal style or voice to it, so that’s really surprising,” said Gardner.


He added he thinks there needs to be better detection methods for AI. 


Sophomore Penny Pasto said she’s “not entirely surprised.


“Of course back when ChatGPT was first released, I used it a couple of times because I didn’t know how bad it was, and haven’t used it to do any assignments here, obviously. Detecting is hard for professors, as it is for students. 


“Even though I personally hate the use of AI, I know it’s almost impossible to catch it consistently,” said Pasto. 


Referring to how widespread AI use is by students, freshman Suzanne Morris said, “That’s really dangerous, especially for college students.


“That means that people aren't using their authentic selves and they’re using basically computers to get good grades,” she added.


“Later on in their careers, they’re going to have a hard time achieving their goals,” said Morris.


Junior Marycarmen Curley said, “I think AI is hard to catch, but I know students who turned it in as their own work, and it was caught for AI. They turned it in as if they did it all themselves, though.


“It’s becoming harder to detect that stuff and I think that's why so many people are using it,” she added.


Senior Jeslie Da Veiga said, “I think AI going undetected is not beneficial to students’ learning, and I know they know that.


“You’re here in college paying money to get the knowledge, so if you think it's OK to use AI for producing a paper or a project, then that's on you. You’re just not furthering your own knowledge,” she added.


Sophomore Elijah Hansen said he didn’t “like that all too much” when informed about the survey results.


“I think it kind of takes away from students being able to express themselves creatively and openly,” Hansen added.


“I’m all for using AI as a resource but not as a submission,” said Hansen.


Freshman Mya Secka said she thinks AI can be “super helpful” as a tool for studying or breaking down information. 


She added one of the drawbacks of AI is that “I know that people use it to cheat” on assignments.


Freshman Ria Padayachee said some of the benefits of AI are that it can check a student’s work “to make sure you're putting your best quality work out there,” but it can also “deteriorate your abilities as a student if you are blindly copying from it.”


Junior Christian Taylor said, “AI should mainly be used as a tool, and depending on the professor, it can be promoted as a tool. You can use it as an advantage, but obviously, it should not be abused,” said Taylor.


Freshman Janice Agyemang said she is against using generative AI because “It has a lot of potential for harm for creativity.”


English Professor and First-Year Writing Coordinator Patricia Lynne believes AI can be helpful in the classroom.


Lynne is teaching Composition I with studio this spring semester, and said she actively encourages students to use AI for particular assignments.


She said, “My policy will be that they can use it, but they have to include some kind of reflection on the results they got.”

She added she provides clear guidelines to her students for how to use AI. 


Lynne said her intent is if students are going to use AI, they should use it “thoughtfully.”


She said while she provides clear stipulations for AI use in all of her classes, it’s hard to know if students are using it in an upper-level class. 


“I know I can never be 100% sure of that, and I don't like policing my students, so I don’t,” she said.


Communication, Media, & Performance Department Chair Niall Stephens said he wants to work with students to explore how AI can be productive.


Stephens has used AI in his upper-level communications classes.


“Essentially, I ask students to tell me if they use AI and how they’ve used AI in doing their assignments,” Stephens said. 


He added he’s sure there’s more uses for AI that nobody has discovered or heard of. 


He noted the “obvious downsides” of AI he believes people are all too aware of. 


“It’s like if students are using AI to do their assignments, they’re not learning anything,” he said.


“They’re just having AI do the entire assignment for them,” he said. 


Stephens said he thinks students believe using AI might be considered cheating and asks himself whether “students are sort of not revealing as much as I'd like them to about how much they're using AI, or if they're using AI at all.” 


He added, “I imagine there are ways AI can really help people do outlines of papers they're composing and organize ideas, things like that. 


“But I haven’t really seen that happen in my classes yet,” he said. 


Stephens acknowledged the perspective of professors who treat using AI “essentially as plagiarism.”


“We should think of AI as something that we have to work with, not something we should work against,” he said. 


Susan Dargan, Dean of Education and Social and Behavioral Sciences, believes AI can be useful for gathering information when used correctly, but emphasized the importance of critical thinking and original work. 


“I think it has tremendous uses academically and personally,” said Dargan.


She stated she has used AI in personal ways such as helping to create lists, gather information, or create outlines for syllabi for certain courses she teaches. 


“There are some tedious things we do that we would not have to do if we learned how to use AI correctly,” she said.


Dargan said she has seen AI being used as a notetaker in meetings to categorize the notes while also removing comments that didn't seem appropriate. 


“I was stunned. I just couldn't believe how well it can organize language,” she said.


Dargan is a part-time instructor at Providence College and witnessed students abusing AI technology there.


“So I'll ask a question like, ‘Describe the use of symbolism in the Civil Rights Movement,’ and I'll get an answer that tells me all about the Civil Rights Movement. I didn't ask that. I can tell right away when students don’t use AI correctly,” she said. 


“I understand using it to gather some information, but then you have to do the rest of the work yourself,” she added. 


The Canvas site features a software system called Turnitin in some courses that helps students and professors detect plagiarism in writing assignments.


Dargan said, “If something comes back as ‘95% AI generated,’ professors are going to send that back to you and say, ‘What is this?’”


“I have also seen this at Framingham State, [students] getting charged with plagiarism or using AI,” said Dargan.


“My strategy is to ask really very specific, pointed questions from the text that we use in class to try to get the students to still do the work and read the text,” Dargan added.


Dargan said FSU has an AI Team, led by T. Stores, dean of Arts & Humanities, developing policies for AI use in the classroom.


Stores said the committee has been working to “recommend policies, or to provide faculty members with the tools to help them choose what kind of policy around AI makes the most sense for their individual classes.


“So as a professor, I might say, ‘I want you to use AI for this particular task, and this is how I want you to use it. I don't want you to use it in this way - it will be considered cheating.’ In another class, I might want to say, ‘We won't use AI at all in this class,’” Stores added.


Stores said it is important to teach students how to engage with AI and learn how to question it. 


“We all know that it hallucinates. It just lies or makes up answers to things, so engaging in critical thinking around what AI is providing us is important,” they said. 


“It's similar to the ways we try to teach students to engage with news sources and with statements that politicians and other people make that may or may not be true,” said Stores.


According to Stores, the challenge for higher education is to actively engage students in questioning when AI is correct, when it's not correct, and “where it's fudging the truth. 


“One of the things that we have to be aware of is that no matter what, whether we like AI or not, it's a reality,” said Stores.


Stores asked whether using AI is ethical. “Is this tool good for our environment, for the climate? Is this tool something that is fair and equal?


“Humans have to retain control over the machine because AI is just a machine,” they said.


Stores said what is most important right now is to help students understand when it is appropriate to use AI and to not rely on it.


“We need to evolve in the ways that we teach, and I think we need to be really transparent about how we as educators use it as well.


“I can go to the library and borrow a book, but that book doesn't give me knowledge unless I open it up and struggle with the ideas that are in it and try to apply those ideas to whatever I'm doing,” said Stores.


Stores said if students don’t understand AI and how to use it as a tool instead of a shortcut, they will not gain any knowledge. 


Kristen Porter Utley, provost and vice president of Academic Affairs, said while she thinks AI is a useful tool, “It is a risk to student learning if it is used improperly. 


“It's no different than cutting and pasting from a research article or cutting and pasting directly without attribution from a website,” she said.


“I would be very clear about academic honesty and how those two things intersect with one another, but I would definitely use it and encourage students to use it,” Utley added.


According to Utley, it is important to understand exactly what AI can do.


“When you walk onto our University, it's like an all-you-can-eat buffet,” said Utley.


“We hope that students will approach their education like, ‘I'm going to eat as much as I can. I am going to pay this amount of money for my all-you-can-eat buffet, and I'm going to take full advantage of everything I can do to learn,’ ” she said.


“Don't let AI rob you of that. As intelligent people, that would be a shame,” she added.


“The value is incredible here at Framingham State. Don't let anything shortchange that for you. Listen to your faculty. Work with your faculty to understand the tool’s benefits and harms,” said Utley.


[ Editor’s Note: Assistant News Editor Bella Grimaldi contributed to this article. ]

  • Instagram
  • Facebook
  • Twitter
bottom of page