An engineering TA got fed up with students submitting nonsense technical reports and tasks. They took it to Reddit to try and change the situation and convince learners to stop trying to cheat in life, especially using ChatGPT for it.

Woman shrugging
JOIN OUR LEARNING HUB
 
✅ AI Essay Writer ✅ AI Detector ✅ Plagchecker ✅ Paraphraser
✅ Summarizer ✅ Citation Generator

Key Takeaways

  • AI tools like ChatGPT are often inadequate for technical reports and tasks. Professors and teaching assistants can easily distinguish between AI-generated content and student-written work due to AI’s tendency to be “verbosely vague,” lack depth, and contain inaccuracies in technical information.
  • Despite its limitations, ChatGPT can still be a useful tool under strict human supervision. Students find it helpful for improving word flow, and structure, and relieving writer’s block.
  • ChatGPT can be effectively used for specific tasks such as creating outlines, checking grammar, and writing introductions and conclusions. However, AI should not be relied upon for complete reports or tasks requiring in-depth research and analysis.

It’s no secret that students started using AI as soon as it came on the scene. Tools like ChatGPT seemed especially useful for those studying humanities, as they could find necessary information, analyze it, provide summaries of long texts, and even help with writing essays and whatnot. Yet, as ChatGPT developed, its capabilities grew and so more learners decided to join the ‘working with AI’ challenge, including those who were attending more technical courses.

Apparently, though, TAs noticed the change in the students’ approach to handling tasks. One teacher, after receiving from their class a whole bunch of engineering reports that didn’t make any sense, decided to try and prevent students from using AI assistance (ChatGPT in particular). Their main argument was that everyone in the technical field can tell hand-written and AI-generated works apart and that just makes everyone submitting unoriginal documents look stupid.

This Reddit post led to an active discussion on how professors can tell when ChatGPT was used and which tasks it can be used for in academic settings. So let’s take a look at what the community thinks about the place ChatGPt should have in the technical fields of study.

Professors Come Together Saying It is Obvious When AI Wrote Your Work

Professors and students alike have raised concerns about the detectability of AI-generated work, particularly in technical fields. Commonly described as ‘verbosely vague,‘ AI-generated text often lacks the specific details and depth expected in academic writing. The repetitive rephrasing without substance and inaccuracies in technical information happened to be the most telltale signs.

“I was using AI to brainstorm ideas and titles for a paper and noticed when I’d asked it to briefly summarize a source I was considering looking into. It would literally just reword the title but with extra nonsense words.
Of course, I kept forcing it to “make it longer” and it turned into like 3 paragraphs just rewording the title of a paper as if it was explaining it. “Verbosely Vague” is fu**ing perfect”

Additionally, the use of AI can result in odd grammar (yes, more strange than any student could have written), eerie similarities between different students’ work, and failure to follow specific formatting guidelines, making it evident to educators when AI has been used.

“A guy in my differentials class used it to copy and rewrite another student’s discussion post comment, resulting in “Even though I think you look stunning, I still want your feedback on the matter.” This guy’s post was just gibberish after changing random words from the source.”

A lot of students under this discussion mentioned, that with thorough proofreading though, works written by ChatGPT can actually be good enough and of course, they can’t be submitted as they are. Here, the OP of the Reddit post decided to weigh in noting, that if somebody in class used ChatGPT they most probably lack the knowledge and skills to edit the AI’s output.

“They don’t know enough to proofread what the AI is giving them. It’s a dangerous game to play in your undergrad when figuring out what you are supposed to be writing is more than half the battle. Once you’re in employment, then that balance shifts, and you know what you need to write most of the time, the work is just in doing it.”

It is true, ChatGPT does have a very obvious writing style. It tends to add confusing grammatical and lexical structures, often using words like ‘delve’, ‘comprehensive’, ‘ensure’, ‘realm’, etc. Not to mention, that most of your teachers already know what your usual papers (or generally tasks completed by students) look like. So, the next time you try to generate your assignment with its help, find time to read it through. Maybe then you will understand how professors see whether the work was written by their students or not.

Redditors Share Why They Are Using ChatGPT and What For

This post also raised a loud debate within the student community itself. Most of them found it stupid that somebody would submit a WHOLE report written by ChatGPT as they agreed this AI can’t handle writing tasks very well. Nonetheless, they protested against stopping using this tool once and for all. Instead, they decided to share ways in which they personally used AI that didn’t interfere with the studying process but rather helped make it more effective.

Here are the top 3 ChatGPT use cases as suggested by Redditors:

Improving Word Flow and Structure

There were a lot of students who agreed: if ChatGPT is good for at least something, it has to be the improvement of writing. Many admitted that they use the platform to help them create an outline for their paper or make their sentence sound better (read more coherent, logical, or professional).

“It is very good at rewriting your own words into a better flowing sentence/paragraph.”

“I’ve used it to write a skeleton of a cover letter, then revised it myself so it sounds more natural.”

“If anyone is using it for other reasons than to spit out intros, conclusions, or just put together a general outline for technical reports they deserve to fail.”

Relief From the Writer’s Block Stress

Some Redditors also shared their pain – staring at a blank page not knowing what to write. To those who experienced the same thing regularly, ChatGPT brought a lot of relief and allowed them to be more productive and consistent when it came to completing written assignments.

“It can be good when I have ideas in my head but am stuck trying to get them down on paper. Once ChatGPT gets things started, it’s easy for me to alter it into something actually useful.”

Some also agreed that it helps a lot with work optimization, as it allows you to concentrate more on doing research or analysis while the AI handles lexical formulations.

“It’s a lot less stress to give lots of information and your thoughts while you are working/trying to take notes and having it write it up properly for you.
I can write perfectly fine, I just prefer to use the mental capacity to get more work done and utilize the AI tool for its intended purpose.”

Analysis & Data Organization

A few Redditors also noted that, if used correctly, ChatGPT could greatly help with analyzing or structuring data that you already have. Of course, all those feeling that way clarified that for the AI to do everything properly, well-structured prompts are not just a need but a necessity.

“It’s good for making a format or a table from common reference sources, it’s faster than doing it by hand.”

“It takes some common sense and proper prompts, but if you give a technical document and explain the basis for it, it’s pretty darn accurate. I’ve had 4th-year professors in EE demo how they use GPT4 for circuits, optics, and basic qol. Obviously if you just feed it some question and don’t even bother reading what it spit out, that’s on you.”

To sum up, we would agree with the user who said that ChatGPt is only a good tool for those who already have knowledge in the task/subject and can therefore effectively proofread everything produced by AI and bring it to the format they need.

The Main Point

Well, after going down the Reddit discussion rabbit hole, we now see that AI is not as good of an assistant as some people make out it to be. This post also only convinced us more that ChatGPT can be used under strong human supervision, with constant proofreading and tweaking of the produced content. Yes, it can be good. And it can be used in technical courses too. however, this AI won’t help anyone create a work from scratch, especially if it requires research, analysis, or a strong understanding of given data. Thus, if you still want to make ChatGPT work for you, follow the advice given by Redditors: make it check grammar, create outlines, help you write introductions and conclusions, but always stay critical of the text it writes.

Related

Opt out or Contact us anytime. See our Privacy Notice

Follow us on Reddit for more insights and updates.

Comments (0)

Welcome to A*Help comments!

We’re all about debate and discussion at A*Help.

We value the diverse opinions of users, so you may find points of view that you don’t agree with. And that’s cool. However, there are certain things we’re not OK with: attempts to manipulate our data in any way, for example, or the posting of discriminative, offensive, hateful, or disparaging material.

Your email address will not be published. Required fields are marked *

Login

Register | Lost your password?