Students submitted code they didn’t write themselves, contributing to widespread rule-breaking

  • streetfestival@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    2 days ago

    As someone in their 30s who went back to university, I am seeing a lot of students use AI to summarize papers they were supposed to read for class or to write papers they were supposed to write for class. This is in addition to using the AI summary feature of a popular search engine as their default, if not only, means of looking up something they’re unsure about.

    It’s often talked about by those that do it with a coolness about successfully skirting dumb rules. For one reason or another, it seems very reasonable to them. Maybe they see it as helping them with the onerous parts of school/ academia. Maybe they see it as the future, and current protests against it as silly.

    More specifically, I’m seeing people use AI for things that are an area of weakness for them. By doing that, I think they’re missing opportunities to develop those skills, and they will continue to ‘miss milestones’ so to speak.

    I think, in general, people’s reading, writing, and critical thinking abilities will go down over decades due to this behaviour. And that scares me. I think those skills are key to a rational electorate. E.g., Lack of such skills = Trump

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago

      Yeah being able to read an article and have it trigger a thought process where you grok the concepts an paraphrase, is such a huge skill. Reading a summary is not the same even if your words would be written out the same in the end.

      Also as a deep dive user of technical program, when I chatgpt some questions about this aoftware the output is often totally wrong, and even if I feed it some correct info it says oh right and changes it into more garbage. At a surface level though it looks very correct to a person who’s never used the software before.