Comment by waffletower

Comment by waffletower a day ago

16 replies

This is such a naive, simplistic, distrusting and ultimately monastic perspective. An assumption here is that university students are uncritical and incapable of learning while utilizing AI as an instrument of mind. I think a much more prescient assessment would be that presence of AI demands a transformation and evolution of university curricula and assessment - and the author details early attempts at this -- but declares them failures and uncritical acquiescence. AI is literally built from staggeringly large subsets of human knowledge -- university cultures that refuse to critically participate and evolve with this development, and react by attempting to deny student access, do not deserve the title "university" -- perhaps "college", or the more fitting "monastery", would suffice. The obsession with "cheating", the fallacy that every individual needs to be assessed hermetically, has denied the reality (for centuries) that we are a collective and, now more than ever, embody a rich mass mind. Successful students will grow and flourish with these developments, and institutions of higher learning ought to as well.

ragingregard a day ago

> This is such a naive, simplistic, distrusting and ultimately monastic perspective

This is such a disingenuous take on the article, there's nothing naive or simplistic about it, it's literally full of critical thought linking to more critical thought of other academic observers to what's happening at the educational level. The context in your reply implies you read at most the first 10% of the article.

The article flagged numerous issues with LLM application in the educational setting including

1) critical thinking skills, brain connectivity and memory recall are falling as usage rises, students are turning into operators and are not getting the cognitive development they would thru self-learning 2) Employment pressures have turned universities into credentialing institutions vs learning institutions, LLMs have accelerated these pressures significantly 3) Cognitive development is being sacrificed with long term implications on students 4) School admins are pushing LLM programs without consultation, as experiments instead of in partnership with faculty. Private industry style disruption.

The article does not oppose LLM as learning assistant, it does oppose it as the central tool to cognitive development, which is the opposite of what it accomplishes. The author argues universities should be primarily for cognitive development.

> Successful students will grow and flourish with these developments, and institutions of higher learning ought to as well.

Might as well work at OpenAI marketing with bold statements like that.

  • waffletower a day ago

    The core premise is decidedly naive and simplistic -- AI is used to cheat and students can't be trusted with it. This thesis is carried through the entirety of the article.

    • ragingregard a day ago

      That's not the core premise of this article, go read the article to the end and don't use your LLM to summarize it.

      The core premise is cognitive development of students is being impaired with long term implications for society without any care or thought by university admins and corporate operators.

      It's disturbing when people comment on things they don't bother reading, literally aligning with the point the article is arguing, that critical thinking is decaying.

    • allturtles 21 hours ago

      So you believe students don't use AI to cheat, and you are calling the OP naive?

      • waffletower 20 hours ago

        That's an utterly hilarious straw man, a spin worthy of politics, and someone else would label, a tautological "cheat". Students "cheated" hundreds of years ago. Students "cheated" 25 years ago. They "cheat" now. You can make an argument that AI mechanizes "cheating" to such an extent that the impact is now catastrophic. I argue that the concern for "cheating", regardless of its scale, is far overblown and a fallacy to begin with. Graduation, or measurement of student ability, is a game, a simulation that does not test or foster cognitive development implicitly. Should universities become hermetic fortresses to buttress against these untold losses posed by AI? I think this is a deeply misguided approach. While I had been a professor myself for 8 years, and do somewhat value the ideal of The Liberal Arts Education, I think students are ultimately responsible for their own cognitive development. University students are primarily adults, not children and not prisoners. Credential provisions, and graduation (in the literal sense) of student populations, is an institutional practice to discard and evolve away from.

      • flag_fagger 21 hours ago

        ChatGPT told them otherwise.

        Seriously, you’re arguing with people who have severe mental illness. One loon downthread genuinely thinks this will transform these students into “genuises”

    • waffletower 21 hours ago

      You can straw man all you like, I haven't used an LLM in a few days -- definitely not to summarize this article -- and what you claim is the central idea, is directly related to my claim. Its very easy to combine them directly: students intellectual development is going to be impaired by AI because they can't be trusted to use it critically. I disagree.

      • gizmo 21 hours ago

        When AI tools make it easy to cruise through coursework without learning anything then many students will just choose to do that? Intellectual development requires strenuous work and if universities no longer make students strain then most won’t. I don’t understand why you think otherwise.

      • ragingregard 21 hours ago

        > You can straw man all you like

        No one is misrepresenting your argument, it's well understood and being argued that it is false.

        > students intellectual development is going to be impaired by AI because they can't be trusted to use it critically.

        This debate is going nowhere so I'll end here. Your core premise is on trust and student autonomy, which is nonsense and not what the article tackles.

        It argues LLM literally don't facilitate cognitive brain development and can actually impair it, irrelevant to how they are used so it's malpractice for university admins to adopt it as a learning tool in a setting where the primary goal should be cognitive development.

        Student's are free to do as they please, it's their brain, money and life. Though I've never heard anyone argue they were their wisest in their teens and twenties as a student so the argument that students should be left unguided is also nonsense.

        • waffletower 20 hours ago

          You said I didn't read the article. That is your weak and petty straw man. Very clearly.

      • awillowingmind 19 hours ago

        I’m not sure how you lived through the last decade and came to the conclusion that people aged 17-25 make rational decisions with novel technologies that have short term gain and long term (essentially hidden) negative side effects.

        • waffletower 19 hours ago

          It seems that 10% of college students in the U.S. are younger than 18, or do not have adult status. The other 90% are adults and are trusted with voting, armed services participation and enjoy most other rights that adults have (with several obvious and notable exceptions -- car rental and legal controlled substance purchase etc.) Are you saying that these adults shouldn't be trusted to use AI? In the United States, and much of the world, we have drawn the line at 18. Are you advocating that AI use shouldn't be allowed until a later cutoff in adulthood? It is not at all definitively established what these "essentially hidden" negative side effects are, that you elude to, and if they actually exist.

turzmo 13 hours ago

No you are wrong. Students use AI not to augment their minds, but to replace the use of them.

add-sub-mul-div a day ago

Even conceding that you, the person reading this comment, will only use AI the right way. With diligence and curiosity. It takes a significant amount of denial not to understand that the majority of people see AI a shortcut to do their job with the least possible amount of effort, or as a way to cheat. These are the people you will be interacting with for the coming decades of your life.

  • waffletower 18 hours ago

    If a student is given a task that a machine can do, and there is some intrinsic value for the student to perform this task manually and hermetically, this value ought to be explained to the student, and they can decide for themselves how to confront the challenge. I think LLMs pose an excellent challenge to educators -- if they are lazily asking for regurgitation from students they are likely to receive machine-aided regurgitation in 2025.