A COOL A$$ SKELOTEN FLICKEN OFF THERE COMPUTER, AND THERE COMPUTER IS RUNNEN A COOL PROGRAM (10 PRINT "dashare.zone" 20 GOTO 10) AND THERE TIRED OF HOW WE HAVE TO ARGUE OVER WHETHER OR NOT WE HAVE TO LET COMPUTER GENERATED SLOP PUSH EVERY THING ELSE OFF THE INTERNET, AND DA TEXT SAYS "THEONLY GPT I CARE ABOUT IS GETTING HIGH AND PANICKING BECAUSE I GOT TOO HIGH " AND LETS BE REAL THE ONLY REASON WHY THEY MADE IT SO KIDS CAN PUT THERE HOMEWORK IN GPT AND GET FREE HOMEWORK IS BECAUSE THERE KNEE CAPPING THE NEXT GENERATIONS RESISTENCE TO ITS USE, IT WAS DONE ON PURPOSE TO MAKE SURE KIDS NOW WILL GROW INTO ADULTS WHO HAVE A FONDNESS FOR THE COMPUTER SLOP, THEN THEY MAKE SURE TO SAY "NOOOO DONT USE IT FOR HOMEWORK" TO GIVE ALL THE KIDS THE IDEA THAT THEY SHOULD USE THE GPT SHIT, THINK ABOUT IT, IM RIGHT - DASHARE.ZONE ADMIN
Yet another academic librarian here. While it's too late to stop this, it's not too late to develop a framework for talking to students about AI, and to work with university administration to make sure libraries and librarians have a voice in the conversation.Another academic librarian here. Yeah, it's too late to stop this. Like it it not, college students are going to use AI. They already are.
We could go outside and shake our fists at the heavens about it. Or we can try to figure out ways for students to use AI without sabotaging their own learning processes. That won't be easy, but I'd rather acknowledge reality than gripe and try to turn back the clock.
Once they have your data, they have you by your data. Oh what, it is no longer your data.I wouldn't trust Google on this as far as I could throw them. 10 years ago when I worked in campus IT we switched all the student email accounts onto Gmail because they gave us a really good deal. A couple years ago, they altered the deal, and suddenly it became much more expensive, and they had to scale back student storage. I don't trust Google not to alter any deal further.
This is not actually the case, at least across the board. My alumni .edu account is not eligible, so I would assume they have SOME basic controls in place.As for who qualifies as a "student" in this promotion, Google isn't bothering with a particularly narrow definition. As long as you have a valid .edu email address, you can sign up for the offer. That's something that plenty of people who are not actively taking classes still have. You probably won't even be taking undue advantage of Google if you pretend to be a student—the company really, really wants people to use Gemini, and it's willing to lose money in the short term to make that happen.
Another academic librarian here. Yeah, it's too late to stop this. Like it it not, college students are going to use AI. They already are.
We could go outside and shake our fists at the heavens about it. Or we can try to figure out ways for students to use AI without sabotaging their own learning processes. That won't be easy, but I'd rather acknowledge reality than gripe and try to turn back the clock.
I might not make them go to the library, because that's largely an enormous waste of energy.Call me old fashion, but I think students should learn how to do research themselves first. You know, like going to the library.
Gave you the upvote because you are mostly correct, mostly.I might not make them go to the library, because that's largely an enormous waste of energy.
Instead, I'd teach a course on finding and evaluating the veracity of online data/information, with a special emphasis on how unreliable AI answers are, and teach people some decent google-fu skills (with AND without Google).
Libraries are limited, and usually woefully out of date. The Internet is unlimited, difficult to filter and a hodgepodge of reality and illusion. But with training (the same amount needed to teach people how to look up shit in a library, if not the same kind), people can be taught where to look for real, current and relevant facts online.
As a useful skill, learning google-fu will serve people a HELL of a lot more going forward than teaching them the Dewey decimal system ever will.
So, you have the right idea, just not the implementation I'd pick for the 21st century.
In Arthur C. Clark's Richter 10, the book came out in 1996, but they had "wrist watches with internet". "Bringing up the Q-fiber". The protagonist noted that kids had all the answers available to them, but didn't know what the questions were.Via Mastodon: "Using AI in education is like using a forklift in the gym. The weights do not actually need to be moved from place to place. That is not the work. The work is what happens within you." (source: The New Yorker).
Not working for staff accounts either.This is not actually the case, at least across the board. My alumni .edu account is not eligible, so I would assume they have SOME basic controls in place.
It's already been proven through various studies that using AI is detrimental to cognitive abilities. I'm too lazy to look for the studies, but I read about it recently. It was on the internet, so it must be true.Call me old fashion, but I think students should learn how to do research themselves first. You know, like going to the library.
Hopefully something that is being taught is that AI is often completely inaccurate and you need to do proper research to validate what the AI is telling you.Yet another academic librarian here. While it's too late to stop this, it's not too late to develop a framework for talking to students about AI, and to work with university administration to make sure libraries and librarians have a voice in the conversation.
We've put together a research guide discussing the use of gen AI tools that I refer to when instructors ask me to talk about AI. I usually focus on a few things:
I don't tell students not to use the tools, but I do try to make them aware of the issues so they can make an informed decision.
- What gen AI is and is not (ChatGPT is a large language model)
- What gen AI is good at (brainstorming, translation, grammar checking, formatting)
- What it's not good at (research)
- Ethical issues (black box model, privacy, copyright violation, bias, environmental impact)
- How to properly cite gen AI tools
I'm part of a group in our library that's testing the AI search tools being offered by our LMS vendor. It's been interesting; the things I'm noticing most is that it will "massage" queries to search based on what it thinks the user is asking. Sometimes this makes sense. Other times it's wildly off-base and pulls sources that aren't actually related. I think it's not close to ready for prime time and it will be detrimental for students, especially undergrads who are new to academic research, but we'll see.
sorry.Yes, this makes sense. if you aren't going to educate students in school, you should at least give them the ability to cheat when they get to University.
I have an .edu email. I signed up, and it explicitly said it is a 15 month free trial. I signed up today (4/17), and it tells me I will start being charged $20.86/month on 7/17/2026. Presumably if I had waited a day and signed up on 4/18, it would be good through 7/18/26.
They are careful to say they'll email me a reminder before they start charging me.
Why though? DIgital media is generally not permitted during proctored exams - at least it was not in any course that I've taken or supervised. The students were permitted hand-written summaries or - rarely - a collection of books. When you have TAs prowling the aisles you either use what you have and know, or waste your time trying to google - as has been the case ever since smartphones became mainstream.If universities are serious about educating students, they will need to switch to oral exams.
That is a self-report.I might not make them go to the library, because that's largely an enormous waste of energy.
I have no idea what, if any, library you've ever visited, but picking a subject and browsing the appropriate aisle is something that has been successfully taught to elementary school children. As for the limitations of a library: a limited and properly curated collection is far superior to an infinite steaming pile of s***, with a few gold nuggets buried inside.Instead, I'd teach a course on finding and evaluating the veracity of online data/information, with a special emphasis on how unreliable AI answers are, and teach people some decent google-fu skills (with AND without Google).
Libraries are limited, and usually woefully out of date. The Internet is unlimited, difficult to filter and a hodgepodge of reality and illusion. But with training (the same amount needed to teach people how to look up shit in a library, if not the same kind), people can be taught where to look for real, current and relevant facts online.
Thanks for sharing your thoughtful and practical approach!Yet another academic librarian here. While it's too late to stop this, it's not too late to develop a framework for talking to students about AI, and to work with university administration to make sure libraries and librarians have a voice in the conversation.
We've put together a research guide discussing the use of gen AI tools that I refer to when instructors ask me to talk about AI. I usually focus on a few things:
I don't tell students not to use the tools, but I do try to make them aware of the issues so they can make an informed decision.
- What gen AI is and is not (ChatGPT is a large language model)
- What gen AI is good at (brainstorming, translation, grammar checking, formatting)
- What it's not good at (research)
- Ethical issues (black box model, privacy, copyright violation, bias, environmental impact)
- How to properly cite gen AI tools
I'm part of a group in our library that's testing the AI search tools being offered by our LMS vendor. It's been interesting; the things I'm noticing most is that it will "massage" queries to search based on what it thinks the user is asking. Sometimes this makes sense. Other times it's wildly off-base and pulls sources that aren't actually related. I think it's not close to ready for prime time and it will be detrimental for students, especially undergrads who are new to academic research, but we'll see.
I'm excited about AI in education; learning indeed depends on the effort we put in. But tools like https://edubrain.ai/ can actually help us better understand complex subjects. They offer step-by-step explanations of many subjects, making it easier to understand complex topics. It's not about skipping work; it's about getting the support you need to learn more effectively.