Of course, generative AI can go off the rails and provide incorrect information, which is a problem when you’re trying to learn things. However, Google says it has worked with education firms like The Princeton Review to ensure the AI-generated tests resemble what students will see in the real deal.
It's also critically important that the AI doesn't give them direct answers, but points to useable resources that contain those answers.The best way to let young people use AI is to make their interactions with it resemble homework or studying.
If a kid asks ChatGPT about a homework question, they need to answer a bunch of Socratic questions to test their understanding first.
So there's a risk of the information being incorrect, and their answer to that is that they've worked to make the incorrect information closely resemble the real test?Of course, generative AI can go off the rails and provide incorrect information, which is a problem when you’re trying to learn things. However, Google says it has worked with education firms like The Princeton Review to ensure the AI-generated tests resemble what students will see in the real deal.
Meanwhile, the AI companies tell us that AI will definitely create jobs.Ryan Whitwam said:The addition of this feature to Gemini for all users will likely accelerate declines in test prep and tutoring services.
The problem with all this is that these standardized admissions tests are coachable in the first place. It pretty much defeats the entire reason for their use.
Yeah, you sassed out the non-answer of their response, or rather, how the responded to the questions "is it accurate?" with "it closely resembles the actual test," instead of "the answers it gives are correct."So there's a risk of the information being incorrect, and their answer to that is that they've worked to make the incorrect information closely resemble the real test?
That's actually worse. Thanks, I hate it.
Yes and no: For most purposes, people are going to be able to prepare themselves for intellectual work they know they're going to perform and good chunk of test coaching is meant to help with the 'unique' pressures of standardized testing that do not reflect real world performance or conditions (such as that heightened time pressure, only standardized tests are designed to not give everyone enough time to answer everything they can).The problem with all this is that these standardized admissions tests are coachable in the first place. It pretty much defeats the entire reason for their use.
A lot of the AI hate on Ars derives from the ethical issues with slurping up training data without consent and the environmental effects of all these processes. That, plus whether this can actually be made to function as a business at the levels of investment it's getting or if it'll crater and depress the economy. And in the meantime, it's causing lots of supply chain issues while the datacenters monopolize hardware.Despite all the AI hate on Ars, Chat GPT and Gemini allowed me to help my high-school student daughter pass college-level calc-based physics, from the same college I graduated from years ago (and barely remembered how to do)
There really isn't anything in modern life that has less of an environmental impact than AI, yet the Ars commentariat is strangely silent on those other things, just because of anti-AI vibes.the environmental effects of all these processes
I can't comment on the SAT specifically since I didn't do my undergrad in the US, but I did have to write the GRE to apply for grad school. In that case, getting a good score requires knowing how the test works. The math questions in particular take too long for anyone to solve in the time allotted, so you have to quickly estimate and then pick the best answer. It doesn't tell you that, and that's unlike any test I've ever written in school. Without paying for prep, you're not going to know that, and so you're going to do worse than people who do pay. It's predatory bullshit.They are coachable, but you can't coach someone from an 800 to a 1500 without them having actually learned what they are being tested on. Coaching works when the issue is mostly test-taking strategy, time management, and so on.
No, it isn't. In general, they speak out on them when there's actually an article about them.There really isn't anything in modern life that has less of an environmental impact than AI, yet the Ars commentariat is strangely silent on those other things, just because of anti-AI vibes.
I think you can see for yourself that is not the case. Go to any iPhone thread, check the comments, count up the comments that mention the GHG emissions embodied by a year of making iPhones. You will find none. Anti-AI environmental discourse is a hysteria that was 100% manufactured by the press.No, it isn't. In general, they speak out on them when there's actually an article about them.
That's how comment sections work, you know?
Don't forget it encourages people to commit suicide (even those who express misgivings) and reinforces delusions. Oh it also allows for generating sexually explicit pics without consent, squeezes creatives out of their labor, allows for mass production of fake news / propaganda (see white house posting a genAI alteration of a person being arrested the other day). And it still thinks there's 2 'r' in raspberry even after hundreds of billion investments.A lot of the AI hate on Ars derives from the ethical issues with slurping up training data without consent and the environmental effects of all these processes. That, plus whether this can actually be made to function as a business at the levels of investment it's getting or if it'll crater and depress the economy. And in the meantime, it's causing lots of supply chain issues while the datacenters monopolize hardware.
I'm glad you helped your daughter, but it doesn't really change that.
The Khan academy questions are definitely well-aligned. I used to teach a College Prep course, which included both ACT and SAT prep as part of the material, and the Khan Academy prep materials have been the best freely available practice for quite awhile now.Just leaving this here, that free SAT prep is already available from for example Khan Academy. These are officially supposed to be aligned with the exam and likely has human-examined content.
https://www.khanacademy.org/digital-sat
They are coachable, but you can't coach someone from an 800 to a 1500 without them having actually learned what they are being tested on. Coaching works when the issue is mostly test-taking strategy, time management, and so on.
Remind me, when was the last time they reopened coal plants so iPhones can be made?I think you can see for yourself that is not the case. Go to any iPhone thread, check the comments, count up the comments that mention the GHG emissions embodied by a year of making iPhones. You will find none. Anti-AI environmental discourse is a hysteria that was 100% manufactured by the press.
Yes and no: For most purposes, people are going to be able to prepare themselves for intellectual work they know they're going to perform and good chunk of test coaching is meant to help with the 'unique' pressures of standardized testing that do not reflect real world performance or conditions (such as that heightened time pressure, only standardized tests are designed to not give everyone enough time to answer everything they can).
The problem is that not everyone is going to have equal access to test coaching. Not only is that unfair to students who can't afford better coaching, but since coaching is paid for personally, it's probably less equal than what you're going to have access to in college and in the workplace.
Producing an iPhone takes about 200 kWh which is equivalent to about 60,000 chatGPT queries (assuming about 0.0003 kwh per query).Remind me, when was the last time they reopened coal plants so iPhones can be made?
You're reaching so hard.
The fact that you are aware of that, but unaware of the carbon content of an iPhone due to coal power in Asia, is exactly the willful, curated ignorance I was referring to. I could not have made my own argument any better than you did.Remind me, when was the last time they reopened coal plants so iPhones can be made?
You're reaching so hard.
So effectively a rounding errorProducing an iPhone takes about 200 kWh which is equivalent to about 60,000 chatGPT queries (assuming about 0.0003 kwh per query).
My wife is an english as a second language teacher (for adults). She now spends half her time giving her students F grades for cheating and using ChatGPT for their assignments. It's an insanely stupid thing to do.But we didn't go that route, and here we are, in serious trouble re: language skills. AI can do writing. Speaking is another way language engages the mind.
Except what is the power it takes to do the training?Producing an iPhone takes about 200 kWh which is equivalent to about 60,000 chatGPT queries (assuming about 0.0003 kwh per query).
Meanwhile in Korea:My wife is an english as a second language teacher (for adults). She now spends half her time giving her students F grades for cheating and using ChatGPT for their assignments. It's an insanely stupid thing to do.
Yup. It's beyond farcical in a blind self-destructive wayWhy bother learning when the LLM can do all of your coursework?
Oh, I'm aware. I'm just capable of fairly looking at the difference. The iPhone, once produced, lasts for years while fulfilling a crucial need for a phone in a modern society. As has been pointed out above, the AI queries are equivalent to producing thousands more iPhones than where made in a year, and their output in general produces nothing that lasts.The fact that you are aware of that, but unaware of the carbon content of an iPhone due to coal power in Asia, is exactly the willful, curated ignorance I was referring to. I could not have made my own argument any better than you did.
There are over 200 million iPhones made each year. If each phone takes as much energy to make as 60,000 ChatGPT requests, that's over 12 trillion requests worth of energy.So effectively a rounding error
ChatGPT processes 2.5 billion requests a day.
edit: more calcs
This means that ChatGPT with current request numbers uses around 4000 times more energy than all of the iPhones made in a year
But I agree with your second paragraph. The inequity that results from access to quality test prep is the problem that results from the coachability of the tests.
The other day I saw my kid studying for a (9th year Math) test in a similar way.Despite all the AI hate on Ars, Chat GPT and Gemini allowed me to help my high-school student daughter pass college-level calc-based physics, from the same college I graduated from years ago (and barely remembered how to do)
Many a time I was unable to figure out the homework without the help. It may feel like cheating by dropping whole questions into the prompt. But if the goal is to learn and not just to cheat, it's invaluable. It never just spits out an answer - always the steps. By asking clarification and followup questions we were able to actually understand the problems and how to attack/solve them.
She pulled a B on the hand-written final, so it certainly works.
Sure it's easy to cheat, but if you can get the proper mindset and goals into the students' heads, it's great.
My wife is an english as a second language teacher (for adults). She now spends half her time giving her students F grades for cheating and using ChatGPT for their assignments. It's an insanely stupid thing to do.