The threat Artificial Intelligence (AI) poses to white-collar professions seemed to escalate earlier this week as ChatGPT, a text-generating AI crafted by an Elon Musk-founded company, expanded on bot capabilities which two years ago wrote an op-ed for the Guardian. It is not just journalists under threat, however: academics recounted on Twitter how they fed exam questions into the chatbot, prompting it to produce answers that would merit decent marks, and which would be difficult, if not impossible, to distinguish from work submitted by a student.
Having signed up to test the software myself, I would assess its output roughly at the level of Zoe Williams, or maybe at a push Owen Jones – it is not yet quite at the level of a Polly Toynbee. Yet if ChatGPT only threatened the jobs of Guardian op-ed writers and the increasingly prolific essay mills that produce undergraduate essays to order, there would be little to worry about. What is more worrying about ChatGPT is what it tells us about the degraded state of academic teaching and the orthodoxies that dominate its hallowed corridors.
Some researchers have already identified the latent political bias of ChatGPT, with its ‘opinions’ clearly lying on the progressive political Left, reflecting what a sanitised Internet policed by content moderators and hate speech laws looks like. That culling bien-pensant opinion from the web would be indistinguishable from the work of undergraduates indicates not only the general lack of original or independent thinking in universities, but also the actual process of education itself.
Long before ChatGPT, academics have been happily collaborating in a management-led process of steadily automating education. We record and upload lectures, agree to provide hand-outs with every session, and provide multiple lecture slides with gobbets of information delivered by an endless list of bullet points.
Often, we are even required to directly provide individual items of readings with the reading list, presumably in order to save the students the trouble of having to go to the (online) library in order to download the material themselves, and risk perhaps stumbling across another possibly relevant and interesting item to read. Here, the lecturer is already reduced to being the increasingly redundant and generic adjunct for an automated system of online provision — and this is without even mentioning the ever-more censorious and dogmatic atmosphere on many campuses.
That ChatGPT would blow apart university assessment reflects marking processes that are increasingly weighted toward coursework in place of more demanding and stressful exams. Many universities offer students several resubmission opportunities should they fail. That is to say nothing of the increasing reliance on multiple choice quizzes that are already marked automatically, or the mania in the humanities and social sciences for alternative forms of assessment.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeThey didn’t see it at the time, but ‘Academics’ enthusiastic embrace of lockdown’ was literally, turkeys enthusiastically embracing Christmas.
Turkeys voting that the other turkeys participate in the Christmas Dinner. (wile they get to loung off in some Spanish Oak Forest eating acorns all day and taking dust baths)
They were both amazingly self serving, and self harming at once.
Blah Blah Blah. We heard all this around 2010: with the ludicrous notion that “MOOCs” (yuck) would do away with the vast majority of on campus university courses within the decade. Don’t buy a used car from the cheerleaders making that prediction- a large number of whom are now rubbing their hands together with glee at the thought of mass redundancies amongst HE staff due to GPT3. I can let you into a little secret- as you obviously aren’t aware of it. Students want the ‘on campus’ experience- there are more 17-25 year olds (in 2023) wanting to go to university in the West than there ever have been. A number of whom will be disappointed as places won’t be available. And chatopenai? Its a great artificial brainstorming friend- I used it for both a module descriptor and a proposal for a special journal issue. It was about 70% useful so needed tweaking and editing (which only someone with core knowledge could achieve) but it saved a lot of time. As for students using it- let me introduce you to the in class timed essay: invigilated! And to the STEM bloke- chatopenai writes excellent python and passable R. So put that in your pipe and smoke. You people…
Blah Blah Blah. We heard all this around 2010: with the ludicrous notion that “MOOCs” (yuck) would do away with the vast majority of on campus university courses within the decade. Don’t buy a used car from the cheerleaders making that prediction- a large number of whom are now rubbing their hands together with glee at the thought of mass redundancies amongst HE staff due to GPT3. I can let you into a little secret- as you obviously aren’t aware of it. Students want the ‘on campus’ experience- there are more 17-25 year olds (in 2023) wanting to go to university in the West than there ever have been. A number of whom will be disappointed as places won’t be available. And chatopenai? Its a great artificial brainstorming friend- I used it for both a module descriptor and a proposal for a special journal issue. It was about 70% useful so needed tweaking and editing (which only someone with core knowledge could achieve) but it saved a lot of time. As for students using it- let me introduce you to the in class timed essay: invigilated! And to the STEM bloke- chatopenai writes excellent python and passable R. So put that in your pipe and smoke. You people…
Turkeys voting that the other turkeys participate in the Christmas Dinner. (wile they get to loung off in some Spanish Oak Forest eating acorns all day and taking dust baths)
They were both amazingly self serving, and self harming at once.
They didn’t see it at the time, but ‘Academics’ enthusiastic embrace of lockdown’ was literally, turkeys enthusiastically embracing Christmas.
“…Having signed up to test the software myself, I would assess its output roughly at the level of Zoe Williams, or maybe at a push Owen Jones – it is not yet quite at the level of a Polly Toynbee…”
Zinger! So, not even close to passing the Turing test, so nothing to worry about for the next few decades then.
Unherd is loaded up with ex-Guardian writers; in fact I spotted at least 3 bot-written essays here in the last month.
Unherd is loaded up with ex-Guardian writers; in fact I spotted at least 3 bot-written essays here in the last month.
“…Having signed up to test the software myself, I would assess its output roughly at the level of Zoe Williams, or maybe at a push Owen Jones – it is not yet quite at the level of a Polly Toynbee…”
Zinger! So, not even close to passing the Turing test, so nothing to worry about for the next few decades then.
I think it’s worth acknowledging some hard truths about the university system, and I mean the system that existed before it became irredeemably woke.
Most academics are, at best, mediocre teachers. Many are utterly incompetent teachers. This is mainly because, unless they work at a small, liberal arts college (in the US system) where teaching is valued, they receive little professional recognition for teaching, and teaching does little to increase their chance of tenure. Also, few academics are ever taught how to teach. They’re supposed to absorb the skill by osmosis.
My background is in science and law. In science, a faculty member lives or dies on their grant funding. Most teaching is delegated to graduate students or to the most junior, tenure-track faculty. There is an increasing trend to use adjuncts (non-tenured, part-time faculty) for teaching. In law, the situation is slightly different. More recognition is given to teaching, but the professors usually try to use the so-called Socratic Method which is probably the most difficult way to teach effectively.
In my opinion, most science and engineering students could effectively learn on-line and in small groups where they come together to do problem sets. They will, of course, have to attend lab/practical sessions in person.
All this hand-wringing about academic teachers being displaced by AI seems to originate from the arts/social sciences. Those are also the parts of the higher education system that give rise to most of the progressive disease that’s destroying universities. I won’t air the usual grievances about the wokeification of academia, but I do believe a massive shakeup in higher ed is required to address this trend.
I note the author seemed mildly peeved that “We record and upload lectures, agree to provide hand-outs with every session, and provide multiple lecture slides with gobbets of information delivered by an endless list of bullet points.” What’s so wrong with that? Why not provide students with a written guide for the course material, especially if the instructor is not especially competent in the classroom?
This AI “controversy” is a storm in a very tepid teacup.
Some valid points.
Rather the opposite J
This AI controversy is more the canary laying on its back in the coal mine.
And you being from Science and Law I point out (haha, sorry) – there is this thing of humanity having a soul that lays at the utter heart of humanity. AI teaching young to become adult humans does not nurture that, but I would guess instead reduces that.
Then there is the whole ‘Twitter’ scandal. If the head of a university can just hit a few keystrokes and shadow ban all the AI direction he does not like, and bring to the fore what he does like, then he molds the generation.
I know the wicked and pernicious philosophy of postmodernism has captured the Liberal Professorships, and so they are warping the minds of their students – but how more dangerous if this can be directed by the ‘State Dept. Of Education’ totally – 1984. Once the Education industry can set its AI Teaching to what ever it wishes then it will exercise 100% control. Now in Liberal Arts the twisted Left control is high, but not absolute as AI managed from the top would be.
I taught at a very large American public university and the teaching skills of our graduate students were actually very carefully fostered – not only because they taught many of our beginning language classes (from which our majors were recruited – not to mention being our own future as a department) but also with a view to their professional futures. But you are right that teaching was not valued as much as publishing a book when it came to pay rises.
J Bryant, there is nothing wrong with all the handouts, but couldn’t you discern that the writer’s point referred to the disappearing intellectually passionate student, hungrily seeking answers to the burning questions.
Some valid points.
Rather the opposite J
This AI controversy is more the canary laying on its back in the coal mine.
And you being from Science and Law I point out (haha, sorry) – there is this thing of humanity having a soul that lays at the utter heart of humanity. AI teaching young to become adult humans does not nurture that, but I would guess instead reduces that.
Then there is the whole ‘Twitter’ scandal. If the head of a university can just hit a few keystrokes and shadow ban all the AI direction he does not like, and bring to the fore what he does like, then he molds the generation.
I know the wicked and pernicious philosophy of postmodernism has captured the Liberal Professorships, and so they are warping the minds of their students – but how more dangerous if this can be directed by the ‘State Dept. Of Education’ totally – 1984. Once the Education industry can set its AI Teaching to what ever it wishes then it will exercise 100% control. Now in Liberal Arts the twisted Left control is high, but not absolute as AI managed from the top would be.
I taught at a very large American public university and the teaching skills of our graduate students were actually very carefully fostered – not only because they taught many of our beginning language classes (from which our majors were recruited – not to mention being our own future as a department) but also with a view to their professional futures. But you are right that teaching was not valued as much as publishing a book when it came to pay rises.
J Bryant, there is nothing wrong with all the handouts, but couldn’t you discern that the writer’s point referred to the disappearing intellectually passionate student, hungrily seeking answers to the burning questions.
I think it’s worth acknowledging some hard truths about the university system, and I mean the system that existed before it became irredeemably woke.
Most academics are, at best, mediocre teachers. Many are utterly incompetent teachers. This is mainly because, unless they work at a small, liberal arts college (in the US system) where teaching is valued, they receive little professional recognition for teaching, and teaching does little to increase their chance of tenure. Also, few academics are ever taught how to teach. They’re supposed to absorb the skill by osmosis.
My background is in science and law. In science, a faculty member lives or dies on their grant funding. Most teaching is delegated to graduate students or to the most junior, tenure-track faculty. There is an increasing trend to use adjuncts (non-tenured, part-time faculty) for teaching. In law, the situation is slightly different. More recognition is given to teaching, but the professors usually try to use the so-called Socratic Method which is probably the most difficult way to teach effectively.
In my opinion, most science and engineering students could effectively learn on-line and in small groups where they come together to do problem sets. They will, of course, have to attend lab/practical sessions in person.
All this hand-wringing about academic teachers being displaced by AI seems to originate from the arts/social sciences. Those are also the parts of the higher education system that give rise to most of the progressive disease that’s destroying universities. I won’t air the usual grievances about the wokeification of academia, but I do believe a massive shakeup in higher ed is required to address this trend.
I note the author seemed mildly peeved that “We record and upload lectures, agree to provide hand-outs with every session, and provide multiple lecture slides with gobbets of information delivered by an endless list of bullet points.” What’s so wrong with that? Why not provide students with a written guide for the course material, especially if the instructor is not especially competent in the classroom?
This AI “controversy” is a storm in a very tepid teacup.
All the AI system is doing is writing what it calculates that we want to hear. Perhaps the best test for students would be that they learn to correct and critique the output of the AI system, and perhaps also the output of the lecturers themselves.
That would simply create feedback pipelines, where AI systems rapidly improve their output so it climbs ever higher up the quality chain – not that that isn’t going to happen anyway.
An excellent idea, unfortunately critiquing your lecturers is the preserve of a PHD, possibly a Masters but unlikely
Biting the hand that feeds, or at least hands out marks, might be fun but doesn’t get you that degree your paying for
That would simply create feedback pipelines, where AI systems rapidly improve their output so it climbs ever higher up the quality chain – not that that isn’t going to happen anyway.
An excellent idea, unfortunately critiquing your lecturers is the preserve of a PHD, possibly a Masters but unlikely
Biting the hand that feeds, or at least hands out marks, might be fun but doesn’t get you that degree your paying for
All the AI system is doing is writing what it calculates that we want to hear. Perhaps the best test for students would be that they learn to correct and critique the output of the AI system, and perhaps also the output of the lecturers themselves.
This appears to be another step in the demolition of the University, at least properly understood. It would make students mere tutees of machines, and that strikes me as a rather horrible thing. It ought to reduce university fees, of course, but at what cost?
In my recent experience as a faculty member at a large state university, I have seen more and more emphasis on fashionable social ideas. The emphasis is now expressed in invitations to voluntarily attend workshops on “pronouns” and other progressive fixations. I expect these workshops to become compulsory, followed by statements required of the faculty in support of whatever leftist fashion the administration has taken as the current received wisdom. Perhaps this sort of thing could be handled by both the university and faculty by AI. Maybe that would be the silver lining. But then, what would all of the Diversity Officers do with the time thus freed up? What would they come up with next?
They could start by renaming themselves Convergence Officers.
University faculty seem to think that they can abandon their core responsibilities of fostering critical thinking and open debate with impunity. Their standing with the public is rapidly declining and there will be a reckoning.
I’ve noticed this trend too. When we hold Zoom classes, half the students have their pronouns listed next to their names. When asked why I don’t have mine, I just respond with “I refuse to be an unwilling participant in someone’s transsexual fetish.” The pronoun thing is a form of coercion and those pushing it should be reminded of Title IX harassment policy.
”But then, what would all of the Diversity Officers do with the time thus freed up? What would they come up with next?”
I would imagine that answer could be found in something from Orwell’s writings.
They could start by renaming themselves Convergence Officers.
University faculty seem to think that they can abandon their core responsibilities of fostering critical thinking and open debate with impunity. Their standing with the public is rapidly declining and there will be a reckoning.
I’ve noticed this trend too. When we hold Zoom classes, half the students have their pronouns listed next to their names. When asked why I don’t have mine, I just respond with “I refuse to be an unwilling participant in someone’s transsexual fetish.” The pronoun thing is a form of coercion and those pushing it should be reminded of Title IX harassment policy.
”But then, what would all of the Diversity Officers do with the time thus freed up? What would they come up with next?”
I would imagine that answer could be found in something from Orwell’s writings.
This appears to be another step in the demolition of the University, at least properly understood. It would make students mere tutees of machines, and that strikes me as a rather horrible thing. It ought to reduce university fees, of course, but at what cost?
In my recent experience as a faculty member at a large state university, I have seen more and more emphasis on fashionable social ideas. The emphasis is now expressed in invitations to voluntarily attend workshops on “pronouns” and other progressive fixations. I expect these workshops to become compulsory, followed by statements required of the faculty in support of whatever leftist fashion the administration has taken as the current received wisdom. Perhaps this sort of thing could be handled by both the university and faculty by AI. Maybe that would be the silver lining. But then, what would all of the Diversity Officers do with the time thus freed up? What would they come up with next?
I think the time has come to pull the plug on tax-payer backed student loans for courses that are not either STEM, professional qualifications or apprenticeships.
I think the time has come to pull the plug on tax-payer backed student loans for courses that are not either STEM, professional qualifications or apprenticeships.
Not to worry: when professors are replaced in the classroom by AI, they can all learn to code.
haha ”Biden tells coal miners to “learn to code””
Or- with the population getting interest in coal as blackouts loom,
”Sunak tells professors, ”Learn to dig coal.””
There’s always Bitcoin mining…..
There’s always Bitcoin mining…..
GPT3 is a bigger threat to coding jobs than it is to universities. It can’t really think but it seems to code quite well
haha ”Biden tells coal miners to “learn to code””
Or- with the population getting interest in coal as blackouts loom,
”Sunak tells professors, ”Learn to dig coal.””
GPT3 is a bigger threat to coding jobs than it is to universities. It can’t really think but it seems to code quite well
Not to worry: when professors are replaced in the classroom by AI, they can all learn to code.
The debate is much wider than academia, there are bots that paint & write music… the fundamental question is the hoary old “what is art”.
for a painting to be hailed as good art requires the painter to have a history, the work is viewed in the context of the artist’s body of work and reputation
a bot would need to create its own persona and a body of work to create a modern masterpiece, at least in the eyes of the art world
The same applies in other arts disciplines, we read a Polly Toynbee article because it’s by Polly Toynbee, (or maybe we deliberately avoid a Polly Toynbee because it’s by her )
at the moment bots can only create one off works, true an art bot can do you a painting in the style of a Picasso… it can do several different ones in the style of … but it can’t creat a distinct body of its own work as it has no persona.
true art is distinguishable by something more subtle, it does require an artist / author / musician to create truly original works
I do agree it’s sad teaching has reached such a low point that it’s a sausage machine turning out high priced degrees… and inevitable that both teachers and students become indistinguishable from computer generated mediocrity …. There’s little or no love for the job on either side of the fence
The debate is much wider than academia, there are bots that paint & write music… the fundamental question is the hoary old “what is art”.
for a painting to be hailed as good art requires the painter to have a history, the work is viewed in the context of the artist’s body of work and reputation
a bot would need to create its own persona and a body of work to create a modern masterpiece, at least in the eyes of the art world
The same applies in other arts disciplines, we read a Polly Toynbee article because it’s by Polly Toynbee, (or maybe we deliberately avoid a Polly Toynbee because it’s by her )
at the moment bots can only create one off works, true an art bot can do you a painting in the style of a Picasso… it can do several different ones in the style of … but it can’t creat a distinct body of its own work as it has no persona.
true art is distinguishable by something more subtle, it does require an artist / author / musician to create truly original works
I do agree it’s sad teaching has reached such a low point that it’s a sausage machine turning out high priced degrees… and inevitable that both teachers and students become indistinguishable from computer generated mediocrity …. There’s little or no love for the job on either side of the fence
About 50 years ago, universities became largely factories for validation of entitlement worth, making someone deserving of attention, consideration, approval as a person, and support – generally in the form of a degree that brought them a good job and high status. Original thinking became a disruption to the smooth working of a standardised system and something that was hard to measure in validation terms, so it was actively discouraged, with predictable results.
Some rare individuals do still manage to find their way to a deeper level of worth of person: that of greater-value worth. There, social status awards are less important than dedication to a domain of deep order.
Timothy Corwen (author, The Worth of a Person)
About 50 years ago, universities became largely factories for validation of entitlement worth, making someone deserving of attention, consideration, approval as a person, and support – generally in the form of a degree that brought them a good job and high status. Original thinking became a disruption to the smooth working of a standardised system and something that was hard to measure in validation terms, so it was actively discouraged, with predictable results.
Some rare individuals do still manage to find their way to a deeper level of worth of person: that of greater-value worth. There, social status awards are less important than dedication to a domain of deep order.
Timothy Corwen (author, The Worth of a Person)
Marking students’ essays, if done properly, is demanding, time-consuming, ill-rewarded and not very satisfying work. Writing multiple choice questions, if done properly, is also demanding. But marking multiple choice questions is a doddle. An admin. assistant can do it under the hairdryer. No wonder multiple choice is popular.
Marking students’ essays, if done properly, is demanding, time-consuming, ill-rewarded and not very satisfying work. Writing multiple choice questions, if done properly, is also demanding. But marking multiple choice questions is a doddle. An admin. assistant can do it under the hairdryer. No wonder multiple choice is popular.
We tested it, over the last few days, writing software routines. It was not expert-level, but it was definitely better-than expected.
We tested it, over the last few days, writing software routines. It was not expert-level, but it was definitely better-than expected.
I follow a blog called AI Weirdness run by Janelle Shane who wrote the book about AI “You look like a thing and I love you”. In her recent post she mentions that there is a website which attempts, using AI, to detect if a piece of text is real or AI generated. It’s easy to use. I’ve given it a few comments on this thread and you’ll be glad to know that they are real. The worst one came out 26% fake 74% real. I won’t say whose comment it was. Anyway you can try it out for yourself here –
https://huggingface.co/openai-detector
The easiest way to use it is to copy and paste the text you want to test into the test window.
I follow a blog called AI Weirdness run by Janelle Shane who wrote the book about AI “You look like a thing and I love you”. In her recent post she mentions that there is a website which attempts, using AI, to detect if a piece of text is real or AI generated. It’s easy to use. I’ve given it a few comments on this thread and you’ll be glad to know that they are real. The worst one came out 26% fake 74% real. I won’t say whose comment it was. Anyway you can try it out for yourself here –
https://huggingface.co/openai-detector
The easiest way to use it is to copy and paste the text you want to test into the test window.
I have this fantasy that the universities become more commercial and more open.
Let anyone attempt their exams (for a fee) whether they attend that university or not.
Make students pay a fee to join an individual lecture or seminar (I’d love to collect fivers from my students as they join my inspirational mathematics classes).
Some universities would still make money from full time students but there would be opportunities for rival private tutors either teaching individuals or classes. Some students would try to learn from textbooks and the internet without any assistance.
I fear that there is too much of the university system that is a self serving cash cow for academics and provides little of value to young people (this exists to a lesser extent in the school system too). That is not to say nothing of value is created.
The trouble is that it’s a bit of a closed shop with young people being told that a university education is the only way into middle class financial security.
I have this fantasy that the universities become more commercial and more open.
Let anyone attempt their exams (for a fee) whether they attend that university or not.
Make students pay a fee to join an individual lecture or seminar (I’d love to collect fivers from my students as they join my inspirational mathematics classes).
Some universities would still make money from full time students but there would be opportunities for rival private tutors either teaching individuals or classes. Some students would try to learn from textbooks and the internet without any assistance.
I fear that there is too much of the university system that is a self serving cash cow for academics and provides little of value to young people (this exists to a lesser extent in the school system too). That is not to say nothing of value is created.
The trouble is that it’s a bit of a closed shop with young people being told that a university education is the only way into middle class financial security.
One question i’d like answered is: if 20 students are posed a particular essay question/topic, and this is fed into the ChatGPT system, won’t 20 identical answers be churned out? In which case, the AI system fails. Or is there a way around this?
Saul D suggests critiquing the AI output, but also a clever student might find a way of re-arranging the output so that its AI transmission goes relatively unnoticed.
The answers will depend on the questions you ask. So slightly differently posed questions will generate potentially very different (but equally valid) answers. All very reminiscent of the Isaac Asimov short story ‘Jokester’.
In which case, the clever student will be the one who can ask AI the questions in such a way that the answer generated best hides its origins. At least that’d cause students to have to think!
In which case, the clever student will be the one who can ask AI the questions in such a way that the answer generated best hides its origins. At least that’d cause students to have to think!
The answers will depend on the questions you ask. So slightly differently posed questions will generate potentially very different (but equally valid) answers. All very reminiscent of the Isaac Asimov short story ‘Jokester’.
One question i’d like answered is: if 20 students are posed a particular essay question/topic, and this is fed into the ChatGPT system, won’t 20 identical answers be churned out? In which case, the AI system fails. Or is there a way around this?
Saul D suggests critiquing the AI output, but also a clever student might find a way of re-arranging the output so that its AI transmission goes relatively unnoticed.
Truly interesting and thought provoking, as are the comments.
As usual.
Truly interesting and thought provoking, as are the comments.
As usual.
I have been lucky enough to have had teachers who were able to respond to my questions even when I lacked the exact words. Good luck getting that kind of interaction from an algorithmic extract from a conglomeration of previous word patterns. AI learning will drive the human thinking faculty into a groove, with the spark of invention left to die. It will suit those people who like things to line up nicely and hate the wild card that is the true power of the human spirit.
I have been lucky enough to have had teachers who were able to respond to my questions even when I lacked the exact words. Good luck getting that kind of interaction from an algorithmic extract from a conglomeration of previous word patterns. AI learning will drive the human thinking faculty into a groove, with the spark of invention left to die. It will suit those people who like things to line up nicely and hate the wild card that is the true power of the human spirit.
University education is a highly profitable service industry in Australia at least, thanks to the proportion of international students seeking undergraduate qualifications. International students’ treatment of highly replaceable academic staff is a taboo subject, so few people outside academia know their struggle trying to preserve the value of university qualifications by providing education. Online teaching – via AI or not – is a logical consequence of putting profit and progressive political ideologies before the quality of education.
University education is a highly profitable service industry in Australia at least, thanks to the proportion of international students seeking undergraduate qualifications. International students’ treatment of highly replaceable academic staff is a taboo subject, so few people outside academia know their struggle trying to preserve the value of university qualifications by providing education. Online teaching – via AI or not – is a logical consequence of putting profit and progressive political ideologies before the quality of education.
Will ChatGPT try to engage in sexual congress with ‘promising’ students? Or will that role at least remain available exclusively to flesh and blood tutors in face to face settings?
If this technology can help improve the university system then I am all in favour of it. The prospect of AI lecturers should be a cause for excitement not fear.
That surely depends on if you are a (human) programmer, as opposed to a (human) lecturer?
There is a problem in AI, and I believe Satan is lurking in there. This is a thing created by man, who is very sinful, not God.
I have seen some AI art and in it I always seem to notice something off – some unsettling quality. I listen to Yuval Noah Harari and I think him and Gates and Schwab, and even Biden, are owned by the darkness.
In the old days Satan was called ‘Lord of the flies’, I worry he is now ‘Lord of the AI’.
I don’t see how AI lecturing would work. I’m far from being an expert on AI but it seems to me that they all work in a kind of passive way. You have to prompt them to do something such as “make a picture on this theme” or “translate this” or “write 1000 words on this subject”. Lecturing is the opposite of that. I can’t imagine what would happen if you said to an AI “Teach me maths”.
I can see a place for AI in learning assistance. A few years ago I did an Open University course on maths and physics. In addition to the usual tutorials and exercises they had a system of computerised assessment where you’d be asked to answer a series of questions. If you answered wrongly it gave a hint. You could ask it to give you a similar question and you could keep going round the loop as many times as you liked. It was pretty crude but brilliant at drumming in basic stuff such as how to solve a particular kind of maths problem or learning about the features of different kinds of galaxy. I can certainly see a role for AI in this respect. I’m not sure how it would work out with subjects like history or literature.
That surely depends on if you are a (human) programmer, as opposed to a (human) lecturer?
There is a problem in AI, and I believe Satan is lurking in there. This is a thing created by man, who is very sinful, not God.
I have seen some AI art and in it I always seem to notice something off – some unsettling quality. I listen to Yuval Noah Harari and I think him and Gates and Schwab, and even Biden, are owned by the darkness.
In the old days Satan was called ‘Lord of the flies’, I worry he is now ‘Lord of the AI’.
I don’t see how AI lecturing would work. I’m far from being an expert on AI but it seems to me that they all work in a kind of passive way. You have to prompt them to do something such as “make a picture on this theme” or “translate this” or “write 1000 words on this subject”. Lecturing is the opposite of that. I can’t imagine what would happen if you said to an AI “Teach me maths”.
I can see a place for AI in learning assistance. A few years ago I did an Open University course on maths and physics. In addition to the usual tutorials and exercises they had a system of computerised assessment where you’d be asked to answer a series of questions. If you answered wrongly it gave a hint. You could ask it to give you a similar question and you could keep going round the loop as many times as you liked. It was pretty crude but brilliant at drumming in basic stuff such as how to solve a particular kind of maths problem or learning about the features of different kinds of galaxy. I can certainly see a role for AI in this respect. I’m not sure how it would work out with subjects like history or literature.
If this technology can help improve the university system then I am all in favour of it. The prospect of AI lecturers should be a cause for excitement not fear.