alt.hn

3/1/2026 at 7:35:03 AM

10-202: Introduction to Modern AI (CMU)

https://modernaicourse.org

by vismit2000

3/1/2026 at 10:17:21 AM

> AI Policy for the AI Course

“ Students are permitted to use AI assistants for all homework and programming assignments (especially as a reference for understanding any topics that seem confusing), but we strongly encourage you to complete your final submitted version of your assignment without AI. You cannot use any such assistants, or any external materials, during in-class evaluations (both the homework quizzes and the midterms and final).

The rationale behind this policy is a simple one: AI can be extremely helpful as a learning tool (and to be clear, as an actual implementation tool), but over-reliance on these systems can currently be a detriment to learning in many cases. You absolutely need to learn how to code and do other tasks using AI tools, but turning in AI-generated solutions for the relatively short assignments we give you can (at least in our current experience) ultimately lead to substantially less understanding of the material. The choice is yours on assignments, but we believe that you will ultimately perform much better on the in-class quizzes and exams if you do work through your final submitted homework solutions yourself.”

by aanet

3/1/2026 at 7:19:35 PM

It feels downstream of CMU's "reasonable person principle". They know that people are going to use AI on their homework, but they trust that they want to learn and improve their skills -- and this is good advice for doing so.

I'm somewhat biased because I was involved in a previous, related course. The important takeaways aren't really about gritty debugging of (possibly) large homework assignments, but the high-level overview you get in the process. AI assistance means you could cover more content and build larger, more realistic systems.

An issue in the first iteration of Deep Learning Systems was that every homework built on the previous one, and errors could accumulate in subtle ways that we didn't anticipate. I spent a lot of time bisecting code to find these errors in office hours. It would have been just as educational to diagnose those errors with an LLM. Then students could spend more time implementing cool stuff in CUDA instead of hunting down a subtle bug in their 2d conv backwards pass under time pressure... But I think the breadth and depth of the course was phenomenal, and if courses can go further with AI assistance then it's great.

This new class looks really cool, and Zico is a great teacher.

by ashertrockman

3/1/2026 at 12:15:22 PM

My money is on extraordinarily poor final exam results and/or cheating.

by piker

3/1/2026 at 1:50:58 PM

In my day professors said that you'd never have an AI in your pocket

by khannn

3/1/2026 at 3:36:09 PM

but can I write 5318008 on my AI?

by _joel

3/1/2026 at 2:22:31 PM

True. Why even bother with school anyway?

by piker

3/1/2026 at 12:00:22 PM

This is the way it should be. AI to speed up the understanding process, and one final evaluation without any help to cement the understanding.

by linhns

3/1/2026 at 2:08:10 PM

I don't think the final evaluation is to "cement the understanding" so much as _verify_ that students have taken accountability for their own learning process.

by topherhunt

3/1/2026 at 2:28:00 PM

^ This

This is what a student, who truly wants to learn rather than simply complete a course / certification, would do... Use AI tools to explain + learn, but not outsource the learning process itself to the tools.

by aanet

3/1/2026 at 6:46:04 PM

> AI to speed up the understanding process

What’s your hypothesis of how AI can accelerate how your brain understands something?

by andsoitis

3/1/2026 at 8:43:23 PM

I have some success with this method: I try to write an explanation of something, then ask the LLM to find problems with the explanation. Sometimes its response leads me to shore up my understanding. Other times its answer doesn’t make sense to me and we dig into why. Whether or not the LLM is correct, it helps me clarify my own learning. It’s basically rubber duck debugging for my brain.

by wrs

3/1/2026 at 6:50:42 PM

Quick, easy access to explanations and examples on complex topics.

In my case, learning enough trig and linear algebra to be useful in game engine programming / rendering has been made a lot easier / more efficient.

The same way Google or Wikipedia enables learning.

by allthetime

3/1/2026 at 5:39:05 PM

I disagree. I think we should treat AI tools like calculators for the exam.

by _345

3/1/2026 at 5:12:04 PM

I'm a little annoyed that 'modern AI' refers here only on LLMs, modern AI is way bigger than that.

Having said that, it's probably a good course, CMU courses are often great.

I was just expecting way more sota models in many fields due to the title.

If someone has this kind of ressource I would be extremely interested!

by somethingsome

3/1/2026 at 6:26:37 PM

I started doing the free version of the course a few days ago - the lessons are excellent but what is even better are the homework tasks which allows me to run my tests locally!

It's sometimes easy to just listen and understand, but be unable to write the code myself - having this coding homework task has really helped me solidify this new knowledge.

10/10 would recommend

by neriymus

3/1/2026 at 10:20:16 AM

Do you think this is a good course? Or, what do you suggest as a structured course to learn how LLMs work?

by gabrieledarrigo

3/1/2026 at 3:37:07 PM

I hope the instructor will publish a textbook to support and accompany the course, will buy in a heartbeat.

by teleforce

3/1/2026 at 12:10:18 PM

Can't wait for postmodern AI.

by mold_aid

3/1/2026 at 2:14:51 PM

How to flip burgers better than an AI robot!

by blackoil

3/1/2026 at 2:29:50 PM

:) Too true

But tbh, it'll more likely be repairing those burger flippin' robots

by aanet

3/1/2026 at 9:43:26 AM

Nothing on symbolic reasoning ?

by sim04ful

3/1/2026 at 9:54:38 AM

I believe that would be part of whats now "classical ai"

by cultofmetatron

3/1/2026 at 2:23:12 PM

It's called GOFAI, or not AI at all. It's basically all machine learning nowadays.

by cubefox

3/1/2026 at 11:18:56 AM

that would be the exact opposite of modern

by xdavidliu

3/1/2026 at 11:29:31 AM

No. That will be covered by the Post-modern AI course in the fall semester.

by chvid

3/1/2026 at 3:44:59 PM

That's not AI.

by leonvoss

3/1/2026 at 7:46:36 PM

Why not? It was called AI at the time.

by DonaldFisk

3/1/2026 at 2:16:48 PM

thanks for sharing, these look great.

by frankdenbow

3/1/2026 at 10:38:37 AM

Nice to finally see the revival of Lisp and Prolog.

by aboardRat4

3/1/2026 at 7:54:00 PM

Sadly, not part of this course, though Lisp and Prolog are very useful for other things. C's fine for building neural networks from scratch, and you can glue different subsystems together to make anything more complex than that using Python.

by DonaldFisk

3/1/2026 at 5:02:24 PM

Lisp and Prolog never really "vived" nor were they ever really gone/dead. So they can't be revived. They've always been there, in the background, in their niche. As they always will.

by hearsathought

3/1/2026 at 1:50:22 PM

prolog in another skin is called erlang you know.

by signa11

3/1/2026 at 8:19:13 AM

[flagged]

by emil-lp

3/1/2026 at 8:53:55 AM

Well it's the dominant and most successful implemented AI, would a comp sci course teach every failed computer architecture or focus on the ones that are in wide use today.

by small_model

3/1/2026 at 12:52:36 PM

Your analogy to computer architectures doesn't make sense, unless comparing GPT-like LLMs to different LLM architectures like Mamba or RWKV. It indeed wouldn't make sense to not teach about Mamba or RWKV in an introductory AI or LLM course.

AI is much broader than LLMs alone. Computer vision, RL, classical ML, recommender systems, speech recognition, ... are still part of AI, just not very visible to the average consumer.

by gield

3/1/2026 at 1:31:53 PM

> most successful implemented AI

According to what? Spent money? Number of users? Outcomes and if so which ones?

by utopiah

3/1/2026 at 1:49:49 PM

probably according to marketing and not limited to hallucination

by boredemployee

3/1/2026 at 9:09:15 AM

I think comp sci courses focuse on fundamentals rather than what's popular. Besides, other kinds of AI are not "failures", they have plenty of uses.

by suddenlybananas

3/1/2026 at 9:14:59 AM

Don't trip over words. The course offers quite a range of knowledge that is suitable outside LLMs. It's an introduction.

by smokel

3/1/2026 at 8:36:57 AM

It really depends on the target audience, because a lot of people have no idea what they are using is called an LLM or that there are various types of generative AI.

by axseem

3/1/2026 at 8:47:59 AM

I think the problem is the under representation of other branches of AI research: knowledge representation, automated reasoning, planning, etc.

These are important topics with important industrial applications which have the only downsides to not be suitable for implementing friendly chatbots and for raising the stocks of Silicon Valley companies.

by gignico

3/1/2026 at 9:56:45 AM

I doubt renowned US universities don't offer courses that cover those topics.

As someone who studied in a university system where the courses you had to take were mostly set in stone (just starting to offer some electives now), I really fancy the option of being able to choose what you study as much as possible.

The AI course I took was mostly symbolic methods and some classic ML at the end. Most students were not interested at all and would've probably been more engaged studying ML directly. Too bad that wasn't an option.

by Kaethar

3/1/2026 at 2:25:13 PM

This is a perfectly reasonable take. It's quite outrageous that this was flagged.

by cubefox

3/1/2026 at 9:24:10 AM

[dead]

by jccx70