Hacker News new | past | comments | ask | show | jobs | submit login
CS188 Intro to AI – Course Materials (ai.berkeley.edu)
506 points by BucketSort on June 13, 2016 | hide | past | favorite | 42 comments



Lecture videos for MIT's 6.034 (Artificial Intelligence) are also freely available online:

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

6.034 emphasizes intuition over math, which might make it an easier alternative for those without a stats or calculus background. Also Prof. Winston is a phenomenal speaker.


Lecture 15 from it is my favourite lecture I have ever watched online. Stick with it until the end. And I totally agree Prof. Winston is an incredible speaker.

https://www.youtube.com/watch?v=sh3EPjhhd40


Applying Winston's Star method on itself. Meta-star if you wish:

Symbol: star is the symbol of this method, it makes the idea visual and memorable

Slogan: With these 5-star tips, you can 5x the impact of your good ideas

Surprise: You can achieve fame and impact by packaging your ideas better following these simple presentation tips

Salient: Having a good presentation is as important as having good ideas/work

Story: These presentation tips are told by a top MIT prof to his undergraduate AI class as secrets to career success


Lecture was phenomenal - thanks for suggesting it. Re: meta-star, I thought the surprise was providing a career/success framework in an AI course. That made it stick out. It was a nice "reveal".


It really blew my mind when I first watched it. A fair amount of the course went over my head so I had been randomly skipping around the series but inted to go back chronologically


Winston is, indeed, an amazing lecturer. And he has extremely interesting side comments.

In one of his lectures there he pointed out that research papers with a single good idea are more likely to succeed (be cited and discussed) than a paper with multiple good ideas. After thinking about it for a while, I realized that this was the reason some of my personal projects failed to gain traction. It's a frustrating realization, but at least he has some concrete, easy to implement advice on how to overcome this issue.

His comments on the history of AI field are also very interesting.


This is a popular undergrad course at Berkeley. To be clear, it's mostly focused on GOFAI (good old fashioned AI) - things like tree/graph search, constraint satisfaction, logic, some basic graphical models, and a bit of RL/ML at the end. It's useful to know about (because these things sometimes show up as components of ML systems), but also fairly disjoint from the core machine learning field that people are undoubtedly interested in.


That's also called Symbolic AI I believe. And there are some modern developments in this branch. I personally enjoy this more than ML - too much statistics for me - I tend to prefer a and b over 0.00789 and 1.4500965 :)


A more recent iteration of this course from Spring 2015 with better quality all around and the entire playlist of lectures: https://www.youtube.com/playlist?list=PL-XXv-cvA_iA4YSaTMfF_...


This is an awesome resource, and exactly what I've been wanting!

I've been thinking about AI a lot lately, but I skipped college and went straight into startups, so all my knowledge on the subject is from reading less focused materials. Reminds me of Stanford's iOS dev resources from 4-5 years ago.

[EDIT] oh god math, why can I program but I can't math unless it's trigonometry or vectors/calculus, visually applied math makes sense, otherwise I'm so lost.


Try to think of the math as weird notation for code.

For me, it really helped to be able to think of mathematical ideas in terms of code–as abstractions I could define and use in programs. I got this view by learning and using Haskell, and I do think Haskell is better-suited for this than other languages, but it's applicable to anything: think about how you would phrase the relevant math in your code, using whatever abstraction facilities you're comfortable with.

I actually took this class a few years back. There is certainly some math, but it's all math that's transferable to code in a reasonably natural way. (In fact, that's roughly what the small projects/homework assignments entail!) Doing the assignments while paying attention to the abstractions you use in your code is going to be a great way to get over the math hurdle.


> Reminds me of Stanford's iOS dev resources from 4-5 years ago.

Believe it or not, they still release a course every year (or semester, not sure). The latest one covers iOS 9 using Swift. You can find them all on iTunes U.


Paul Hegarty is such a fantastic instructor of that class... its is hands-down THE resource to get started with iOS dev.


You can do it if you put your mind to it. It just takes a bit of practice. Tackle the little bits you don't understand and eventually you will.


Thanks for the encouragement, you're right, just gotta have the grit.


Try this. Choose a lecture. As you go through it, create a bulleted list of things you don't know. A Google Doc works well.

Then go through the bulleted list in order. During this process, you will also encounter things you don't know. Add these to the top of the list and start from the top anew. As you work through bulleted items, mark them off or move them to a "complete" list.

What you are creating is a list of things you need to understand in the order you need to understand them. It guides your investigation and makes it all more manageable.

Bonus: Beneath the list, create an in-order set of notes pertaining to the items on your list. This is like a personalized set of lecture notes.


Such a great idea!


Very good idea.


Take a functional programming class. Martin Odersky on Coursera now has a while track. It's Scale. If that matters. Functional programming is the blood-brain barrier between programming and math with lots of activity bringing the two closer. It may not be directly useful in Machine Learning type of math, but it's a strong foundation for parallel programming and multi-core programming due to the statelessness of pure functions. Good luck!


From personal experience, a functional programming style helps one to reason with the math used in artificial intelligence. This might be why lisp is considered one of the original AI languages. The power of being able to express the networks as purely lists of numbers is amazing in my opinion.

Correct me if I'm wrong but I actually think the reason LISP was created was for AI.


A little nicer format - with edited videos (1st half), HW and programming assignments, on edx: https://courses.edx.org/courses/BerkeleyX/CS188x_1/1T2013/in...


Same course, but this link goes to the course overview page: https://www.edx.org/course/artificial-intelligence-uc-berkel...

Your link took me to an internal page.


After finishing CS188, if you want to "go deeper", check out Berkeley's CS 294: Deep Reinforcement Learning, Fall 2015:

http://rll.berkeley.edu/deeprlcourse/

And you should be well on your way to being able to read and comprehend the Google DeepMind papers posted on Arxiv ;)


Any idea why only 2 out of (seemingly) 4 assignments are posted? Are the other assignments available anywhere?



The first half of this course was previously offered at edx during Fall 2012, Spring 2013, and most recently Spring 2015 (CMIIW). Back in 2013, when I just graduated from high school and waiting for uni to start, I enrolled myself in the Spring 2013 offering and it was a great course. Just note that this is a quite a heavy course to take (there are lectures, homework, and projects); but the project is really fun, you're to make Pac-man more intelligent in its environment :)

From the knowledge gained from the MOOC, later on I made this game http://kenrick95.github.io/c4/demo/ which I implemented a simple AI as the opponent, which is a minimax agent. It is not perfect, but it is good enough for me :)


I took a later version of this class and I couldn't get into it. I thought I'd love AI, as the idea is extremely appealing to me, but it seemed to be mostly probability and formulaic work. Which is fine, if that's the way AI is, but I would've preferred if they'd explained why the algorithms chosen are what they are, and how we got to them--and what other methods have been explored.

As it is, the class just skims over a variety of techniques and ways to implement them, lacking depth (such as what disadvantages and advantages there are in practice or where they have been used in the real world).


This sounds like exactly what it should be as an intro-to-* course. You are given the names and terminologies and the ability (Math! Yes, you do need it.) and freedom to explore any deeper materials on your own.


Hmm, I always thought that this style of teaching was a good way to lose a lot of students' interest. A math professor I had was insistent on showing the thinking that led to a proof, which not only made it easier to remember, but also made the study vastly more interesting and engaging. From what I can gather, his technique has made him quite a popular professor amongst the students at the University of Waterloo, so I suspect this is quite an effective alternate style of "intro" course, although I recognize that it limits the breadth of material.


They can code AI, but can't run their sound through audacity to clean it up? That's a bummer.


I did the first half of this as an edX course (CS188.1x) and it covered up to and including the reinforcement learning content. It was really fascinating and enjoyable - definitely one of my favourite courses that I've done.


This is one of my favorite AI courses. It basically takes you through a good chunk of Artificial Intelligence: A Modern Approach (warning, this is mostly the basics and not fancy machine learning) in a very practical way (implementing things for a pac-man game).

I tough a similar course for a while and they are doing a much better job than I did.


I just took this course at Berkeley this past semester. The latest materials should be available on EdX for the public!


If possible, can you post a link to the same? The material on edX is currently of the Fall 2013 offering. Not that it's an issue but it would be nice to be more current.


If you watch the lectures, I highly recommend the projects, some of the best projects in a CS class I've had.


Related: how do people normally access edX on an iPad? Am I correct that there is no dedicated app?


I believe this is the app: https://itunes.apple.com/us/app/edx/id945480667?mt=8#

iOS 8.0 and greater


Just signed up for the same thing on edX! Still going through the math review though.


Your thoughts on CS188 vs Udacity's Intro to AI?


The Berkeley one is a bit slower and it has a statistics/probability refresher. I found it much easier to follow at 1.5x speed. I'd recommend the Udacity one if your stats/probability knowledge is fresh or if you're re-learning the topics.


Haven't taken Udacity's offering. I took CS188 on when it was offered on edx and loved it! I hope the bring the course back!


Amazing! Only if I had more free time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: