Computation and Cognition: the Probabilistic Approach (Psych 204/CS 428, Fall 2018)
Overview
How can we understand intelligent behavior as computation? This course will introduce probabilistic modeling through probabilistic programs, and will explore the probabilistic approach to modeling human and artificial cognition. Examples will be drawn from areas including concept learning, causal reasoning, social cognition, and language understanding.

Instructor: Noah Goodman (ngoodman at stanford)

TAs: Robert Hawkins (rxdh at stanford) & Ben Peloquin (bpeloqui at stanford)

Meeting time: T,Th, 1:302:50pm

Meeting place: 380380X

Office hours:
We will use Canvas to post announcements, collect assignments, and host discussion among students. We encourage students to post questions here instead of directly emailing the instructors: we hope students will attempt to answer each other’s questions as well (TAs will verify the answers). Trying to explain a concept to someone else is often the best way to check your own knowledge.
Assignments and grading
Students (both registered and auditing) will be expected to do assigned readings before class. Registered students will be graded based on:
 20% Class & Canvas participation.
 80% Homework.
or (see below),
 20% Class & Canvas participation.
 30% Homework.
 50% Final project (including proposal, presentation, and paper).
Assignments should be submitted to Canvas in .pdf form; fixedwidth font appreciated for code (e.g. using the editor at http://webppl.org). Homework assignments will be graded using letter grades:
 A: All solutions correct & reasoning clear
 B: Assignment complete, with a few errors
 C: Assignment incomplete, or pervasive errors throughout
 F: Assignment was not attempted
Readings
Readings for each week will be linked from the calendar below. Readings will be drawn from the webbook Probabilistic Models of Cognition and selected research papers. (In some cases the papers will require an SUNet ID to access. See the instructor in case of trouble.)
Prerequisites
There are no formal prerequisites for this class. However, this is a graduatelevel course, which will move quickly and have technical content. Students should be already familiar with the basics of probability and programming.
Other resources
In addition to the assigned readings below, here are notes from a few related short courses, that might prove useful:
 The Design and Implementation of Probabilistic Programming Languages
 PPAML Summer School 2016
 Bayesian Statistics for Psychologists
 Modeling Agents with Probabilistic Programs
 Probabilistic Language Understanding
Schedule
Week of September 25
Introduction. Simulation, computation, and generative models. Probability and belief.
Homework: Exercises on Generative Models and (optionally) JavaScript Basics.
Readings:
 JavaScript Basics
 Generative Models
 Optional: Concepts in a probabilistic language of thought. Goodman, Tenenbaum, Gerstenberg (2015).
 Optional: How to grow a mind: structure, statistics, and abstraction., J. B. Tenenbaum, C. Kemp, T. L. Griffiths, and N. D. Goodman (2011).
 Optional: Simulation as an engine of physical scene understanding. Hamrick, Battaglia, Tenenbaum (2013).
 Optional: Sources of uncertainty in intuitive physics. Smith and Vul (2012).
Week of October 2
Conditioning and inference. Causal vs. statistical dependency. Patterns of inference.
Homework: Exercises on conditioning, dependence, conditional dependence.
Readings:
 Conditioning
 Causal and statistical dependence
 Conditional dependence
 Optimal predictions in everyday cognition. Griffiths and Tenenbaum (2006).
 Optional: Causal Reasoning Through Intervention. Hagmayer, Sloman, Lagnado, and Waldmann (2006).
 Optional: Children’s causal inferences from indirect evidence: Backwards blocking and Bayesian reasoning in preschoolers. Sobel, Tenenbaum, Gopnik (2004).
 Optional: Bayesian models of object perception. Kersten and Yuille (2003).
Week of October 9
Bayesian data analysis. Discussion on levels of analysis.
Homework: Exercises on Bayesian data analysis.
Readings:
 Bayesian data analysis
 Chapter 1 of “The adaptive character of thought.” Anderson (1990).
 Optional: Inferring Subjective Prior Knowledge: An Integrative Bayesian Approach. Tauber & Steyvers (2013).
 Optional: Descriptive vs. optimal bayesian modeling (blog post). Frank (2015)
 Optional: Chapter 1 of “Vision.” Marr (1982).
 Optional: Ten Years of Rational Analysis. Chater, Oaksford (1999).
 Optional: The Knowledge Level. Newell (1982).
 Optional: What Do We Actually Know About the Economy? (Wonkish) Paul Krugman, New York Times (2018).
Week of October 16
Inference algorithms.
Project proposals due on Sunday!
Readings:
 Algorithms for Inference
 PPAML Summer School 2016: Approximate Inference Algorithms.
 Optional: The Design and Implementation of Probabilistic Programming Languages.
Week of October 23
Process models. Learning as inference.
Readings:
 Rational process models
 Learning as Conditional Inference
 Rational use of cognitive resources: Levels of analysis between the computational and the algorithmic. Griffiths, Lieder, Goodman (2015).
 One and Done? Optimal Decisions From Very Few Samples. Vul, Goodman, Griffiths, Tenenbaum (2014).
 Optional: Rules and similarity in concept learning. Tenenbaum (2000).
 Optional: Burnin, bias, and the rationality of anchoring. Lieder, Griffiths, and Goodman (2012).
 Optional: Perceptual multistability as Markov chain Monte Carlo inference. Gershman, Vul, Tenenbaum (2009).
 Optional: A more rational model of categorization. Sanborn, Griffiths, Navarro (2006).
 Optional: Theory acquisition as stochastic search Ullman, Goodman, and Tenenbaum (2010).
 Optional: Exemplar models as a mechanism for performing Bayesian inference. Shi, Griffiths, Feldman, Sanborn (2010).
Week of October 30
Learning compositional hypotheses. The langauge of thought. Hierarchical models.
Readings:
 Learning in a langauge of thought
 Hierarchical Models
 A rational analysis of rulebased concept learning. Goodman, Tenenbaum, Feldman, and Griffiths (2008).
 Optional: Learning Structured Generative Concepts. Stuhlmueller, Tenenbaum, and Goodman (2010).
 Optional: Bayesian modeling of human concept learning. Tenenbaum (1999).
 Optional: Word learning as Bayesian inference. Tenenbaum and Xu (2000).
 Optional: Word learning as Bayesian inference: Evidence from preschoolers. Xu and Tenenbaum (2005).
 Optional: Learning overhypotheses. Kemp, Perfors, and Tenenbaum (2006).
 Optional: Object name learning provides onthejob training for attention. Smith, Jones, Landau, GershkoStowe, and Samuelson (2002).
Week of November 6
Occam’s razor. Mixture models. Unbounded mixture models.
Readings:
 Occam’s Razor
 Mixture Models
 Optional: Nonparametric models
 Structure and strength in causal induction. Griffiths and Tenenbaum (2005).
Week of November 13
Learning continuous functions. Deep probabilistic models.
Readings:
 Learning (deep) continuous functions
 Optional: Derivation of fGAN objective
 Optional: Tutorial on Variational AutoEncoders
Week of November 20
Thanksgiving – no class!
Week of November 27
Social cognition. Natural language pragmatics and semantics.
Readings:
Week of December 4
Catch up. Project presentations!
Project option
Some students, especially graduate students may prefer to get indepth experience applying the techniques we’ve discussed in class to a research question. To facilitate this, we will allow selected students to do a project instead of homework for the second half of the course. We encouraged (but don’t require) projects to be done in small groups of two or three people. Project proposals will be evaluated for plausibility and scientific value; about six projects will be selected to proceed.
Project proposal
Your proposal should be no more than one page long (single spaced). Make sure that you cover the background, key question, and methods of your project. The background should include the topic and the context of your project, including other research in this area. The specific question you are planning to ask through your project should be clearly stated. You should briefly describe the methods you plan to use (your experimental design, your modeling approach, your data analysis, and so on).
Upload your proposal to Canvas as a PDF file by midnight on TBA.
Project presentation
The presentations should describe your question, methods, and results at a high level.
Project writeup
Your final project should be described in the format of a conference paper, following the guidelines of paper submissions to the Cognitive Science Society conference. The easiest way is to download the preformatted template:
In particular, your paper should be no more than six pages long. Your paper should cover the background behind your project, the questions you are asking, your methods and results, and your interpretation of these results.