Pedagogy

Neural Computation

Rice University, NEUR 416 / CAAM 416 / ELEC 489
Instructors: Xaq Pitkow, Krešimir Josić
Spring 2017, TTh 2:30–3:45, Herzstein 211. 3 credits.

How does the brain work? Understanding the brain requires sophisticated theories to make sense of the collective actions of billions of neurons and trillions of synapses. Word theories are not enough; we need mathematical theories. The goal of this course is to provide an introduction to the mathematical theories of learning and computation by neural systems. These theories use concepts from dynamical systems (attractors, chaos) and concepts from statistics (information, uncertainty, inference) to relate the dynamics and function of neural networks. We will apply these theories to sensory computation, learning and memory, and motor control. Our learning objectives are for you to formalize and mathematically answer questions about neural computations including “what does a network compute?”, “how does it compute?”, and “why does it compute that way?”

3 credits. Prerequisites: linear algebra, probability and statistics, calculus

Theoretical Neuroscience: Networks and Learning

Rice University, NEUR 416 / CAAM 416 / ELEC 489
Instructors: Xaq Pitkow, Harel Shouval
Spring 2016, TTh 2:30–3:45. 3 credits.

 

Computational Systems Neuroscience Laboratory

Rice University, NEUR 332
Instructors: Xaq Pitkow, Harel Shouval
Spring 2016, TTh 4–5:15. 1 credit.

A project oriented, hands-on, computational companion to NEUR 380: Fundamental Neuroscience Systems. The student will build and test mathematical models of the neural systems that subserve perception, learning, and behavior. Lecture each Tuesday, followed by lab each Thursday.

Prerequisites: calculus, some programming experience.

Deep Learning

Rice University, ELEC 681
Instructors: Rich Baraniuk, Ankit Patel, Xaq Pitkow
Spring 2015, F 1–3:30. 3 credits.

This course will explore deep learning, multistage machine learning methods that learn representations of complex data. Over the past several years, thanks for the development of new training rules, massive computing capabilities, and enormous training data sets, deep learning systems have redefined the state-of-the-art in object identification, face recognition, and speech recognition. Examples of modern tools include: Facebook’s Deep Face and Google Deep Mind.