Polars and split-apply-combine exercises
About the course
1
What are we doing?
Polars and split-apply-combine
2
Introduction to data frames and Polars
3
Tidy data and split-apply-combine
4
Polars for Pandas users
Data display
5
Making plots with Bokeh
6
High level plotting with iqplot
7
Styling Bokeh plots
8
Dealing with overplotting
Probability: The foundation for generative modeling
9
Probability as the logic of science
10
Probability distributions
11
Entropy and the Kullback-Leibler divergence
Sampling out of probability distributions
12
Random number generation
13
Random number generation using Numpy
Simulating generative distributions
14
Simulating the Luria-Delbrück distribution
15
The noisy leaky integrate-and-fire model
16
Modeling nonhomogeneous Poisson spiking
Markov chain Monte Carlo
17
The basics of Markov chain Monte Carlo
18
“Hello, world” —Stan
19
Nonhomogeneous Poisson process arrival times with Stan
Bayesian modeling and inference
20
Basics of Bayesian modeling
21
Conjugacy
22
Choosing priors
23
What about machine learning and artificial intelligence?
24
Bayes's theorem as a model for learning
25
Model building with prior predictive checks
Statistical inference with Markov chain Monte Carlo
26
Parameter estimation with Markov chain Monte Carlo
27
Reporting summaries of the posterior
28
Posterior predictive checks
Mixture models
29
Mixture models and label switching with MCMC
Principled inference pipelines
30
MCMC diagnostics via a case study: Artificial funnel of hell
31
Principled analysis pipelines
32
Simulation based calibration and related checks in practice
Model assessment
33
Model comparison
34
Model comparison in practice
Summarizing posterior distributions with maxima
35
Bayesian approach to parameter estimation by optimization
36
Parameter estimation by optimization case study: Gamma likelihood
37
Minorize-maximize algorithms
38
The expectation-maximization (EM) algorithm
39
EM applied to a Gaussian mixture model
40
An example application of the EM algorithm to a Gaussian mixture model
41
K-means clustering
Variate-covariate models
42
Model building
43
Variate-covariate models with MCMC
Hierarchical models
44
Modeling repeated experiments
45
Choosing a hierarchical prior
46
Implementation of hierarchical models
47
Generalization of hierarchical models
48
Implementation of a hierarchical model
Principal component analysis and related models
49
Principal component analysis: A heuristic approach
50
Factor analysis
51
Special cases of factor analysis
Hidden Markov models
52
Hidden Markov models
Generalized linear models
53
Generalized linear models: An introduction
54
GLMs applied to neurons and aggression
Polars and split-apply-combine exercises
55
Mastering selection and filtering of data frames
56
Split-Apply-Combine of the frog data set
57
Adding data to a data frame
58
Palmer penguins and split-apply-combine
Data visualization exercises
59
Plotting with Palmer penguins
60
Microtubule catastrophe and ECDFs
61
Long-term trends in hybridization of Darwin finches
Probability exercises
62
Distribution stories
63
Models for microtubule catastrophe
64
Censored and truncated distributions
65
Distributions of interspike intervals
66
Normal approximations
Sampling and simulation exercises
67
Exploring and sampling probability distributions
68
Exploring tails of distributions
69
Simulating distributions MT catastrophe in a two-step model
70
Spike timing with a refractory period
Introductory MCMC exercises
71
Sampling out of a bivariate Normal distribution
72
Funnel of hell
Bayesian modeling exercises
73
Building a Bayesian model
74
An Inverse Gaussian model for spiking: Prior predictive checks
75
Exponential ISIs and Gamma priors
76
Building a changepoint model
Inference with MCMC exercises
77
Working with Boolean data
78
Inference of parameters of microtubule catastrophe
79
Heavy-tailed distributions and outliers
80
Inferring a changepoint
Variate-covariate modeling exercises
81
Determining a dissociation constants
82
Diagnosing nonidentifiability with MCMC
Appendices
Notation
A
Notation
Computing
B
Configuring your computer to use Python for scientific computing
C
Hello, world.
D
Variables, operators, and types
E
Lists and tuples
F
Iteration
G
Introduction to functions
H
String methods
I
Dictionaries
J
Comprehensions
K
Packages and modules
L
Errors and exception handling
M
File I/O
N
Introduction to Numpy and Scipy
Polars and split-apply-combine exercises
Brrrrr. It’s time to do some Polars exercises!
54
GLMs applied to neurons and aggression
55
Mastering selection and filtering of data frames