Skip to content

Commit

Permalink
Added template notebooks
Browse files Browse the repository at this point in the history
  • Loading branch information
Christoph Schmidl committed Jul 29, 2018
1 parent 20ae96d commit 57a8a2b
Show file tree
Hide file tree
Showing 7 changed files with 495 additions and 0 deletions.
69 changes: 69 additions & 0 deletions notebooks/Chapter_01-Introduction.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Chapter 1 - Introduction\n",
"\n",
"\n",
"## 1.1 - Example: Polynomial Curve Fitting\n",
"\n",
"## 1.2 - Probability Theory\n",
"\n",
"### 1.2.1 - Probability densities\n",
"### 1.2.2 - Expectations and covariances\n",
"### 1.2.3 - Bayesian probabilities\n",
"### 1.2.4 - The Gaussian distribution\n",
"### 1.2.5 - Curve fitting re-visited\n",
"### 1.2.6 - Bayesian curve fitting\n",
"\n",
"## 1.3 - Model Selection\n",
"## 1.4 - The Curse of Dimensionality\n",
"## 1.5 - Decision Theory\n",
"\n",
"### 1.5.1 - Minimizing the misclassification rate\n",
"### 1.5.2 - Minimizing the expected loss\n",
"### 1.5.3 - The reject option\n",
"### 1.5.4 - Inference and decision\n",
"### 1.5.5 - Loss functions for regression\n",
"\n",
"## 1.6 - Information Theory\n",
"\n",
"### 1.6.1 - Relative entropy an mutual information"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# TODO: Explanations and code examples"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "py35",
"language": "python",
"name": "py35"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
74 changes: 74 additions & 0 deletions notebooks/Chapter_02-Probability_Distributions.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Chapter 2 - Probability Distributions\n",
"\n",
"## 2.1 - Binary Variables\n",
"\n",
"### 2.1.1 - The beta distribution\n",
"\n",
"## 2.2 - Multinominal Variables\n",
"\n",
"### 2.2.1 - The Dirichlet distribution\n",
"\n",
"## 2.3 - The Gaussian Distribution\n",
"\n",
"### 2.3.1 - Conditional Gaussian distributions\n",
"### 2.3.2 - Marginal Gaussian distributions\n",
"### 2.3.3 - Bayes' theorem for Gaussian variables\n",
"### 2.3.4 - Maximum likelihood for the Gaussian\n",
"### 2.3.5 - Sequential estimation\n",
"### 2.3.6 - Bayesian inference for the Gaussian\n",
"### 2.3.7 - Student's t-distribution\n",
"### 2.3.8 - Periodic variables\n",
"### 2.3.9 - Mixtures of Gaussians\n",
"\n",
"## 2.4 - The Exponential Family\n",
"\n",
"### 2.4.1 - Maximum likelihood and sufficient statistics\n",
"### 2.4.2 - Conjugate priors\n",
"### 2.4.3 - Noninformative priors\n",
"\n",
"## 2.5 - Nonparametric Methods\n",
"\n",
"### 2.5.1 - Kernel density estimators\n",
"### 2.5.2 - Nearest-neighbour methods"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# TODO: Explanations and code examples"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "py35",
"language": "python",
"name": "py35"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
67 changes: 67 additions & 0 deletions notebooks/Chapter_03-Linear_Models_for_Regression.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Chapter 3 - Linear Models for Regression\n",
"\n",
"## 3.1 - Linear Basis Function Models\n",
"\n",
"### 3.1.1 - Maximum likelihood and least squares\n",
"### 3.1.2 - Geometry of least squares\n",
"### 3.1.3 - Sequential learning\n",
"### 3.1.4 - Regularized least squares\n",
"### 3.1.5 - Multiple outputs\n",
"\n",
"## 3.2 - The Bias-Variance Decomposition\n",
"## 3.3 - Bayesian Linear Regression\n",
"\n",
"### 3.3.1 - Parameter distribution\n",
"### 3.3.2 - Predictive distribution\n",
"### 3.3.3 - Equivalent kernel\n",
"\n",
"## 3.4 - Bayesian Model Comparison\n",
"## 3.5 - The Evidence Approximation\n",
"\n",
"### 3.5.1 - Evaluation of the evidence function\n",
"### 3.5.2 - Maximizing the evidence function\n",
"### 3.5.3 - Effective number of parameters\n",
"\n",
"## 3.6 - Limitations of Fixed Basis Functions"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# TODO: Explanations and code examples"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "py35",
"language": "python",
"name": "py35"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
78 changes: 78 additions & 0 deletions notebooks/Chapter_04-Linear_Models_for_Classification.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Chapter 4 - Linear Models for Classification\n",
"\n",
"## 4.1 - Discriminant Functions\n",
"\n",
"### 4.1.1 - Two classes\n",
"### 4.1.2 - Multiple classes\n",
"### 4.1.3 - Least squares for classification\n",
"### 4.1.4 - Fisher's linear discriminant\n",
"### 4.1.5 - Relation to least squares\n",
"### 4.1.6 - Fisher's dicriminant for multiple classes\n",
"### 4.1.7 - The perceptron algorithm\n",
"\n",
"## 4.2 - Probabilistic Generative Models\n",
"\n",
"### 4.2.1 - Continuous inputs\n",
"### 4.2.2 - Maximum likelihood solution\n",
"### 4.2.3 - Discrete features\n",
"### 4.2.4 - Exponential family\n",
"\n",
"## 4.3 - Probabilistic Discriminative Models\n",
"\n",
"### 4.3.1 - Fixed basis functions\n",
"### 4.3.2 - Logistic regression\n",
"### 4.3.3 - Iterative reweighted least squares\n",
"### 4.3.4 - Multiclass logistic regression\n",
"### 4.3.5 - Probit regression\n",
"### 4.3.6 - Canonical link functions\n",
"\n",
"## 4.4 - The Laplace Approximation\n",
"\n",
"### 4.4.1 - Model comparison and BIC\n",
"\n",
"## 4.5 - Bayesian Logistic Regression\n",
"\n",
"### 4.5.1 - Laplace approximation\n",
"### 4.5.2 - Predictive distribution"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# TODO: Explanations and code examples"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "py35",
"language": "python",
"name": "py35"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
87 changes: 87 additions & 0 deletions notebooks/Chapter_05-Neural_Networks.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Chapter 5 - Neural Networks\n",
"\n",
"## 5.1 - Feed-forward Network Functions\n",
"\n",
"### 5.1.1 - Weight-space symmetries\n",
"\n",
"## 5.2 - Network Training\n",
"\n",
"### 5.2.1 - Parameter optimization\n",
"### 5.2.2 - Local quadratic approximation\n",
"### 5.2.3 - Use of gradient information\n",
"### 5.2.4 - Gradient descent optimization\n",
"\n",
"## 5.3 - Error Backpropagation\n",
"\n",
"### 5.3.1 - Evaluation of error-function derivatives\n",
"### 5.3.2 - A simple example\n",
"### 5.3.3 - Efficiency of backpropagation\n",
"### 5.3.4 - The Jacobian matrix\n",
"\n",
"## 5.4 - The Hessian Matrix\n",
"\n",
"### 5.4.1 - Diagonal approximation\n",
"### 5.4.2 - Outer product approximation\n",
"### 5.4.3 - Inverse Hessian\n",
"### 5.4.4 - Finite differences\n",
"### 5.4.5 - Exact evaluation of the Hessian\n",
"### 5.4.6 - Fast multiplication by the Hessian\n",
"\n",
"## 5.5 - Regularization in Neural Networks\n",
"\n",
"### 5.5.1 - Consistent Gaussian priors\n",
"### 5.5.2 - Early stopping\n",
"### 5.5.3 - Invariances\n",
"### 5.5.4 - Tangent propagation\n",
"### 5.5.5 - Training with transformed data\n",
"### 5.5.6 - Convolutional networks\n",
"### 5.5.7 - Soft weight sharing\n",
"\n",
"## 5.6 - Mixture Density Networks\n",
"## 5.7 - Bayesian Neural Networks\n",
"\n",
"### 5.7.1 - Posterior parameter distribution\n",
"### 5.7.2 - Hyperparameter optimization\n",
"### 5.7.3 - Bayesian neural networks for classification"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# TODO: Explanations and code examples"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "py35",
"language": "python",
"name": "py35"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Loading

0 comments on commit 57a8a2b

Please sign in to comment.