Introduction
We should be able to answer the following by the end of this course:
- How convolution works
- How to create a convolution layer for a neural network
- What considerations need to be taken when working with one-dimension data
- How to modify a convolution neural network to get good performance
As well as some general questions about NNs
- How to use a neural network to perform classification tasks
- How a softmax layer works and how to implement it
- How a batch normalization layer works and how to implement it
- How to create a neural network out of any differentiable function you like
When $a \ne 0$, there are two solutions to $(ax^2 + bx + c = 0)$ and they are $$ x = {-b \pm \sqrt{b^2-4ac} \over 2a} $$
Mathjax block:
\[a \ne 0\]
Inline shortcode \(a \ne 0\) with Mathjax.
# convolution in one dimension using numba
import time
from numba import njit
import numpy as np
@njit
def convolve(signal, kernel):
"""Convolve a signal with a kernel using numba"""
signal_len = len(signal)
kernel_len = len(kernel)
output_len = signal_len - kernel_len + 1
reversed_kernel = kernel[::-1]
result = np.zeros(output_len)
for i in range(output_len):
result[i] = np.dot(signal[i:i+kernel_len], reversed_kernel)
return result