Skip to content

Keras script to merge the batch norm transform into preceding convolutional layer

Notifications You must be signed in to change notification settings

yaysummeriscoming/Merge_Batch_Norm

Repository files navigation

This project contains a simple script showing how to merge the Batch Normalisation transform into each preceding convolutional layer.

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Note: This script currently only supports a linear stack of layers & 2D convolutions.

Background:

The batch normalisation transform has now become one of the base ingredients when training CNNs. The transform can be described as: The gamma and beta parameters are learned during training.

This transform uses next to no computation, compared to convolutional layers. In practice though, the extra steps in the computational graph can incur quite a bit of overhead, especially on GPUs. Additionally this is another layer type that you need to include in your deployed model.

Fortunately the Batch Normalisation transform is just a linear transform and therefore can be combined with the base convolution weights & bias.

math is here

NOTE: This code is Python 3 compatible only!

About

Keras script to merge the batch norm transform into preceding convolutional layer

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages