We propose a novel network module called Feature Aware Normalization that can be incoporated into existing architectures. The module combines advantages of LSTM units and Batch Normalization. Scale and shift parameters are calculated on the fly using a pre-trained (and possibly finetuned) deep network. Retraining of as little as 30.000 parameters is required to adapt the algorithm to your dataset.
Feature Aware Normalization is fast, improves significantly on previous approaches and can easily be adapted to new problems, especially when Deep Networks are used for further processing.
To benchmark normalization algorithms, we provide the full validation dataset used in our experiments. The dataset consists of five different blocks of tissue, each seperated into nine subsequent cuts with different color characteristics. For each combination, we provide five images of size 2000x2000, yielding a total of 225 images.