Fuzzy Tiling Activations (FTA)

Open In Colab

This is a PyTorch implementation/tutorial of Fuzzy Tiling Activations: A Simple Approach to Learning Sparse Representations Online.

Fuzzy tiling activations are a form of sparse activations based on binning.

Binning is classification of a scalar value into a bin based on intervals. One problem with binning is that it gives zero gradients for most values (except at the boundary of bins). The other is that binning loses precision if the bin intervals are large.

FTA overcomes these disadvantages. Instead of hard boundaries like in Tiling Activations, FTA uses soft boundaries between bins. This gives non-zero gradients for all or a wide range of values. And also doesn't lose precision since it's captured in partial values.

Tiling Activations

is the tiling vector,

where is the input range, is the bin size, and is divisible by .

Tiling activation is,

where is the indicator function which gives if the input is positive and otherwise.

Note that tiling activation gives zero gradients because it has hard boundaries.

Fuzzy Tiling Activations

The fuzzy indicator function,

which increases linearly from to when and is equal to for . is a hyper-parameter.

FTA uses this to create soft boundaries between bins.

Here's a simple experiment that uses FTA in a transformer.

61import torch
62from torch import nn

Fuzzy Tiling Activations (FTA)

65class FTA(nn.Module):
  • lower_limit is the lower limit
  • upper_limit is the upper limit
  • delta is the bin size
  • eta is the parameter that detemines the softness of the boundaries.
70    def __init__(self, lower_limit: float, upper_limit: float, delta: float, eta: float):
77        super().__init__()

Initialize tiling vector

80        self.c = nn.Parameter(torch.arange(lower_limit, upper_limit, delta), requires_grad=False)

The input vector expands by a factor equal to the number of bins

82        self.expansion_factor = len(self.c)

84        self.delta = delta

86        self.eta = eta

Fuzzy indicator function

88    def fuzzy_i_plus(self, x: torch.Tensor):
94        return (x <= self.eta) * x + (x > self.eta)
96    def forward(self, z: torch.Tensor):

Add another dimension of size . We will expand this into bins.

99        z = z.view(*z.shape, 1)

102        z = 1. - self.fuzzy_i_plus(torch.clip(self.c - z, min=0.) + torch.clip(z - self.delta - self.c, min=0.))

Reshape back to original number of dimensions. The last dimension size gets expanded by the number of bins, .

106        return z.view(*z.shape[:-2], -1)

Code to test the FTA module

109def _test():
113    from labml.logger import inspect

Initialize

116    a = FTA(-10, 10, 2., 0.5)

Print

118    inspect(a.c)

Print number of bins

120    inspect(a.expansion_factor)

Input

123    z = torch.tensor([1.1, 2.2, 3.3, 4.4, 5.5, 6.6, 7.7, 8.8, 9., 10., 11.])

Print

125    inspect(z)

Print

127    inspect(a(z))
128
129
130if __name__ == '__main__':
131    _test()