snnmetrics.synops#

Module Contents#

Classes#

SynOps

A metric that calculates the number of synaptic operations, both for every neuron in the

class snnmetrics.synops.SynOps(fanout: Union[float, torch.Tensor], sample_time: Optional[float] = None)[source]#

Bases: torchmetrics.metric.Metric

A metric that calculates the number of synaptic operations, both for every neuron in the layer and for the sum over all neurons in the layer. The number of synaptic operations is defined as number of spikes times the fanout, which are the number of connections each neuron has to the next layer. Whereas the fanout using fully-connected connectivity is equal to the number of neurons (or features) in the next layer, the situation for convolutional layers is more complex. Parameters such as stride, kernel size, grouping and others all have influence on convolutional fanout. When you think about a convolutional kernel that is applied to every receptive field, the neurons at the edge of the input will be seen less often (given a padding of zero) than neurons in the middle. The convolutional fanout can be approximated when the spatial input size is large enough.

Parameters:
  • fanout (Union[float, torch.Tensor]) – Can either be a float or a tensor of shape (C,H,W).

  • sample_time (Optional[float]) –

is_differentiable: bool = True#
higher_is_better: Optional[bool]#
full_state_update: bool = False#
update(output: torch.Tensor)[source]#

Override this method to update the state variables of your metric class.

Parameters:

output (torch.Tensor) –

compute()[source]#

Override this method to compute the final metric value from state variables synchronized across the distributed backend.