youtube image
From YouTube: Numenta Research Paper Review: Scalable training of artificial neural networks

Description

Review of the paper Scalable training of artificial neural networks and how it relates to our ongoing research on applying sparsity to neural networks (https://www.nature.com/articles/s41467-018-04316-3).

Additional papers reviewed to set the background for the discussion:

1) Rethinking the Value of Network Pruning (https://arxiv.org/abs/1810.05270): structured pruning using several different approaches, reinitialize remaining weights to random values.

2) The Lottery Ticket Hypothesis (https://arxiv.org/abs/1803.03635): Finding Sparse, Trainable Neural Networks: unstructured pruning based on the magnitude of final weights, set remaining weights to initial values.

3) Deconstructing Lottery Tickets (https://arxiv.org/abs/1905.01067): Zeros, Signs, and the Supermask: unstructured pruning based on the magnitude of final weights or the magnitude increase, set weights to constants with same sign as previous initial values.

Structured pruning usually refers to changing the network architecture, like removing a filter or a layer.
Unstructured pruning is “sparsifying”, killing the connections by setting the weights to zero and freezing.


Broadcasted live on Twitch -- Watch live at https://www.twitch.tv/rhyolight_