Numenta / Numenta On Intelligence Podcast

Add meeting Rate page Subscribe

Numenta / Numenta On Intelligence Podcast

These are all the meetings we have in "Numenta On Intellige…" (part of the organization "Numenta"). Click into individual meeting pages to watch the recording and search or read the transcript.

3 Dec 2019

In this episode, host Matt Taylor chats with Numenta Visiting Research Scientist Florian Fiebig. Florian is a recent graduate from the KTH Royal Institute of technology in Stockholm, Sweden with a PhD in computational neuroscience. His PhD thesis focuses on Hebbian learning networks and he regularly presents his work at Numenta research meetings[1]. Florian’s thesis is titled, “Active Memory Processing on Multiple Timescales in Simulated Critical Networks with Hebbian Plasticity.”[2]

Show Notes
• 1:05 Intro to Florian
• 2:41 Florian’s background and what led him to Numenta
• 3:06 Continuous learning
• 9:30 Does deep learning have anything similar to Hebbian learning?
• 11:36 Different types of plasticity in Hebbian learning
• 11:55 Long-term Potentiation (LTP)
• 14:38 “So it turns out: Short-term potentiation is not always short-term potentiation”
• 15:47 Two fast forms of plasticity: facilitation and augmentation
• 17:57 Homeostatic mechanisms: the Bobcat example
• 19:41 Let’s talk about working memory
• 21:21 Associative nature of memory
• 26:46 The brain as a massive filter
• 28:16 Episodic memory vs. semantic memory
• 30:38 Non-declarative memories
• 32:47 How does the transfer process of initially acquired memory into something that is longer lasting work?
• 35:05 The keys to remembering: repetition and relevance
• 37:28 Attractors and dynamical systems
• 44:15 The cortical attractor theory of neocortex or neocortical memory
• 45:26 The binding problem
• 55:00 Closing

Download the full transcript of the podcast here.[3]

Subscribe to Numenta On Intelligence: iTunes, Stitcher, Google Play, Spotify,
RSS

[1] https://numenta.com/blog/2019/07/31/the-livestream-experiment-update
[2] http://www.diva-portal.org/smash/record.jsf?pid=diva2%3A1263428&dswid=-4633
[3] http://numenta.flywheelstaging.com/assets/pdf/numenta-on-intelligence-podcast/NOI-episode-14-conversation-with-florian-fiebig.pdf
  • 2 participants
  • 58 minutes
neuroscientist
neuroscientists
neurosciences
cognitive
research
thesis
neural
brain
cortex
institute
youtube image

28 Oct 2019

After a summer hiatus, we are back with a new episode of the Numenta On Intelligence Podcast.

Host Matt Taylor talks to Numenta VP of Research Subutai Ahmad about the effort he has been leading in applying Numenta research and HTM principles to deep learning systems.

Paper link: https://arxiv.org/abs/1903.11257

0:51 Why is Numenta looking at deep learning?
2:43 What are the inherent problems in deep learning and how can neuroscience help?
3:06 Continuous learning
3:48 Catastrophic forgetting  - “If you have really sparse representations and a better neuron model and a predictive learning system you can learn continuously without forgetting the old stuff.”
5:11 What does high dimensionality mean in deep learning and neural networks?
6:34 Why does sparsity help?
11:23 Other types of sparsity: dendrites are tiny, sparse computing devices
14:47 Another type of sparsity: Neurons are independent sparse computing devices
15:34 Numenta’s paper on sparsity: How Can We Be So Dense? The Benefits of Using Highly Sparse Representations
19:34 The surprising benefit of sparse activations AND sparse weights, a rarity in the machine learning community
20:52 Benchmarks that we’re working with: MNIST, CIFAR10, VGG19
24:22 What does the future of research look like at Numenta?
  • 4 participants
  • 29 minutes
intelligent
intelligence
understanding
neural
brains
thoughtfulness
research
deep
hmm
sparse
youtube image

22 May 2019

From the Numenta On Intelligence Podcast series.
  • 5 participants
  • 47 minutes
intelligence
intelligent
brains
thinking
evolved
beliefs
evolutionarily
mentally
understand
existential
youtube image