►
Description
Paper review: https://arxiv.org/abs/1804.02464 "Differentiable plasticity: training plastic neural networks with backpropagation"
It is aimed to be a connection between the work we are doing, with structural plasticity through Hebbian learning, and continual learning.
Will possibly review a 2nd paper: "Backpropamine: training self-modifying neural networks with differentiable neuromodulated plasticity"
https://openreview.net/forum?id=r1lrAiA5Ym
Subutai, time-willing, will go over "Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties"
https://www.nature.com/articles/338334a0
A
A
So
I'm
I
think
I'm
live,
but
the
meetings
not
quite
ready
to
start
yet
we're
having
a
small
change
of
plans.
This
is
a
research
meeting
I
tried
not
to
disrupt
research
meetings.
Why
are
we
going
to
a
different,
authoritative,
camera?
Okay,
I,
try
not
to
disrupt
research
meetings.
I
just
want
to
let
him
flow
I
just
want
to
live
stream.
Whatever
happens
there,
so
it's
hard
to
plan.
A
A
I'm,
not
sure
how
this
how
well
this
is
going
to
work,
but
I'm
just
going
to
put
this
on
my
room
Lib
for
a
minute
while
they're
just
setting
up
but
I,
wanted
to
get
everything
properly
set
up,
so
I'll
be
right
with
you.
I
should
put
like
the
I
know,
there's
like
a
starting
soon.
That's
it!
No
I
used
to
have
something
said
starting
soon.
A
B
A
And
you
don't
have
to
pay
much
attention
to
that.
That's
my
agenda.
It's
not
like
I'm,
probably
not
gonna
stream,
all
of
it
I
might
I
might
stream
the
thing
of
Jaden
later
on
twitch.
So
if
those
are
just
joining,
we're
gonna
start
as
soon
as
they're
ready,
but
we
might
start
with
a
different
paper
than
the
ones
we
were
talking
about.
This
is
a
live
research
meeting,
so
I
just
kind
of
go
with
the
flow
here,
and
here
we
go
we're
getting
something
just
a
moment.
D
E
E
F
E
E
That's
pretty
far,
and
so
these
would
definitely
be
in
different
cortical
columns.
You
know,
and
so
the
basic
thing
that
he
finds
is
that
you
can
look
at
the
receptive
fields
and
when
they're,
nearby
and
they're
and
the
stimulus
is
consistent
with
their
preferred
orientation.
You
get
some
synchrony,
usually.
E
E
C
E
C
E
So
here's
like
we
would
say
it's
like
the
same
mob
yeah,
so
I
thought
that
was
really
interesting
because
we've
talked
about
voting
across
article
columns
and
so
on,
but
we
never
really
talked
about
synchrony
and
to
me
this
is
synchrony
and
our
models.
This
was
really
interesting
because
a
lot
of
our
stuff
happens
through
active
and
recommended
excitements
yeah.
If
you
have
the
neuron
and
you
have
these
active
dendritic
segments
in
order
for
this
segment
to
invoke
in
an
MVA
spike
or
to
fire,
you
have
to
be
less
than
five
milliseconds.
E
C
C
E
So
I
thought
that
was
kind
of
an
interesting
all
right.
So
my
question
was
to
him,
so
this
is
great,
but
this
is
descent
lines.
I
heard
about
actual
objects,
yeah
all
right,
so
I
want
her
to
find
out
about
things
like
what,
if
welcome
you
want,
if
you
have
this,
but
it's
actually
a
same
same
object
yeah.
So
you
know
something
like
this
yeah,
let's
see
a
rectangle
yeah
and
maybe
one
part
is
looking
here
now:
the
cars
looking.
E
C
D
E
Just
the
two
two
pageants,
but
they
could
be
from
the
same
movie
or
from
different
movies,
so
they
could
be
coherent
or
so
kind
of
similar
to
that
and,
of
course
again,
I've
got
this
table
literally
twenty
minutes
ago
and
then
really
our
store
all
of
this,
but
they
do
a
bunch
of
different
they
meet,
but
they
base
a
conclusion
and
have
been
really
go
into.
The
more
detail
is
that
when
they're
from
the
same
movie
and
they're
co-parent,
you
get
a
lot
more
synchrony
and
the
final
activity.
No.
Thank
you.
This.
C
Is
kind
of
an
artificial
stimuli
in
the
sense
that
you're
only
presenting
this
passage?
It's
like
it's
like
those
images,
they
show
you
like,
there's
some
circles
and
there's
some
lines
and
the
circles
and
then,
when
the
lines
move,
you
realize
there's
an
object
behind
the
circles.
Yeah
no
filter,
it's
kind
of
like
that.
It's.
C
E
F
E
D
D
C
C
C
C
B
C
B
B
And
this
is
an
LST
m
and
you
may
not
recognize
some
modifications.
Anything
I
just
took
the
memory
out
of
the
cell
and
I
had
the
same.
They
were
going.
No
network
has
a
problem
before
learning
several
time
steps
in
a
row,
and
then
you
get
a
loss.
You
have
a
problem
by
propagating
that
how
they're
repeating
so
the
difference?
The
way
Alice
can
fix
this.
B
It
adds
a
memory,
and
this
memory
is,
along
with
painful,
defines
the
output
of
that
current
time
start
that
memory
is
modified,
so
we
have
three
gates:
neural
stem
and
chair
you,
one
of
them
defines
how
much
of
that
memory
you're
gonna
take
into
account.
So
it's
like
an
input
cake,
and
these
two
gates
are
the
output
gates
and
they
will
define
how
much
of
this
neighbor
you
gonna
forget
and
how
much
of
what
to
learn
here.
You
can
add
to
that
memory.
B
D
B
E
It's
like
a
temporal
trace,
Imagineer
grammar,
where
something
now
can
affect
something
concepts
later
right.
It's
very
difficult
to
keep
this
trace
around
for
ten
steps
difficult.
So
what
these
things
will
do
is
is
some
part
of
it
will
recognize,
hold
I
need
to
remember
this,
and
it
will
artificially
just
keep
it
alive
for
until
the
system
says.
Okay
and.
B
A
memory
component
and
the
memory
component
here,
who's
behind
and
trace,
and
what
he
calls
ahead
and
trace
heavy
interest
yeah
and
the
way
it's
calculated
so
the
head
and
face
it's
the
product
of
the
pre-synaptic
activation
and
postsynaptic
activations.
Just
things
with
an
output
register
paper
words
and
the
product
of
those
are
going
to
compose
this
hydrant
race,
and
this
handle
face
is
updated
from
statue
step
so
from
time
0
to
time
step
one
it's
going
to
flow
here,
and
it's
going
to
be
updated
with
this
with
this
correlation
at
that
current
time
step.
B
So
you
can
see
it's
kind
of
similar
to
the
data
stem
and
the
heaven
trace
is
also
calculated.
So
you
have
the
input
times
of
wait
plus
they
have
been
face
times
and
after
this,
which
is
also
set
of
weights
that
are
going
to
define
how
relevant,
how
much
are
they
taking
to
account
ahead
and
paste
into
the
calculation
of
girl
so
similar
to
the.
B
This
small
learning
rate
in
these
equations-
it's
switched
by
an
actual
gay,
so
it
is
okay,
so
that
this
is
the
Nero
modulation
perfect.
So
the
neural
localization
is
that
this
is
going
to
tell
you
how
much
you
can
update
at
each
time
step
as
before.
You
only
have
like
this
global
learning
rate.
Now
in
the
second
paper,
you
have
a
specific
gate,
so
this
is
the
so
this
is
the
again
update
rule.
B
C
E
C
Differential
update
for
them
for
like
what
to
remember,
is
not
always
the
same,
and
instead
of
cleverly
figured
out
what
things
should
I
keep
in
the
trace.
Not
I
mean
this
is
how
we
normally
have
of
this
context
is
propagated
throughout,
and
we
do
this
with
separate
representations
in
every
southern
point
of
time.
Here
they
did
something
equivalent,
but
just
by
saying
this
is
the
thing
you
need.
This
is
the
context
you
need
to
keep
track
of,
to
know
what
to
put
it
in
X,
so
it
was
a
clever,
different
and
so
typical
back
propagation.
C
B
C
E
C
B
B
C
I'm
Morgan
intuition,
why
the
reason,
having
a
piece
that
I
was
trying
to
understand
how
the
relevance
of
that
would
be
a
time
I
think
Ellis.
Does
it,
but
it's
kind
of
clever
I,
don't
remember
it,
but
okay,
so
that
that's
essentially
this
someone,
my
curiosity
or
a
confusion
about
it,
because
they're
not
really
you
thinking
for
time
they
passed
yeah,
which
is
odd.
E
C
C
E
In
with
the
notion
of
permanence
and
the
more
Co
activation
you
have,
the
stronger
the
permanence
would
be
so,
the
less
likely
it's
going
to
go
from
one
to
zero
yeah,
but
if
you're
sort
of
in
between
you're
going
to
be
right
near
the
threshold
and
you
could
easily
go
from
one
to
zero
now
sort
of
related
to
here,
you
know
if,
if
the
correlation
between
two
units
is
very
strong,
the
weights
are
going
to
be
very
strong
at
last.
My
last
second
one,
so
here.
B
C
B
C
C
E
F
C
C
Building
a
ruffles
friend
play
cells
and
we're
going
to
map
of
this
world
and
then
use
the
calculator
to
get
someplace
where
it
could
be
like
a
bacterium.
A
bacterium
has
a
strategy,
it
goes
forward
and
then
it
turns
every
turns
more
often
when
it
doesn't
find
what
it's
looking
for-
and
this
is
more
like
the
bacteria
might
have.
C
C
A
D
C
C
C
F
D
D
B
E
E
E
F
F
A
D
F
D
D
E
D
E
E
B
E
D
B
B
E
E
E
B
D
E
You
just
don't
need
feature
that
you
have
this
base
yeah
work
there,
but
then
I
still
want
to
have
something
if
I'm
being
fed
something
different.
I
want
to
have
some
way
of
adapting
to
it,
and
this
is
a
mechanism
to
at
least
partially
do
that
right,
and
so
you
know
it's
clearly
more
powerful
than
there
were.