►
From YouTube: HTM Hackers' Hangout - Mar 4, 2016
Description
B
Ok,
so
we're
live
hi,
hello,
new
pic-
and
this
is
our
HTM
hackers
hang
out
that's
sort
of
our
newest
format
and
our
second
or
third
one
of
them.
So
so
the
goal
of
this
meeting
is
really
just
to
kind
of
sink
and
to
exchange
ideas
and
see
if
anybody's
needs
any
help
or
as
experiencing
any
problems
of
new
things.
I
have
a
couple
of
things
that
I
would
like
to
talk
about,
but
other
than
that
I
would
just
like
to
open
the
floor.
B
Let's
see
if
anybody
wants
to
discuss
anything
in
particular,
David
Ray
has
been
out
sick
for
a
while.
Now
he's
been
really
under
the
weather,
so
I
don't
know.
If
he's
going
to
end
up
joining
us,
he's
the
HTM
Java
lead
engineer
forest.
He
works
at
court
of
ohio,
so
I'm
not
sure
he's
going
to
join
in,
but
other
than
that
we
do
have
civet
I
and
team
in
the
office.
There's
Marcus
Lewis
on
the
left
there
and
Austin
Marshall
on
the
right
looks
like
felix
has
joined
us
as
well.
B
We
have
a
couple
of
newcomers
we
haven't
seen.
So
thanks
for
joining
us,
we
appreciate
you
being
here
so
so
I'm
just
going
to
start
with
the
topics
I
wanted
to
talk
about,
and
and
maybe
a
after
we
get
through
that
we
can
talk
about
whatever
anyone
else
wants
to
talk
about.
So
the
the
first
thing
that
I
wanted
to
mention
was
in
new
pic
when
you're
creating
a
model
parameters.
I
wanted
to
talk
about
the
temporal
implementation
setting
inside
the
model
params.
B
So
historically,
that
has
always
been
a
CPP
okay,
which
is
a
and
what
that
means
is
like
not
the
spatial
pooling
part
of
it,
but
what
we
used
to
call
a
temporal
pooling
or
the
temporal
sequence.
You
know,
processing
or
analyzation
portion
of
it.
It's
basically
specifying
what
algorithm
should
be
used
for
that
part
of
the
HTM
setup.
B
So,
like
I
said
historically,
it
has
been
CPP,
which
is
the
old
temporal
pooling,
10x
c++
implementation,
and
that
is
currently
what
we
would
call
the
production
version
of
the
algorithm.
We
have
been
working
for
a
long
time
on.
You
know
the
temporal
memory
algorithm,
which
is
like
the
new
hotness,
but
it's
not
as
performant
I,
believe
as
as
the
old
one
and
it's
not
as
rigorously
tested
I
would
say
as
the
as
a
new
temporal
memory
one.
B
So
there
are
options
now,
instead
of
cpp
for
temporal
implementation
and
model
parameters,
you
can
use
TM
underscore
py,
which
is
the
Python
implementation
of
temporal
memory
or
TM
underscore
CPP,
which
is
the
c++
implementation
of
temporal
memory,
which
I
believe
is
pretty
much
done.
Correct
me.
If
I'm,
if
I'm
wrong
I
know
there
was
a
couple
of
dangling
issues
there
about
whether
that
was
really
ready
or
not,
but
both
of
those
temporal
memory,
implementations
are
still
kind
of
knew.
B
We
wouldn't
say
that
they're
there
production
ready,
we
haven't
made
them
the
default
in
any
of
our
examples.
Swarming
doesn't
try
them
at
all
that
all
it
always
uses
the
old
more.
The
temporal
pooling
algorithm,
which
I
guess
is
the
second
or
third
generation,
depending
on
what
how
supachai
calls
it.
So
that
was
one
thing
and
I
don't
know
if
you
guys
in
the
office
when
a
comment
on
that
at
all.
It's
up
to
you.
A
I
think
that's
a
pretty
good
description.
What
you
just
said
so
didn't
the
new
TM
underscore
pie
and
TM
underscore
CPP
are
close
to,
but
not
quite
production
ready.
Yet,
but
people
who
are
experimenting
could
use
that
and
that
algorithm
corresponds
much
closer
to
what's
in
the
white
paper.
So
it's
much
easier
to
understand
and
and
fall
open
right.
B
A
B
That
would
be
that
could
be
another
entry
in
the
nap.
Hopefully
it's
very
similar
okay,
so
that
was
the
one
topic
and
I
wanted
to
talk
about.
The
other
is
about
data
aggregation,
just
because
this
is
something
I've
been
doing
a
lot
of
lately
and
it
didn't
quite
make
sense
to
me
at
first
so
there's
maybe
some
of
you
out
there.
It
doesn't
quite
make
sense
to
so
I'd
like
to
explain
at
least
how
I
understand
at
this
point.
B
B
There's
no
power
users
to
the
outlet.
The
sensor
itself
gives
me
like
an
update
every
10
seconds,
or
so
it's
a
10
seconds
for
a
minute,
but
a
relatively
slow
rate,
and
it
basically
says
no
power
usage,
no
power
usage,
no
power
usage
and
also,
if
it's
a
very
constant
power
usage
like
a
lamp
or
something
that
does
not
constantly
changing.
B
It
also
is
a
slower
rate
of
update,
but
if
it's
hooked
up
to
his
workstation
and
his
computer's
plugged
into
it,
he's
got
a
screen
on
nothing's
going
up
and
down
at
it
and
I'm
getting
constant
updates.
You
know
many
many
updates
per
second
and
I
sort
of
expected,
naively
that
I
could
just
take
that
stream
and
because
you
know,
if
you
look
at
that
data
over
time,
there's
a
daily
pattern.
There's
a
weekly
pattern
very
similar
to
like
the
hot
gym
example.
B
B
But
this
data
this
sensor
as
a
lot
of
be
irregular
there's
times
when
it's
not
sending
many
updates
at
all
and
there's
times
when
it's
a
ton
of
updates,
so
that
sort
of
has
to
be
made
into
some
type
of
regular
flow,
regular
stream.
So
I
did
put
in
place
an
aggregation
strategy.
You
can
say,
instead
of
just
allowing
the
sensor
to
decide
how
often
it's
going
to
stream
data
into
the
HTM
I'm
going
to
do
a
mean
aggradation
over
time
over
certain
time
periods.
So
maybe
every
10
seconds
I'll
do
a
mean.
B
I,
deviation
of
all
the
values
that
since
racine
and
send
that
end.
So
every
10
seconds
to
get
something
and
I
was
much
more
successful
using
it
that
way,
because
then,
since
the
HTM
has
a
something
regular
coming
in,
it
knows
that
what
scale
to
to
identify
the
patterns
am
I,
saying
that
right.
Does
that
make
sense
to
you
guys:
okay,
yeah.
A
They're,
it's
being
moved
over.
C
A
B
B
Like
to,
and
so
I
thought
about
it,
but
I
have
but
one
of
the
problems
that
I'm
having
is
as
I'm
using
this
HTM
in
the
cloud
product
or
project
that
Jonathan
McKenzie
worked
on
and
it's
a
it's,
not
very
scalable.
So
I
I'd
hit
the
wall
pretty
quickly
with
thing
I'll,
create
a
few
models
I'll
past
few
hundred
thousand
rest
of
data
into
it
and
everything
blows
up.
So
I'm
working
with
some
scalability
issues
so
I'm
trying
to
keep
my
footprint
very
small
at
the
moment.
B
A
A
D
D
No,
it's
I'm
working
with
a
company
on
the
East
Coast
this
time
to
the
time
difference
is
a
bit
easier
and
the
window
stuff
I've
been
keeping
an
eye
on
it
since
Christmas
and
it's
nice
to
see
it
and
going
to
see
people
using
it,
I've
been
doing
some
support
on
the
mailing
list
to
make
sure
that
people
can
use
it
properly
and
in
and
it's
nice
to
see
vitally
trying
to
tackle
the
32-bit.
A
D
Of
not
just
windows,
but
also
the
other
ones,
the
other
operating
system,
but
as
far
as
far
as
a
way
of
the
way
we
left
our
Christmas.
It's
all
running
on
out
there
all
the
integration
tests,
the
swarming
I,
put
into
PR
just
before
Christmas
for
the
nap,
so
that
runs
on
windows.
So
the
64-bit
worst
version
people
are
using
we're.
D
E
Cool
and
Matt
the
whole
thing
else.
It
is
kind
of
flown
in
under
the
radar.
It's
not
exactly
Windows
related,
but
with
our
work
on
getting
bamboo
set
up,
you
actually
have
some
pipelines
that
are
running
that
are
building
a
distribution
of
Python.
It
has
new
pic
built
into
it
right
for
OS
X,
so
like
technically,
anyone
could
download
that
version
of
python.
That
has
the
version
we
could
literally
bundled
into
it
with
no
installation
necessary
I,
wonder
if
there
is
some
value
in
bringing
some
more
attention
to
that
and.
B
I
think
there
is
specifically
I
goes.
Are
there
been
some
people
on
windows
on
the
mailing
list
asking
questions
about
how
to
get
it
running
on
windows?
We
clearly
don't
really
have
much
windows
development
experience,
so
this
might
be
a
good
Avenue
for
them,
and
all
we
really
need
is
a
wiki
page
rate
with
some
instructions
on
how
to
go
about
it.
Yeah.
E
D
B
Yeah
so
put
that
I'm
going
to
do
list
to
try
and
get
some
went
to
work
on
that
wiki
page
yeah.
That
would
be
really
be
useful.
Yeah
I've
been
surprised
how
many
people
have
been
trying
to
get
hookah
crying
on
windows
to
the
point
where
I've
explicitly
sent
an
email
saying
by
the
way
we
don't
quite
for
new
pic
on
windows
yet
because
I
feel,
like
people
are
expecting
it
to
work
on
windows
and
like
we
did.
B
Richard
did
a
ton
of
work
and
thank
you
very
much
again
for
doing
that,
but
we're
not
quite
to
the
pip
install
new
pic
face
on
windows.
At
this
point
and
then
also
you
know,
Richard
you've
been
really
helpful
in
the
mailing
lists
to
get
these
people
up
and
running
him,
and
he
these
guys
have
gotten
up
and
running
with
new
pic
on
windows
for
the
most
part,
which
is
amazing,
awesome
with.
E
D
Because
at
the
moment
on,
Windows
you've
got
the
pip
install
new
pic
that
picks
everything
up.
If
you
use
in
the
right
Python
we've
already
seen
somebody
trying
to
use
it
with
a
32-bit
Python,
and
it
can't
find
the
right
distributions
for
the
bindings
file
I
think
they
try
to
64
and
got
it
working
it.
They
just
mailed
back
tonight
got
that
work
in.
C
The
exact
yeah
so
like
one
little
project
I've
been
doing
is
some
is
a
pic
court
encoders
and
writing
the
clutters
and
C++
is
like
one
thing:
it's
pretty
isn't
just
not
that
hard
of
a
thing
to
do,
but
the
more
difficult
ambiguous
part
was.
How
do
you
fit
that
into
a
new
click
network?
So
that's
kind
of
a
working
progress,
probably
today
I'll
have
a
pull
request.
That
actually
does
that,
so
so
in
general,
I
think
this
makes
it
where
it
is.
C
So
I've
only
done
the
scalar
encoder,
but
the
rest
of
them
are
now
basically
low-hanging
fruit
or
will
be
next
week
or
so,
and
so
that
will
be
out
there.
It's
not
a
thing
we
can
just
like
pull
those
in
weather
ribs
like
us.
If
people
get
really
interested
in
doing
that
kind
of
thing,
just
give
a
place,
you
could
dive
in
so
yeah,
I
guess
once.
A
B
C
It'll,
be
more
of
you'll,
see
the
press
event
you'll,
see
how
like
we
have
these
for
c++
encoders.
We
also
have
these
Python
ones.
We
use
both,
and
so
the
way
you
do
in
another
language
would
be
to
kind
of
mirror
the
strategy
it
wouldn't
be
till
like
inherit
from
anything.
It
would
be
to
your
strategy.
We
use
with
Python
with.
C
D
A
B
Okay,
any
other
topics:
we've
got
anything
on
your
mind,
Felix.
F
Sure
so
I've
been
I've
been
reading
a
lot
of
papers,
I
guess
trying
to
just
figure
out
what
to
do
next
to
get
the
best
in
fact,
and
I've
actually
started
playing
around
with
with
them
to
data
sets
to
to
try
to
work
out.
How
like
how
the
features
that
are
being
learned
by
HTM.
F
Okay,
the
big
ol
I
think
should
be
to
to
learn
features
that
every
user
and
generic
in
some
way
so
that
they
can
be
recombined
for
the
novel
inputs
and
I'm,
not
convinced
that
H
Tim's
doing
that
at
the
moment.
And
so
if
this
topics
like
a
d
correlation
that
come
up
in
the
literature
and
on
the
aesthetic
mechanisms,
yeah
I'm
still
not
quite
sure
what
the
right
strategy
is,
but
I'm
looking
at
the
actually.
F
The
amnesty
data
is
one
and
there's
a
data
set
from
Josh
Tannenbaum
on
basically
a
sort
of
that
cognitive
data
set
on
attributes
of
animals.
It's
like
80
animals,
each
with
60
features,
something
like
that
which
remained
mute,
that
it's
sort
of
like
they
are
in
the
spirit
of
the
Fox
eats
data,
all
the
data
but
larger
scale,
and
you
can
start
to
look
at
how
if
you
can
learn
some
generic
features
of
types
of
animals
face
stuff
that
I
uses
for
future
induction.
F
A
F
F
F
And
they
SD,
as
in
they
associated
synaptic
and
fields
like
they
can
trace
back
the
receptive
fields.
It's
just
what
people
do,
in
example,
reporting
on
em
this
data,
you
can
report
what
are
the
receptive
fields
of
different
neurons
and
they'll,
be
representing
some
sort
of
small
curve
in
a
part
of
the
field
and
say,
and
that
can
be
reused
for
different
types
of
of
characters.
But
you
don't
want
to
be
learning
kind
of
I
wrote
the
whole
shape
of
the
character.
You
want
to
be
learning
a
kind
of
the
usable
bit.
F
B
F
Still
have
I
had
one
multi
Stan
that
Sun
know
if
it's
right
context,
but
you
you
saw
you
sent
an
email
back,
hackathons
that
and
possibly
joining
an
external
hackathon.
Does
that
mean
you're
thinking
of
not
holding
a
hackathon
internally.
B
So
we
know
it
made
that
decision
yet
on
it.
I
don't
know
yet.
I
I'm
gonna
be
back
in
the
office
in
a
couple
weeks
and
we're
going
to
have
some
major
discussions
about
it.
I
am
reaching
out
right
now
with
Christy
our
director
of
marketing
to
see
how
feasible
it
would
be
for
us
to
attach
some
type
of
nuclear
component
of
Technology
sponsorship
of
some
kind
to
existing
hackathons
around
globe
over
the
next
year
or
two.
Maybe
it
really
depends
on
the
response
we
get.
B
If
we
get
a
lukewarm
response
on
that,
then
I
want
to
have
a
Agatha.
If,
if
we
get
a
bunch
of
interested
parties
saying
yes,
you
can,
you
can
play
a
role
in
our
hackathon.
Come
bring.
Your
technology
show
it
to
everybody.
Maybe
we
won't,
it
depends.
I've
come
on
so
oh
hi,
chandan
and-
and
we
also
have
a
Dmitry.
If
you
have
any
things
that
Dmitry,
you
feel
free
to
unmute
yourself
and
join
them.
Oh
yes,
hello.
G
Ok,
I'm
working
with
electroencephalogr
and
a
mirror,
infrared
spectroscopy
data,
then
I
analyzed.
It
was
a
fast
fourier
transform
and
then
send
it
to
learning
algorithm,
hopefully
as
four-year
analysis,
electrons,
Felicia,
spectroscopy
and
as
an
ascent,
rules-based.
When
fast
fourier
transform
into
sensory
substitution
or
another
device
in
a
snap
home
internet
of
things,
then
these
devices
working
make
some
work,
one
target
condition
which
is
n
situation,
oh
and
then
a
sensory
substitution,
or
this.
G
B
G
G
B
So
I
think
this
has
been
a
proud.
This
is
an
unsolved
problem,
I
think
taking
any
any
type
of
this
spectrum
analysis,
data
or
data
from
a
spectrum
like
an
electroencephalogram
and
reducing
it
and
to
into
sparse
you
know,
series
of
sparse
distributed
representations.
We
talked
about
this
at
the
last
challenge.
B
Last
fall
when
one
of
our
one
of
our
challenges
was,
you
know
trying
to
analyze
heartbeats,
and
you
know
we
kicked
around
the
idea
internally
of
doing
some
type
of
frequency
encoder,
but
I
don't
think
we
got
very
far
on
that,
so
they
there
is
no
answer
to
this
question.
Yet
how
do
you
encode
that
type
of
data?
I'm
not
sure.
B
B
H
Yeah
I'm
gonna
kill
me
yes,
okay,
it!
So,
let's
see
we
have
right
now
about
52
people
that
have
signed
up
and
I
guess
we,
the
location,
allows
for
even
more
people,
so
I
sent
out
a
note
this
morning
and
we
have
another
10
days
to
go.
So
we
generally
see
some
RSVPs
happen
in
the
last
week,
so
it
should
be
an
interesting
get
together.
This
wood
has
signs
of
being
a
much
bigger
than
the
last
one.
H
B
H
And
Jeff
will
have
also
signed
up
for
a
few
small
demos,
so
it
would
be
interesting
to
see
all
of
those
I'm
looking
forward
to
it.
We
also
have
a
slot
there
from
Christy
to
talk
about
what
you
and
what
you
know.
Should
the
community
be
aware
of
so
I
think
she
is
putting
together
some
content
for
a
flutter.
B
This
is
a
symbol
of
growth
in
our
in
our
technology
in
our
community
that
companies
are
starting
to
come
to
de
menta
asking
if
we
can
consult
for
them
and
you
know,
create
in
a
shim
network
or
put
together
a
new
big
gap
or
something
yeah.
That's
not
really
our
business
strategy,
we
don't
do
consulting
so
I,
went
to
the
community.
Ask
the
mailing
list.
If
anyone
is
interested
in
doing
HTM
insulting
and
put
together
a
list
and
I'll
be
keeping
that
list
up
to
date.
B
If
you
have
done
HTM
work
in
the
past,
with
new
pic
or
HTM
comport
X,
you
have
experience
putting
together
HTM
networks,
encoding
data,
creating
applications.
Let
me
know,
can
email
me
at
Matt
at
the
MIT
to
that
org
and
send
me
your
basic
information,
bio
location
and
free
related
experience
and
stuff
and
I
will
add
it
to
this
list.
So
I'm.
B
This
isn't
really
an
endorsement
from
the
Menta
I'm.
Just
collecting
people
so
but
I
will
about
I
am
keeping
this
list
to
pass
to
businesses
that
are
looking
for
HTM
consultants
just
to
let
you
guys
know
one
one
other
thing
that
I
think
might
be
interesting
for
you,
supa
tied
to
mention
is
you
recently
went
to
co-sign?
A
Yeah,
so
we're
in
the
process
of
writing
up
like
a
detailed
blog
post
on
it,
so
we'll
be
more
detailed
there.
Obviously,
but
we
went
to
a
conference
called
cosine,
which
is
a
computational
and
systems
neuroscience
conference.
It's
the
big
one
every
year
and
we
presented
a
poster
and
Jeff
had
a
talk
on
it,
and
we
also
got
to
see
what
everyone
else
is
working
on.
It's
kind
of
a
mixture
of
theoreticians
and
I'm
endless
from
the
neuroscience
world,
and
it's
about
sort
of
moderate
size,
that's
seven
hundred
to
a
thousand
people.
A
So
when
algorithms,
you
know
detail,
scientific
thing
was
fairly
big
from
the
point
of
view
of
a
big
commercial.
He
wasn't
commercial
at
all
this
small
from
that
standpoint,
but
it
was.
It
was
really
interesting.
We
saw
a
lot
of
different
experiments
and
models
and
people
trying
a
lot
of
people
trying
sensorimotor
experiments
and
trying
to
understand
sensory
motor
behavior,
which
is
ties
in
nicely
to
stuff
that
were
twisted
in
some
people.
A
A
Are
more
familiar
with
the
cast?
The
representation
in
the
brain
become
much
sparser,
which
is
exactly
what
you
predict
from
capital
memory,
and
so
a
lot
of
people
were
trying
to
understand
that.
Why
is
it
becoming
scarcer
and
so
there's
a
bunch
of
examples
of
that
particular
phenomena?
So
it's
just
stuff
like
that.
A
B
F
But
I'm
not
I'm.
Sorry
there
was
this
interesting
discussion
on
the
manliness
about
medical
traffic
receptors
that
Jeff
joined
in
and
if
so
the
last
thing
he
said,
I
guess
a
couple
of
weeks
ago.
It
was
that
he,
you
don't
have
a
meeting
and
you
come
to
a
really
good
understanding
and
he
didn't
have
time
to
go
into
it.
But
ya
did
good
back
at
an
outline
which
is
really
interesting.
But
I
was
just
wondering
if
he's
planning
or
someone's
planning
to
expand
on
that
yeah.
A
A
Had
much
time
to
really
put
his
thoughts
down
and
and
stuff
I
know
he's
kind
of
rethinking
a
bunch
of
things
right
now,
it's
like
too
early
to
really
talk
about
it,
but
he
is
thinking
about
hierarchy
and
sensory,
motor
behavior
and
so
on.
I,
don't
know
if
he's
had
much
progress
in
with
metabolic
receptors
as
such,
but
you
never
know
what
Jeff
I
would
say
just
ask
again
on
the
mailing
list,
or
I
can.
A
We
did
see
in
the
kind
of
the
we
did
see
in
done
at
cosine
someone,
professor
barry
from
princeton.
He
tried
to
teach
an
animal
how
to
recognize
random
sequences
and
would
be
recording
from
neuron,
and
he
did
see
that
one.
You
got
too
sparse
and
learn
sequences
really
fast,
like
within
a
few
seconds,
and
you
got
to
see
the
really
sparse
responses,
but
he
also
saw
some
evidence
of
some
sort
of
a
pooled
response,
so
neuron
that
would
become
active
throughout
this
whole
random
sequence.
A
So
we
were
trying
to
figure
out
okay.
How
could
you
know
that
will
learn
that
fast
with
it?
Some
may
be
mad
about
tropic
receptors.
I
involved
in
there.
We
don't
know
you
know
there
was
definitely
evidence
of
really
fast
sequence:
learning
in
v1,
even
even
very
early
visual
cortex
and
evidence
of
completely
a
full
representation
of
a
completely
new
random
sequence.
That
was
pretty
cool
with
you.
B
So
much
we
don't
know,
okay,
so
thanks
all
you
guys
for
joining
an
ending
note.
I
just
want
to
say
if
anybody's
watching
and
they
know
of
a
hackathon,
they
think
that
no
matter
should
be
at
and
showing
these
hackers
our
technology
trying
to
get
people
to
use
it
for
their
hacks.
Let
me
know
personally:
Matt
aetna
meant
org
I
would
appreciate
it.
Thank
you
very
much.
Thicker
see
you
on
the
mailing
lists.
Seal.