►
Description
Carin Meier told about the Clojure API for MXNet, about the process of writing it, the relation to the bigger Apache MXNet project, future challenges and more. Followed by a broader discussion of the Clojure data science landscape and community challenges.
This was the second meeting of the Scicloj community, on May 15th, 2019, 10pm UTC -- an open online gathering of the Clojure data science community.
More details here:
https://twitter.com/scicloj/status/1125890426752389120
Meeting's text chat:
https://docs.google.com/document/d/1TYN9AtCEjMuXm0obvMo6qxpOaoqV4VtPFtU6wlGuWgo
A
Hello,
everyone,
we're
really
pleased,
are
the
second
Catherine's
now
with
the
great
can
admire,
as
you
can
see,
of
course,
is
going
to
look
about
the
next
and
closer
we're
recording
this,
because
we're
going
to
release
this
on
YouTube
in
the
next
few
days.
So
if
you
have
problems
with
that,
you
know
it
right
now:
it's
okay
and
of
course,
then
you
can
share
it.
If
someone
wants
to
see
it
in
wasn't
possible
for
them
to
join
the
meeting
right
now,
so
without
further
ado,
actually
Karen.
B
So,
let's
go
everyone
hear
me:
yeah
yeah
I
have
the
chat
over
on
my
other
window,
so
I'll
look
over
at
that
occasionally
too.
So,
if
you
want
to
put
questions
in
there
as
I
go
along
I'm
happy
to
take
questions
as
we
go.
This
isn't
like
a
you
know
like
a
big,
formal
presentation
or
anything
like
this.
So
it's
more
of
a
you
know,
just
explaining
things
and
you
know
showing
where
we
are
in
the
community
some
challenges.
We
can
open
up
a
larger
discussion.
B
B
So
it's
an
Apache
project,
its
incubating
I'm,
just
gonna
bring
up
a
website
here
have
a
whole
bunch
of
so
here's
a
website
for
it.
It
kind
of
explains
it
better.
So
it's
incubating.
So
it's
not
quite
graduated,
yet
it's
getting
close
graduation,
but
it's
still
in
the
incubation
phase
of
being
an
Apache
project,
and
we
can
talk
more
about
that
later
of
what
that
means,
but
it's
a
flexible
and
efficient
library
for
deep
learning.
B
B
Let
me
see:
there's
lots
of
different
languages,
the
higher
level
languages.
There's
Python,
look
we're
there
closure
whoo,
Java,
Julie,
Perl,
RR
Scala,
so
it
also
uses
TBM.
We
had
the
presentation
last
time
on.
You
know
DVM
a
lot
of
the
stuff
that
they
check
that
ml.
So
the
core.
This
is
actually
kind
of
an
important,
important
part
and
distinction
in
my
mind,
with
the
other
deep
learning
frameworks
like
tensorflow,
the
the
whole
core
of
the
framework
is
in
like
C
C++
and
the
higher
languages
wrap
around
that
API
and
build
on
it
now.
B
This
is
opposed
to
like
tensorflow
as
I
understand
it.
I
could
be
wrong
in
this,
but
when
I've
talked
to
people,
this
is
sort
of
the
challenges
they
tried
in
integrating
with
it
is
that
only
part
of
it
is
in
the
part
of
the
car
is
in
like
C
stuff
and
the
rest
of
it
is
actually
core
in
Python,
so
to
interact
with
the
C
stuff.
B
So
what
can
you
with
them
accent?
You
can
do
all
sorts
of
deep
learning
tasks
and
one
of
the
classic
ones
that
you'll
see
a
lot
or
image
recognition.
Image
classification
you
can
tell
whether
things
are
cats
or
dogs
or
especially
recognize
human
faces.
That's
like
an
important
thing
now
and
for
us
for,
like
industry
stuff,
we
can
do
a
lot
of
stuff,
a
text,
so
tech
sentiment,
whether
it's
a
positive
or
a
negative
sentiment
or
what
they're.
What
this
article
is
about.
B
Summarization
classification
is
a
legal
document
and
something
else
you
can
do
question
answering
now
with
it
actually
show
a
demo
of
that
text
extraction.
You
get
a
document
and
you
want
to
extract
some
sort
of
text
and
not
in
a
brittle,
rule-based
fashion.
That's
also
a
core
it
and
there's
like
tons
more
that
I
haven't
touched
on,
there's
all
sorts
of
things.
You
know:
there's
reinforcement,
learning,
there's
a
Gans,
it's
a
generative
stuff,
there's
just
a
whole
rich
field
of
things.
You
can
do
with
deep
learning
and
we've
only
scratched
the
surface.
B
The
real
talk
is
that
this
is
dominated
into
a
pretty
few
major,
deep
learning
frameworks
that
people
use
and
you'll
notice
that
there
are
some
names
next
to
them
and
we
all
know
Google
with
tensorflow
height
or
just
kind
of
backed
by
Facebook
and
MX
net
is
backed
by
AWS,
and
why
is
this
important?
Because
this
field
is
moving
fast
and
furious
and
to
keep
up
you
really
need
company
backing
it,
putting
putting
time
and
resources
and
effort
into
it,
and
these?
These
are
the
frameworks
that
these
companies
are
supporting
so
saying
this.
B
It's
really
hard
for
a
small
player
to
come
on
the
field
and
compete
with
these
people
he's
father,
deep
learning
frameworks.
If
they're
like
ok,
we
got
like
three
developers.
We're
gonna
make
a
deep
learning
framework.
Well,
you
can,
but
you
know
it's.
This
is
a
reality.
This
is
a
reality
of
a
successful,
deep
learning
framework
project
and
actually,
if
you're
talking
about
like
open
source
in
general,
it
is
the
trend
to
have
a
really
big,
successful,
open
source
project
you're
going
to
have
backing
by
a
company
that's
putting
full-time
developers
into
it.
B
The
other
kind
of
reality
check
sort
of
thing
that
I've
noticed
a
lot
as
I've
gotten
into
deep
learning
and
I've
attended
conferences.
I've
attended,
Leichner,
apps
I've,
attended
a
CLR
I've
read
like
all
the
blogs
and
I
I.
Have
it.
I
have
semi
good,
at
least
for
me.
I
think
I
have
a
heartbeat
of
you
know
what
is
going
on
in
the
steep
learning
community,
and
this
is
actually
there
was
another
talk
and
it
was
video
broadcast
by
I.
Think
Aria
I
think
you
mentioned
this
too.
B
B
Performant
wise,
there's,
all
sorts
of
other
struggles
with
putting
that
sort
of
code
into
production.
Python
Python,
based
same
with
high
torch
I
mean
they're,
like
dominating
the
research
field,
right
now,
just
kind
of
dominating
it.
But
then
you
look
at
MX
net
and
AWS
and
its
I
mean
industry
and
putting
stuff
in
production
is
a
WSS
bread
and
butter
and
I.
Think
this
really
marries
well
with
MX
nets
strengths
with
this
performance
and
also
the
multiple
languages
that
it
can
run
on.
You
can
run
it
on
JVM.
B
You
know
it's
not
it's
not
limited
to
just
the
Python
ecosystem,
so
I
think
that
MX
net
and
AWS
they're
they're
really
poised
for
success
for
leveraging
deep
learning
in
a
production
industry
setting
in
the
next
years.
So
they're
like
in
third
place
right
now,
I
would
say
for
the
deep
learning
frameworks,
but
I
expect
them
to
really
tip
up
very
shortly.
But
again,
this
is
my
personal
personal
opinion
on
this.
B
B
B
B
So
this
is
totally
a
valid
response,
because
this
isn't
the
first
time
you
know
I
was
in
it.
I
was
like.
Oh,
you
know,
I
don't
want
to
use
Scala
I
want
to
just
keep
it
pure
closure.
What
can
I
do
so?
I
get
a
lot
of
a
lot
of
reactions
that
people
are
like.
You
know.
Why
is
it
not
sure
I
don't
want
to
have
to
deal
with
Scala
I?
B
So
let's
have
some
real
talk
about
purity,
so
I
love
closure.
You
know
we
all
hear
a
left
closure.
This
is
why
we're
here,
but
you
know,
closure
is
pragmatic.
I,
remember
people
saying
about
closure:
oh
I,
really
like
closure,
but
it's
on
the
JVM
and
I.
Don't
like
the
DVM
I.
Okay,
we
just
have
just
plain
closure,
but
closure
is
pragmatic.
You
know
there's
a
lot
of
advantages
that
we
get
pragmatic
advantages
using
libraries
on
the
JVM
and
the
reality
is
right.
B
Now,
as
we
stand,
the
closure
world
is
really
small
compared
to
what's
out
there
for
Python
and
the
rest
of
the
deep
learning
world
I
mean
really
small
I
recently
went
to
Ike
I
Claire,
it's
another
like
deep
learning
event
and
there
was
like
not
well.
There
was
one
other
closure
risk
because
it
was
been
for
my
company,
who
went
with
me
only
people
there
and
maybe
like
I,
think
it
was
about
3,000
or
4,000
people,
so
I
mean
it's.
It's
really
small.
When
you
get
out
of
your
your
your
sphere
and
your
ecosystem.
B
B
B
However,
this
is
open-source
software,
so
if
somebody
really
feels
passionate
about
this
and
wants
to
put
the
time
in
build
from
the
base
bindings,
you
know
you
can
definitely
do
that
and
we
can
take
that
and
run
with
it.
But
this
is
kind
of
a
tree
structure
of
the
project.
There's
a
lot
of
work.
That's
gone
on
the
Scala
package
of
just
building
stuff
out
that
we
can
take
advantage
of
so
the
stuff
and
stars
here.
B
So
the
generated
and
the
array
and
simple
stuff
that
comes
from
the
base
C
stuff
and
like
some
of
this
MD
array
and
some
of
this
symbol,
this
is
what
you
could
get
if
you
just
kind
of
do
some
direct
C,
J&I,
stuff
or
building
out.
However,
you
want
to
just
directly
interface
to
the
C
package.
You
would
have
to
build
out
all
this
other
stuff
as
well,
so
we
are
not
only
getting
you
know
future
and
current
development.
B
Okay,
so
let's
talk
the
state
of
the
closure
MX
net
package,
so
I
think
it
was
July
when
the
PR
last
July,
when
the
PR
got
accepted,
that
it
was
actually
part
of
the
larger
apache
MX
net
project.
So
we've
been
part
of
the
main
package
for
nearly
a
year,
we're
still
kind
of
under
this
contributor,
but
we're
nearing
graduation
out
of
the
contrib
I.
B
You
know,
I
think
what
we'll
do
is,
you
know,
take
a
inventory
of
where
we
are
and
how
far
we've
come
in
a
couple
months
and
put
out
a
vote
and
discussion
to
the
group
to
have
us
graduate
out
so
I
think
we're
pretty
near
that
point:
we're
gaining
contributors
and
committers,
which
is
wonderful.
This
is
this
is
what
we
we
want
to
have
happen
and
I
get
more
people
involved
and
grow
the
package.
We
have
one
pp,
PMC
member,
which
is
me
right
now,
but
again
hope
to
get
more
people
up
there.
B
What
what
is
this?
This
is
the
modeling
project
management
committee,
because
we're
still
incubating
it's
called
a
pod
Ling.
But
the
the
point
of
this
project
management
committee
is
against
voting
rights.
We
look
at
the
long
term,
health
and
growth
of
the
MXM
project
and
help
vote
on
bringing
new
members
in.
B
So
it's
very
good,
so
it's
actually
I'm
a
committer
as
well
as
if
he
used
to
be
a
member.
So
that's
like
to
committers
and
there's
more
to
come.
So
the
other
thing
that
I
like
about
this
is
that
each
each
committer
that
is
admitted
into
the
larger
group
of
commit
rights.
This
is
not
just
a
closure
person.
Voting
on
you
know
who
would
be
a
good
person
that
have
in
this
committer.
This
is
people.
B
You
know
across
a
scholar
package
across
you
know
python.
I
mean
it's,
it's
a
larger,
larger
group,
so
it's
really
community
and
so
being
admitted
into
a
committer.
I
think
is
a
really
big
big
deal
because
that's
like
in
a
larger,
a
larger
community
setting
of
just
bigger
than
closure
ist's.
So
I
think
that's
really
great.
B
Over
a
hundred
and
fifteen
closure,
pr's
have
been
opened
and
merged
across
this
nearly
a
year,
and
this
is
cross
language
too.
So
we've
had
enclosure,
people
are
doing,
scholars
spell
out
people
to
enclosure,
stuff
java
stuff,
and
you
know
it's
really
good
to
see
the
collaboration
I
like,
and
we
talked
about
nearing
graduation
napkin
trip.
So
I
think
it's
a
really
really
great
project
to
be
part
of
and
to
build
awareness
of
closure
in
the
larger
deep
learning
community.
B
B
So
it's
data,
art
with
text
and
just
came
out
like
in
the
last
couple
days,
was
a
state-of-the-art
facial.
Rec
mission.
I'll
show
you
this.
This
is
retinal
single-stage
dents
face
localization
in
the
wild,
so
this
just
came
out
like
a
few
days
ago-
and
this
is
the
cool
thing.
I
haven't
had
time
to
do
this
yet,
but
you
could
recognize
all
these
faces
so
many
faces,
but
it's
got.
B
B
Oh
sorry,
I
wasn't
looking
at
the
thing,
can
an
accent
take
advantage
of
GPU
tensor
units,
so
it
can,
it
does
do
GPU,
and
so
there
is
a
cpu
version
and
a
GPU
version
on
Linux
there's
only
a
CPU
on
Mac
so
that
there's
some
work
needs
to
be
done
to
get
the
GPU
working
on
that
and
I.
Don't
know
what
RTX
20secs
GPUs
are.
So
maybe
if
you
want
to
explain
that
whoever
Alan
or
I
could
look
it
up
later.
A
C
B
B
B
B
Okay,
so
if
you
go
into
like
this
computational
complexity,
theory
answering
data
set
you'll,
have
this
like
paragraph
that'll,
say
about
complexity,
theory
little
above
and
then
you'll
have
some
questions
that
are
based
on
this
and
actually
trying
to
get
an
answer
out
of
it.
So
you
know
what
is
the
term
for
tassa
generally
lends
itself
from
this.
You
know
there's
the
answer,
so
they
have
all
sorts
of
areas
where
you
can
ask
about
this.
It's
Norman's,
like
the
rain
forest.
B
Now
how
many
Scarecrow
millimeters
are
covered
in
the
rain
forest.
Now,
how
many
nations
control
those
region?
Just
it's
really
comprehensive
and
stuff
that
I
look
at
and
went.
You
know
there's
no
way
a
computer
could
do
this,
but
we're
there
we're
at
that
point.
So
this
takes
some
Bert,
which
is
a
Bert
model
that
has
been
fine-tuned
on
this
squad.
A
B
So
the
Scala
bits
are
hidden
there
there
underneath,
so
you
shouldn't
you
shouldn't,
be
dealing
with
small
bits
unless
you're
really
developing
the
package
all
to
show
some
code
for
this.
So
you
can
see
what
it
looks
like
I'll
show
you
the
results.
First
right,
you
see
the
cake,
so
this
is
stuff
kind
of
directly
from
the
squad.
B
You
know
in
a
1960s
series
discovering
the
most
important
information
of
the
seafloor
spreading
blah
blah
blah
blah,
but
the
input
question
that
was
put
into
this
was
what
was
the
most
important
question,
and
this
was
the
ground
truth
answer
and
it
worked.
It
got
the
predicta
answer
and
then
I,
even
though
it
wasn't,
you
know
trained
on
this
I
just
put
in
my
own
questions
and
answers
just
for
fun.
So
you
know
here's
my
my
answer.
B
You
know
Susan
had
a
cat
named
Sammy
when
she
lived
in
the
greenhouse
and
my
input
question
was
what
was
Susan's
coming
and
it
said
Sammy.
So
that
was
just
totally
inference.
It
wasn't
trained
on
that
at
all.
So
this
is
another
one
that
I
just
made
up.
Well,
actually,
I
didn't
make
it
up
because
I
think
it
took
it,
maybe
from
the
Wikipedia
or
something
like
that
about
the
creator
of
the
closure
language,
so
Mike
white
asked
it
like
who
created
closure
and
it
predicted
rich,
so
very
cool,
cool
stuff,
so
yeah.
B
B
Okay,
so,
like
I
said,
even
though,
underneath
we
are
using
the
Scala
stuff,
you
won't
see
that
in
your
normal
day-to-day
stuff,
if
you're
doing
some
new
development,
where
you're
doing
some
new
interoperates
and
new
functionality
will
have
to
deal
with
that
and
I
can
show
you
exactly
what
that
entails.
But
this
is
looks
pretty
much
like
just
your
regular
closure
stuff.
You
have
some
pre-processing
you've
got
your
infer.
B
B
B
B
So
like
creating
a
new
module,
you're
gonna
have
to
you
know,
do
some
data
structure
Interop
going
in
and
out.
So
that's
that's
just
what
it
is
so
not
too
bad,
I!
Think:
okay,
next
example,
next
example:
I
wanted
to
do
is
actually
I
made
where
we
made
a
fine
tuning
and
the
fine
tuning.
This
was
like
super
exciting
to
me
because
we
were
able
to
take
this
gluon
NLP,
so
gluon
is
is
a
we.
We
support
right
now
and
the
closure
package.
B
We
support
the
module
api,
there's
a
module
api
and
then
there's
a
gluon
api.
So
the
gluon
api
is
just
another
another
way
to
do
the
deep
learning,
that's
a
little
bit
more
friendly
and
the
python
has
it.
So
we
need
to
build
that
out
too,
but
this
is
another
project
called
gluon
and
NLP
that
specializes
in
LP,
and
they
have
this
nice
tutorial.
That
was
like
fine
sentence.
B
B
B
So
here
is
the
fine
tune.
Burped
and
you'll
see
that
the
kernel
starting
and
it's
line
closure.
So
the
line
closure
plug-in
this
kind
of
got
me
as
well.
But
if,
if
you
just
use
a
regular
closure,
Pro
that
play
in
and
won't
use
your
line
again
dependencies,
but
if
you
use
billion
closure,
it
does
pick
up
the
dependencies,
which
is
what
you
want.
So
the
kernel
is
ready.
Now
can.
B
B
So
we're
going
to
load
up
the
pre-trained
Burton
model,
so
you'd
have
to
I've
already
downloaded
these
scripts
I
can
make
you
wait
for
that.
So
baby
move
you
in
get
Bert
data.
The
the
pre-loaded
weights
will
be
loaded
up
for
you
or
will
be
exported
for
you,
so
you
can
just
load
the
preteen
model,
so
I
like
this
picture,
and
you
can
look
at
this
all
a
little
bit
later
at
your
convenience.
So
it
kind
of
shows
you
the
Burton
article
model
architecture
for
the
sentence,
pair
classification.
B
B
B
So
what
we're
going
to
do
here
is
we're
going
to
create
a
fine
fine-tuned
model
from
it
that
is
going
to
take
it
in,
and
it's
going
to
add
in
a
drop
out
fully
connected
layer
and
it's
going
to
create
an
output
layer
softmax
for
a
classification
at
the
end,
so
we're
going
to
have
our
two
number
of
classes
for
our
classification
that
we're
going
to
do
and
we'll
show
that
a
little
bit
more
detail
make
more
sense.
So
there
pre-processing
pre-processing,
is
always
kind
of
huge
in
these
sort
of
things.
B
B
B
So
the
ones
that
we're
interested
in
is
we're
interested
in
sentence
a
which
is,
you
know
he
said
the
foodservice
pie.
Business
doesn't
fit
the
company's
long-term
growth
strategy
sentence,
B,
which
is
the
food
pie.
Business
does
not
fit
our
long-term
growth
strategy
and
whether
it's
equivalent
or
not,
are
the
two
sentences
equivalent
or
not.
So
if
they
are
equivalent,
then
it's
going
to
be
one
and
if
it's
not
equivalent,
it's
going
to
be
a
zero.
B
So
there's
a
little
bit
about
like
toque,
embeddings
and
position
and
beddings
with
bert,
which
I'm
not
going
to
go
into.
So
you
can
kind
of
go
through
a
walk
through
this
and
read
it.
But
then
you
do
pre-processing
to
get
all
this
text
in
the
right,
padding
and
everything
to
go
into
the
input
model.
The
the
notebook
is
a
the
notebook,
so
if
you're
looking
at
it-
and
you
go
I'll
put
this
in
the
link,
this
is
actually
right
here
and
examples.
B
B
B
B
Then
we
kind
of
construct
it
all
and
we
make
this
iterator
out
of
it.
This
mxi
o
ND,
ed
Rader.
So
at
the
end
you
have
an
ND
rate
iterator,
so
the
NDA
rate
at
array.
Let's
talk
about
a
little
bit
with
it
that
does
this
iterator
helps
as
its
training.
It
helps
cycle
through
things
batch
it
and
you
know
shuffle
it
if
it's
appropriate
and
just
you
know,
manage
your
data
in
a
performant
way.
So
that's
what
the
NVRA
and
Rader
does.
Okay.
B
So
now
we're
the
point
that
week
with
it
all
together,
I'm
only
gonna
run
it
for
one
epic,
because
you
know
we
have
other
things.
We
want
to
do
her
time,
but
it
puts
it
all
together,
you
make
a
fine-tuned,
mod
module
and
you
can
give
the
context
I'm
running
about
CPU,
which
I
think
is
pretty
amazing.
Then
I
can
run
this
on
my
CPU.
You
can
run
that
on
GPU
come
around
multiple
GPUs.
B
If
you
wanted
to,
and
you
can
just
run
fit
on
it
and
it
takes
the
the
pre-trained
arc
Rams
and
ox
prams
and
here's
your
optimizer
and
it
uses
a
callback.
The
callback
you
can
adjust
the
callback
size
I'm,
just
putting
it
to
one
because
I
like
to
see
its
progress,
because
I
am
running
out.
My
CPU
and
it's
kind
of
slow
I
mean
just
generally.
It's
amazing
that
I
can
run
it
at
all.
Just
I
guess
my
point,
but.
B
B
D
A
A
A
B
Mx
net
has
a
performance,
it's
the
most
performant
framework
out.
There
I
think
it
also,
depending
on
on
what
you
value,
I
think
having
it
be
Apache
open-source
is
important,
because
that
means
that
if
you
want
to
get
involved,
if
you
want
to
drive
where
it
goes
and
make
sure
that
it
meets
your
need,
you
can
have
a
say
and
you
can
help
steer
it
in
the
direction
that
you
need
to
and
in
other
packages.
That
might
not
be
the
case
because
they
are
not
Apache.
B
A
Yeah
thanks
for
it
and
by
starting
to
use
it
actually,
eighth,
it
felt
like
it
was.
There
were
less
passing
whistles
then
tensorflow,
let's
say
I'm,
not
very
you,
so
I
can
say
about
that.
I
can
talk
about,
but
then
it
felt
much
leaner,
much
smoother
and
tensorflow,
but
this
meant
also
because
because
I
actually
lately
come
to
a
tense
flow,
so
I
might
be
biased
in
I.
Don't
know
and
I
see
that
there
are
already
a
few
questions
in
the
chat.
Okay,
I
can
Berger.
He
asks
his
most.
You
see.
B
B
You
know
this
open,
AIG
PD
do
I
mean
it's
just
incredible
amount
of
resources
that
I
think
if
you're
going
to
do
any
real
task,
just
starting
from
scratch
and
trying
to
get
all
the
data
out
there
on
the
internet
is
going
to
be
not
a
very
good
use
of
your
resources.
I
would
say
the
the
best
use
of
your
resources
is
taking
something
that's
out
there.
That's
that
state
of
the
art
and
then
fine-tuning
it
to
your
data
set.
B
Right
so
we
we
are
working
on
putting
it
into
production,
but
it's
not
quite
in
production,
yet
so
I
don't
know
how
much
I
can
speak
of
it
about
this,
of
maybe
other
people,
but
I've
heard
lots
of
people
talk
about
the
challenges
getting
pipelines
together
and
you
know
making
sure
that
your
accuracy,
you
know,
is
measured
over
time,
so
I
think
there
I
think
again.
This
is
an
area
where
you
know
you
have
the
research
and
then
you
have
industry
I.
B
Think
industry
is
really
going
to
start
picking
up
the
slack
and
learning
how
the
best
practices
of
building
these
pipelines
and
production
izing.
This
and
I
think
that
having
the
JVM
and
having
MMX
net
are
some
good
tools
to
do
this.
So
I
don't
have
the
I.
Don't
have
the
answer
on
this,
but
I
think
it's
evolving.
B
Right,
do
you
do
most
problems
you
run
across
with
musi,
and
the
Scala
lair
I
would
think
that
it's
mainly
figuring
out
your
models
so
but
saying
that
I
mean
you
do
have.
This
is
one
disadvantage
of
running
this
high
up
on
the
stack
right
because
there
are
three
different
layers
of
debugging,
but
you
know
that
you
have
the
error
that
could
occur
in
the
closure
layer.
It
could
occur
in
the
sky
layer
and
it
could
occur
in
the
sea
layer
layer.
So
there
are
three
three
possible
areas
that
it
could
occur
in
in
practice.
B
You
know
I
see
I
see
most
of
my
debugging
time
is
either
you
know
in
the
normal
closure
errors
we
make
or
trying
to
get
the
models
right
in
in
the
layers,
and
when
you
have
like
a
layer
mismatch
it
will.
It
will
complain
at
you
at
the
sea
level.
You'll
get
a
get
a
stack
and
just
ectric
some
Java,
but
it's
from
the
sea
level.
Interop
that'll
say
you
have
a
layer
mask
match.
You
had
like
31,
you
know
layers
coming
in
here
and
it
was
expecting
103
and
you're
like.
B
B
I
think
I
kind
of
went
into
that,
but
the
Scala
Interop,
so
I
didn't
know
any
Scala
at
first
going
into
this
project
and
I
got
pretty
proficient
at.
You
know,
kind
of
going
back
and
forth.
Trying
to
so
now.
I
could
translate
a
Scala
example
to
a
closure
example,
and
it
really
isn't
that
bad
I
mean
it's
like
a
like.
I
said
you
don't
have
to
do
that.
A
C
I
have
a
question
this
Chris
yeah
I
was
curious.
There
is
some
discussion
somewhere
else
about
somebody
using
different
language
models
for
different
languages
and
I
researched,
Bert
and
gluon
a
little
bit,
but
not
much
but
Karen.
For
instance,
if
you
had
a
bunch
of
documents
in
Italian,
could
you
run
that
same
fine-tuning.
B
I
think
there
is
a
different
language
model
for
it,
so
I
think
you'd
do
a
different
pre-trained
language.
I'd
have
to
do
some
quick
research
on
that,
but
I
think
it
depends
on
the
corpus
that
is
strained
on
as
well
mm-hmm.
So
this
is
trained
on.
Like
you
know,
English
based
MX
net
is
actually
very
big
in
China,
it's
like
huge
in
China.
B
A
B
B
Contributive
I
can
give
you
the
link
later,
but
there's
there's
like
a
wiki
for
the
Apache
and
any
anybody
that
has
proposals
that
you
want
to
do
like
anything.
You
just
kind
of
post
there
and
people
give
you
feedback
on
it.
You
know
before
you
fight
it's
basically
a
place
to
give
your
design
documents.
So
one
of
the
things
is
in
progress
is
a
new
API
for
ndra
and
I
can
show
you
a
little
bit
of
it.
It's
so
much.
B
D
B
B
D
B
Here's
some!
So
like
here's,
here's
powers,
so
you
know
you
can
do
power
and
it
takes
in
like
a
Scala,
scape
scaler
and
then
it
also
takes
in
an
ND
ray
or
it
can
take
in
like
2
ND
arrays.
So
you
can't
lead
to
know
what
you're,
what
you're
doing
here
and
I'll
show
you
the
new
API,
which
is
much
nicer.
B
So,
look
at
this
check
that
out
and
look
at
like
the
doc
strings
and
everything,
so
these
are
all
taken
from
the
underlying
C
API
and
generating
the
functions
from
that
so
you're
going
to
get
a
lot
of
dock
strings.
To
tell
you
what
it's
doing,
you're
gonna
get
keys
that
tell
you
what
the
options
are.
So
it's
really
it's
really
just
a
fantastic,
so
we're
still
using
the
Scala
package
and
like
the
J&I
functions,
but
we're
talking
directly
to
the
J&I
functions
at
this
point
and
being
able
to
generate
the
functions.
B
B
So
that's
one
initiative:
another
initiative
is
building
out
the
gluon
api
and
enclosure
and
that's
a
bigger
project,
but
it
has
some
interesting
points
and
I'm
gonna
bring
this
up.
I,
don't
know
if
anybody
call
is
really
excited
about
this,
but
I
know
at
least
one
person
I
was
talking
to
is
excited
about
it,
but
there's
autograph,
like
you
know,
there's
like
you,
you
could
do
autograph,
potentially
with
just
like
given
function
because
it's
got
a
capi,
so
maybe
that's
something
that
we
can
leverage.
It's
also
got
in
models.
Ooh.
B
Basically
do
a
version
of
this
book,
but
with
all
the
closure
examples
which
would
be
super
cool,
so
that
is
one
thing
that
we're
looking
into
this.
Oh
that's
a
bigger
project
as
well,
so
if
anybody's
interested
in
like
building
out
another
API-
and
you
know
maybe
eventually
getting
this
stuff
together-
I'll
see
you.
B
So
that's
not
quite
I
was
looking
for
us.
I
can
see
balu
and
model
zoom
anyway.
They
change
this
around.
So
I.
Don't
want
to
look
for
it
on
here,
but
anyway,
there's
a
whole
there's
a
whole
pre-trained
like
model
zoo
that
you
can
get
to
for
the
gluon
stuff.
That
makes
it
really
easy.
So
I
think
you
know,
building
out
the
gluon
API
to
making
it
a
lot
easier
for
users
to
use
build
parity
with
the
Python
package
and
be
able
to
leverage
all
the
documentation
that
they
built
in
books.
B
There's
I
think
another
area
that
matches
up
with
our
strengths
as
industry
and
closures
would
be
to
also
focus
on
the
text
aspects
of
it
because
in
most
businesses,
you've
got
tons
of
text
or
documents
or
something
sitting
around
so
I.
Think
the
opportunity
for
using
deep
learning
in
an
industry
setting
using
text
is
much
higher
than
I'm
using
images.
So.
B
B
A
C
B
C
B
E
B
B
So
there's
a
Scala
is
the
main,
the
main
player
for
the
the
Jana
AMP
drop
and
they've
built
out
some
Java
API.
It's
only
partial
to
use
with
the
the
Scala
to
use
directly
with
the
job
user,
and
they
have
done
benchmarks
on
that
and
the
time
for
the
conversion
of
the
data
structures
is
insignificant
compared
to
the
actual
core
time
that
it's
doing
the
work
on
the
sea
level.
So
I
would
expect
that
it
would
be
the
same
that
the
overhead,
with
the
translation
between
the
data
structures,
is
insignificant
to
the
real
work
time.
B
Right,
okay,
so
that's
a
good
question.
Okay!
So
yes,
the
skeleton
drop
is
needed
for
the
ndnd
raise
the
the
j'ni
code
is
expecting
those
scholar,
data
structures
to
translate
the
and
the
array
code.
So
yes,
you
do
need
the
Scala
Interop.
Would
it
be
possible
to
extend
the
core
matrix
protocols
to
support
the
nd
arrays
as
a
core
matrix
implementation?
I,
don't
know.
Maybe
I
haven't
really
looked
at
extending
core
matrix
protocols,
but
you
know
maybe
a
you
could
try
it
I,
don't
know.
F
B
A
Have
another
one?
Actually
so
let's
say
that
there's
a
beginner
who
would
like
to
about
deep
learning,
and
we
would
like
to
point
him
towards.
Of
course,
what
do
you
think
or
feel
or
if
nothing
is
missing,
that
is
missing,
for
people
to
actually
be
able
to
start
learning
about
deep
learning
and
closure
at
the
same
time,
in
the
most
effortless
way
possible.
Yeah.
B
Would
say,
there's
kind
of
two
ways
you
can
go
about
it.
You
could
go
about
it
with
using
like
a
virtual
machine
or
you
could
try
installing
it
locally
and
running
it
on
your
machine.
So
it
does
take
some
native
dependencies
so
like
open,
C,
be
kind
of
like
me
that
installed
and
other
things
so
I
think
when
I've
talked
to
be
a
little
more
they've
run
into
roadblocks
on
on
trying
to
run
it
locally
they've
run
into
some
trouble
with
the
native
dependencies.
B
So
once
you
get
past
that
I
think
it's
it's
pretty
good.
I'll
show
you
what
you
need
to
do
to
get
going
on
it,
so
what
I
would
do
is
kind
of
clone
clone
the
MX
sent
repo.
The
jars
are
out
there,
so
you
can
grab
an
older
release
if
you
want
to
from,
or
one
of
the
releases
from
maven
they're
out
there
as
well,
but
I
find
it
convenient
to
just
clone
the
whole
repo.
And
then
you
have
like
all
the
examples
right
there.
So.
B
B
If
you're
running
a
not
and
in
boot
or
like
a
standard
Linux,
what
some
of
the
name,
one
I,
don't
even
remember,
some
Arch
Linux
I
think
it
was
they
there's
some
rough
bumps
and
trying
to
get
it
to
run
like
an
arch.
But
if
you're
running
on
a
boot,
it
seems
like
it's
pretty
pretty
good.
So
what
you'd
go
in
here
and
I
usually
find
the
easiest
thing
to
do
is
just
pull
down.
B
B
You
know
goes
through
and
arrays
symbols
and
module,
so
it'll
it'll
walk
you
through
the
emne
stand,
how
to
contract
construct
layers-
and
you
know
kind
of
point
you
to
how
to
how
to
do
everything
so
I
think
that's
kind
of
a
useful
way
to
start
just
looking
at
the
other
examples
that
are
out
there,
yeah
there's
a
Bert
one,
there's
one
on
fine
tuning
CNN
text,
classification
CAPTCHAs,
you
know,
there's
there's
a
whole
bunch
out
there
that
you
can.
You
can
come
play
with
there's
also
a
great.
B
B
B
But
you
know
getting
started
and
actually
this
is
a
really
good
path
to
do,
because
you
could
get
started
on
AWS
and
then
you
don't
have
to
worry
about
all
your
local
stuff
or
you
can
try
to
try
to
run
it
locally
and
it
goes
from
Andy
or
a
symbol,
module
the
visualization
of
symbol
using
OpenCV
and
then
some
pre
trained
models.
So
that's
those
are
great
series
to
go
through
I'll.
Put
that
in
the
in
here
as
well.
D
A
B
B
C
So
there
is
a
bug.
Last
time
I
checked,
there
was
a
bug
in
the
database
for
the
Scala
bindings
that
they
were
actually
considering,
moving
to
J,
a
and
or
at
least
some
dynamic
system,
which
would
eliminate
quite
a
bit
of
their
build
steps
and
some
of
their
dependencies,
which,
for
instance,
open
C
B,
is
something
lots
of
people.
Have
it
lots
of
different
versions
and
I
don't
want
to
commit
anybody
to
the
Scala
version
that
they
decide
to
use
on
that
month
or
whatever
so
I
was
this
isn't
such
a
question?
C
What's
the
best
way
to
start
working
with
the
Scala
team
to
raise
some
of
these
issues
like
for
right
now,
I
can't
easily
use
T
V
in
with
the
max
net,
because
I
have
to
guess
which
shared
library
is
actually
unpacked
on
the
system,
because
the
the
C
bindings
aren't
actually
exposed
through
the
J&I
layer
for
TBM,
even
though
they
are
all
there,
because
MX
net
does
use
all
of
t
vm
the
compiler
facilities
and
everything.
So
there's
like
significant
functionality
exposed
in
the
that's,
not
shared
library.
C
B
Some
more
you
know,
and
once
and
once
you
have
a
direction
on
that
and
you're,
you
know.
Generally,
it's
like
I'm
the
source
right
I
mean
generally
they're
much
more
receptive
to
something
if
you're
gonna
put
in
the
time
of
work,
make
it
there
right.
Maybe
you
say
I.
Have
this
really
great
idea?
I
want
you
to
spend
three
months.
Andrey
architect,
your
system,
they're
gonna,
say
well,
that's
very
interesting,
but
you
know
if
you're
willing
to
work
with
them
and
put
in
some
time
and
code
to
make
it
better.
A
C
The
images,
for
instance,
going
into
the
models
which
is
conceivable
to
need
to
do,
especially
if
you're
training
and
you
want
to
have
fancy
augmentations
of
those
images
for
some
reason
or
another
and
I
just
kind
of
did
enough
as
a
proof
of
concept
to
say
like
oh,
this
is
a
little
hairy
and
they
have
a
bug
that
would
actually
solve
my
problem.
But
I
had
enough
other
things
going
on
that
I
didn't
want
to
get
into
it.
C
So
I
TVM
is
put
very
briefly:
TV
ins,
a
very
advanced
math
and
in
dimensional
compilation,
substrate,
that
you
can
use
to
program
for
CPUs,
GPUs
or
Asics,
potentially
in
dimensional
type
problems
such
as
a
convolution
layer
of
a
neural
net
or
a
lot
of
image
based
processing.
I.
Don't
really
want
to
go
into
it
because
I've
already
given
a
talk
on
that
and
you
can
find
that
on
the
Internet
and
I
can
find
a
link
to
it.
C
But
it's
TBM
clj
is
a
project
that
I
built
because
I
thought
TVN
was
such
a
promising
direction
for
the
closure
community
and
for
the
jvm
in
general,
because
it
gives
you
really
good,
fast,
correct,
arithmetic
for
unsigned
numbers
and
for
GPUs
and
things
like
that
you
can
use.
However,
you
want.
A
A
C
I've
used
the
next
net,
not
from
closure
to
do
facial
recognition,
work
and
I
intend
to
use
emacs
net
in
the
very
new
future.
To
start
writing
some
NLP
examples
include,
or
that
show
kind
of
different
frameworks,
and
that
kind
of
thing
that's
great
yeah.
B
If
you
run
into
any
problems
like
using
a
mixer
or
just
generally
want
some
help,
just
let
you
know
in
general
cognitive
is
it's
nice
enough,
like
I
have
20%
time,
so
they
let
me
use
my
Friday's
to
devote
to
an
accent
and
open-source
work.
So
if
you
grab
me
on
a
Friday,
it's
always
a
good
time.
For
me,
awesome.
A
So
it's
nice
also
for
them
and
I
think
this
is
a
very
important
work
and
what
what
I
think
is
missing,
probably
is
the
part
before
some
things
on
the
part
before
and
some
things
on
the
park
afterwards
MX
net.
So
how
do
I
get
stuff
in
I?
Do
I,
take
stuff
out
and
then
serve
it
to
someone
or
something
else
pretty
depends
what
I'm
doing
with
it.
A
A
B
B
B
B
F
D
B
You
this
is
gonna,
be
boring,
maybe
it
won't
be
so
boring
so
anyway,
here's
like
a
get
image.
So
this
is
a
low-level
kind
of
get
image.
It's
using
OpenCV
to
get
the
matrix
from
the
URL
and
convert
it
to
some
stuff,
you're,
predicting
it
and
getting
stuff
out.
Okay,
I
won't
show
you
this,
but
there's
a
higher
level
way
of
doing
it.
Now,
with
this
infer
package.
B
B
B
Resize
an
image
apply
borders,
so
there
are
some
nice
ease,
high
levels.
Ways
of
doing
this,
so
convert
image
and
array
to
image
to
a
real
image
grab,
draw
bounding
boxes
that
sort
of
thing,
so
you
can
use
the
higher
level
functions
as
well,
so
you
can
use
open,
seem
to
be
directly
or
use
the
higher
level
stuff
or
then
for
package
reads:
they
load
an
image
from
a
path
shove
it
in
the
difference.
A
A
B
C
Okay,
I've
wondered
why
perhaps
clue
Jupiter
doesn't
use
pods
or
something
like
that,
so
the
things
running
that
the
things
actually
running
your
code
aren't
interacting
with
the
claw
Jupiter
ecosystem.
Very
much
that's
potentially
way
more
in
depth,
but
we
have
a
couple
texts
that
we
use
to
like
isolate,
running
JVM,
sub
instances
and
right
now,
I,
don't
think
they're
using
those.
Maybe
that's
a
good
discussion
for
them.
A
B
I,
looked
at
using
that
and
I
honestly
I
would
have
probably
preferred
to
use
it,
but
the
the
lying
plugin
for
this
makes
it
so
nice
that
you
can
use
a
one
command
thing.
This
is
like
install
kernel.
You
know
that
the
other
stuff
is
the
tooling.
Isn't
there
that
you
need
to,
like
you
know,
do
the
git
clone
you
do
a
lot
of
work
to
install
the
kernel
before
you
can
actually
really
get
going
on
it.
So
that's
why
I
kind
of
chose
to
go
this
path.
G
Is
it
better
now
young
lady
yeah,
thank
you
yeah,
so
so
in
what,
in
the
couple
last
couple
of
weeks,
what
I've
been
doing
under
my
employer
is
playing
with
our
enclosure
together
and
and
so
I
think
for
several
of
us.
We
have
had
several
kind
of
experience
in
connecting
our
to
closure,
mainly
when
we
needed
a
firm
closure
to
consume
statistical
functions
and
visual
visualization
functions.
But
now
there
was
some
other
idea
that
I
have
from
our
people,
but
I
talked
with
my
CEO
Asaf,
who
was
occlusion
and
now
likes.
G
Our
and
I
also
talked
with
Al
and
I
felt,
who
is
occlusion
and
is
working
in
our
studio
nowadays,
and
they
both
said
that
it
could
be
interesting
to
consume
closure
form
and-
and
it
is
interesting
from
several
reasons-
one
is
that
it
could
be
interesting
to
our
people.
Wow
really
sometimes
open-minded
towards
function
of
alchemy
and
these
ideas
and
they
could
love
it.
And
another
reason
is
that,
then
you
can
use
closure
from
our
studio,
so
met,
probably
several
people.
G
B
H
A
See
Audrey
go
on.
No,
no
okay,
I
think!
That's
really
really
nice
because,
as
we
were
discussing
it
a
few
a
few
days
ago
and
I
was
thinking
that
getting
to
closure,
our
programmers
is
going
to
be
much
easier
than
Python
programmers,
because
re
is
much
more
lis
speed
and
bitin
actually,
and
so
this
it's
really
really
cool
in
my
opinion,
and
then
I
started
programming
with
our.
So
to
me
our
studio,
it's
it's
really
nice
IDE
and
it
would
be
really
great
because
you
can,
you
might
have
plotting
on
one
side
of
this
tooling.
G
Okay,
so
there's
someone
if
I
may
suggest
maybe
a
we
have
like
a
little
bit
more
time
till
the
end
of
the
meeting
and
and
so
it
is
a
good
opportunity
for
people
to
talk
about
what
they
have
been
doing
or
what
they
have
been
interested
in
and
and
I
guess
many
people
are
studying
something
or
have
some
beginning
of
project.
So
it's
a
nice
time
that
we
can
could
talk
about
it.
If
anyone
wants
to.
I
Yeah
I
can
talk
a
little
bit.
What
I've
been
what
I've
been
working
on?
This
is
Chris
small
here
and
I've
been
working
on
a
oz
which
is
a
data,
visualization
library,
wrapping
wrapping
big
in
Vega,
light
and
I.
You
know,
I'm
sure
the
fact
about
this
with
a
bunch
of
you
and
John
Anthony's
who's
here,
as
well
as
working
on
sort
of
a
similar
similar
project.
The
thing
lately
that
I've
been
working
on,
which
is
kind
of
news,
is
some
static
site
generating
features.
I
I
Say
say
you
know
widgets
or
interactive
panels
or
whatever
but
kind
of
thinking
more
about,
like
you
know
how
do
we,
how
do
we
utilize
data,
visualizations
and
scientific
content
from
from
documents
and
and
sort
of
stitch
those
things
together
and
so,
as
I
move
sort
of
further
out?
I've
realized
that
it's
you
know
this
is
kind
of
direction.
Things
are
heading
that
that
having
sort
of
static
site
generation
features
would
be
a
really
a
really
helpful
thing.
I
You
know
we
have
a
couple
of
static
site
generation,
toolkits
enclosure,
which
I
think
you
know
do
do
a
good
job
of
what
they
do,
but
having
something
that
was
kind
of
pre
baked
with
the
ability
to
layer
in
Vega
and
big
alight
visualizations.
It
would
as
well
as
you
know,
there's
kind
of
scientific
content
logic.
Whatever
would
would
sort
of
you
know,
branch
out
in
this
direction
of
you
know
not
just
not
just
taking
results
and
sort
of
exploring
them
and
do
things
with
them.
I
Sort
of
like
fig
wheel
like
experience
for
data
science,
so
like
a
fig
wheel
like
notebook
kind
of
thing,
I
went
ahead
and
sort
of
took
a
stab
at
this.
So
if
I
was
interesting,
sort
of
problem
and
built
this
sort
of
live
code,
reloading
functionality
which
looks
at
your
closure
code
and
sort
of
evaluates
only
those
forms
which
have
sort
of
changed
from
the
last
time.
I
I
What
I
actually
want
to
do
is
like
write
that
code
and
in
my
editors
that
I
have
you
know
all
my
sort
of
editor
features
set
up,
but
then
I
want
to
run
it
in
in
in
a
repple
and
where
things
get
tricky
is
where
you
have
sort
of
steps
in
your
sort
of
ripple
flow
that
take
a
long
time
to
run,
which
obviously
happens.
A
lot
if
you're
doing
kind
of
data
science,
the
sort
of
stuff,
even
if
it's
just
kind
of
some
basic
data
processing,
may
be
downloading
some
some
data
or
whatever.
I
Sometimes
that
can
be.
That
could
be
something
that
you
sort
of
have
to
start
working
around,
and
you
know
you
start
doing
things
like
okay.
Well,
maybe
I'll
put
that
in
def
once
or
whatever,
but
then
you
need
to
update
it
cuz,
something
before
that
updated
now,
like
oh
gosh,
I
have
to
go.
Is
this
whole
sort
of
state
problem
right
and
so
I'm
just
gonna
simplifying
that
problem?
I
If
you
know
Serge,
but
it's
a
great
little,
you
know
command-line
utility
for
for
publishing
static
sites,
but
but
also
like
as
you
change
files.
It's
it's
popping
up.
It's
popping
up
the
latest
sort
of
thing
that
that
you
added
so
you
know
you
start
adding
one
file
and
as
soon
as
you
save
it
boom,
you
see
those
changes
and
start
adding
another
file
and
boom.
You
see
those
changes,
so
so
it's
still.
This
is
I'm
like
it
just
kind
of
finished.
I
This
like
last
couple
days,
so
it's
kind
of
fun
to
talk
about
it
here,
but
I'm
excited
to
push
it
up
and
see
what
folks
see
what
folks
think
about
it,
because
so
far
again
you
know
so
a
little
bit
rough.
But
it's
it's
sort
of
an
exciting
direction
and
yeah
I
I'm
interested
in
you
know
seeing
seeing
what
folks
think
of
it.
D
I
You
mean
inserted
the
incremental
or
the
what
I
call
it,
but
the
live
code
reloading
yeah
yeah.
Exactly
so
that's
where
this
exit,
this
kind
of
custom
execution
model
comes
in
what
it
does
is
it.
It
uses
a
closure
reader
to
read
in
the
code
as
data
or,
let's
see
how
to
actually
use
it.
Maybe
I'm
using
madam
I'd
just
be
using
eaton
reader.
I
forget
now
no
I
must
be
using
closure
reader
because
you
need
it
for
literals,
so
so
yeah
right.
I
We
just
use
the
reader
to
read
in
the
code
as
data
and
then
we
effectively
like
each
time
the
file
changes
we
sort
of
cache
like
okay.
What's
the
current,
you
know
again
sort
of
not
not
counting
like
whitespace
changes
or
comments
or
anything,
but
just
like
once,
we
you
know
read
in
the
code,
as
you
know,
as
data,
what
like
where
it
is
where's
the
first
change
and
we
only
actually
run
from
that.
First
change.
I
I
If
I
go
and
I
update,
just
something
in
the
visualization,
that's
the
only
stuff
from
there
on
after
is
going
to
need
to
be
recomputed,
so
everything
everything
before
that's
just
gonna
stay
there
and
again
it's
just
in
a
you
know:
a
live
coding,
sort
of
environment
where
you
know
just
save
file
and
boom
things.
Things
updated.
I
Yeah
one
two
one
quick
thing
I'll
add
is
in
relation
to
our
markdown,
which
is
really
gone.
I
didn't
even
realize
this
actually
initially
getting
into
it
till
I
kind
of
started
deciding
how
I
was
going
to
do
the
the
markdown
functionality
they
have,
which
is
right,
like
you
can
get
a
markdown
document
and
in
one
of
the
code
blocks
you
can
add
a
class
that
says
like
this
is
Vega
and
that
will
then
tell
it
like
I
want
you
to
not
just
interpret
this
as
not
just
render
this.
I
So
it
turns
out
that
that
strategy
for
embedding
stuff
in
markdown
is
very
similar
to
what
our
notebook
does,
and
so
as
far
as
and
I
think
it's
actually
pretty
similar
to
what
next
journal
is
doing,
if
I'm
not
mistaken,
so
yeah
in
any
case,
there's
this
kind
of
this
interesting
convergence
there
and
this
pattern
of
you
know
sort
of
extending
markdown
documents
with
a
sort
of
extra
functionality
which
I
I
mean
yeah
I
haven't
even
really
fully
plumbed
it.
Yet
I
mean
right
right
now,
it's
just
it's
just
for
visualizing.
I
You
know
these
vaguer,
bakelite,
visualizations,
but
I
think
the
next
step
is
going
to
be
tools
to
say,
like
sure.
If
you
know
most
of
what
you're
doing
is
just
laying
out
a
markdown
document,
which
you
want
to
include
a
little
bit
of
code
that
that
actually
dynamically
generates
data
visualization
or
whatever,
then
you
can
include
that
in
there
as
well.
So
yes,
so
sort
of
plumbing
the
model,
but
but
it's
it's
promising
I
think
that's.
I
C
So
I'm
gonna,
say
I've
also
been
working
on
rebuilding
a
lot
of
the
base
infrastructure
behind
the
tech,
email,
subsystem
and
I
did
a
lot
of
research
into
APL
and
numeric
program.
Languages
like
APL
and
J,
and
basically
built
out
a
graduated
pathway
from
using
persistent
vectors
to
doing
APL
style,
y
dimensional,
arithmetic
kind
of
on
anything
you
want,
and
so
it's
basically,
it's
works
from
a
concept
of
a
reader
which
is
an
index
given
an
index.
C
If
you
only
had
sequentially
accessible
memory
anyway,
I
built
out
this
pathway
to
remove
the
Tuck
compute
dependency
from
the
data
type
library
and
from
tech
ml,
and
to
make
it
a
lot
easier
to
bind
sophisticated
kind
of
functional,
math
capabilities
into
other
things.
So
when
you
say
like
one
vector
plus
another
vector,
the
result
that
it
gives
you
back
is
a
lazy,
functional
data
structure.
It
doesn't
evaluate
that
operation
and
produce
a
new,
persistent
vector
or
a
new
job
array,
or
anything
like
that.
C
It
produces
a
new
reader
that
when
it
reads
from
index,
why
does
the
actual
operation
in
line?
And
so
this
allows
me
to
build
long
chains
of
these
operations,
but
not
have
anything
execute.
You
know
like
a
lazy
way
in
a
haskell
lazy
way
and
then
I
can
force
execution
by
copying
into
the
final
thing
that
I
actually
do
care
about,
which
may
be
an
image
or
it
may
be
an
indie
array
and
then
in
the
next
net.
It
could
be
a
lot
of
different
things.
C
I
do
have
an
in-depth
technical
walkthrough
on
tech
data
set
or
a
tech
data
type.
Sorry
that
starts
from
the
very
beginning
of
sequences
and
ends
with
in
dimensional
tensors
and
basically
making
an
argument
that
this
is
how
you
can
build
like
abstract,
algebra
enclosure
arbitrarily,
and
so
there
is
some
documentation
on
that
and
there's
a
cheat
sheet,
but
I
agree
that
talking
about
it
more
would
be
a
very
good
idea.
C
I've
been
really
focused
on
working
through
NLP
and
what
are
all
the
NLP
libraries
out
there
and
I
guess
I
feel
like
because
Tech
ml
is
it's
such
a
high
level
of
abstraction
above
the
data
type
library.
Getting
a
solid
NLP
example
would
enable
more
people
to
do
machine
learning
at
their
jobs
than
me,
explaining
too
much
more
about
data
type,
which
is
kind
of
plumbing
I.
B
C
C
And
yeah
you've
made
a
really
good
point.
Karen
actually
I.
Think
most
people
who
do
clock
I
mean
most
of
the
tech
ml
system
is,
is
meant
for
people
who
just
do
closure.
If
you
do
closure
and
you've
access
to
a
database,
you
can
do
machine
learning
on
it
for
sure,
because
you
can
always
get
a
sequence
of
maps
with
keys
in
them
and
you
can
do
machine
learning
on
on
a
sequence
of
maps.
That's
the
point
behind
tech,
ml
and
I
agree.
B
Yeah
I
want
to
get
a
hold
of,
and
this
is
kind
of
cool
too,
because
you
know
that
it's
more
cross-pollination
I
actually
did
a
PR
and
the
Python
glue
in
glue
on
NLP
library
to
get
the
the
base
bird
model
exported.
So
we
could
use
it
and
I
want
I
want
to
get
a
hold
of
the
GPT.
If
people
know
what
GPT
is
so,
this
is
the
open
AI
model
that
they
haven't
even
released.
B
They've
released
some
of
the
pre
train
rates,
but
they
don't
want
to
release
a
full
creatine
rates
because
it's
too
dangerous,
because
it's
just
too
good,
like
I,
could
generate
text
like
unicorn
levels,
so
they
released
the
they
they
released
the
first
cut
and
then
they've
released
just
a
few
days
ago.
They
released
the
medium-sized
pre-trained
weights
that
they
deem
it's
not
too
dangerous.
I
think
still
haven't
released,
the
really
big
one,
but
the
steps
to
that
would
be.
B
You
know
that
I
would
be
looking
at
would
be
making
another
PR
or
helping
helping
along
get
it
in
the
glue
on
an
NLP
side.
So
we
can
get
the
pre
train
weights
to
export
it.
To
them
bring
it
into
you
know
the
MMX
net
proper.
So
it's
a
lot
of
cross
cross
teamwork,
but
it's
cool
stuff
that
would
open
up
a
lot
of
doors
for
us
and
hopefully
not
be
too
dangerous.
G
Thank
you
thanks
Alan,
so
I
just
mentioned
something
that
we
were
talking
about
the
other
day.
It
began
in
a
discussion
with
Ethan
with
here
and
Ethan,
told
me
that
one
problem
that
we
have
with
working
with
data
enclosure
is
that
many
of
us
do
not
know
about
other
people's
sessions.
We
do
not
know
actually
how
the
workflow
looks
like,
and
you
know
many
of
the
people
here
in
this
discussion.
G
G
H
H
I
You
mean
like
a
compute
graph,
yeah
yeah,
so
it's
a
sort
of
trivial
compute
graph
in
that
right,
it's
a
sort
of
a
sequential.
So
it's
not
it's
not
doing
anything
intelligent
as
far
as
like
figuring
out
dependencies
between
forms
that
would
be
really
cool.
It's
something
I've
thought
about,
and
maybe
will
you
could
probably
I
sort
of
fell
through
this
a
little
bit.
I
I
think
you
could
implement
a
sort
of
dummy
version
of
it
which
would
probably
get
things
wrong
in
certain
circumstances,
and
you
know,
maybe
you
could
find
ways
of
patching
that
up
here
and
there,
but
yeah
I
mean
without
more
sort
of
assumption
about
state
and
purity.
It's
a
little
hard
to
know
like
given
any
chunk
of
code.
What
chunks
of
code
following
it
might
need
to
get
rerun
right
under
some
sort
of
simple
assumptions
like
you
know
that
you're
mostly
writing
pure
pure
code.
I
You
can
you
know
you
could
imagine
like
okay,
well,
I'll
trace
my
my
defined
forms
and
if
you
go
out
which
symbols
I'm
using
where
and
sort
of
assume
like
you
know,
if
this
I
see
one
of
these
things
update,
then
you
know
I
everything
in
between
until
the
next
time.
I
see
that
thing
update
can
maybe
just
stay
the
same,
etc.
I
think
that'd
be
really
cool
but
yeah.
I
A
A
Can
talk
about
what
I'm
doing
right
now,
actually
I'm
working
on
a
system
that
takes
some
files
from
a
cloud
folder
and
I
have
take
that
then
read
these
files,
which
happen
to
be
Excel
files,
so
I'm
working
on
a
library
to
read
arbitrary
Excel
files
in
closure,
and
it's
pretty
early
to
talk
about
it.
I
actually,
but
I
hope
it's
going
to
be
released
in
the
next
few
months.
Let's
see
so,
you
can
actually
read
them.
A
Write
Excel
files,
I
know
that
there's
a
really
something
like
that,
but
I
wanted
to
be
as
efficient
as
possible
and
it's
at
the
same
time
have
as
much
control
as
possible.
So
if
you
want
to
have
a
lot
of
control,
you
can
have
it
so
you
can
use
full-blown
Apache
boy
prop
out
or
if
you
want
to
do
just
open
or
write
very
huge,
Excel
files.
You
can
do
that,
but
of
course,
you're
going
to
lose
some
information,
mostly
formatting,
basically,
and
then
after
I.
Take
these
files.
A
I
have
to
call
an
API
actually
more
than
one
I
get
data
in
process.
This
data
do
a
lot
of
financial
calculations.
With
this
data
and
then
spit
excel
files
out,
of
course,
in
the
meantime,
I
want
to
actually
persist
all
this
information
somewhere
and
then
I'm
going
to
train
some
machine
learning
models
on
the
staff
actually
pretty
basic
stuff,
so
mostly
gasification,
binary
classification
actually
and
all
of
this
I'm
going
to
try
to
I'm
trying
to
do
it.
A
Everything
in
closure
also
because
for
the
deploying
stuff
we
already
have
a
Cuba,
needs
cluster
that
can
take
any
arbitrary
jar
and
just
run
it
inside
the
docker
file.
So
it's
pretty
nice,
so
I
can
do
whatever
I
want
mostly,
but
what
I'm,
trying
what
I'm
finding
that
is
pretty
difficult
is
to
have
all
the
parts
because
in
we
have
most
of
the
parts
actually
enclosure
are
there
to
do
all
of
this
work.
But
it's
there's
a
lot
of
glue
code
to
take
them
and
put
them
together.
So.
A
G
Alone,
oh
I
just
want
to
say
that
what
you
told
about
I
think
Excel
files
sounds
so
useful,
because
Excel
is
in
some
places
in
some
places.
Excel
is
such
a
common
format
that
is
so
accessible
to
everybody
and
sometimes
more
than
HTML,
and
in
the
sense
that
it
gives
power
to
the
one
who
gets
it
and,
and
so
it
is
a
wonderf.
It
can
be
a
wonderful
way
to
to
share
your
report
or
anything
that
you
create.
G
A
B
D
A
Ok,
thank
you
very
much
and,
of
course,
if
you
some
of
you,
wants
to
stay
a
little
bit
more
and
keep
discussing
about
some
things,
we
can
do
that,
but
in
the
meantime
it's
this
is
a
wrap.
So
if
you
have
to
go,
you
can
go,
and
I
would
like
to
remember
you
that
this
is
going
be
record.
This
is
recorded
and
it's
going
to
be
put
on
YouTube,
so
you
won't
lose
anything
okay,.
I
G
G
Guess
it
could
be
interesting
to
think
about
what
you
mentioned
about
the
computer
of
of
your
workflow,
because
usually
we
think
about
these
two
things
separately.
We
have
our
sequential
flow
of
fighting
and
notebook,
because
that
is
how
notebooks
look
like
and
then
sometimes
we
need
to
do
computation
and
then
sometimes
we
need
to
think
about
the
computation,
as
is
a
graph
that
well
the
parameters
flow
and
everything
is
influenced
by
the
parameters
as
it
has
to
more
generally
than
a
leaner
flow,
but
actually
why
the
separation?
Alright.
A
Maybe
everything
has
done,
because
it's
very
likely
that
I
did
something
stupid
in
the
mean
time,
I
might
have
opened
the
run
file
or
something
else,
but
only
the
important
parts
getting
them
out
in
a
data
structure
which
is
mostly
static
and
and
like
that
under
Russian
control
and
then
I
can
have
some
kind
of
service
that
takes
these
these
data
structures
and
then
just
runs
them
whenever
you
want.
It
would
be
really
interesting.
A
G
And
then
the
question
is
how
how
would
the
system
figure
out
the
gaff
of
what
you
do
right
so
so
one
way
could
be
called
analysis,
but
it
it
may
be
a
bit
tricky
with
macros
and
all
that
and
another
way
would
be
to
have
to
ask
your
values
with
a
certain
type.
That
is
aware
when
it
is
the
reference
all
right
and
then
that,
like
they
do
in
the
UI
world,
in
the
closure
script
like.
G
A
J
J
The
you
know
the
kind
of
work
interaction
is
much
more
like
what
you
do
at
the
ruble
and
but
the
neat
thing
is:
is
you
can
actually
mix
different
languages
together
as
well?
So
in
their
particular
case
you
can
mix
JavaScript
and
Python,
so
you
can
Anthony
actually
managed
to
get
the
weird
thing
about
what
they're
doing
is
they
they
do
this
by
and
basically
compiling
everything
into
web
assembly,
so
they
literally
take
the
Python
interpreter
the
C
code
and
compile
it
into
web
assembly
and
then
directly
use
it.
So
you
actually
have.
J
Can
she
have
Python
three,
seven
or
whatever
running
in
the
browser
and
and
they
did
the
same
thing
with
Numbi
and
Psyche
it
again
a
couple
of
things
so
that
you
can
then
load
data
sets.
You
can
also
load
data
sets
using
I,
think
there's
another
language
too,
and
I
came
Oh.
Julia
I
think
they
have
Julia
working
that
yeah,
so
I
kind
of
liked.
The
basic
flow
of
that,
because
you
don't
have
to
have
that
enforced
linear
thing
that
you
get
in
typical
notebooks.
J
J
Client
only
so
it's
just
strictly
closed
your
script,
so
it's
all
posted
and
or
mixed
with
closure
on
the
server
and
I
have
a
very
point
or
anything
where
you
can
throw
Python
into
the
thing
as
well.
So
all
in
this
in
the
editor
pane,
you
can
do
this
and
then
you
can
hit
mark
down
a
lot
tag
and
all
that
kind
of
stuff.
It's
true.
It
was
strange,
and
then
it
was
a
similar
idea
to
this
to
what
the
mozilla
iodine
people
were
doing.
A
J
I
guess
they're
open
to
the
idea
of
including
other
languages
and
stuff
as
well,
mostly
that
they're
thinking
of
as
other
languages
like
I,
don't
know
it
would
like
to
Julia
or
who
knows
what
else
where
or
maybe
go
who
knows
where
they're
literally
gonna
take
the
code
for
that
to
source
code
for
those
things,
the
compilers
and
whatever
interpreters,
and
then
just
compile
it
all
in
with
it
using
using
web
assembly
and
then
literally
have
everything
running
in
the
browser.
It's
kind
of
it.
J
That
part
strikes
me
as
kind
of
maybe
not
the
greatest
idea.
That
ever
was
I
mean
I
sort
of
understand
their
rationale
for
it,
but
I'm
not
really
convinced,
but
certainly
since
JavaScript
is
works.
They
that
just
works
directly
for
them
and
since
closure
script
just
compiles
to
JavaScript
I
would
tend
to
think
that
that
should
be
a
very
simple
inclusion.
K
You
know
these
code
blocks
in
there
and
then
you,
when
you
go
to
effectively
compile
your
word
document
or
render
it
you
get
all
the
code,
interaction
it's
more
along
the
lines
of
the
polyglot
stuff
where
you
can
kind
of
splice
things
in
and
then
some
people
have
taken
it
and
then
done
some
pretty
pretty
kooky
stuff
with
it
not
to
the
level
that
it
looks
like
iodide
does
I
just
just
looking
at
iodide.
It's
got
some
pretty
impressive
stuff
in
there.
K
But
again
it's
all
browser,
it's
it's
browser
centric
and
that
web-based
so
it's
it's
pretty
cool
which
you
can
do
with
it.
I
just
thrown
it
out.
There's
another
point,
this
or
Babel
stuff,
yeah,
just
real,
quick
aside
from
all
that,
I
just
wanted
to
circle.
Back
on
on
the
stuff.
You
were
talking
about
Alan.
What's
your
munging
Excel!
So
in
my
domain,
that's
like
that
used
to
be
the
coin
of
the
realm
and,
to
a
certain
extent
it's
distill,
the
baseline.
K
For
a
lot
of
folks
and
so
I
interrupt
with
that,
a
lot
I,
don't
know
what
you're
doing
exactly
if
you've
already
built
something.
I've
got
a
fork
of
some
of
the
doctor
stuff
in
there
Excel
Apache,
Pio,
I
library,
Interop
that
I
forked
off
for
performance
reasons
and
patch
them
stuff
up
years
ago.
Just
I
never
pushed
it
back
to
mainline,
but
I'm.
K
One
things
that
we
ran
into
early
on
was
generating
and
also
consuming
this
stuff,
significantly
doing
a
lot
large
large
books
or
doing
it
fast
and
there's
just
some
some
of
the
the
typing
and
coercion
stuff
that
was
in
there
was
kind
of
run
off
the
rails,
like
with
dates
and
stuff.
So
just
curious,
if
you're,
if
you've
already
built
something
I'd,
be
curious.
What
you've
done
if
it
ties
in
with
the
the
doctor
stuff
or
if
it's
just
a
patch
epi
based
or
whatever
you
expand
on
that
yeah.
A
Thank
you.
No
actually
I
was
bidding
it
from
a
boy
directly,
I
didn't
know
you
had
something
about
it.
I'm
surely
trying
it
and
I
was
using
fast
Excel,
reader
and
writer
as
well,
which
opens
a
stream,
and
then
you
can
consume
that
dead
stream.
While
we
the
purge
for
you
just
after
everything
at
once.
So
of
course,
it's
going
to
be
much
slower,
doesn't
make
sense
to
you.
Yeah.
K
I'm
tracking
man-
that's
that
makes
sense.
The
other
thing
I
was
gonna
mention
a
lie
because
we're
dealing
with
like
I
said
a
lot
of
there's
a
lot
of
Microsoft
based
stuff.
So
PowerPoint
is
another
natural
one.
We
ended
up
having
to
generate
lots
and
lots
of
PowerPoint,
so
we
were
generating
visualizations,
programmatically
and
then
building
up
these.
You
know
these
presentations
with
hundreds
of
slides
of
just
all
sorts
of
visuals
already
put
in
there,
so
that
ends
up
being
non-trivial.
There's
the
plit
UI
libraries
are
there.
K
You
got
to
do
a
lot
of
wrapping.
We've
done
some
of
that
too.
I
need
to
clean
it
up,
but
it's
it
exists
in
one
form,
another
in
a
politically
accessible
way.
I
just
need
to
clean
up
and
point
to
it
that
if
that's
abuse,
I
just
thrown
out
there,
I
don't
know
I,
don't
know
the
last
time.
I
looked
at
a
doctor
wasn't
doing
a
whole
lot
of
that.
There's
some
basic
support,
but
in
terms
of
programmatically
building
all
that
crap,
that's
a
big
big
help
to
be
able
just
spewed
out
so.
G
G
A
Even
powerpoint,
if
one
wants
to
or
PDFs
or
whatever,
because
I'm
working
in
a
bank
actually
so
excels
files
are
everywhere
and
it's
the
only
way
to
actually
free
data
to
be
buried.
Data
is
to
go
and
open
up
all
these
files
and
then
collect
them
and
just
put
them
somewhere
where
then,
you
can
clean
them
out
and
use
this
this
data
in
much.
We
hope
in
a
schmuck
away.
K
Yeah
man
there's
a
lot
of
we're
fighting,
there's
a
lot
of
institutional
knowledge,
it's
stored
in
frigging,
slides
and
we're
trying
we're
looking
at
possibly
scraping
that
and
doing
one
more
looking
at
throwing
that
into
a
search
index.
But
some
of
this
ain't,
no
peace
stuff
sounds
like
it
might
be
useful
in
business
case
for
extracting
legacy
information
from
that
stuff.
Yeah,
it's
it's
it's
non-trivial!
Once
you
start
playing
with
it,
it's
kind
of
it's
one
of
those
dirty
problems
that
a
Paul
Graham
talks
about.
K
It's
not
very
sexy,
but
it's
it
ends
up
being
somewhat
important.
So
but
yeah
man,
I,
was
gonna.
Ask
just
out
of
curiosity
because
it
popped
up.
On
my
end,
there's
anybody
messing
with
mathematical
programming
when
you're
programming,
only
your
programming,
that
type
of
stuff,
because
I'm
starting
to
wrap
a
library,
I,
don't
know
a
thing
with
just
hidden
that
for
some
operations,
research,
stuff
I'm
doing
for
scheduling,
but
if
that's
of
interest
or
use
I
can
publicize
that,
although
I'm
just
starting
to
play
with
it,
though.
K
Man
I
hit
I
must
hit
the
mute.
It
was
yeah,
so
it
yeah
I
got
this
extra
thing
on
my
headset
A's
in
operations,
research
optimizations,
a
pretty
big,
pretty
big
bread-and-butter
tool.
We
use
a
lot.
We
do
a
lot
of
mathematical
programming,
specifically
linear
programming
and
mixed
integer
programming,
that
that
sounds
familiar
and
so
there's
a
couple
of
there's
a
couple.
Libraries
out
there
you've
got
like
the
industrial-strength
solvers
like
SEAPLEX
and
jure
OB
and
most
sec,
and
all
these,
like
you
know,
planet
scale,
optimization
problem
solvers,
and
we
do
some
of
that.
K
But
then
a
lot
of
the
times
you
just
want
something
to
solve.
As
a
scheduling
problem,
that's
like
say:
you've
got
a
job
scheduling
thing.
You
want
to
do
with
side
constraints,
and
maybe
it's
maybe
it's
just
easier
to
not
spend
thousands
of
dollars
to
get
a
SEAPLEX
back
man
or
something
like
that,
and
so
what
there's
a
couple
of
Java
libraries
out
there?
The
Apache
math
Commons,
has
a
simplex
solver
in
it
that
someone
wrapped
a
long
time
ago.
K
Enclosure
called
pearl
in
that
was
just
library,
that's
its
I
played
with
this
cutout
and
there's
a
new
set
of
libraries
that
it's
called
øg
øg
algo,
which
has
like
a
whole
crapola.
It
small
he's,
like
you,
know,
over
math
libraries
as
well,
and
they
have
like
tons
and
tons
of
different
routines
data
structures
and
then
I
also
have
several
solvers
for
in
this
case
they
have
some
some
high
end,
linear
programming
and
nonlinear
programming
solvers.
K
So
I'm
starting
to
I
was
looking
at
that
today
and,
like
I
said
because
it
work
is,
I
can
do
it
on
work
time.
I
can
I,
can
start
the
project
open
source
and
then
contribute
to
it
during
work.
So
if
that's
of
use,
it
might
just
I'm
just
kind
of
networking
and
throwing
it
out
there,
yeah
oh
gee
I'll,
go
that's
it
I
mean
that's
I
I
heard
about
that
before
this
particular
library,
oh
gee,
I'll
go
but
I
hadn't
really
used
it
until
I
start
messing
with
it
today
and
man.
K
They
have
like
it's
like
one
of
those.
You
know
like
one
of
those
big
big
uber,
math,
numerix
libraries.
That's
just
out
there
that
kind
of
went
under
the
radar.
For
me,
I
don't
know
if
that
was
in
our
original
survey,
when
we
were
talking
about
this
stuff,
but
it
probably
should
be
because
there's
there's
a
lot
of
stuff
in
there.
It
might
be
able
to
use
a
lot
of
redundant
stuff
too,
but
there's
a
lot
of
stuff
in
there.
A
Yeah
thanks
no
I,
don't
think
we
actually
had
it
in
the
survey
yeah.
We
should
say
add
it
and
please,
if
you
can
actually
open
source
that
really
great,
because
I
think
that
most
of
the
problems
that
we're
solving
with
machine
learning
right
now
in
many
companies
can
be
actually
solved
with
solders
or
some
search,
smart
search,
algorithms,
actually
so
yeah.
K
I
agree
with
that
I
think
there's
a
lot
of
stuff.
That's
it
sounds.
It
sounds
sexier
whenever
you,
when
you
have
a
neural
network
or
something,
but
in
reality
the
the
freaking
decades
of
research
and
hardware.
Sorry,
a
software
engineering
that
have
gone
into
implementing
some
of
these
solvers
is
like
they're,
crushing
it
man
they
will.
They
will
give
you
a
provably,
optimal
solution.
I
mean
some
of
the
stuff
is
out
there.
So
it's
like
I'm
less
impressed
when
some
people
are
trying
to
actually
apply
neural
networks.
K
That's
something
that's
kind
of
even
then,
like
with
some
of
these
backends
man
like
Goro
B
in
particular,
is
kind
of
cutting-edge
man,
they're
they're,
solving
planet-scale,
optimization
problems
fast,
and
it's
just
I'm
not
as
impressed
in
that
domain,
but
but
there's
definitely
use
a
lot
of
these
things
like
I
used
to
pulp
in
Python
the
Python
linear
programming
package,
which
is
like
its
own
mobile
DSL
I'm,
trying
to
look
at
something
like
that
for
wrapping.
This
this
ojl
goes
solver,
which
is
kind
of
their
interface,
is
kind
of
weird.
K
It's
not
exactly
it's
more
like
an
object-oriented
type
thing
where
you,
you
define
your
variables
and
you
define
these
expressions,
but
it's
not
like
a
standard.
It's
it's
a
little
bit
hard
to
formulate
the
mathematical
form
that
I
would
I'm
used
to
with
the
formal
mathematical
modeling.
So
I'm
trying
to
do
is
write
a
little
DSL,
or
at
least
a
nice
API
enclosure
to
wrap
it
and
then
form
that
out
and
then
they
they
allow
you
to
also
tag
into
the
different
backends.
K
If
you
actually
have
one
of
those
salt,
one
of
the
industrial-strength
solvers,
you
don't
want
to
use
theirs,
then
so,
just
like
Julia's
got
a
really
good
optimization
package
as
well.
Something
like
that
I
think.
That's
an
area
that
we
don't
hear
about
in
closure.
A
lot
but
I,
know
I,
ideal
on
that
and
I'd
like
to
I'd
love,
to
write,
formulate
my
stuff
in
closure
and
solve
it
like
if
I
can
solve
it
locally
instead
of
having
to
have
the
client
have
their
own.
You
know,
like
I,
said:
ten
thousand.
K
Fifteen
thousand
dollar
license
for
something
I
I
can
just
solve
it
with
the
Java
library,
that'd
be
pretty
badass
with
closure
so
and
on
top
of
that
also
there's
a
friggin
JavaScript,
optimization,
a
library
that
they've
got.
There's
somebody
written
LP
solver
for
that
and
I
think
it
doesn't
mixed
integer
programming
too
so
I
was
thinking
about
taking
you
further
and
actually
having
a
little
closure
script
wrapper.
So
you
could
do
optimization
in
the
browser.
This
doesn't
sound
like
a
great
idea,
but
at
least
it
could
be
somewhat
portable.
K
A
Nice
yeah
I
was
looking
for
and
last
year,
I
was
at
Traxxas
in
Vancouver
and
they
have
shown
something.
Actually,
it
was
really
interesting,
but
I
can't
find
it
right
now
as
soon
as
I
find
it
I
will
share
it
with
all
of
you.
It's
basically
a
solver,
but
it
was
built
for
doing
recommendation
systems.
It
was
really
really
interesting.
K
Yeah
I
think
you're
surprised
my
confluence
there.
Where
you
can
you
can
mix
and
match
these
methods.
You
know
and
there's
there's
probably
some
some
made
of
stuff
there.
You
know
optimizing
the
optimization
or
optimizing
a
neural
network
tuning.
It
I,
don't
know,
might
be
some
interesting
stuff
there
that
hasn't
been
done
yet,
but
one
other
thing
I
wanted
to
mention
I
mean
I've.
K
You
probably
seen
it
on
the
on
Zula,
but
yeah
I
continue
to
kind
of
dig
into
some
of
the
stuff
that
Tomas
Tomas
is
working
on
with
the
sealed
J
plot
and
then
what
I'd
offer
out
to
is,
if
anybody's
got
any,
what
I'm
trying
to
get
to
is
I'm
trying
to
find
just
he's
kind
of
got
an
interesting
implementation.
K
That's
inspired,
quote-unquote
inspired
by
you
know,
d3
and
ug
plot
and
and
lattice
I
guess
is
the
rendering
package
and
R
and
I'm
kind
of
like
going
at
it
from
the
opposite
direction,
is
like
I'm
spilling
through
the
code
and
learning
it's
leaking
his
examples
and
dissecting
it
and
try
and
understand
kind
of
what
the
underlying
formalism
with
the,
what
the
formal
structures
are
underneath
and
then
trying
to
then
map
that
to
other
places
like
Vega
and
potentially
other
libraries.
K
If
there's
anybody
got
any
expertise
in
terms
of
late
I
went
I
know:
I
can
get
around
in
Vega
a
bit
and
I've
looked
through
libraries
like
like
John's
and
Chris's,
but
I'm
not
like
I,
don't
know
all
the
friggin
options
and
all
the
properties
and
all
the
different
fields
and
all
the
stuff,
and
all
that
you
know.
There's
no
I
can
read
the
docs,
but
there's
no
like
spec
in
terms
of
like
closure,
spec
type
thing
for
to
define
the
grammar
that
I'm
familiar
with
outside
of
the
papers
and
I'm
just
curious.
K
If
there's
a
if
there's
any
domains
of
expertise
in
that
or
in
ggplot,
I'm,
very
curious.
As
to
if
there
are
some
kind
of
formal
specifications
for
these
different
plotting
constructs
because
I'd
like
to
see,
if
there's
a
way
to
do
Sheen
compilation
or
some
sort
of
a
transform
between
the
different
specifications,
instead
of
doing
a
lot
of
customized
boutique
and
jammed
I
can
take.
You
know
this
narrow,
subset
and
kind.
K
Of
course
it
to
look
like
a
go
from
Vega
to
ggplot,
or
something
like
that,
because
I
think
that'll
that
might
actually
buy
even
more
portability,
bridging
if
you
will,
if
we
can
define
ways
to
adapt
that
because
I
think
some
of
Tomas,
this
stuff
is
pretty
cool
and
there
might
be
some
things
in
there
that
you
cannot.
Yet
you
can't
do
in
terms
of
defining
your
own
custom
plot
types
actually
defining
your
custom.
K
G
K
Yeah,
it's
come
up
a
couple
times
like
Chris
Nurnberger
is
asking
about
it
and
it
makes
sense.
They've
already
got
a
pretty.
You
know,
you've
got
you
got
ten
years
or,
however
long
Vega
has
been
around
and
Gigi
plots
been
around
a
lot
longer
than
that.
So
you've
got
a
lot
of
domain
expertise.
It's
kind
of
this
tried-and-true
way
of
expressing
visualizations,
so
yeah.
The
only
thing
you
don't
like
about
it
is
it's
in
the
friggin
browser
man
and
I
that
that's
it
that's
a
non-starter
for
some.
K
In
some
cases
you
can
work
around
it
or
or
you
can
as
Tomas
did,
and
if
you
haven't
used
this
stuff
or
played
with
it,
it's
actually
really
really
nice.
How
will
it
integrates
with
the
closure
functional
paradigm?
That's
the
other
thing.
I
really
like
about
it.
I
think
John's
stuff
does
as
well
with
how
he
does
the
map
composition,
overloading
it's
a
it's
very
nice
to
be
able
to
just
use
your
closure
stuff
to
just
express
this
stuff.
It's
really
really
nice.
K
That's
that
happen
to
like
live
within
this
kind
of
weird
JavaScript,
but
not
quite
JavaScript
grammar.
You
know
or
our
arc,
but
not
quite
you
know
this
this,
this
foreign
thing
I'm
not
familiar
with
it's
it's
very
nice.
So
what
I
like
to
do
is
leverage
what
Tomas
has
and
then,
if
there's
a
way
to
what
I'm
lacking
right
now
is
under
standing
of
how
to
bridge
formally
define
a
bridge
between
these
kind
of
external
specifications
like
here,
Vega
has
a
pretty
well-defined
grammar
and
it
gets
down
to
the
weeds.
K
K
If
I
know
the
rules,
if
I
can
coerce
it
to
what
tom
has
had
Tomas
has,
then
we
can
define
ideally
mechanical
transformations
like
I'm
thing
that
just
specking
it
out
and
using
closure
to
expect
to
kind
of
help
to
inform
that
almost
from
a
compiler
like
a
transpiler
level,
I
guess
source
code
transformation,
something
like
that,
but
I'm
a
little
ambitious
at
this
point,
but
at
the
very
least
I
think
it
would.
It
would
enable
a
wide
swath
of
you
can
take
your
Vega
examples
directly
off
the
web
and
use
them
with
Tom
masses.
K
Tomas
is
stuff,
that'd
be
pretty
cool
or
ggplot.
You
know
some
something
similar
from
ggplot.
At
least
you
can
kind
of
squint
at
it
and
it
looks,
looks
close
enough
to
ggplot.
But
that's
that's
that's
long-term
for
right
now,
I'm
specking
out
his
stuff
and
trying
to
understand
it
and
formalize
it.
And
then
the
idea
is,
if
there's
some
companion
formalization,
then
that
would
be
that.
Would
that
would
enable
the
bridging
that
we
talk
about
all
the
time.
You
know
building
these
bridges,
so
you
get
coj
plot.
G
K
K
You
then
can
animate
by
you
know
rerender
at
image
over
time
and
just
just
changing
what
you're
drawing
so
it's
a
lot
more
functional
and
less
framework
doesn't
feel
like
you're
like
living
it
like
in
processing
and
quill.
There's.
Always
this
ambient
argument,
that's
kind
of
hidden
from
you,
which
is
the
act
of
the
current
sketch,
so
everything's
very
imperative,
and
it's
defined
with
that
hidden
state
that
global
state.
If
you
will,
where
it's
not
that
way
in
enclosure
to
eat,
so
it's
actually.
K
If
you
look
at
his
examples,
enclosure
2d
he's
got
gobs
of
interactive
stuff
to
just
straight
out
of
all
this
generative
programming
and
art,
art
books
and
a
lot
of
processing.
So
you
can
tell
his
background.
He
came
from
processing,
so
there's
a
lot
of
interactive
stuff
in
there.
So
I
think
I
think
you
could
probably
do
more
interactive
stuff,
even
custom
interactions
that
you
can't
do
with
with
the
pre
described
stuff
they
have
in
Vega
and
Vega
light
the
interactive
stuff.
In
my
opinion,
just
sorry,
energy
yeah.
G
And
about
GT
cloth
I
think
it's
maybe
a
good
idea,
because
GT
plot
is
a
very
loud
syntax
that
many
people
are
used
to
and
now
we
do
have
it
working
on
the
JVM,
because
they
are
one
of
the.
Our
thoughts
to
the
JVM
does
does
have
GT
plots
walking.
It
doesn't
render
correctly,
but
you
do
have
its
working
in
the
sense
that
you
can
by
the
syntax
and
get
the
data
structure
that
the
prism
see.
And
then
you
you.
K
K
Even
think
about
that,
because
I
remember
trying
to
user
engine
and
tried
to
use,
try
to
do
something,
ggplot
stuff
right
and
it
they're
not
there.
Yet
you
can
do
some
things,
but
but
I
didn't
think
about
getting
the
the
data
structure
out
because
then
you
could.
You
could
just
compile
that
if
you,
if
you
know
what
the
mapping
is
in
theory,
you
know
I
I
didn't
even
think
about
this
yeah.
G
Thing
that
they
in
our
world,
because
they
do
use
the
ggplot
data
structure,
is
a
way
to
endow
two
different
formats.
One
format
is
ET.
Another
format
is
a
broadly
JavaScript
interactive
plot
and
they
both
are
different.
Renderings
of
the
same
data
structure
with
the
gt+
data
structure.
So
I
guess
it
is
really
well
documented
and
we
could
invent
a
new
way
to
render
it
and
and
I
I
guess
the
benefit
would
be
at
least
being
able
to
do
it
on
the
JVM
yeah.
Let
us
try.
G
H
K
Yeah
guys
it
sounds
you're
running
a
successful
program
error,
so
I
think
this
is
a
good
format.
One
thing
I'd
mentioned:
we
talked
about
community
interaction.
Typically
at
the
end
of
these
things,
right,
you
guys
are
you
guys
are
leading
that
up
pretty
well,
I
think
I
would
say
the
thought
occurred
to
me
at
some
point.
It
may
not
be
a
bad
idea
depending
on
interest.
K
We
talked
about
meeting
up
at
a
conference
or
something
depending
if
we
have
a
consider
same
group
of
people
that
keep
coming
to
these
at
least
online
ones,
and
there's
there's
some
some.
You
know
people
are
actually
putting
their
money
where
their
mouth
is
in
terms
of
producing
and
pumping
out
libraries
or
there's
product
or
there's
tutorials.
It
might
not
be
a
bad
idea
to
you
know,
float
the
idea
about
a
separate
event.
If
it's
you
know,
one
of
the
things
that
Karen
brought
up
was
there's
nobody.
K
You
know,
there's
only
two,
the
two
people
that
were
there
that
we're
doing
closure
were
at
the
conference
for
her
and
the
other
person
from
cognate
echt
I.
Think
so,
if
we
have
a
if
we've,
if
the,
if
we've
done
a
good
job
of
actually
getting
a
network
together,
if
people
that
are
interested
in
this
that
are
able
to
go
well,
why
don't
we
go?
Do
our
own
conference
or
our
own
little?
You
know
it
doesn't
be
a
formal
big
thing.
K
You
know
like
the
guys
up
in
Canada
did
the
closure
north
thing
and
they
just
kind
of
put
that
together.
So
you
know
there's
something
to
think
about,
or
at
least
we've
talked
about
doing,
like
working
groups,
maybe
at
another
con
the
closure
con
just
happening
and
and
and
we're
also
international.
So
it's
not
like
it's.
K
It's
not
trivial
for
people
to
fly
out
here
and
there,
but
it
may
be
worth
it
if
it's
for
the
data,
if
you
know,
if
there's
an
actual
concerted
effort
to
get
people
together,
so
I
just
floating
that
out
might
be
again
if
this
pans
out
so
far
it
has
we've
had
a
couple
of
these
and
the
the
group
is
growing
wider,
but
we
also
have
pretty
consistent
participation.
So
if
that
continues,
then
it
might
be
a
good
idea
to
look
at
some
kind
of
physical
meetup.
So.
G
A
K
Yeah,
you
know,
like
the
last
concert
of
people
that
were
people
with
you
know
actual
money
behind
them
and
they
were
interested
in
getting
the
data
science
stuff,
the
community
working
together
and
it
just
seems
like
there
was
no
leadership,
and
so
what
I
see
now
is
there's
leadership.
So
all
it
takes.
Is
people
pushing
that
hey?
Let's
have
a
conference?
Is
anybody
interested
having
a
conference
or
is
it
interesting
having
a
workshop,
you
don't
have
to
call
a
conference
right.
K
It
doesn't
have
to
be
a
big
formal
thing,
but
if
you
want
to
come
demo
your
thing
and
then
participate
some
of
these
talks,
then
we
already
have
several
topics
lined
up
that
you
could
come
due
and
then
throw
it
up
on
YouTube
or
whatever.
There's
some
cost
associated
and
look
the
problem
with
that
is
running
running
the
conference
for
running
a
workshop.
I
mean
to
get
together.
Is
it's
not
a
trivial
effort?
I,
don't
I,
don't
have
any
illusions
about
that,
but
it
might
be
worth
it.
Man
I
mean
there
might
be.
K
Somebody
want
to
step
up
and
do
that
too.
We,
just
you
don't
know
until
you
start
asking,
but
yeah
I
think
it's
a
good
sign
man.
So
so
far
this
has
been
productive
and
kind
of
makes
me
a
little
more
optimistic.
So
if
we
continue
on
this
track,
when
you
you
talked
about
the
next,
you
say
the
next
meetup
was
gonna,
be
in
like
June.