►
From YouTube: Julia
Description
Part of the Data Day 2022 October 26-27, 2022
Please see https://www.nersc.gov/users/training/data-day/data-day-2022/ for the training agenda and presentation slides.
A
Dueling
languages
I
don't
want
to
create
the
impression
that
that
it
is
dueling
languages,
because
I
think
Python
and
Julia
exist
in
their
own
niches
and
and
I
think
they
can
coexist
in
interesting
ways,
and
so
I
want
to
give
a
an
overview
of
our
support
at
nurse
for
the
Julia
language
and-
and
this
is
a
little
open-ended
in
the
sense
that
I
have
some
examples
and
demos
prepared.
A
So
please
ask
questions
during
my
talk
and
I.
Don't
I
can't
see
this
the
the
chat?
So
if
someone
sees
questions
in
the
chat,
please
just
yell
at
me,
so
that
I
I
get
that
actually
answer
them.
A
So
basically
don't
wait
until
the
end
for
asking
questions
all
right.
So
so
the
reason
why
I
find
Julia,
tempting
and
very
compelling
at
a
at
centers
like
nurse,
is
that
especially
complex
workflows
really
require
something
like
a
high
productivity,
high
performance
language,
and
so
it's
it's
really.
It
comes
down
to
high
performance
glue
code.
A
So,
let's
see,
if
we
look
at
at
workflows
that
require
High
productivity
languages
like
python,
you
usually
see
sort
of
this
pattern
where
you
have
a
python
or
like
a
high
productivity
glucose
that
tends
to
be
low
performance,
gluing
together,
multiple
kernels
that
are
written
in
a
high
productivity
language,
so
high
performance
language,
real
productivity,
language
like
like
C
or
Fortran,
and
so
the
advantage
of
this
approach
is
that
in
many
cases
the
the
right
algorithm,
so
the
red
language
is
used
to
address
the
right
kind
of
algorithm.
A
In
that
case,
we
really
maintained
this
idea
of
kernels.
However,
we
start
to
introduce,
as
we
introduce
more
languages,
we
start
to
introduce
more
complexity
in
our
software
stack.
So
it's
a
bit
more
difficult
to
maintain
and
also
we
have
to
be
careful
that
context.
Switching
between
different
languages
doesn't
introduce
performance,
overhead
and
so
here's
an
example.
I
hope
everyone
can
see
my
my
mouse,
maybe
I,
can
make
it
here.
We
go.
Here's
an
example
that
that
sort
of
illustrates
this.
A
If
you
look
at
the
following
function,
calls
and
then
compare
the
pi
bind
11
time,
the
time
just
to
make
the
function
call
so
I'm,
comparing
with
the
functions
to
nothing.
So
it's
one
round
trip,
then
we
can
see
that
there's
a
almost
I
mean
maybe
a
50x
slow
down.
Sorry,
but
there's
a
58
speed
up
to
using
Julia.
A
Actually,
it's
sort
of
a
cheat,
because
it's
not
actually
Julia's
functions
the
way
and
the
way
that
Julia
actually
works
is
you
know
your
scripted
language.
Your
Julia
program
gets
ingested
by
the
Julia
Justin
Time
compiler,
and
it
is
in
the
Juliet
jet
compiler
produces
llvm
ir
and
and
then
has
llvm
compilered
to
a
native
code.
So
when
you
do
something
like
C
call,
you
actually
just
natively
calling
that
function
like
within
C
is
calling
C.
A
So
so
it's
really
just
pointer
resolution
and
that
sort
of
thing
that
that's
causing
that's
the
overhead
here,
all
right,
so
I'm
using
I'm
using
this
as
sort
of
a
motivation,
I
think
when
you're
representing
an
up-and-coming
language,
you
also
have
to
motivate
in
other
ways.
A
So
one
way
is
to
point
out
that
nurse
has
been
interested
in
supporting
Julia
well
before
I
started
at
nurse,
and
when
you
conduct
surveys
of
the
Julia
user
Community,
you
find
that
even
though,
for
example,
This
Server
here
conducted
in
2020
about
three
quarters
of
users
that
they're
not
using
Julia,
when
you
ask
them,
do
they
planned
how
many
users
plan
to
use
Julia
in
the
future?
It's
about
50
percent,
so
basically
half
of
the
folks
that
are
currently
not
using
Julia.
A
If
this
is
a
representative
survey,
half
of
the
folks
currently
not
using
Julia.
Excuse
me,
instead
of
the
folks,
roughly
not
using
Julia
plan
to
use
Julia
in
the
future
if
they,
if
they're,
given
the
the
right
support,
and
so
the
objective
at
nurse
has
been
to
make
using
Julia
easy
and
so
the
the
model
that
we're
using
is
that
we
need
to
enable
users
to
wrote
down
Julia
install.
A
So
the
idea
is,
we
have
multiple
levels
of
Julia
support
and
the
the
most,
though
the
the
most
easy
way
for
users
to
use
Julius
just
to
use
our
module
and
that
introduces
a
whole
bunch
of
pre-compiled
packages
that
have
been
pre-configured
using
nurse
systems.
So
that
would
be
you
know,
D
tier
oops.
Excuse
me
D
tier
here,
and
then
you
might
want
to
start
to
mess
with
your
own
version
of
MPI.
A
So
so
what
we
do
is
we
provide
both
compatibility
interfaces
like
MPI
trampoline
and
also
we're
using
the
preferences.jl
mechanism
to
make
sure
that
the
correct
settings
are
picked
up
when
a
package
is
installed.
So
maybe
a
little
dig
at
python.
If
I
went
you
know,
pip
install
MPI
for
pi
without
any
additional
settings,
it
will
do
the
wrong
thing.
A
So
the
nice
thing
about
Julia
is:
you
can
create
what
are
called
Global
preferences
and,
for
example,
you
would
tell
the
MPI
preferences
to
use
the
create
compiler
wrappers
and
then
order
the
users.
So
you
know
package
and
MPI
and
it
will
say:
aha
I'm
I
can't
just
use
a
a
GC,
MPI
Cc
or
something
like
that.
I
have
to
use
a
compiler
wrappers.
Of
course
you
can
still
disable
this,
but
they
have
to
actively
disable
this.
They
can't
just
accidentally
stumble
into
the
wrong
build.
A
So
here
is
an
example
of
module
load
statistics,
so
this
was
assembled
by
Annette
Greiner
just
two
days
ago.
So
thank
you
very
much,
and
what
you
can
see
is
that
the
monthly
Jupiter
users
has
been
steadily
growing
until
roughly
here,
which
somewhat
coincides
with
perlmata
coming
online,
and
so
this
I
hope
the
Slowdown
is
because
people
are
going
over
to
Pearl
matter
and
we're
currently
not
monitoring
module
log
statistics
on
Pearl
matter,
so
I'm
kind
of
curious
to
see
what
happens
when
I.
A
We're
coming
for
you,
okay,
no,
it's
not
adversarial
I
mean
come
on
no
I
guess
what
I
want
to
say
is
that
it's
not
an
insignificant
number
of
people
who
are
already
using
Julia
ednersk.
So
please,
let
me
keep
my
job
and
another.
Very
quick
aside,
is
that
Julia
users
tend
to
like
new
versions.
So
when
you
look
at
the
different
module
loads
for
the
different
packaged
versions,
you
can
see
in
green
that
is
1.42.
A
So
there's
still
some
Legacy
users
at
the
moment
and
then
blue
is
1.6
and
basically
the
moment
that
I
got
around
to
actually
enabling
some
newer
versions.
They
all
ticked
up
immediately
and
there's
a
a
very
strong
tendency
to
use
the
latest
stable
version.
But
then,
in
this
per
in
this
this
brown
here
you
have
beta
users
as
well,
so
so
Julia
like
Julie,
uses
like
really
new
versions,
which
is
kind
of
important
to
know,
and
then
in
the
future.
A
You
know
I
I'm,
basically
looking
at
the
work
that
is
was
done
around
python
in
terms
of
monitoring
package
usage
and
have
some
prototypes
that
aren't
quite
production
ready,
but
that
pretty
much
use
the
same
mechanism
where
you
can
place
at
Exit
hooks
in
the
startup.jl
and
then
use
that
to
monitor
what
packages
you
Julia
uses
had
actually
used
during
their
where,
like
in
their
workflows,
all
right.
A
So
before
I
jump
into
some
example,
code,
I
kind
of
also
want
to
give
a
brief
overview
of
the
package
ecosystem
and,
unlike
python
I,
can't
show
lots
of
pretty
pictures.
The
reason
is
that
we
don't
have
huge
companies
behind
us
that
have
Graphics
designers,
that
make
wonderful
logos
and
instead
it's
it's
just
people
with
inkscape,
so
so
I.
Unfortunately,
these
B
slots
will
look
a
little
bit
drab
compared
to
Daniel's
slides
but
I'm.
A
Actually,
gonna
transfer
some
of
this
to
our
documentation,
so
that
folks
have
have
like
a
nice
list
of
references
available
in
case.
They
want
to
look
at
them
later,
but
essentially
the
organizations
to
follow
by
the
way
all
Julia
packages
are
managed
via
GitHub.
So
you
can
pretty
much
just
look
for
packagename.jl,
GitHub
and
you'll
almost
always
find
the
the
correct
repo
and
and
on
GitHub
they're,
organizing
the
following
organizations.
A
So
Julia,
IO
and
Julia
data
is
really
helpful
in
case
you're.
Looking
for
things
to
access
data
and
manipulate
data,
Julia
parents,
sort
of
traditional
HPC
kind
of
workflows,
plus
multi-processing,
fun
stuff,
like
that,
and
of
course
Juliet
GPU,
which
is
currently
where
a
lot
of
activities
happening
is
obviously
for
GPU
programming.
A
Now
so,
let's
look
at
some
interesting.
I
o
packages
I
want
to
point
out
that
pitfile.jl
isn't
really
an
I
o
package,
but
it's
really
interesting
when
you
start
to
do
a
lot
of
parallelism
and
and
Julia
lends
itself
to
doing
a
lot
of
parallelism
out
of
the
box
without
having
to
do
any
changes
to
the
code.
Sorry.
A
Anyway,
you'll
find
that
you
might
have
multiple
ranks
that
want
to
read
the
same
file
or
manipulate
the
same
file,
and
it
becomes
you
you
can
start
to
get
race
conditions
really
quickly
and
so
pit
file
is
a
standard
package.
A
It
provides
the
Linux,
Unix
pitfile
mechanism
to
just
hold
new
taxes
that
that
belong
to
files,
then,
and
it
it's
held
in
the
file
system.
So
so
you
don't
it's
not
like
you're
threading
mutex,
it's
shared
by
the
file
system
anyway,
hdf5
and
Tsar
are
pretty
common
packages
for
high
performance
array
like
or
table
like
data
in
binary
format.
A
Jld
and
jld2
are
serialization
formats
there
a
bit
like
pickle,
basically
only
they
don't
break
all
the
time,
and-
and
actually
you
want
to
use
jld
too,
if
you
don't
want
stuff
to
Break.
So
so
that's
why
jld2
was
invented.
I
think
it
was.
Someone
was
annoyed
that
piccol
and
jld
could
be
kept.
Breaking
and
tables
data
frames
and
csv.j
are
your
common
tabular
data
format
and
here's
an
interesting
thing
that
is
kind
of
helpful.
A
If
you
are
building
web
applications,
you
don't
always
have
to
use
Python.
For
example,
I
built
all
my
web
applications
on
spin
using
Julia,
so
the
the
packages
here
really
fall
into
two
categories.
A
A
So,
for
example,
oxygen
is,
is
the
Julia
version
of
fast
API
and
if
you
want
to
build
a
fully
fledged
web
application,
I
recommend
Genie
Genie
is
a
bit
massive,
so
it
does
have
a
big
learning
curve,
but
once
you
you've
got
Genie,
you
can
build
pretty
sophisticated
websites
all
right,
so
traditional
HPC
support
is
provided
via
mpi.jl
and
if
you
kind
of
want
to
go
a
little
bit
beyond
the
normal
API
application
constraints,
I
recommend
that
you
take
a
look
at
cluster
managers.
A
So
these
allow
you
to
schedule
resources
on
the
fly
so,
for
example,
the
slum
cluster
manager
allows
you
just
to
to
submit
more
work
into
slurm
using
Julia
and
the
MPI
cluster
managers
create
a
kind
of
a
cluster
manager.
Kind
of
environment
built
on
top
of
MPI
implicit
global
grid
and
MBI
Rays
then
lives
on
top
of
that
and
allows
you
to
do.
The
communication
via
array-like
objects
entirely,
so
they
sort
of
emulate
a
global
address.
Space.
A
A
There's
nothing
urgent,
I,
think
sorry,
nothing
urgent
in
the
zoom
chat,
all
right
now,
Julia's
native
parallelism
is,
is
a
little
bit
like
multi-processing,
in
that
it
natively
spins
up
threads
or
processes
and
then
allows
you
to
distribute
work
over
them.
A
But
a
rich
ecosystem
has
been
built
on
top
of
that
to
perform
task
based,
like
basically
producer
consumer
style,
HPC,
so
parallel
processing,
and
so
for
that
I
recommend
that
you
take
a
look
at
distributed.jl
and
especially
dagger.jl,
and
they
are
very
similar
to
desk
and
Ray,
except
that
they're
not
as
mature
and
therefore
crash
all
the
time,
but
I
have
got
them
to
work
on
pearlmata
and
actually
they're
taking
shape
now.
A
So,
if
you
start
experimenting
in
your
applications
with
them
now
and
within
a
few
months,
I'm
pretty
sure
that
we
can
get
something
stable
on
our
systems
and
then
on
top
of
that
we
have
detailables
and
distributed
arrays
that
once
again
create
like
an
array,
interface
or
table
interface
across
tasks.
A
A
If
you
have
AMD
gpus
or
one
api.gl,
if
you've
got
Intel
gpus
on
top
of
those
packages,
I
can
recommend
the
kernel,
abstractions
and
they're
a
bit
like
Cocos
and
essentially
allow
you
to
just
write
your
like
write,
abstract
algorithms
and
then
that
generates
portable
code
all
right
demo
time
so
I'm
going
to
basically
show
good
grief.
A
Close
this
okay
I'm,
going
to
show
some
slides
from
a
larger
set
of
of
examples
and
slides
shown
here
just
so
that
folks
get
a
basically
a
gentle
introduction
and
then
we're
sort
of
going
to
work
ourselves
towards
an
example
where
we
use
flux.jl
to
analyze
some
data
so
to
to.
Sometimes
there
is
prediction
all
right,
so
I
want
to
really
start
by
just
giving
a
just
a
an
overview
of
the
structure
of
a
Julia
program.
A
It's
a
little
different
to
python,
in
that
your
your
your
object
orientation
is
handled
via
functions
which
have
methods
I'll
get
to
that
in
a
second
and
namespaces
are
isolated
using
modules.
So
let's
have
a
look
at
at
it
from
a
top
down,
so
modules
are
they're
basically
given
by
the
module
keyword.
Julia
uses
bracketing
user
by
a
statement,
and
then,
if
it's
a
multi-line
statement,
you
use
end
to
indicate
the
end
of
it.
A
So
no
indentation,
unless
you
want
it
and
no
curly
braces
and
so
a
common
structure
around
the
source
code,
is
you
define
a
module
which
includes
some
source
files
and
the
source
file
will
Define
several
functions
and
helper
functions
and
so
on,
and
then
in
the
module
you'll
expose
only
those
functions
that
you
want
to
be
public.
You
can
still
get
to
the
helper
function
by
explicitly
referencing
it,
but
the
export
statement
when
you
use
the
using
module
keyword,
that's
what
what
export
is
used
for.
A
That's
those
functions
then
enter
the
global
namespace.
So
here's
an
example:
it
has
a
my
helper
and
a
my
func
and
the
my
func
calls
my
helper
and
my
my
help,
I
suppose
publicly
oops
all
right.
So
here's
an
example
of
of
how
that
works.
If
you
have
module
a
and
module
B
and
both
of
those
modules
provide
a
function
called
hello,
then
I
can
call
each
individual
function
using
the
module,
name.function
name
syntax.
So,
for
example,
I
can
call
module
a
DOT
hello
or
module
B
dot,
hello,
okay,
all
right!
A
So
that's
really
all
you
need
to
know
about
modules
right.
It's
they're
used
to
encapsulate
namespaces.
Next,
we
need
to
just
have
a
brief
foray
into
the
flow
control
and
so
flow
control.
Once
again
is
used.
A
You
know
using
IF,
while
for
so
if,
while
four
a
try
and
so
on-
and
they
always
end
it
with
end.
The
do
statement
is
not
a
loop.
It's
something
slightly
different
and
we'll
get
to
that
in
a
moment.
A
So,
first,
let's
start
with
compound
Expressions.
They
are
expressions
that
use
begin
and
end
and
the
idea
is
the
last.
The
result
from
the
last
expression
is
returned.
So,
for
example,
Z
here
is
equal
to
X
Plus
y.
It
doesn't
cause
a
namespace
isolation,
so
X
and
Y
are
still
available.
So
if
I,
if
I
just
try
to
show
y,
then
I'd
get
that
if
else,
if
Allison
and
oops
that's
pretty
obvious
here,
what
what
they
do.
I'm
gonna
skip
across
that.
A
But
of
course
we
also
have
the
ternary
operator,
so
this
operator
Returns
the
B.
If,
if
a
is
true
otherwise
C
and
of
course,
we
have
short
circuit
evaluation
and
that's
controlled
using
the
and
and
pipe
pipe
operators
for
and
an
or
so,
for
example,
X
was
less
than
y
in
this
example,
so
you
evaluate
the
second
expression,
and
so
this
is
the
very
common
design
pattern
in
Julia
to
create
one
line
conditional
execution.
A
All
right,
Loops
are
pretty
standard
as
well.
You've
got
four,
and
while
do
is
not
a
loop
and
Loops
are
different,
though
from
Python
and
from
C
style
syntax
in
that
they
are
inclusive.
So,
for
example,
here
you've
got
a
range
expression
that
goes
from
one
to
ten,
including
one
and
including
ten,
and
so,
if
I
then
printed
that
I
would
get
you
know
the
sequence
of
numbers.
A
While
you
know
you
could
emulate
a
a
c-style
while
loop.
This
way,
you
never
really
do
it.
Try
catch
is
also
really
common
pattern.
It
is
pretty
much
the
same
as
you
would
expect
in
Python.
A
In
that
you
catch
every
error,
then
you
can
use
Isa
to
check
what
kind
of
error
it
is.
So
it's
not
like
C,
plus
plus
style
catch
statements
that
only
catch
specific
errors.
If
you
want
to
then
continue
the
error
escalation,
if
it's
not
and
is
a
then
you
use
the
rethrow
command,
which
I
forgotten
on
this
here
and
finally,
you
can
use
finally
and
finally
always
executes
regardless
of
the
state
of
the
catch.
A
Okay.
I've
said
that
asynchronous
programming
is
will
basically
come
later,
but
that's
part
of
the
flow
control
of
Julia
as
well
right
functions.
Functions
are
using
the
function.
Keyword.
The
only
thing
that's
maybe
slightly
different
is
that
you
can
use
return,
but
you
can
also
just
return
the
last
line
of
the
function.
A
Yeah,
so
you
can
do
single
line
functions,
I
really
like
those
for
for
brief
functions
and
in
in
expressions
like
these,
where
the
multiplication
is
implicit,
it
uses
multiplication,
so
this
is
2
times
x
right
and
so,
if
I
ran
that
I'd
get
six,
all
right
and
functions
can
also
be
treated
like
function.
Pointers,
so
I
can
just
pass
a
function
to
another
function
and
execute
that
within
the
function
body.
A
So,
for
example,
if
I
have
a
function,
that's
X,
Plus
1.
Then
this
other
function,
we'll
just
call
it
it's
a
little
example,
but
I
think
you
get
the
point.
A
You
can
use
anonymous
functions
using
the
arrow
notation,
so
this
is
like
a
Lambda
expression
in
C
plus
or
in
Python,
and
the
nice
thing
is
this
is
where
do
comes
in
do
end.
Is
a
multi-line
Lambda.
A
So,
for
example,
this
here,
this
two
block
really
creates
the
same
Lambda
as
we
saw
before,
but
the
the
way
that
this
syntax
works
is
whatever
you
put
in
the
in
the
beginning
of
the
do
statement
has
to
have
the
first
variable
acceptor
function,
because
that
variable
gets
the
anonymous
function.
So
here
my
apply
function,
function,
which
remember,
takes
another
function,
just
executes
it
that
takes
that
now,
ingests
this
two
statement,
all
right.
So
now
you
have
the
three-minute
introduction
to.
A
To
the
the
Julia
language,
syntax,
the
program
structure,
sorry
now,
let's
have
a
look
at
something
that
is
particularly
relevant
to
Julia
itself
and
that
these
are
data
types
methods
and
introspection.
So
we
have
like
five
minutes.
You
know
I'm
gonna,
see
if
that's
okay,
I
realize
so
maybe
what
I'm
gonna
do
is
I'm
gonna
say
take
a
look
at
this
later.
A
I
do
want
to
point
out
one
thing,
and
that
is
a
method
is
an
implementation
of
a
function.
So
you
can
use
a
data
type
annotations
to
give
a
function.
Multiple
methods,
that's
how
we
get
multiple
dispatch,
AK
function
overloading
so,
for
example,
I
want
to
build
a
function
that
takes
double
that
established
the
int
part
of
a
number.
So
if
I
give
it,
if
I
give
double
integer
it
just
returns
two
times,
if
I
give
a
float,
it
will
only
double
the
integer
part.
A
The
way
we
would
Implement
that
in
Julia
is,
we
would
write
a
function
for
integer
inputs
right.
So
here
you
say,
you've
got
the
function
doubling
and
if
x
is
an
integer,
then
return
to
terms
X
if
double
end
is
receives
an
abstract
float.
So
anything
that
derives
from
the
float
data
type.
Then
we
find
the
the
remainder
and
we
doubled
the
integer
part
and
add
the
remainder,
and
you
can
see
it
in
action
and
you
can
also
use
the
methods
command
to
list
the
methods
that
any
given
function
has
so
I.
A
There's
way
more
to
this-
including
maybe
one
one
thing
I
do
want
to
say-
is
you
can
ask
Julia
to
return
the
llvm
code
and
the
native
code
using
code
code
lower
just
shows
the
resolved
Julia
code,
so
here
it
it,
it
tells
you
okay.
This
is
an
INT,
so
I'm
going
to
return
the
code
that
corresponds
to
2
times
x
and,
if
I
put
in
2.1,
then
you
see
it's
it.
It
picks
up
the
code
for
finding
the
remainder
and
then
it
doubles
the
integer
part
and
adds
the
remainder.
A
Now,
if
you
say
well,
how
does
llvm
treat
this,
then
you
can
just
go
code
llvm
and
then.
Finally,
this
is
the
double
path
which
sort
of
goes
down
the
screen.
A
All
right,
I
want
to
I
wanted
to
point
that
out,
because
it
means
that
you
don't
need
third-party
tools
to
figure
out
what
your
code
is
doing.
I
am
skipping.
All
of
that,
as
you
can
see,
there's
lots
more
things
to
explore
in
this
presentation.
That's
available
on
GitHub
I
do
want
to
point
out
that
and
we're
going
to
switch
the
example
now
and
I'm
kind
of
right.
A
So
so
we're
not
we're
not
going
to
go
into
depth
with
this
example,
but
it's
it's
helpful
to
look
at
it
anyway,
so
that
you
see
all
of
these
action
things
in
action.
A
So
anyway,
what
we
want
to
do
is
we've
got
Apple
stock
prices
and
we
want
to
do
a
Time
series
prediction
on
that.
The
way
we
would
do
yes.
First,
for
example,
I
have
some
data
in
a
CSV
file,
so
I'm
going
to
use
CSV
to
read
that
data
as
a
data
frame.
So
that's
what
that
line.
A
Does
then
I
am
going
to
and
and
there's
several
things
happening
at
once
here,
I'm
going
to
take
the
closing
price
out
of
my
data
set
and
I'm
going
to
cast
that
to
float32,
but
I
want
to
do
an
element-wise
operation.
So
that's,
why
I
do
this
and
then
I
construct
views
into
my
data
set
that
correspond
to
the
training
data
right?
So
my
today's
price
is
my
input.
A
Tomorrow's
price
is
going
to
be
my
target
for
the
neural
network
to
learn
right,
so
we're
going
to
use
flux
and
flux
just
very
quickly.
It
requires
the
inputs
to
have
a
slightly
different
shape
for
networks,
so
we
use
an
expression
like
this
to
create
a
vector
of
vectors
right,
so
these
are
little
think
of
it.
A
This
way,
the
that
the
current
neural
network
expects
a
fragment
of
a
Time
series,
in
this
case
a
fragment
of
size
one
later
in
this
spreadsheet-
and
this
is
not
spreadsheet
in
this
slides-
you
can
actually
see
what
happens
if
I
take
a
window
of
maybe
four
or
five
elements
times
I.
Take
a
snapshot
of
one
and
I
want
to
pass
that
into
my
recurrent
neural
network
as
a
vector
and
that's
why
I
do
this
by
the
way?
There's
no
penalty
doing
this
in
Julia
because
it
gets
compiled
into
native
code
anyway.
A
So
you
don't
have
to
watch
things
not
being
always
just
vectors
and
so
then
I'm
going
to
construct
a
model.
A
Oops
I'm,
sorry
I'm
skipping
a
little
fast
here,
but
the
point
is
I'm
gonna,
import,
flux
and
I'm
going
to
create
a
model
that
just
chains
together,
recurrent
neural
network
and
then
a
dense
layer
and
then
I'm
going
to
then
I'm
almost
ready
to
start
training
I'm
going
to
define
a
loss
function.
A
This
this
is
me
being
just
a
bit
crazy,
but
you
can
take
a
look
at
this.
This
is
basically
having
a
loss
function
that
reacts
differently
to
getting
a
single
time
single
snapshot
in
time
compared
to
a
Time
series
and
I'm,
basically
using
the
data
type
annotations
to
pick
up,
which
kind
of
lost
function
based
on
what
what
the
inputs
are
always
explained,
offline,
basically
and
then
I'm
going
to
use
flux.train,
and
it
really
takes
us
to
the
end.
A
Basically,
where
we
all
we
influx
the
train
function,
all
it
does
is
it
creates
a
gradient
and
then
takes
the
gradient
of
that
of
the
loss,
function
and
updates
the
model.
That's
what
train
does
if
I
ran
that
you
know,
I
would
get
yeah
this
kind
of
loss
function,
which
is
which
is
sort
of
very
constantly
the
we
get
some
diminishing
returns
at
some
point
from
training.
And
then,
if
we
looked
at
our,
we
looked
at
our
Trend
model.
It
you
can
see.
A
The
red
line
is
a
trained
model,
it
does
terribly
at
the
beginning
and
then
it
it
really
picks
up
on
the
Apple
stock
price
and
the
with
this
I'm,
basically
ending
the
presentation.
I
want
to
leave
you
with
one
thought,
though,
and
that's
kind
of
neat.
A
If
I
want
to
use
gpus,
all
I
need
to
do
is
I
just
need
to
take
my
model,
send
it
to
using
cuda.j
I'll,
send
that
to
GPU
then
send
my
data
to
the
GPU
and
run
exactly
the
same
code,
because
the
data
type
is
now
different,
so
Cuda
will
realize
I
had
different
data
type
I
have
to
compile
new
code,
so
it
looks
up
what
do
I
do
if
it's
a
Cuda
array
and
huda.jl
is
basically
telling
Julia
how
to
generate
llvm
code
to
drive
a
GPU
rather
than
a
CPU,
and
so,
if
I
ran
this,
you
can
see
my
GPU
activity
is
picked
up
and
it
starts
to
run.
A
This
is
on
Corey,
GPU
and
so
I
can't
plot,
like
I,
had
to
fail
over
to
core
GPU.
So
things
are
configured
badly
here,
but
so
now
it's
training
on
a
GPU
all
right
and
with
that
I
I
also
point
out
that
we
have
a
paper
coming
out.
So
keep
an
eye
out
for
for
this
paper
and
with
that
I'd
like
to
thank
you
for
your
attention.