►
Description
Speakers: Stephen Belanger
Async iterators make stream processing much easier than using Node.js core streams. Let's investigate a few common use cases and analyze the performance profile.
A
Gonna
talk
about
async
iterators,
and
so
how
can
change
stream
streaming
systems
in
the
future?
So
about
me,
I'm,
Stephen,
hello,
I'm,
on
github
and
Twitter,
and
we're
gonna
talk
about
some
things.
The
intro
some
of
you
might
have
seen.
Mateos
talked
earlier,
so
you've
already,
maybe
seen
a
bit
of
intro
to
isn't
Gatorade
here
so
I.
Won't
it's
been
too
much
time
on
that.
One
also
gonna
show
off
some
use
cases.
A
So
whatever
is
a
sink,
it
writers,
its
matches,
the
async
function
system
and
the
generator
function
system
into
one
thing:
special
signature,
you
above
the
async
symbol
and
the
asterisks.
It's
you
can
weight
stuff
and
you
can
yield
stuff
and
just
returns
an
iterator
that
you
can
do
for
weight
loop
over
and
it's
also
internally
an
interface
that
just
promise
like
promise.
A
So
streams
are
actually
async
iterators,
so
it
you
can
just
take
a
any
creek,
read
stream
and
just
for
awaits
over
it
like
right
now.
This
already
works,
so
just
go
and
do
it
so
with
that,
like
you
can
do
a
simple.
Can
cats
dream
type
thing
too,
like
this,
like
there's
a
bunch
of
modules
any
in
the
user
land
right
now
to
do
like
stream
can
cat
that
are
like
five
times
the
length
of
this
to
do
the
same
thing,
so
it
makes
things
easier
and.
A
The
you
can
also
do
stream
transformations
I'm
going
to
be
soon
working
on
a
pull
request.
That's
in
node
core
there's
a
function
called
pipeline,
which
you
might
see
later
in
the
presentation
that
replaces
the
like
stream
pipe.
You
just
call
pipeline.
It
looks
basically
the
same
as
this
he's
giving
a
bunch
of
streams.
A
The
this
is
my
my
own
user
land
thing
using
async
iterator
pipe,
which
it
takes
just
any
stream
the
same
as
like
pipeline
does
and
transform
streams,
but
it
also
takes,
but
it
can
also
take
the
async
generator
functions
in
place
of
a
stream
which
just
looks
like
that.
So
it's
it's
just
a
function
that
receives
another
iterable
and
produces
the
new
one.
So
this
interval
can
be
like
an
existing
stream
or
another
one
of
these,
and
you
can
just
chain
all
these
together.
A
A
As
long
as
there
is
a
new
line
in
here,
then
it
will
slice
up
to
the
point
of
that
new
line
and
yield
that
chunk
and
then
move
the
pointer
like
resetting
that
buffer.
And
so
this
will
just
keep
like
loop
until
I
have
a
new
line
and
then
omit
that
loop
until
I
have
another
new
line
and
hit
that
and
just
keep
doing
that
and
then
eventually,
you
might
have
no
new
line
terminator
at
the
end
of
the
file
so
just
yield.
A
If
there's
anything
left
and
now
that
we've
splitted
into
lines,
we
need
to
do
something
with
those
lines
which
is
trans,
translate
that
as
CSV,
so
the
first
line
in
CSV
is
just
the
keenen's.
So
we
can
loop
over
each
line
and
just
split
the
first
line
into
an
array
of
the
keys
and
then
the
next
for
every
subsequent
line.
A
A
It
has
this
for
handling
streams,
which,
as
I've
said
before
a
stream,
is
just
an
interval,
so
you
can
just
iterate
over
things
and
that
this
handles
a
stream
a
lot
like.
If
it's
targeting
a
stream,
then
you
can
just
iterate
over
the
iterable
source,
whether
it's
a
stream
or
not,
and
just
write
into
the
stream
and
and
when
you're
done
the
loop.
A
And
then,
if
it's
function
it
you
can
just
hand
it
the
source
directly.
Just
create
this
like
nested
call
thing,
and
then
all
we
have
to
do
is
just
detect
whether
something
is
a
stream
and
pick
which
size
of
logic
we
want
to
do,
and
then
you
can
just
do
a
reduce
all
the
targets.
So
you
have
like
two
in
the
pipeline.
A
So
typically,
you'll
have
like
an
HTTP
app.
That's
you
have
like
HTTP,
create
server
with
like
a
like
handler
to
receive
each
I
get
each
request
that
comes
in,
but
instead
you
can
use
HTTP,
iterator
and
just
do
create
server
without
giving
a
callback,
and
you
can
actually
do
that
normally
in
a
like,
attach
a
like
on
response
event.
Instead,
that's
actually
what
it's
just
doing
in
turn
internally,
but
this
module
will
actually
do
that
internally
and
convert
that
into
an
async
iterator.
A
So
you
can
just
for
await
over
requests
and
there's
some
kind
of
neat
things
you
can
do
with
this
sort
of
pattern
like
pushing
all
of
this
into
a
queue
and
then
like
awaiting.
If
you
have
like
too
many
concurrent
connections
or
something
like
you,
you
might
want
to
throttle,
or
things
like
that,
so
you
can
like
push
stuff
into
a
queue
and
handle
it
elsewhere.
A
Instead
of
like
handling
it,
all
in
this
callback
gives
you
a
little
bit
of
extra
controls,
and
so
this
is
the
entirety
of
HTTP
iterator
module
called
channel
surfer,
which
provides
go
like
channels,
so
you
create
a
channel
and
listens
to
as
I
said,
the
server
on
requests
event
and
just
passes
the
request
in
response
in
to
this
channel
and
then
closes
the
channel
when
the
server
closes.
That's
it
so
we'll
get
into
more
complex
example.
A
Now,
so
a
lot
of
like
api's
will
have
like
a
pageant
ated
api
and
they
might
like
have
their
pages
like
nested
and
things,
and
it
can
be
kind
of
awkward
if
you're
like
trying
to
deal
with
something
like
that.
If
you
want
to
like
interact
with
like
every
record
in
a
system
or
something
like
that,
then
you
have
to
like
build
a
whole
bunch
of
logic
on
top
of
logic,
on
top
of
other
logic,
that
just
makes
it
complicated
to
interact
with
they
have
something
like
that.
A
So
but
let's
just
consider
like
with
load
page
just
pretend
this
is
like
fetching
from
the
network
or
something
it
gets
a
page
by
the
page
number.
So
we
can
convert
that.
That's
a
page
number
of
getting
function
into
a
nascent
Gatorade
or
pretty
easily.
By
just
having
this,
and
as
long
as
load
page
returns
a
like
truth
e
value,
then
it
will
keep
going
in
the
while
loop
and
yielding
pages
so
whenever
it
hits
a
empty
page
than
it
starts.
A
So,
but
each
of
each
of
these
pages
looks
like
this:
it's
it
doesn't
give
us
the
items
it
gives
us
the
page.
What
we
want
is
actually
like
a
list
of
items,
so
we
can
just
convert
one
iterator
into
another,
a
diretor,
but
just
wrapping
the
pages
iterator
in
one
that
loops
over
each
page
that
it
receives
from
that
and
omits
each
item
from
each
of
those
pages.
A
You
you
you,
you
might
have
like
a
giant
sequence
of
things.
You
want
to
break
up
into
like
groups
of
like
process,
ten
items
at
a
time
or
something
like
that.
So
here
we
have
infinite
sequence
of
numbers.
This
will
just
keep
looping
forever.
Every
like
ten
milliseconds
it'll,
give
you
the
next
number.
A
So
we
for
batching.
We
want
to
be
able
to
take
like
a
certain
number
of
items
out
of
the
iterator
and
process
each
of
those
like
in
that
batch,
so
that
there's
some
interesting
stuff
in
here.
That's
a
we're
using
the
like
iterator
next
function,
instead
of
like
a
408
loop
here
and
then
like
breaking
at
so
after,
like
n
iterations
and
the
the
reason
for
that
is
that
for
like
for
weight
of
willfully
consume
an
iterator,
even
if
you
try
to
break
out
of
it.
A
That's
what's
doing
the
while
loop
and
the
weight
iterator
next
and
when
you're
interacting
with
the
underlying
iterator
interface,
you
have
to
check
the
like
dumb
flag
and
yield
the
value
not
like
yield
the
OP
the
object,
so
that
this
might
look
a
little
bit
familiar,
because
this
is
actually
effectively
the
same
thing
as
the
stream
can
cap
that
I
showed
you
earlier.
It
just
doesn't
deal
buffer
can
cat.
So
this
is
interesting
in
that,
like
you
can
actually
like
abstract
away
the
whole
like
stream.
A
A
So
now
that
we
have
those
pieces
in
place,
we
can
make
the
batch
function,
so
it
just
takes
an
iterator
and
it
takes
a
page
size
and
it
will
take
a
page
of
that
size
out
of
that
iterator
convert
that
to
an
array
and
then,
as
long
as
there's
items
to
be
yielded,
then
it'll
just
yield.
That
page,
so
we
can
do
something
like
if
we
have
infinite
number
of
infinite
sequence
of
numbers.
A
We
can
like
take
batches
of
that
and
some
the
batches
or
like
things
like
that,
but
that
there
are
some
interesting
pitfalls
to
consider
with
async
iterators,
though
so
I've
already
mentioned
this
one,
which
is
like
for
a
weight
of
loop,
fully
consumes
and
they
were
iterator,
there's
some
other
interesting
ones
as
well.
A
really
weird
one
to
wrap
your
brain
around
is
that
micro
tasks
are
higher
priority
than
everything
else
in
the
like
event
loop.
A
So
what
what
this
means
in
practice
is?
That's
you
can
you
can
do
some
things
like
in
here?
We
have.
We
have
this
function,
that
we
want
it
to
just
infinitely
loop
over
and
like
await
some
task,
and
we
have
this
timeout
here
that
just
sets
a
flag
to
tell
it
yeah
like
what,
when,
when
it
reaches
this
timeout,
just
like
set
the
flag
to
stop
running
the
loop.
So
do
you
think
this
works?
A
No,
it
does
not
because
there's
a
weird
thing
in
basing
a
weight
that
you
can
actually
await
an
on
promise
and
it's
just
we'll
wait.
A
wait
like
it'll
yield
the
value
it's
like
you
can
just
do
like
a
weight.
One
and
it'll
give
you
one
and
it
won't
actually
defer.
Well
it
it
will
defer
to
the
micro
task
queue,
but
it
will
be
immediately
resolved,
and
that
is
a
that
particular
property
is
a
problem
in
that
the
micro
task
queue
little
bit.
A
Basically,
the
the
queues
in
node
like
will
step
down
to
the
next
that
the
next
tier
when
the
current
tier
has
M
has
drained.
So
if
the
micro
task
queue
every
every
time,
it
has
a
tick
adds
another
task
to
the
micro
task.
Queue
it'll
just
keep
cycling
through
the
micro
task
queue
forever
and
will
never
actually
reach
anything
else.
So
the
set
timeouts
will
actually
never
get
reached
because
it's
a
lower
priority
queue.
A
So
it's
a
little
bit
of
a
weight
issue,
but
just
realize
that
micro
tasks
your
higher
priority
and
you
might
have
to
debug
that
and
the
that
this
was
kind
of
connected
to
that.
If
any
of
you
remember
the
salgo
issue
and
the
DS
algo
module
that
came
out
of
that
in
like
early
days
of
know,
those
like
lots
of
people
would
like
write.
A
So
it's
just
gonna
like
it
call
the
callback
immediately,
but
it
call
like
it
is
sometimes
call
it
in
the
same
tick
instead
of
the
next
tick
so
like
you'd
have
like,
like
logically,
your
code
would
like
call
this
async
thing,
and
then
you
might
have
something
below
it
that
that
that
would
like
on
the
first
run.
That
would
happen
first
and
then
the
callback
would
happen.
A
But
then
that
would
switch
around
because
it
was
calling
the
callback
in
the
same
tick,
and
it
just
like
makes
your
code
super
confusing
and
non
non
deterministic,
so
the
desal
go.
Module
was
created
to
ensure
that
things
are
always
async,
and
this
is
basically
the
equivalent
in
async
iterators.
We
have
to
do
this
again.
Apparently,
so
you
just
wait.
Another
promise.
That's
differs
to
set
immediate,
which
is
at
the
bottom
of
the
the
like
tiers
of
queue
priorities
so
that
this
will
say
like
once.
Everything
else
is
done,
then
jump
back
to
this
again.
A
So
oh
I
also
have
a
bit
of
performance
demo,
so
I
had
there's
been
a
lot
of
people
have
kind
of
avoided,
isn't
Gator
writers
and
like
related
to
that
promises
because
of
the
myth
that
promises
are
slow
which
at
one
point
had
some
truth
to
it.
But
there's
been
a
huge
amount
of
effort.
That's
gone
into
optimizing
promises
over
the
last
several
years.
That's
they're!
Actually
quite
performant.
At
this
point,
so
I'm
gonna
show
the
show
a
little
demo.
A
A
A
A
A
Yeah
yeah
it
has
some
helpers
to
handle
the
converting
the
keys
and
stuff
well,
but
the
the
I
was
kind
of
hoping
to
show
the
demo,
but
it
doesn't
run
around
so
that
I
I
did
some
performance
testing
basically
and
the
like,
went
when
when
running
the
upper
case,
like
the
opera
casing,
demo,
it's
fairly
naive
code,
so
it's
not
gonna
have
any
significant
performance
difference,
but
the
CSV
to
JSON
has
like
a
little
more
meat
to
it.
There's
a
little
more
stuff
there.
A
That's
it's
like
more
real-world
kind
of
demo
and
the
the
code
from
the
async
iterator
actually
is
about
15%
faster
than
streams
are
and
about
40%,
less
memory
usage.
So
you
can
actually
use
async
iterators
now
and
gets
like
as
good
or
better
performance
than
streams
if
you'd
know
how
to
use
them
properly.