►
From YouTube: Stream into the Future - Matteo Collina, NearForm
Description
There was a time when Node.js streams were all the rage but over time the Node.js Core Streams codebase became extremely complex and hard to understand. Worse still, WHATWG introduced an API for browser Streams. The two Streams API’s are incompatible with each other and both are complex and leaky abstractions. In this talk, a Node.js Core Streams maintainer presents a stream-less future by demonstrating how to use pure JavaScript: Async Iterators and Generators can give us everything Streams can while being completely cross-platform and highly performant.
A
I
am
at
the
Kalina
admeto
Kalina
on
Twitter,
so
please
follow
me
just
so
that
you
know
and
I'm
trying
to
eat
10,000
followers
by
the
end
of
the
year.
So
please
help
me
into
that
to
that
release
that
my
son
so
I'm
here
today
to
talk
about
nodejs
streams
and
the
future
of
Nod
streams,
and
you
know
how
you
can
help
or
maybe
what
you
can
provide
feedback
on.
So
a
couple
of
things.
First,
I
work
for
a
company
called
near
form.
We
have
a
nice
booth
in
the
booth
CRO.
A
We
are
also
doing
some
raffle
thing
with
some
smart
watches
and
airport
so
can
buy
all
the
you
know.
We
donate
some
money
out
of
the
Ottawa
charity
out
after
the
booth
after
after
that,
so
please
come
by
swing
by
it
would
be
interesting,
so
we
are
new
to
yes,
we
are
a
professional
services
company
that
do
all
things
javascript.
So
you
know
if
you're,
if
you
have
some
problems
with
your
JavaScript
and
you
need
some
help
and
your
team
need
some
out
lights,
please
pass
by.
A
A
So
a
stream
is
like
an
array,
but
over
time,
essentially,
instead
of
having
a
big
chunk
of
data
enough
of
memory,
we
choose
rating
upon
your
receiving
data
along
the
way
and
you're
processing
them
over
time.
This
is
fantastic
because
it
enabled
us
to
crunch
an
insanely
high
amount
of
data
with
a
limited
amount
of
memory,
which
is
fantastic
right.
That's
that's
what
why
there
we
are
using
them.
We
are
using
them
for
file,
processing
and
a
lot
of
other
things,
so
there
are
key
part.
A
The
problem
is
that
node
streams
are
really
complicated
and
most
people
don't
really
understand
all
dreams
at
all
by
even
some
cork,
not
core
contributor
struggles
a
little
bit
into
when
it
comes
to
those
two
reviewing
those
Chris
PPR.
So
it's
really,
you
know
it's
a
really
complicated
code
base
that
evolved
a
lot
over
time
and
you
know,
in
fact
it
emits
up
a
number
of
events.
So
not
streams
are
based
on
a
venturi
meter
and
they
emit
a
lot
of
events.
You
know
you
can
have
the
data
event
you
probably
have
used.
A
You
have
you
that
it's
a
readable
event,
which
probably
you
haven't,
used
much
because
that's
not
very
popular
that
is
close.
This
is
a
meaning.
This
end
that
has
some
other
meaning
different
from
close
by
the
way.
You
know
there
is
finish,
which
you
know
close
at
the
finish.
You
know
they
kind
of
mean
the
same
thing
right
know
that
really
complete
different
three
completely
different
meaning,
and
then
that
is
destroyed
to
tell
the
full
thing.
A
The
full
thing
down
and
I
have
not
mentioned
error,
because
you
probably
have
known
that
a
meeting
every
node
is
a
really
big
thing,
so
there
is
also
that
to
take
into
consideration
just
to
recap:
data
and
readable
are
for
reading
data
and
that
close
and
finish
are
emitted
when,
when
streams
are
ending
to
some
extent,
the
meaning
of
end,
it's
on
the
readable
side,
the
meaning
on
finish
on
the
writable
side.
So
when
a
reader
of
stream
is
done
or
the
writable
stream
is
done
successfully
successfully.
A
Only
so
if
there's
an
error
or
an
abrupt
operations
are
not
emitted,
close
is
emitted
when
the
underlining
resource
is
brought
down
and
it
will
be
emitted
also
in
the
case
of
fibers.
So
so,
once
upon
a
time,
the
only
way
to
interact
with
the
stream
was
by
using
the
data
on
data
event.
You
will
find
a
lot
of
these
examples
everywhere.
A
Okay,
these
I
call
this
the
wild
horse
example,
because
essentially,
you
are
receiving
as
much
data
as
fast
as
the
source
can
provide
you,
which
turns
out
to
be
a
very
bad
idea,
because
you
cannot
really.
You
know
it's
really
hard
to
pause
the
source
to
some
extent
until
the
a
please
slow
down,
because
this
is
too
much
data
I,
don't
know
what
to
do
it
with
that
here
also
take
into
account
that
Jesus
posted
the
source
of
one
of
the
wars
back
ever
so,
if
you
put
with
where
is
it?
A
Okay,
if
you
put
a
sink
here,
you
are
creating
making
a
lot
of
pop
card,
so
don't
put
a
sink
there.
James
has
a
talk
tomorrow
about
broken
promises.
Go
to
his
broken
promises.
Talk
because
it's
a
really
good
talk
to
be
to
go
and
liquid
pressure
with
that
on
data
model
is
really
hard
because
I'll
do
you
stop?
There's
a
pose
method
on
strings,
but
pose
is
only
advisory.
So
what
so?
At
some
point
it
was
introduced.
A
A
new
in
another
API
called
the
readable
event
to
actually,
you
know,
provide
a
pool
based
model
so
instead
of
having
these
fire
rows
of
data.
Okay,
let
me
just
call
you
and
get
me
notify
me
when
there
is
data
available
and
then
let
me
read
that
data
from
you.
So,
instead
of
you
know,
a
fire
hose
of
information
is
more
a
pool
model.
Okay,
so
streams
can
have
a
push
model
and
the
pool
model
I'm
implementing
in
the
same
code
base.
How
does
that
sound?
A
That's
its
are
able
to
maintain
okay,
but
these
is
actually
the
probably
the
best
way
to
interact
with
the
string,
because
you
actually
can
you
pull
data
from
it,
so
you
use
the
internal
buffer
in
the
most
optimal
way.
Still
again,
not
really.
This
is
not
really
readable.
To
be
honest,
it's
also
more
performant,
because
in
this
way
you
only
read
the
data
that
you
need.
So
it's
a
little
bit
more.
It's
a
much
more
gentle
in
the
buffering
off
of
the
streams.
A
A
Don't
do
this?
Okay,
how
many
of
you
have
done
this
few
people?
Don't
do
this
at
all?
This
is
community.
This
will
create
a
memory
leak
immediately.
Y-You.
Can
you
see
who
I?
Can
this
create
a
memory?
You
know
can't
you
see
the
super,
it's
obvious
right.
It's
a
fast!
That's
a
simple
API
out.
What
can
I
write
bad
you
anyway?
A
So
if
you,
if
you
look
at
this,
you
see
this
pie
press.
So
if
the
response
errors
or
quit
or
the
and
the
other
side
close
the
connection,
the
read
stream
is
not
closed
or
tear
down
or
destroyed,
so
it
stays
alive
and
stays
a
light
forever.
Because
then
there
is
nobody
to
consume
that
stream,
and
that's
that's
a
big
problem
so
essentially,
if
in
your
code,
are
using
pipe
without
having
some
special
error,
handling
magic
you're,
probably
have
a
memory
leak
somewhere,
your
application,
so
don't
use
dot
pipe.
A
What
you
need
to
use
is
pipeline,
which
is
a
new
operator
that
we
have
added
in
North,
core,
I,
think
10
or
something
or
8,
maybe
10
10
anyway,
it's
available
on
as
an
open
source
model
under
the
name
pump
and
these
Souls
released
inside
the
readable
stream
model
module
anyway
there
is
pipeline
and
pipeline
is
actually
really
cool
because
it
actually
teared
tear
down
the
streams.
So
if
one
of
those
errors
it
closed
all
the
others,
so
basically
it
makes
sure
that
there's
no
memory
leaks
and
no
leaks
of
any
sort.
A
So
please
use
pipeline.
Ok,
don't
use
pipe!
You
might
want
to
ask.
Why
can't
you
change
the
behavior
of
pipe
to
actually
do
the
right
thing?
Well,
backward
compatibility.
We
cannot
break
everybody.
Okay,
there's
a
lot
of
code
that
the
light
of
that
behavior
out
there.
So
you
can.
We
can't
really
change
that.
That's
bad!
You
can
also
use
a
framework
by
the
way
to
serve
files.
Probably
the
best
option
is
actually
to
use
a
framework,
because
there
are
a
bunch
of
things
to
be
done,
so
don't
survive
manually.
You
just
use
a
framework.
A
A
A
We
can
probably
have
a
beer
afterwards
right
anyway,
so
everybody
has
moved
to
a
sink
await
the
biggest
problem
with
streams
and
the
sink
away.
It
is
that
the
two
things
do
not
really
mix
well
together
and
if
you
try
to
do
that,
we
are
probably
have
seen
some
new
promise
and
then
doing
some
Shanahan's
to
get
actually
the
thing
to
do
what
it,
what
you
want
to
do,
I,
don't
know
you.
A
Probably
somebody
as
soon
is
nothing
I,
hope,
I,
hope
it
I'm
meeting
some
nerves
here,
but
the
outer
do,
it
is
not
clear,
is
not
easy
and
is
not
straightforward.
So
essentially
you
have
we're
not
doing
a
good
way
of
serving
you
folks,
that's
not
core
contributors
into
this
by
the
way
I'm
part
of
the
team
that
maintains
the
node
streams,
so.
A
Anyway,
so
we
cannot
really
change
the
things
that
are
already
there
without
breaking
everybody,
because
everybody
depends
on
these
things.
So,
for
example,
we
cannot
really
change
all
the
streams
work
because
then
it
will
break
whatever
web
framework
that
you
are
using
those
used
streams
or
you
can
change
them.
If
we
break
some
of
the
file
processing
that
you
do
so
we
can
do
some
surgical
changes,
but
we
can't
really
do
much
so.
A
So
then
the
question
comes:
how
can
you
improve
the
usability
of
this
user
ability
of
streams,
considering
the
fact
that
you
know
people
love
us
in
Kuwait,
so
the
first
one
was
the
number
one
mistake
that
people
do
with
streams
is
on
data.
Sync,
don't
do
that?
Okay,
right
now,
I'm
working
on
a
field
go
to
this
totally,
but
I'm,
going
working
on
a
fix.
I
did
a
PR
to
help
with
that,
but
to
actually
provide
some
safety
net,
but
the
net
result
would
be.
A
How
many
of
you
knows
about
the
sync?
Iterators,
hey,
fantastic,
guys,
the
best
feature
that
has
been
put
putting
inside
one
of
the
recent.
We
should
have
note
they
came
into
node.
It
said
note
10,
so
that
basically,
after
December,
by
the
way,
no
date
goes
out
of
maintenance
in
the
31st
of
December
this
year.
So
stop
using
no
date.
So,
since
January
1st
all
supported
version
of
nodes,
yes
will
have
a
sink
iterator
sin
which
is
fantastic,
so
we
can
use
them
everywhere,
depending
on
whichever
node
version
we
are
using.
A
How
does
this
look
like?
Well,
you
have
a
bunch
of
things
to
say
so.
First
one
up
is,
you
can
have
on
a
sync
generator
of
a
nursing
generator.
This
is
actually
a
very
specific
way
to
write
a
function
that
you
know
can
spit
out
things.
You
know
that
you
can
iterate
on
like
this
with
for
await
and
you
pass
an
iterator
now.
A
What
you
probably
don't
know
is
that
of
it.
There
is
a
full
protocol.
Under
that
thing,
you
know
to
power
that
so
you
can
write
your
own
asynch
iterators
without
using
the
the
function
syntax,
for
example,
more
or
less
like
you
can
write
your
own
iterator
normally
to
resource
for
other
objects,
so
you
can,
if
you
want
to
use
for
off
into
on
an
object
on
any
object,
you
can
actually
provide
that
type
of
capability
because
there's
a
protocol,
so
there
is
your.
Your
object
need
to
implement
certain
methods
to
do
things.
A
So
this
is
pretty
interesting.
Can
we
use
this
with
streams?
Well,
maybe
we,
when
this
was
being
standardized,
we
were
working
with
tc39
to
actually
validate
that
we
could
actually
use
them
with
streams,
and
we
could
so
so
things
are
compatible
and
compatible
right
now,
so
you
can
use
this
code
right
now,
so
you
don't
need
to
write
any
turn
on
any
flag
and
so
on
this
code
is
so
you
can
create
your
stream
and
then
you
can
for
await
and
iterate
over
that
stream.
A
Pretty
cool
right
and
you
can
just
process
those
chunks
and
that's
it,
and
when
the
for
loop
ends
it
ends,
hey
I,
don't
need
to
use
any
crazy
events.
I
don't
need
to
do
any
shine
against
I.
Don't
need
to
do
anything
crazy,
I!
Just
can
just
do
my
thing
and
don't
be
done
with
it
and
note
that
if
I
break
I
use
break
from
my
for
loop,
it
just
tear
down
the
streams
automatically
again.
I
don't
have
to
do
anything
about
it,
so
it
just
do
they
just
handle
the
very
80%
basic
use.
A
Cases
that
you
need
and
use
the
down.
The
other
API
is
as
low-level
components
also
in
node
12,
but
also
not
ten.
Now
these
are
shipped
on
node
10
as
well,
so
I
need
to
lay
these
lights
in
node,
ten
in
all
version
of
node,
but
it
was
just
recent
recently
ported
to
node
12.
So,
like
the
last
release
of
no
time,
we
have
added
readable
dot
from
which,
essentially,
you
can
pass.
A
You
can
pass
a
an
iterator
or
an
a
sync,
a
traitor
to
readable
dot
from,
and
it
will
automatically
convert
you
in
to
a
stream
which
is
pretty
fantastic.
So
you
don't
have
to
do
anything
now.
Okay,
I
can
just
use.
I
can
just
use
these
to
convert,
to
convert
us
to
a
function
or
an
object
or
an
array
to
a
stream.
So
it
simplifies
a
lot
of
testing,
for
example,
because
you
need
to
test
your
streams
right,
I
hope.
All
of
you
write
unit
tests.
A
So
with
now,
you
can
also
go
and
write
some
crazy.
You
know
concept
of
Inception,
so
you
can
start
with
an
a
sink,
a
sink
generate
on
function,
convert
into
a
stream
and
then
iterate
over
the
stream.
Now
these
are
some
little
side
effects.
Okay
stream.
One
of
the
key
concept
of
streams
is
data
buffering
in
order
to
a
stream
in
order
to
be
performin
is
to
will
buffer
data
and
so
you're
a
sync
generator
will
get
called
a
little
bit
more
up
to
fill
the
streams
buffer.
So
it
has
a
slightly
different
behavior.
A
Then
these
will
be
have
a
slightly
different
behavior
that
are
normal.
If
you
just
iterate
over
the
sync
generator
still
is
doable
and
you
know
you
can
use
these
things
to
actually
level
the
field
up.
It's
pretty
fitting
nice,
because
then
you
can
just
accept
a
parameter
if
it's
a
sink,
I,
terrible
and
just
a
sink
iterate
over
that
and
you
can
use
an
assassin
assassin
generator
or
any
other
object
that
implement
that
protocol
and
your
code
will
actually
be
the
same.
So
it's
pretty
powerful.
It's
pretty
powerful,
okay.
So
it's
demo
time.
A
So
hopefully
the
demo
gods
will
be
with
me.
You
know,
because
that's
that's
an
important
piece,
so
I
have
a
seven
examples:
I'm,
not
sure
if
I
can
go
through
all
of
them
we'll
see.
So
this
is
the
first
example.
So
this
is
old-school
streams,
so
you
implement
the
red
with
a
the
youth
pass,
you
implement
the
read
method
and
with
read
you
just
you
know,
iterate
over
a
big
array
and
just
push
all
those
values
in
and
then
push
now.
A
You
run
this
and
you
just
spit
out
all
the
code
now.
This
was
how
you
would
do
this
thing
note
that
in
node
we
change
I
changes
this,
and
you
know
these
will
not
run
on
node
six.
It
not
six.
This
won't
work
by
the
way.
Okay,
we
have
fixed
some
stuff,
I'm,
not
sick.
This
won't
work
whatever
don't
try
that
or
not
six.
Of
course
that
is
long
gone,
so
we
don't
have
to
worry
about
it
anymore.
A
So
now,
let's
go
and
look
at
another
version
of
the
same
code,
so
this
is
just
using
readable
from
know
that
we
see
use
on
data
to
put
to
process
that
to
process
the
data.
However,
in
this
case
we
just
pass
the
array
through
and
we
just
get
a
stream
back.
This
is
really
powerful
for
for
testing,
for
example,
because
in
this
way
we
can
be
compatible
and
so
on.
Without
you
know,
breaking
a
sweat,
so
we
can
run
this
code
and
you
will
still
see
you
can
still
work
the
exact
same
now.
A
What
you
can
do
is
you
can
also
go
into
the
full
inception
mode
to
some
extent,
and
you
can
pass
in
a
generator
generator
function.
Note
that
this
is
all
synchronous.
Okay,
a
generator
is
synchronous,
a
sync:
it
a
sync
generator:
it's
a
synchronous,
okay,
anyway,
you
can
do
this
and
you
can
still
run
it
and
you
still
get
the
same
output
as
before,
but
now
you're
generating
it
from
a
function.
Instead,
this
is
pretty
useful
again.
A
We
can
of
course
use
those
with
we
can
use.
We
can
go
from
an
iterator
to
Anna
sync
iterator
using
this
API,
so
we
can
pass.
We
have
do
we
have
at
the
our
generator
here.
We
pass
it
to
readable
from,
and
then
we
a
sync
iterate
over
it.
Why?
Because
it's
all
pretty
much
generic,
so
we
can
still
run
the
code
in
still
spitting
out
of
thousand
and
twenty
four
numbers,
I'm
I've
a
lot
of
fantasy.
A
Instead,
it
will
still
do
exact
same
thing.
Okay,
we
still
work
the
exact
same,
so
there
is,
as
I
said,
there
is
a
little
bit
of
difference
related
to
how
much
data
is
fetched
from
the
sink,
a
traitor
function.
So
there
is
that
thing
to
take
into
consideration,
but
that's
that's
it
so
now
you
can
also
given
that
so
daughter
streams,
though
so
you
can
also
call
all
the
full
api's.
A
A
A
By
the
way,
always
use
promise,
if
I,
if
you
want
to
get
promised
version
of
anything,
don't
do
it
manually
the
chances
if
it
did
make
a
mistakes
doing
manually
or
some
of
your
colleague
will
make
a
mistake
after
you
have
done
it
correctly.
Six
months
later,
it's
very
high,
so
don't
use
promise
advice.
That
is
there
for
a
reason,
and
it's
also
really
fast.
So
we
go
with
generators
live
function
with
set
timeout.
There
are
a
pretty
a
lot
of
nasty
things
that
you
can
just
for
me.
A
A
A
A
A
Note
that
when
you
using
a
sink
await
on
the
server,
you
also
always
need
to
put
a
catch
handler
in
there
make
sure
make
sure
that
if
you
don't
put
a
catch
and
ER
you're
going
to
create
a
memory
leak,
everything
about
you
called
make
promises
safe.
The
name
implies.
The
name
is
exactly
what
it
implies,
or
you
crash
on
and
handle
rejection.
This
is
a
big
conversation
and
not
the
focus
of
this
talk.
A
So
what
we
do
here?
We
just
count
the
chunks
and
count
the
length
and,
and
then
we
just
here,
we
just
use
the
request
module
to
to
do
two
pipe
things.
So
I'm
just
running
this
up
and
you
see
this
is
all
working
fine
and
it's
actually
really
fast,
not
that
using
a
sink.
A
traitor's
is
actually
really
fast
as
prolly
faster
than
using
other
things
you
know,
living
on
is
as
fast
as
any
other
way
of
processing
streams
in
node,
so
use
that
that
is
doesn't
have
a
big
performance
overhead.
A
I
we
have
optimized
the
sink
iterators
like
to
the
bone
problem.
So
it's
the
include.
The
code
is
really
nasty
from
in
there.
Just
I
brought
the
things.
So
let's
go
talk
a
little
on
what
is
missing
so
the
first
one
that
is
missing
is
how
to
write
a
transcript
stream
using
a
sync
iterators.
So
there
is,
for
example,
something
that
you
could
use
to
actually
process
the
data
along
the
way.
This
is
actually
a
very
nice
idea
because
we
can
have
transferred
by
and
you
just
pass
in
an
a
sync
iterator.
A
A
So
it's
a
pretty
nice
pattern
to
write
a
transform,
a
way
to
transform
data,
pretty
cool
right,
I
love
this
so
and
then,
if
you
convert
this
each
to
a
transverse
frame,
then
you
can
use
it
with
pipeline
and
interact
all
the
stream
secure
system
models.
Note
that
this
is
currently
being
worked
on.
So
hopefully
this
PR
will
land
again
has
been.
It
got
stuck
at
the
beginning,
October
so
I'm
hoping
to
unstuck
it.
A
This
weekend
we
have
the
collaborator
summit,
then
in
two
days
hopefully,
I
can
get
that
unstuck
and
you
can
get
that
landed
eventually.
I
would
love
to
be
able
to
create
a
to
avoid
creating
a
transform
completely.
So
you
can
just
pass
in
a
single
sink.
A
sync
function,
I
think
I,
think
I
think
generator
function
that
essentially
implement
that
pattern
for
you.
So
in
this
way
you
know
you
can
do
a
void
creating
another
stream.
You
can
just
compose
function
as
much
as
you
like.
It's
pretty
it's
pretty
interesting.
This
is
not
implemented.
A
Yet
this
is
fantastic
code
I'm
living
in
Wonderland.
Now,
hopefully,
this
will
eat
your
screen,
maybe
on
node
14,
hopefully
will
back
port
it.
So
now,
I've
talked
about
readable
I
talked
about
transfer
now
I'm
talking
about
writable,
you
know,
I
haven't
talked
about
writable
at
all.
At
this
point,
writable
well,
I,
don't
know
what
to
do
with
writable.
That's
the
end
story,
and
you
know
if
you
want
to
write
something
to
write
something
you
need
to
consume
as
an
a
sink,
a
generator
right.
A
However,
you
want
to
have
an
operation
that
actually
you
know
returns
are
freakin
promised
the
problem.
The
fact
that
you
know
our
current
right
thing
does
not
return
the
promise
and
it's
very
hard
to
create
the
behavior
to
return
a
promise.
Why
well,
in
this
code,
I
would
like
to
be
able
to
write
this
code.
Okay,
look
at
this.
This
is
we
have
right
or
a
bill
right,
and
then
we
do
a
way
to
write
chunk
now.
This
code
is
correct.
A
A
That's
a
net
that
called
parts
of
the
problem
so
managing,
and
this
is
why
using
promises
with
streams
is
really
hard
because
one
thing
it
errors
straight
away:
the
other
thing
around
only
when
it
when
you
look
at
it,
so
those
two
things
are
really
really
hard
to
mix
and
match,
and
you
can
see
this
this
code
that
does
this
kind
of
shiny
guns
to
make
this
happen.
So
essentially
we
are.
We
are
registering
an
error
event
and
cashing
the
error.
A
A
So
how
does
it
all
work?
Well,
almost
is
implemented
in
JavaScript,
so
you
can
use
all
of
this
stuff
in
your
code.
Learn
the
pattern
and
use
it
for
other
things,
so
I
think
iterator
as
I
said.
I'm
running
a
little
bit
out
of
time,
so
hope
you
don't
kick
me
out
so
or
not.
This
is
an
extract
from
North
core.
A
So
essentially
we
are
implemented
the
symbol,
ADATA
sync
iterator,
primitive,
that
returns
a
function
and
essentially
we
just
create
call
that
call
our
generator
for
that
or
our
builder
for
that,
and
we
get
a
turn
on
a
sing
Katarina.
So
essentially
it's
in
order
to
prove
to
build
something.
You
just
need
to
provide
that
note
that
if
you
want
to
build
and
I
think
iterate
or
you
can,
you
should
use
this
type
of
pattern
if
you
don't
want
to
use
on
a
sync
generator
function,
that
is
nice
and
really
cool
to
use.
A
But
if
you
want
to
do
some
of
the
hard
stuff
that
we're
doing
or
providing
the
mapping
that
we
do
in
North
core,
what
you
need
to
do
is
that
you
need
to
create
an
object
which,
as
the
prototype
of
the
a
sink
a
generator
and
and
implement
some
method
like
next
and
return
next,
when
it's
called
when
X
is
called
to
get
the
next
element,
when
the
return
is
called
when
you
exit.
That's
it.
There
are
some
other
few,
but
they
are
less
important.
These
are
the
key
to
ones
so
internally,
not
core.
A
We
wrap
the
readable
event.
These
are
some
consequences,
and
one
of
those
thing
is
that
is
there
it's
really
performant.
So
that's
the
good
side
also
it
handles
it
blenders,
well,
buffering
and
and
all
the
things
this
is
a
long
link.
Sorry,
somewhere
small,
is
the
source
of
the
implementation.
Now
I'm
going
to
open
it
up
for
you,
okay
and.
A
A
Okay,
so
there
is
no
performance
penalty
in
using
a
single
traitor's
over
the
other
method.
So
that's
the
key
part
which,
if
you
talk
about
promise
based
API
saying
there
is
no
performance
penalty.
This
is
pretty
big
and
I'm,
not
underselling.
Elizabeth,
typically
promised
by
staff,
are
slower
than
combat
by
staff.
This
thing
is
on
par
just
so
that
you
know
one
last
thing
before
I
finish:
there
is
what
woody
streams
now?
What
would
you
stream
says
what
you're
using
in
node
fetch?
Okay,
no
sorry
naughty,
not
file
using
in
fetch
in
the
browser.
A
Not
in
not
fat,
not
not
does
not
have
what
with
these
troops
now.
This
case
have
a
bit
of
abyss
compatibility
between
the
two
things.
This
is
what
woody
streams
were,
which
is
this
pipe
to
method?
Three
complex
again.
Okay,
all
the
key
part
there
is
streaming
is
real
streams
based
API
are
really
complex.
You
know,
and
you
know,
if
you
want
to
implement
those,
you
need
to
implement
this
thing,
which
has
a
number
controller
and
a
lot
of
other
stuff
which
is
really
complicated
again.
A
So
not
really
easy
and
I
just
want
to
skip
through
this,
but
we
would
like
to
have
those
in
node
core
and
to
make
it
more
consistent.
You
probably
seen
my
stock,
so
this
makes
us
some
sense
to
have
this
in
North
core.
However,
we
want
to
be
compatible
with
the
node
ecosystem.
We
saw
we
use
node
streams
at
what
time
I
use
here,
the
next
one,
okay,
a
tree,
okay,
so
I.
Think
sorry.
A
However,
what
we
do
streams
are
going
to
be
a
sink
high
interval,
so
that's
cool
to
implement
the
same
pattern,
so
I
hope
one
day
to
be
able
to
do
this
type
of
thing
which
make
you
start
from
a
what
was
irritable
and
then
consume
is
just
using
an
a
sync
generator
function
pass
to
a
node
writable
and
it
will
all
work
fine,
so
we're
all.
The
introduction
is
being
managed
by
via
their
sink
I
traitor
protocol,
which
would
be
pretty
fantastic.
Do
you
want
to
get
involved?
Please
reach
out.