►
From YouTube: ASP.NET Community Standup - July 31, 2018 - SignalR Streaming with Dylan the SignalR Intern
Description
Community links for this week: https://www.one-tab.com/page/m338zhUZQrKUo5zwoY9cpw
A
C
A
B
A
D
Done
sort
of
like
at
school,
our
school
does
like
partnerships
with
different
companies,
and
so
I've
done
like
part-time
through
the
year
or
project
with
Microsoft
Dynamics,
and
so
that
was
all
it
wasn't
like.
It
wasn't
like
signal
are
like
asp.net
development,
but
it
was
like
c-sharp
and
it
was
in
Visual
Studio.
Okay,
like
that,
helped
me
understand
a
lot
of
the
fundamentals
coming
and
I.
Think
you.
A
Know
I
think
that's
one
of
the
features
of
dotnet
that
is
pretty
cool.
Is
you
can
like
you,
learn
dotnet,
and
then
you
can
write
a
game.
You
can
write
a
you
know
a
web
server.
You
can
write
a
desktop
app
like
it's
pretty
portable,
which
is
it
which
is
cool,
yeah,
yeah,
nice,
alright!
Well,
let
me
jump
in
and
share
the
community
links
and
then
we
can
get
to
the
fun
demo
stuff
and
you
can
show
us
what
you've
been
working
on.
A
So,
let's
start
with
this,
and
it
will
lock
to
my
screen
so
I'll
start
with
Damien
tweeted.
This
I
think
just
earlier
today,
I
thought
this
was
kind
of
it.
So
there
was
a
little
bit
of
a
mention
of
what
was
going
on
with
what
is
it
called
dispatcher?
What
we're
adding
in
dispatcher.
So
this
is
this
is
neat
these
he
linked
over
to
the
benchmarks.
Here
you
can
see
that
they're
kind
of
tiny
it's
a
little
hard
to
see.
Fortunately,
when
we
go
over
here,
they
they
have
a.
A
B
A
B
A
All
right
and
Sebastian
is
saying
in
the
in
the
chat:
they're
new
routing
code
base,
so
cool
good
stuff,
okay,
so
on
to
the
next.
This
was
something
also
Damien
tweeted.
This
is
the
retweet
Damien
in
a
video
show
this
week.
So
here
this
is
the
early
access
downloads
I
like
that
we've
been
doing
this.
These
are
all
day.
You
know
and
there's
there's
caveats
here
about
you
know,
watch
out,
and
this
is
for
for
advanced
people
like
you
are
the
if
you're
watching
this
show.
A
So
this
is
cool,
though
this
is
where
you
can
kind
of
try
out
the
two
one.
Three
release
early
access
so
and
especially
I
would
think
this
would
be
useful
if
you're
building
a
library
if
you're
you
know
the
kind
of
thing
where
you've
got
dependencies
and
you
want
to
like
be
ready
when
it
comes
out,
you
want
to
identify
problems
early
on
that
kind
of
stuff,
yeah.
B
This
goes
back
to
to
the
idea
that,
in
the
early
days
of
ACE
Connect
or
when
we
were
just
packages,
we
shipped
nightly
builds
onto
my
get
and
you
could
download
just
literally
yesterday's
build
and
it
would
just
work
when
we
have
the
shared
framework
and
the
SDK
things
get
a
little
more
complicated
and
we're
trying
to
make
it
easier
to
get
these
early
access
downloads
so
that
you
can
test
out
your
libraries
and
components
with
them
and
make
sure
they'll
still
work.
Cool.
A
All
right,
that's
a
really
great
post,
I
think
from
Phillip,
and
he
goes
through
and
talks
about
centralized
exception
handling
and
some
things
that
I
did
not
even
know
we're
a
thing.
He
talks
about
there's
this
RFC
for
problem
details
that
are
it's
an
HTTP
problem,
details,
kind
of
a
specification
and
there's
already
a
poco
that
that
is
set
up
for
that.
A
So
these
are
all
the
kind
of
things
where
you
can
package
something
up
as
a
problem
details,
and
it
has
all
these
things
you
would
probably
want
to
say
and
what
schools
this
is
all
kind
of
following
that
RFC.
So
you
know,
you've
got
things
like
a
titled
details,
status,
etc,
and
then
he
goes
through
and
does
two
instances
talking
about
exception,
handling
I'm
talking
about
how
to
do
exception,
handling
using
those
problem
details,
and
then
he
also
talks
about
about
validation
as
well,
so
going
through
and
doing
validation
and
packaging.
A
Those
taking
advantage
again
of
the
problem
details
for
your
validation
messages,
so
I
thought
that
was
pretty
cool.
How
it's
a
you
know,
an
application
and
and
I
did
not
even
know
number
one.
There's
an
RFC
and
number
two
that
there's
a
there's
built-in
support
for
it.
So
yeah,
it's
kind
of
a
spec
for
transmitting.
A
Right
this
is,
this
is
cool.
This
is
from
Red
Hat
and
they've
gone
through,
and
this
is
a
post
about
using
Linux
specific
transport.
So
it
talks
about
the
transport
abstraction
and
you
know
obviously
they're
going
to
be
things
that
are
stur
different
about
different
different
operating
systems.
To
give
you
the
best
transport
and
so
kind
of
cutting
to
the
chase.
Here,
there's
there's
talk
about
what
exactly
is
going
on
with
libuv
and
how
they've
optimized
for
it.
They
talk
about.
A
There's
some
some
benchmarks
here
about
you
know
showing
this
off
using
using
the
linux
transport,
and
so
here
they're
saying
you
know
the
20%
improvement
for
jason-
and
this
is
all
that's
required
here-
is
this
in
the
build
web
host.
Just
call
use
linux
transport
and
they
he
points
out
there
that
it's
a
it's
a
no
op.
If
you
use
it
on
windows,
it's
not
gonna
break
anything,
so
it's
just
pulling
in
this
package
and
then
calling
so
it's
as
Red
Hat
s,
peanut,
core
server,
Kestrel
transport,
Linux
and
then
use
Linux
transport.
A
B
A
A
So
it's
a
this.
This
is
kind
of
cool,
too
we've
been
calling
out
things
from
the
peach
pie,
team,
they're,
very
active
and
building
a
lot
of
cool
stuff.
They
had
one
thing
that
they
tweeted
about
recently
about
building
partials,
rendering
PHP
with
excuse
me
rendering
razor
partials
in
a
PHP
view.
So
that's
pretty
neat,
but
here
they're
talking
about
asp
net
core
responds
caching,
so
usually
with
things
like
wordpress,
it's
very
common
to
use
some
sort
of
response.
Caching
thing,
there's
all
kinds
of
wordpress
plugins
for
for
response.
A
Caching,
here
they're,
showing
that
you
know
that
you
can
use
response.
Caching,
middleware
and-
and
you
know,
set
that
up
directly
here
they
talk
about
WordPress,
specific
behavior.
What's
interesting
is
I
was
immediately
thinking
like
show
me
the
benchmarks
and
and
what
they
point
out
here,
which
which
kind
of
makes
sense
when
I
thought
about
it
is
you're,
not
gonna,
see
any
there's
no
point
in
benchmarking
this,
because
the
entire
point
is
it's
it's
cached
right
so,
but
the
nice
thing
is
that
this
is
a
simple
implementation.
This
is
very
easy
all
across
your
app.
A
You
can
just
turn
it
on
using
this
response.
Caching,
middleware
and
from
when
I've
used
WordPress
before
it's
been
a
little
bit
complicated
to
set
some
of
those
cache
things
up.
Alright,
I'll
try
and
speed
up
there's
a
lot
of
good
stuff
today.
This
is
neat.
This
is
an
expense
manager
using
EF
core
and
highcharts
and
asp
net
core
as
well
so
I
like
this.
It's
it
goes
beyond
the
kind
of
hello
world
to
show
something
where
you're
actually
building
out
expenses.
A
Your
your
it's
a
real-world
kind
of
business
application
and
the
charts
are
cool
too.
So
this
is
all
like
all
the
code
here.
As
far
as
building
out
the
model
implement
implementing
hooking
up,
you
know
and
and
all
the
kind
of
front-end
stuff
to
stuff
there
graph
QL
using.net
boxed.
So
we
talked
about
dotnet
boxed
in
the
past,
which
is
a
cool
open
source
project
that
has
kind
of
a
lot
of
stuff,
already
kind
of
precooked.
A
For
you
and
I
like
this,
this
is
Eric,
taking
a
look
at
that
and
showing
off
how
easy
it
is
to
do
graph
QL
using
net
box.
So
you
know
here
going
through
or
showing
the
schema
and
query
and
and
just
kind
of
showing
the
implementation
of
it.
So
this
is
a
very
kind
of
quick
walkthrough
and
it's
nice
to
show
how
simple
that
makes
it.
A
Okay,
this
one's,
interesting,
I,
don't
know
Andrew.
You
may
have
an
opinion
on
this.
This
is
about
I,
don't
know
if
you
saw
this
one.
This
is
so
there's
already
support.
Of
course,
in
Razer
you
can
just
do
you
know
a
tiff,
and
so
what
what
Andrew
Locke
pointed
out
here
was
he
wanted
array
a
tag,
helper
version
you
like
the
way
it
look
better.
He
it
worked
better
in
with
his
HTML
parser
or
you
know
it
may
work
better
in
different
IDs
and
that
kind
of
stuff.
A
B
A
Cool
all
right,
Rick
straw,
whenever
we
call
our
rickshaw,
he
writes
beefy
posts
with
a
lot
of
information,
some
good
opinions.
You
know
so
here
he's
talking
about
web
assembly
and
blazer-
and
this
is
you
know:
we've
we've
done
a
lot
of
stuff
about
blazer.
Lately
we
had
the
team
on
and
stuff,
but
I
loved.
The
way
that
he
kind
of
tears.
Everything
down
looks
at
you
know
shows
you.
You
know
all
the
kind
of
what
everything's
built
on
how
it
runs.
There's
a
ton
of
information
in
here
you're.
A
A
This
is
one
I'm
calling
out
because
I
was
gone
last
week
we
didn't
do
the
the
Community
Links,
but
the
Blazer
Oh
five-o
release
is
out
some
of
the
some
of
the
big
things
in
here.
We
were
already
shown
off
in
the
recent
community.
Stand
up
server
side,
blazer,
some
things
with
like
rendering
of
raw
HTML
some
good
stuff
there,
blazer
powered
by
signal
are
yeah.
A
Here's
an
interesting
one.
This
is
this
is
from
I'm
gonna,
pronounce
his
name
wrong,
so
I've
actually
got
it
handy.
This
is
rocket
on
G,
I,
believe,
and
so
this
is
he's
been
doing
a
lot
of
cool
stuff
and
this
one
is:
he
previously
did
a
clock
or
blazer,
but
it
was
using
canvas
and
in
this
case
here
is
using
SVG.
So
it's
it's.
You
know
kind
of
fun
to
see
because
all
web
standards
stuff
easy.
They
are
possible
to
integrate
with
a
lot
of
different
things:
yep.
A
Okay,
almost
on
Nate
McMaster
talking
about
plugins.
He
talks
about
the
different
ways
that
you
well.
He
introduces
what
the
plugins
are.
He
talks
about
why
so
there's
assembly
load
from,
but
there's
a
bunch
of
problems,
mostly
with
dependency
management,
loading,
locating
mismatches,
race
conditions,
etc.
He
talks
about
assembly
load
context
which
can
solve
those
problems,
but
it's
low
level
and
you
have
to
do
some
work.
So
then
he
talks
about
this
plugin
loader
that
he
wrote
and
he
also
what's
kind
of
cool
is
there's
a
demo
app.
A
This
is
asp
net
core
and
it's
got
some.
It's
loading,
some
plugins.
So
I've
talked
over
the
years
to
tons
of
people
that
are
wanting
to
do
a
different
plug-in.
You
know
run
time
loading
of
plugins-
and
this
is
this-
looks
like
a
pretty
neat
implementation
and
the
last
thing
that
I'm
contractually
obligated
to
point
out
every
time.
I
talk
to
anyone,
including
my
neighbors,
the
mailman,
etc,
is
dotnet
cough
dot.
Net
cough
is
coming
up
soon.
A
We
were
just
reviewing
submissions
for
the
for
the
presentations
community
presentations,
all
kinds
of
good
stuff,
so
this
is
coming
up
in
42
days,
there's
online
stuff
there's
and
it's
going
to
be
followed
by
local
events.
So
if
you
would
like
to
do
a
local
event,
you
can
see
already
we're
getting
a
bunch
of
them
that
are
being
submitted.
Please
let
us
know
so
we
can
send
you
swag
so
that
we
can,
you
know,
get
you
listed
here
and
stuff.
A
B
Yeah
so
Dylan's
been
working
with
us
for
just
about
three
months
now
and
he's
been
working
on
making
some
changes
to
sing
right
in
some
new
function.
New
features
that
we've
been
talking
about
doing
for
a
while
he's
going
to
sort
of
talk
a
bit
about
what
he's
done,
and
we
got
some
demos
we
can
run
through,
and
then
we
can
answer
some
questions
about
how
it
works
and
sort
of
what
we're
what
our
plans
are
for
it.
B
D
Yes,
so
most
of
what
I've
done,
it's
been
kind
of
like
streaming
parameters
and
what
it
lets
you
do
is
so
when
you're
on
a
client
and
signal
art-
and
you
want
to
call
a
function,
that's
on
the
server
so
before
you
could
only
pass
in,
like
you
know
like
stuff
that
you
can
Jason
if
I.
So
you
can
pass
in
your
numbers
in
your
strings
and
so
what
my
future
lets
you
do.
Is
it
lets?
B
So
this
is
using
the
new
system,
dot
threading
channels,
API
that
was
added
in
2.1.
I
think-
and
it's
basically
it's
it's
very
familiar
to
go,
has
a
construct
in
the
go
language
for
channels.
It's
basically
just
a
pipe
and
objects
go
in
one
end
with
the
writer
and
objects
come
out,
the
other
end
with
their
from
the
reader
and
that
sort
of
lets
it
lets
you
just
sort
of
have
once
I
can
have
a
writer
once
I
can
have
a
reader
that
all
works
in
process
with
just
the
channels
API.
D
D
B
B
You
would
have
to
manually,
implement
some
kind
of
chunking
system
and
track
that
yourself.
But
now
we've
got
a
system
where
we're
working
on
system
where
you
can
actually
handle
that
streaming
and
we'll
handle
the
like.
Sending
chunks
automatically
floor
and
link
the
routing
of
chance
to
the
rate
streams
right.
A
B
Right
and
so
I
noticed,
there's
a
question
in
the
in
the
chat
about
like
with
video
streaming,
be
possible
in
theory.
Yes,
this
is
the
kind
of
thing
that
could
allow
that
whether
you
should
use
signal
art
for
video
streaming,
I
think
is
a
different
question,
because
there
are
a
lot
of
really
good
for
the
streaming
protocols
out
there
already.
But
it
is
that
kind
of
thing
sending
chunks
of
data
across
the
wire,
and
it's
not
just
for
really
big
data.
B
Yeah,
like
measurement
data,
you
know,
so
it's
just
about
data
that
either
that
needs
to
come
in
sort
of
separate
segments
right
that
needs
to
come
over
time
in
segments.
It
could
be
one
big
piece
of
data
or
it
could
be
a
bunch
of
little
data
that
happens
over
time
and
some
of
our
examples
can
kind
of
show
sort
of
what
that
looks
like
problems.
So
I
will
start
sharing
our
screen.
Whoo,
you
know,
jump
over
to
I,
haven't
done
that
yet
I
fit
the
wrong
button.
B
That's
the
right
button
share
the
screen
all
right,
so
we
should
have
something
up.
Let
me
we'll
just
go
straight
into
launching
launch
the
demo
and
let
Dillon
kind
of
explain
some
of
the
code.
We'll
start
with
some
of
the
JavaScript
code
on
how
you
would
invoke
a
method
using
this
streaming.
Are
you
seeing
the
screen?
John
I
am.
B
D
Today-
and
this
is
an
upload
stream
object,
which
is
a
new
concept,
and
so
you
just
make
one
from
the
connection
and
then
we
want
to
send
over
the
word,
the
soup,
and
so
what
we
do
is
for
each
sweets.
So
we
invoke
the
upload
word
method
with
this
new
stream
and
then
get
send
over
the
word.
We
send
each
character
one
at
a
time
and
what
the
method
on
the
server
does
and
we'll
look
at
it
in
a
little
bit.
D
Is
it
takes
all
the
incoming
letters
out
of
the
channel
and
it
appends
them
together,
and
then
it
just
sends
you
back
your
word,
and
so
we
send
each
letter
a
bit
of
liming
and
then
we
say
all
right:
we're
done
Cindy.
We
want
to
resort
back
now
and
then
you
await
the
invocation
promise,
and
this
is
the
server
gets
back
here
and
says,
here's
your
word
and
then
we're
done.
So
if
we
go
over
to
the
browser
here,
actually
yeah
just
open
up
the
layer.
C
B
One
yes
yeah
here:
we've
got
a
page:
let's
open
up
the
and
now
let's
open
up
the
network
tab,
so
f
cells
can
show
a
little
bit.
So
if
we
go
to
the
network
tab
and
let's
refresh
the
page,
we
connect
alright.
So
now
we've
got
so
single
our
uses
WebSockets,
although
it
has
a
bunch
of
other
transport.
So
we
can
see
the
WebSockets
connected
here
right
now.
You
just
got
handshaking
messages
and
we
sent
sending
pings
every
once
in
awhile.
B
B
See
yeah
you
see
it's
sending
here,
you'll
see
the
items
coming
in
one
at
a
time,
Oh
over
and
I
think
the
code
had
a
two
second
delay
between
sending
news
so
we're
sending
each
individual
letter
as
a
separate
message
yeah
and
if
we
jump
actually,
if
we
jump
now
over
to
BS,
which
is
flat
splashing.
Oh
here.
B
Hit
the
breakpoint,
so
we
can
show
the
server
code
a
little
bit.
Server
code
is
pretty
simple:
we've
got
just
a
standard
hub
method,
let's
just
scrolled
this
a
little
bit,
so
you
can
see
the
whole
thing.
You
know
that
you
can
see
the
server
method
is
taking
in
this
channel
reader
of
string
and
we
have
a
loop.
B
That's
basically
just
saying
read
items
out
of
this
channel,
this
sort
of
double
while
loop
pattern
is
a
common
pattern
with
channel
reader
because
it
lets
you
read
synchronously
as
much
as
you
can
and
then
block
to
wait
for
new
items.
Okay,
let's
see
it's
broken
on
this
breakpoint
and
if
we
look
at
item
it'll,
say
Z,
because
that's
the
first
item.
B
D
B
B
D
B
A
B
C
B
Now
we
see
that
it
sent
those
messages,
one
every
two
seconds
and
we
eventually
received
the
result
from
the
server
which
is
everything
sort
of
concatenated
together.
So
that
kind
of
lets
you
so
and
that
stream
that
you
have
that
you
can
call
right
to
it
call
right
on
you
can
send
what
any
any
data
you
can
send
through
signal
are
today
complex
objects,
binary
data
you
can
send
through
those
screens,
and
this.
B
B
D
B
Yeah,
so
let's
look
at
the
code,
so
we've
got
a
dot,
so
we're
just
gonna
show
now
using
this
from
the
dotnet
claim.
We're
gonna
show
uploading
a
file,
so
we've
got
a
sample
here.
Where
again
we
right
now.
This
only
works
with
the
channel
type,
which
is
again
a
new
type
in
dotnet
core
to
one
we
are
looking
at
when
we
actually
shipped
this
feature.
It
would
probably
also
support
things
like
sending
a
stream,
because
a
stream
is
really
just
a
series
of
bytes
anyway
and.
B
D
B
And
this
is
another
example
of
where
you
can
pass
in
both
you
can
pass
in
a
channel
but
also
standard
parameters
like
a
string,
etc.
So
first
hit
alright.
So
just
we
just
run
this
one
is
the
listen
a
bit
in
the
next
one.
So
here
the
sample
is
going
to
run
and
it
all
goes
well.
It's
got
compile
everything,
but
it's
gonna.
D
B
I'm,
sorry,
so
alright,
so
it's
finished
uploading
you
see
it's
writing
all
those
five
KB
chunks
to
the
server
that
was
uploading.
Our
Sigma
architecture
diagram
image
that
this
image
tag
was
referencing
because
it
wasn't
there
yet
the
link
was
broken.
But
now,
if
we
refresh
it,
it
should
be
there
yeah.
So.
C
B
B
B
B
Yeah
and
a
couple
of
questions
I
note
from
the
chat
about
like:
can
the
client
be
sent
anything
from
the
server
while
it's
in
the
middle
of
sending
data
to
the
server?
Yes?
Yes,
that's
one
of
the
really
nice
features
of
this
is
that
if
you
were
sending
a
huge
file
in
a
single
message,
then,
while
that
file
is
transferring
well,
that
message
is
being
sent
to
the
server,
you
can't
send
anything
else.
The
connection
is
taken
up
by
that
and.
D
B
D
So
here's
this.
This
is
the
score
tracker
example,
and
the
idea
of
this
is
that
there's
like
two
players
playing
some
sort
of
game
and
they
each
so.
Here's
here's,
here's
the
game
logic
and
each
each
round
is
a
score
random
number
between
negative
two
and
ten.
So
this
is
a
game
that
requires
like
a
lot
of
strategy
and
so
up
here
we
make
two
channels,
one
for
player,
1
1
for
player
Jim,
and
then
we
just
launched
the
game
loop
in
a
background
thread.
B
And
one
of
the
nice
things
here
is
on
the
server
we're
spinning
up
these
two
loops,
one
for
each
player
and
we're
using
a
weight
when
any,
which
means
that
what'll
happen
is
as
soon
as
one
of
the
players
hits
25.
This
task
will
complete
and
will
know
that
the
task
we
get
back
is
the
winning
player,
because
it
was
the
first
one
to
reach.
D
25
and
the
interesting
thing
is
that,
if
you
look
here
in
the
client
code,
the
the
only
exit
condition
for
the
game
is
the
cancellation
token,
and
this
happens
after
the
server
returns.
So
what
this
is
basically
saying
is
here's
startup
like
to
startup
startup
this
loop
on
the
claim
just
sent
data,
send
data,
send
data
and
then,
in
another
thread
the
surfer
will
say:
hey
that's
enough
data
I'm
good
and
then
you
know
to
stop
sending
it
yeah.
B
And
I
actually
know
there's
a
question
about
exactly
this
in
the
chat
and
I
was
kind
of
waiting
for
this
example.
This
is
an
example
where
the
client
isn't
the
server
is
able
to
say
stop.
I'm
done,
I,
don't
need
any
more
data,
whereas
in
the
previous
example
for
the
file
upload,
the
client
was
the
one
saying
the
stream
is
now
complete.
Now
the
server
is
doing
that
by
simply
returning
from
its
invocation.
It
says:
I
don't
need
any
more
data.
B
B
The
thing
will
swap
that
over
there
and
brought
over
there
and
we'll
do
oh
I
just
shut
down
the
server
I
hit
their
own
thing.
Let's
start
that
oh
I
meant
to
do
this
clear
this
out,
so
that
we
don't
have
extra
text
in
the
way
too
many
screens,
yeah
and
so
we'll
see.
Both
the
server
in
the
client
will
write
out
as
they're
getting
it
better.
In
the
background
is
fine,
that's
not
a
problem.
B
You
see
the
connection
coming
in
and
then
you
see
these
scores
coming
in
and
eventually
one
of
them
will
hit
25
I
hope.
If
we
end
up
in
a
loop
here.
Oh
there
very
good,
and
you
see
on
the
server
side,
you
can
see
the
scores
as
they
built
up
and
then
player
one
got
226
and
it
returned,
and
if
we
were
not
again
because
player
two
kept.
B
D
B
So
that's
that's
kind
of
what
we
had
for
demos.
The
idea
this
feature
is
to
again
to
allow
you
to
stream
that
data,
any
data
that
comes
over
time
or
is
too
large
to
go
in
one
chunk
to
let
you
stream
it
in
to
to
method
in
vacations
and
and
sort
of,
let
you
it
gives
you
that
I
mean
you
know.
Pipelines
is
the
hot
new
thing
right
now
and
channels
are
kind
of
like
pipelines,
but
instead
of
bytes
they're
for
objects
right.
So
you
have
a
write.
One
end
that
you
write
to
it.
B
A
B
Not
30
minutes,
but
today
or
tomorrow,
yeah
we're
hoping
to
merge
it
in
we're.
So
yeah
we're
currently
working
towards
this
for
the
in
the
branch
for
SP
net
core
3.0,
mostly
just
because
2.2
is
already
kind
of
planned
out,
and
this
was
kind
of
a
sight,
a
project
idea
that
we
had
and
we're
still
kind
of
playing
around
with
it.
There's
no
that
I'll
have
to
look
into
what
we
can
do
as
far
as
like
something
a
package,
maybe
people
could
use.
B
We
want
to
be
extra
careful
because
it's
going
to
be
a
prototype,
we
want
to
make
sure
it's
very
clear
that
its
prototype,
but
I,
don't
see
a
huge
problem
with
us
trying
to
find
a
way
to
at
least
give
you
some
give
you
some
idea
of
how
you
can
try
it
out.
At
the
very
least
you
can
clone
the
repo
you
can
build
it
and
there's
the
you
can
reference,
but
yeah.
We
want
to
try
and
get
this.
B
A
Okay,
so
is
this
something
where
like
feature-wise,
this
is
all
pretty
like?
It
does
a
pretty
set
like
thing,
and
you
don't
really
need
any
kind
of
input
as
to
I
wish
it
did
this
or
that
or
is
it
kind
of
something
where
you're
interested
in
like?
Are
there
some
decisions
that
the
community
could
like
weigh
in
on
or
whatever.
D
B
We
definitely
need
some
feedback
I
think
on
this,
and
there
are
also
some
major
limitations
like
Dylan
was
just
mentioning
right
now,
those
channel
readers
that
you
send
they
have
to
be
the
they
have
to
be
the
parameter
they
can't
be
put
inside
a
list.
You
can't
have
a
channel
reader
inside
an
object
or
a
channel
reader
inside
an
array,
and
it's
things
like
trying
to
figure
out
what
what
kinds
of
objects
do
you
want
to
be
able
to
send
this
way
we
know
channel
channel
is
the
primitive
we're
going
to
use.
B
Of
the
things
coming
is
I
sink
innumerable
and
we're
probably
going
to
want
to
you
know
we're
probably
going
to
want
to
support
that
as
well
and
also
just
like.
We
want
to
see
how
people
use
it
right.
A
lot
of
what
we
do
is
we
come
up
with
an
idea
and
we
think
it's
useful
and
we
need
to
know
if
people
actually
want
to
use
it
and
how
they
want
to
use
it.
D
It
out
so
like
if
everyone
ends
up
using
this
for
file
uploads,
then
we'll
make
like
a
nice
upload.
This
file
function
if
everyone
uses
it
for
like
tracking
scores
and
multiplayer
games.
Oh
another
cool
thing
you
can
do
is
that
because
you
can
get
the
channels
onto
the
server,
then
once
once
it
once
the
channels
on
the
server
in
the
wood
in
the
function.
Somehow
you
can
take
that
channel
and
you
can
bring
it
anywhere
else
on
the
server.
D
So
you
can
have
like
a
connect
player
function
in
a
game
or
something
and
that
player
will
pass
a
channel
to
the
server
and
the
server
can
take
that
channel
and
move
it
anywhere
else
and
like
put
it
in
like
a
lookup
dictionary
or
like
put
it
in
some
singleton
object
and
then
like
return,
it's
like
yeah
we're
good.
We
store
to
your
channel
and
then
the
client
anytime
it
writes
to
that
channel.
A
D
B
The
channels
api
is
you
know,
is
something
that
they
designed
in
corvex
will
like
last
release
cycle
and
it's
basically
mcore
parts
to
it.
On
the
read
side,
you
saw
with
these
two
loops
that
you
can
you
wait
for
items
to
be
available
and
that's
an
async
thing.
That's
at
the
task
and
that
task
completes
when
new
items
are
available
and
then
wow
items
are
available.
You
just
read
from
it
synchronously
right.
D
B
D
A
B
C
B
Two
loops
I
think
the
screen
should
be
sharing
again.
There's
the
there's
two
loops:
there's
a
outer
loop
here,
so
the
outer
loop
is
saying:
wait
while
wait
to
read
async,
and
so
this
is
where
the
asynchronous
logic
is
so
you'll
be.
If
there's
no
items
available,
your
code
will
be
blocked
at
this
line.
66
waiting
for
items
to
become
available
once
items
are
available.
We
then
do
a
we
say.
B
Well,
try
read
get
that
item
out
and
this
will
run
as
fast
as
it
can
grabbing
all
the
items
that
are
currently
available
until
it
runs
out
of
items.
So
if
the
client
sends
ten
items
and
then
goes
to
sleep
and
then
sends
ten
more
items,
then
your
inner
loop
is
going
to
run
ten
times
and
then
pause
under.
B
Go
back
to
waiting
and
then
your
inner
loop
is
going
to
run
ten
more
times,
there's
also
a
slightly
simpler,
API
like.
If
you
really
want
to
get
the
simpler
code,
you
can
go,
read
or
read.
Async
and
that'll
just
give
you
an
item
if
one's
available
and
if
there
isn't
one
it'll
just
block,
but
this
kind
of
lets
you
avoid
going
async
until
you
actually
know,
there's
no
available.
B
A
A
D
This
is
where
most
of
the
routing
happens,
and
so
how
how
this
works
under
the
covers
is
that
when
you
thank,
you
explain
so
when
you
put
a
channel
in
as
a
top-level
parameter
on
the
client,
what
client
ends
up
doing
is
replacing
that
parameter
with
adjacent
object.
That
says
like
hi
I'm,
a
stream
placeholder
and
my
ID
is
like
14
and
then,
when
you
end
up
yeah
so
so
see
here
on
the
screen,
it
says:
item
Z
stream,
ID
1
yeah.
D
So
when
it
sent
the
invocation,
it
said
hi
I'm,
a
new
stream,
my
IDs
1,
and
then,
when
you
receive
a
stream
data
message,
it'll
say:
hi,
my
stream
IDs
1
and
then
the
stream
tracker,
which
is
here
it's
basically
just
a
glorified
dictionary
with
this
here's,
here's
the
dictionary
and
it
maps
the
IDS
to
the
maps
IDs
to
the
channels.
It's
perfectly
really.
B
What
them
says
if
we
just
sort
of
we
assign
each
stream
an
ID
as
it
comes
in
and
the
IDS
are
used
to
route
things
to
the
right
place.
So
the
kind
of
the
routing
that
you
control
is
that
you
can
call
a
met
you.
You
can
call
different
methods
with
different
streams
like
if
you
want
to
have
a
stream
for
a
certain
kind
of
event.
Then
you
like
it.
Let's
say
you
want
to
have
a
stream.
B
Let's
say
you're
writing
like
an
IOT
device,
and
you
want
to
have
a
stream
of
temperature
readings
and
a
stream
of
pressure
readings
right.
You
could
have
a
single
method
called
you
know,
stream
input
and
it
can
take
two
parameters.
It
could
take
a
string
which
is
the
type
of
reading
you
want
you're
sending
and
then
the
channel,
which
is
the
actual
data.
B
So
you
could
call
it
once
with
temperature
and
then
a
stream
or
you
could
call
it
another
time
with
pressure
and
then
a
stream
and
we'll
keep
track
of
which
stream
was
passed,
to
which
method.
And
so
in
your
method,
on
the
server
you'll,
be
able
to
look
at
the
parameters
and
so
see
what
which
data
is
coming
in
from
which
invocation
and
there's
a
deeper
connection.
Yeah.
A
C
B
There's
a
question
about
reconnection:
how
does
that
work
reconnection,
so
signal
are
a
speed
on
that
course.
Singular
does
not
support
automatic
reconnection
where,
where
so
one
thing
a
speed
on
that
single
arm
would
do,
is
it
could
reconnect
and
when
it
reconnected
it
looked
like
it
was
the
same
connection
to
the
server
asp.net
core
single
or
doesn't
do
that
when,
when
you
reconnect
you're
a
completely
brand-new
connection,
so
if
you
have
a
stream
going
and
then
you
get
disconnected
the
stream
ends,
however,
this
is
something
you
can
kind
of
build.
B
B
A
Cool
I
don't
see
any
other
questions
and
I
don't
have
any
offhand.
Is
there
more?
We
talked
about
where
people
that
it's
going
to
be
merged
soon,
and
so
people
can
watch
the
signal,
our
repo
in
the
three
O
branch.
If
they
want
to
start
playing
with
this
yeah
yeah
and
then
as
far
as
any
kind
of
gathering
feedback,
should
they
just
open
an
issue
or.
B
A
Got
one
more
question
which
is
earlier
on:
there
was
a
question
about
video
and
I
will
confess
that
over
time,
whenever
we've
talked
about,
like
you
know,
there's
peanuts
stand
up
and
right.
Why
don't
we
just
hook
up
some
of
us?
Send
some
video
frames
over
you
know
over
signal.
Are
how
hard
could
it
be,
but
I
know
video
is
difficult
on
the
on
the
web,
like
what
are
your
thoughts
on
I
see
Dylan
laughing
every
time
videos
mentioned
okay,.
D
D
B
A
B
Know
that
nothing
ever
goes
wrong
when
David
decides
to
write
some
random,
but
as
someone
who's
had
to
clean
a
few
of
those
up,
but
but
I
guess
I
would
say
it
doesn't
feel
like
it's
like
signaler
can
add
a
lot
that
other
protocols
can
that
can't
do
I
mean
over
our
WebSocket.
It
would
be
pretty
quick,
I,
don't
know
if
I'd
want
to
send
video
over
like
long
polling.
That
would
get
pretty
high
latency
and
pretty
laggy.
But
you
know
what
Annie
you
know
what
you
know.
B
B
B
B
A
D
B
D
I'd
say
from
like
a
technical
perspective,
working
on
something
with
so
much
architecture
and
like
so
many
layers
and
having
to
see
because,
like
the
stuff
I've
worked
on
before
it's
all
been
like
only
like,
maybe
like
two
or
three
layers
and
you're
like
alright
I
can
like
do
stuff
in
here
and
like
move
stuff
around,
and
then
you
go
in
to
signal
line.
There's
like
seven
or
eight
layers.