►
From YouTube: Weekly Sync 2022-02-18 Part 1
Description
Meeting Minutes: https://docs.google.com/document/d/1vKYEPtqKiwsFwhVKPmPub5ebMqN9HteBcbdFAuTXalM/edit#heading=h.bucu7l213uqd
A
Which
is
we're
gonna,
look
at
the
inner
source
use
case,
and
how
do
I
make
a
new
session.
A
So
we're
trying
to
think
about.
You
know
how
do
we
create
automated
tooling
around
the
governance
structures
of
you
know:
multi-project
organization,
of
course,
this
maps
very
nicely
onto
our
second
party
plug-in
use
case,
which
is
where
we
have
the
main
package
dfml,
and
then
we
have
you
know
all
the
other
plugins
live
in
their
own
repos
and
that's
the
second
party
stuff
that
we've
been
talking
about
forever.
So
this
would
be
okay,
you've
put
them
in
their
own
repos.
How
do
you
make
sure
that
they're
doing
all
the
right
stuff
like?
A
How
do
you
make
sure
that
they
didn't
stop
doing
linting
right?
Because
we
still
want
them
to
do
linting?
You
know?
Well,
how
do
you
make
sure
that?
How
do
you
go
verify?
You
know,
maybe,
as
we
get
further
into
this,
that
they
are
running
the
tests,
the
way
they're
supposed
to
be
running
them,
because
that's
one
of
the
things
that
we'll
split
out
with
the
second
party
stuff
is.
This
is
good.
This
is
what
we're
working
on
right
now
is
basically
like
the
auditing,
it's
the
auditing.
A
After
we
transition
to
the
second
party
setup,
you
might
say
well,
why
are
you
doing
that
first,
and
I
tell
you
because
there's
overlap
with
something
else,
all
right?
Okay,
so
let's
do
it
nope
we're
not
here.
Are
we
okay,
wait
now
see
operations?
A
I
think
I
was
trying
to
where
were
we
here,
oh
and
all
that
that's
the
remote
ssh
hsh
based
remote
execution,
that
was
just
random
notes.
All
right
here,
we
are.
A
So,
okay,
I
created
the
package,
I
think
yeah.
Is
that
all
I
did.
I
just
created
the
package
all
right,
let's
just
clean
this
up,
real,
quick,
it's
a
dangerous
thing
to
do
that
deletes
all
the
not
tracked
files
all
right.
So
here's
our
package
and
we
made
the
package
by.
A
So
we
made
the
package
by
following
the
development
service,
create
command
right,
so
we
wanted
operations.
So
if
you
do
help
on
this,
it'll
tell
you
which
kinds
it's
every
kind
of
plug-in
right.
We
should
have
a.
We
should
have
a
skeleton
project
to
facilitate
the
building
of
a
new
plugin
for
every
type
of
plugin.
We
have
right.
So
if
we
have
sources,
we
need
a
source,
skeleton
and
all
those
exist
under
scale,
and
that
is
documented,
I
believe,
there's
a
working
on
scale
section.
A
Go
basically
out
of
notes
working
on
skill,
okay,
so
this
is
documented
here
and
it
talks
about
how
there
are
some
simulinks
that
are
needed,
which
I
don't
know
how
that
will
work
on
windows.
But
this
is
just
for
developing
not
for
not
for
usage.
You
don't
have
to
be
immediate.
Sim
links
all
right,
so
so
I
I
generated
the
package
reload
to
allow
offline
go
away,
so
I
created
the
edit
the.
A
It's
gonna
try
to
open
teams
on
me.
Don't
do
it
don't
do
it
windows
all
right,
fantastic,
okay,.
A
All
right,
so
we
are
talking
about
checking.
A
A
We
want
to
ensure
that
ci
cd
practices.
A
And
other
things
maybe
documentation
test
coverage
are
being
kept
up
to
org
standard
levels.
A
And
so
the
reason
so
this
will.
This
will
tie
into
our
tier
model,
with
the
tiered
support
levels
so
and
just
as
a
recap,
the
tiered
support
levels.
So
this
is
true
right
here,
support
mobile
model,
which
can
be
found
in
the
adr
for
the
second
party
plugins,
which
is.
A
A
Possible
support
levels,
okay,
main
package,
all
right
so
support
level.
Zero
is
the
main
package.
A
We're
not
gonna
break
that
if
we
break
that
we
lose
it's
a
program
right
now,
but
you
know
not
the
release
version,
so,
okay,
so
sport
level,
one
second
party
required
to
pass
release
okay.
So
this
is
it's
been
a
long
time
since
I
looked
at
this,
so
single
pretty
required
faster.
So
this
would
be
all
the
plugins
which
we
consider
to
be
part
of
the
core
set
to
make
the
project
functional
enough
for
end
users
like
sort
of
the
out
of
the
box.
A
A
A
So
if
you
install
all
these
these,
these
are
the
ones
where
we're
going
to
guarantee
the
documentation
work.
So
we
don't
ship
unless
the
documentation
works,
so
that
would
be
spoiler
one
all
right
support
level.
Two
is
basically
everything
else.
You
know
that
exists,
probably
within
the
main
documentation
website.
A
We
talked
about-
and
I
think
we
have
documented
here
and
we're
not
going
to
go
into
the
various
ways
of
documenting
the
matrices
of
what
versions,
what
version
combination
different
things
work
with
so
like,
for
example,
if
you
were
to
go
to,
let
me
pick
a
a
a
plug-in
that
involves
a
tutorial
that
involves
multiple
plugins
right.
So
I
think
this
one.
A
A
A
Not
those
using
nlp
operations
there
we
go
all
right.
Well,
whatever
all
right.
This
one
only
has
two
but
it'll
suffice
so
model
tensorflow
is
probably
one
of
our
tiers
tier
one
support
level
plug-ins
right,
the
nlp
operations.
You
know
in
a
perfect
world,
they
would
be
part
of
our
support
level,
one
plug-ins,
but
but
the
only
person
who
really
knows
a
lot
about
nlp
is
himanshu
and,
and
so
we
need
to.
We
need
to
get
some
more
people
who
who
know
about
nlp.
A
Well,
oh,
we
have
sahil
is
also
doing
nlp,
so
we
need
to
be
beef
off
the
amount
of
maintainers.
We
have
in
this
area
because
yeah
that
that
kind
of
is
what
makes
these
metrics
drop
is
when
we
don't
have
enough
people
maintaining
things
so
so
in
this
case,
we
would
have
something
that
you
know.
This
would
be
a
support
level.
A
Two
perhaps
in
this
instance
the
nlp
operations
right
and
if,
for
whatever
reason,
they
stop
working
with
model
tensorflow
one
day,
and
we
need
to
do
a
release
that
day,
then
we're
going
to
just
do
the
release
and
note
on
this
page
that
the
last
known
version
of
model
tensorflow
and
dfml
that
that
dfml
operations
nlp
worked
with
was
this
set,
and
then
these
commands
will
be
equals
equals
that
set
so
that
we
can
ensure
the
tutorials
always
work,
even
though
we
document
that
they
don't
work
on
the
latest
version,
and
so
that
gets
more
complicated.
A
But
this
is
just
the
general
and
then
third
party
is
just
you
know,
everybody's
plugins,
so
we
want
to
make
sure
to
to
as
a
part
of
making
sure
that
these.
So
this
is
the
test
surpassing
right
now.
The
stuff
we're
doing
today
is
let's
track
things
that
are
going
to
help
us
make
sure
that
we
have
tests
and
the
tests
are
passing
right.
So
this
is,
like
you
know,
are
we
running
the
tests
in
ci
right
so
did
like,
even
even
before
running
the
tests?
A
Have
we
or
do
we
have
something?
That
is
set
up
to
run
the
test
all
right.
So
let's
check
this
out.
I
think
what
we're
doing
here
yeah.
So
I
was
just
parsing
the
workflow
all
right.
So
the
approach
that
we're
going
to
take
with
this
one
is
test
operations.
A
Okay,
all
right,
so
I
went
off
in
a
little
bit
of
a
direction
here,
so
we're
gonna
we're
gonna
change
this
up
a
little
bit.
A
A
Test
one
all
right-
and
this
is
just
watching
for
file
changes-
that's
what
notebond
it's
javascript
utility.
Nothing
else.
Doesn't
I
notify
right
for
some
reason
so
and
then
we
basically
say
we're
going
to
clear
the
screen
and
rerun
the
test
every
time
you
guys
have
seen
this
for
most
of
you
so.
A
A
A
A
All
right,
okay,
perfect!
So
this
is
loading
you're,
seeing
a
search
in
error
42.
So
we
said
you
know
check
that
when
we
run
this
context
right.
So
this
says
analyze
dfml.
When
we
run
the
the
dfml
context,
the
answer
should
be
42..
Obviously
that
doesn't
make
any
sense
we're
in
the
middle
of
writing
the
test
right
here.
A
So
all
right.
Okay,
so-
and
I
started
out
with
this
parse
github
workflow,
but
then
I
was
given
feedback
that
it
doesn't
really
make
any
sense
to
go
straight
into
trying
to
analyze
the
deep
internals
of
the
workflow.
A
Until
you
know,
you
know,
until
you
just
check
for
the
presence
of
it
or
not
right
so
start
small.
You
know
make
sure
that
make
sure
that
we
know
that
it
does
exist
right
and
then
we
can
sort
of
grade
it
upwards.
Based
on
the
the
presence,
the
presence
is
the
first
tick
upwards,
and
then
you
know
what
the
contents
is.
You
know,
what
are
we
testing?
What
are
we
doing?
A
That
would
be
the
next.
The
next
thing,
all
right,
so
we're
just
gonna
feed
it
a
directory,
probably
yeah.
I
think
we're
gonna
feed
it.
We
should
probably
go
look
at
the.
We
should
probably
go.
Look
at
the.
A
Let
me
just
make
the
inputs
here
so
we're
going
to
look
at
the
git
so
surprise,
surprise.
A
A
Feature
operations
right
so
surprise,
surprise
we're
going
to
use
the
get
repo
operations
right
because
they're,
probably
all
this-
is
probably
going
to
come
from
git
repos.
So
let's
grab
one
get
rid
of
release,
get
repository.
I
think
we
just
do
a
git
repository.
We
don't
need
to
get
repository
checked
out,
I
mean
technically
we
could
do
this
on
a
directory
right.
A
Blah
blah
blah
all
right,
we
want
what
is
it
yeah,
repo
dot
directory?
Okay,
all
right!
So
now
we're
just
basically
gonna
say
you
know,
presence
github,
workflow
present.
A
A
All
right,
we
look
for
the
workflows
directory
if
it's
there
we're
good
all
right
get
a
word
for
a
present.
Oh
my
god,.
A
Operations,
import,
okay
repository.
I
turned
on
this
auto
complete
stuff
recently.
A
A
All
right
we're
just
going
to
assume
so
we're
assuming
that,
since
git
can
only
track
directories
with
files
in
them.
If
there's
any
files
in
the
github
workflows
directory,
then,
if
the
directory
exists
at
all,
then
there's
files
in
it.
If
there's
files
in
it,
then
they
should
be
workflow
files,
so
all
right
so
contents,
all
right,
okay!
So
here's
what
we're
going
to
do
we're
going
to
just
take
this
root
directory
where's
this
guy
inputs.
A
Was
this
dfml
router
basically
read
the
file
right,
the
cut
this
is
the
the
parse,
the
parse,
the
parsha
action
just
took
the
raw.
Well,
it
took
the
string
representation
of
the
file
right
so
being
pathlete.
We
can
that's
a
nice
one,
liner,
so
yeah,
so
we
just
read
the
file
and-
and
you
know
we
have
this
testing
animal.
So
that's
that's
what
we
did
so
now
we're
just
going
to
create
this
github
we're
going
to
create
this
git
repository.
A
Oh
it's
in
my
vim
history.
So
we're
gonna
create
this
git
repository
type,
we're
gonna
instantiate
an
instance
of
it
as
an
input.
Read
definitions
how
we're
done
on
time.
Okay,
we're
doing
good!
A
A
I
really
have
no
idea
why
we
didn't
just
do
that.
That
doesn't
make
any
sense.
There
must
have
been
some,
I
might
have
been
for
the
importing
and
exporting
man.
This
thing
is
slow
today,
okay,
so
this
is
the
gear
repo
spec
object,
so
it
takes
a
directory
and
url
all
right.
Well,
we
don't
have
a
url,
so
that's
fine,
so
we're
just
going
to
since
it'll
default
to
none
so
we're
just
going
to
instantiate
the
spec.
So
basically
we
are
so
we
grab.
A
You
know
we
do
an
import
star,
which
is,
you
know,
obviously
not
the
best
behavior,
but
this
is
a
test
and
we're
just
writing
it
as
little
lines
as
possible.
A
So
the
this
this
function,
op-amp
in
you,
can
pass
it
something
and
it'll
just
go
through
it
like
and
look
to
see
if
it
has
any
operation
implementations
in
it
and
of
course,
so
anything
you
decorate
with
the
op
decorator
is
going
to
be
an
operation
implementation
because
it'll
create
the
operation
object
with
the
arguments
that
you
pass
into
op
and
then
it'll
create
the
implementation,
class
and
implementation
context
class
out
of
the
function
that
it's
wrapping.
Okay,
so
then
you
know
when
we
do
when
we
create
that
op.
A
Let
me
just
show
that,
so
when
we
do
that
op
right,
we
set
the
inputs
and
here's
the
inputs
contents,
and
so
this
op
actually
becomes
a
property
on
this.
This
also
gets
a
property
called
dot,
imp
and
dot.
Imp
is
a
implementation
operation,
implementation
instance,
and
you
can
read
about
that
in
the
conceptual.
So
all
of
this
is
covered.
A
This
is
where
so,
I'm
working
on
a
series
of
tutorials
about
async
and
event
like
basically
event
loops
and
like
an
event-based
approach
flowing
into
a
data
flow
based
approach
to
explain
you
know
what
why
why
we're
doing
what
we're
doing?
I
guess
I
think
that
can
get
lost
very
easily.
A
This
is
a
very
good
page
to
read.
If
you
want
to
know
how
the
execution
of
a
data
flow
works.
Now
this
is
so.
This
is
one
particular
way
we
could
execute
a
dataflow,
but
it
doesn't
it
shouldn't
it
shouldn't
be
the
only
way.
So
we
need
to
figure
out
how
to
get
that
interface
going.
I
think
right
now,
it's
just
implementing
a
new
orchestrator,
but
I'm
not
sure
if
there
should
be
something
separate.
A
A
And
I'm
not
sure,
I'm
not
sure
we
need
to
look
into
that,
because
we
need
to
be
able
to
support
the
use
case.
I'm
thinking
it
was
more
of
like
a
linear
flow
right.
So
basically
you
just
put
a
bunch
of
operations
in
there
and
one
goes
to
the
next,
but
I'm
not
sure
if
that's
even
different,
I
don't
know
it's
a
use
case.
A
That's
come
out
that
I
haven't
had
a
chance
to
look
at
it
so,
but
the
point
being
think
about
how
would
you
do
this
differently
right
and
let
me
know
so
because
there's
got
to
be
different
ways
to
do
this
in
different
cases
beyond
this,
this
is
like
this
is
based
on
creating
making
sure
every
permutation
of
every
function's
inputs
gets
called,
and
I'm
not
sure
where
that,
where
that
breaks
down,
we
need
to
find
out
where
that
breaks
down,
because
it
can't
it
must
not
work
for
all
use
cases
right
they're.
A
So
I'm
pretty
sure
it
breaks
down
somewhere,
but
we
need
to
figure
that
out,
okay,
so
so
that
is
the
so
this
this
this
is,
and
let
me
put
it
in
the
notes.
So
this
is
the
explanation
of
where
is
the
operation
implementation
over
here
operation
implementation
network
and
then
the
operation,
which
is
the
operation
implementation
context,
which
would
all
be
the
dot
imp.
So
this
is
dot
imp
and
then
this
is
the
context
that
dot
amp
creates
when
you
do
the
double
context:
entry.
A
Okay,
let
me
put
that
in
the
nets.
A
Come
on,
okay,
whatever
should
be
saved
right.
Google
drive
all
right
so
now
we
so
then
the
spec.
So
this
is
the
definition
right
and
here's
the
definition,
and
we
looked
at
the
spec
object
earlier,
so
we're
instantiating
that
name
tuple
all
right,
so
inputs
contents
all
right.
Well,
we
renamed
that.
So
I
better
not
have
an
input
name
context
there.
We.
A
All
right
great,
we
can
do
that.
That's
interesting,
I'm
not
sure.
If
that's
that's,
that's
not
pretty
see
if
they
were
all
just
regular
definitions
would
be
fine,
but
I
I
don't
know
if
that's
gonna
work,
we
should
try
it
all
right.
Hey,
look!
I'm
seeing
true
at
the
end
of
that
thing,
so
it
looks
like
we
do
have.
A
All
right
look
at
that,
so
we
determined
that
the
workflow
was
present.
Okay,
so
just
for
shits
and
google's
now
we're
gonna
point
it
at
arbitrary
repos.
Did
it
try
to
print?
Do
you
see
this
look
at
this.
A
A
So
this
is
a
this
is
just
great,
no
all
right,
okay!
So
let's
just
recap.
A
All
right
so
should
I
should.
I
is
a
metastatic
analysis
tool
that
we
built
as
a
demo.
Basically
it
does
very
something
very
similar
to
this
goes
in
looks
for
presences
of
the
presence
of
different
files
and
then
runs
things
you
know
based
on
the
presence
of
those
files.
So
this
is
very
very
very
similar
to
what
we're
going
to
do
here
and-
and
the
goal
of
all
of
this
is
to
basically
be
able
to
throw
it
all
together
right
so
at
a
moment's
notice,
and
so
that's
what
we're
gonna
do.
A
So
we
did
a
demo
here.
We
make
this
tool,
we
make
a
bunch
of
operations.
A
And
so
this
shows
you
the
flow.
So
basically
we
get
a
python
package.
We
go
to
pi
pi,
we
grab
the
the
metadata
about
it.
We
grab
the
download
from
the
metadata
we
download
it.
We
run
this
bandit
utility
once
we
have
the
version
here
from
the
package
metadata,
we
run
this
safety
check
utility,
that's
the
data
flow
right
and
that's
making
that's
all
these
operations
that
we're
implementing
here.
So
then
you'll
remember
the
automating
classification
example.
A
We
basically
we
go
in,
we
clone
gateway,
clone
gate
repos,
and
we.
What
do
we
do
well?
This
is
what
we
do.
We
count
the
authors
we
determine
the
the
diversity
of
authorship
is
the
work
metric
metric
and
we
have
this
data
flow.
Now
we
in
this
tutorial,
we
combine
them,
and
so
we
look.
Here's
the
get
repository
checked
out
operation,
here's
lines
of
code
by
language
lines
like
code
to
comments
which
I
believe
we
use
in
that
one.
A
We
just
looked
at
right
and
so
now
we're
going
to
go
and
we're
going
to
take.
First
of
all,
we
deploy
this.
What
do
we
do
here?
Yeah
we
create
the
the.
What
is
that
the
overlay
yeah
they're
override
override
overlay?
I
need
to
work
on
that
wording.
So
look
now
you
can
see
the
you
know.
This
is
the
the
pi
pipe
devs
that
should
I
utility
and
then
this
is.
You
know,
part
of
that
data
flow
that
we
just
saw
right
so
now
we're
just
gonna.
A
Stick
them
together,
we're
not
going
to
write
any
more
code
right.
So
we
wrote
all
these
functions,
we're
not
going
to
write
any
code,
we're
just
going
to
stick
them
together
now
right.
So
now
we're
going
to
do
the
same
thing
here
right
so
pi,
pi
package
contents,
that
is
a
directory,
and
so
this
is
what
I
was
talking
about.
I
didn't
want
to
do
this,
so
I
didn't
want
to
go
create
this
mapping,
because
it's
just
an
extra
step
right.
A
So
that's
why
I
use
the
repository,
because
these
guys
accept
a
repository
with
the
directory
property,
so
you
have
to
create
a
mapping
object
which
then
can
gets
converted,
so
you
have
to
create.
So
if
you
do
this
create
mapping
create
a
mapping
is
like
a
more
generic
programming
language,
agnostic
term.
A
For
a
dictionary
right,
so
if
you
do
this,
you'll
create
a
dictionary
and
its
property
of
directory
will
be
the
directory
where
the
package
contents
of
the
pi
pi
package
were
downloaded,
then
you
set
the
key
to
be
directory
and
now
all
of
a
sudden
you
have
this
sticked
object
with
directory
and
then
the
path
to
the
you
know
what
is
effectively
your
downloaded
gear
revo,
but
there's
no
gate
dot
get
directory,
which
is
why
it's
not
a
real
repository
object
and
we're
just
faking
the
creation
of
it,
and
then
we
can
pass
it
to
anything
that
takes
a
repository
right.
A
So
that's
it!
There's
like
a
little
transform
that
happens
there.
It
transforms
the
data
type
one
data
type
into
another
data
type,
okay,
so
right
we're
going
to
download
a
bunch
of
key
repos.
So
what
do
we
do
here
when
we
do
this?
Okay?
So
we
need
to
create
the
data
flow
that
does
the
download
and
then
we
need
to
create
the
data
flow.
So
I'm
going
to
open
the
tutorials
and
we're
just
going
to
do
this.
So
what
is
the
tutorial
tell
us?
A
Just
want
to
make
sure
we
have
all
the
operations,
it
should
just
be
get
clone
clone
get
repo.
Okay,
oh,
does
it
need
check
if
get
repository
valid?
Okay,
so
we're
gonna
clone
gear
repo?
A
Okay,
test
operations
test
run.
Okay.
I
think
we
can
say
that
this
is
good,
so
I'm
just
going
to
fix
up
this
test
here.
A
Okay,
so
that
just
grabs
all
the
outputs,
let's
see
and
here's
the
data
flow,
okay.
So
all
right.
So
this
we're
going
to
move
the
data
flow
to
the
to
the
to
the
global
namespace
in
this
file
and
we
are
going
to.
A
A
And
then
it
should
be
true
right,
so
let's
try
this
again.
All
right
did
we
put
that
there
it's
going
to
want
a
data
flow.
A
A
All
right,
so
this
we're
just
building
building.
You
know
what
the
output
dictionary
should
be.
So
it's
now
42
right
so
that
before
we
continue
we're
not
we're
not
in
a
weird
spot.
Okay,
so
let's
let's
commit
that
all
right.
So
this
is
operations
in
our
source
and
how
we're
doing
on
time
all
right.
We
got
eight
minutes
all
right,
we're
gonna,
we're
gonna!
Do
this!
A
We're
gonna
run
it
against
some
some
gear
repos,
so
operations
intersource-
and
this
would
be
you-
know-
fixed
tests
or
switch
to
downloading
or
switch
to
checking
for
presence
of
workflow
stir.
A
Oh
yeah,
that's
right!
We're
messing
with
that!
Oh
yes,
that's
the
remote
and
local
execution.
I've
had
a
breakthrough
on
that
one.
Okay,
so
let's
go
write
this
in
the
tests.
We
can
just
look,
not
look.
We
don't
need
to
look
at
that
right
now,
all
right
so
test
operations
and
we'll
call
this.
You
know
maybe.
A
I'll
just
say:
test
data
flow
for
now
we're
test
on
real
reboot
repos,
all
right
great,
so
we're
going
to
test
it
on
some
real
repos
all
right.
So
you
know
I'm
going
to
how
we
should
have
a
basic
test
function,
I'm
realizing
okay,
wow.
That
would
be
really
really
a
good.
How
would
we
not
have
this
yet?
Okay,
we're
going
to
write
that
wow,
really
real
real
boneheaded
move?
Okay.
So
if
not.
A
Can
you
do
that
yeah?
You
can.
A
All
right,
I
don't
like
to
use
a
cert.
Basically,
if
you
python,
by
compile
things
files
and
then
run
from
the
python
by
compiled
ones,
then
assert
statements
go
away.
A
A
A
Oh,
my
god,
this
is
really
going
to
be
how
how
nice
this
is.
Look
at
this
we're
just
going
to
add
this
async
test
case.
This
is
going
to
be
fantastic.
No,
that's
a
dumb
idea!
That's
dumb
idea!
Let's
not
do
that!
It
should
be
staying
alone.
Should
we
stand
on.
A
All
right,
so
I
think
we
do
have
this
somewhere,
so
we're
just
going
to
make
this
a
tuple
right
and
so
you're
going
to
put
your
inputs
for
that
context,
and
then
your
outputs
for
that
context-
and
this
should
have
been
done
a
long
time
ago,
so
where's
the
inputs,
man,
look
at
this
yeah.
We
definitely
have
this
somewhere
why
we
didn't
think
to
put
it
in
a
function.
It's
beyond
me.
A
A
A
A
A
Yeah
I
like
to
do
that.
I
like
to
name
the
variable
instead
of
just
using
the
underscore.
I
think
I
think
it's
better.
Okay,
I
think
it's.
I
think
it's
more
clear
just
so
you
don't
forget
what
it
is.
If
you,
you
know,
if
you're
analyzing
the
code
later
and
you're
like
well,
what
why
is
this
not
working?
Well,
I'm
not
using
the
thing
that
you
know
I
obviously
should
have
been
using
right.
A
I'm
not
checking
that
value.
It's
more
obvious!
That
way,
all
right,
so
we're
going
to
go
through
we're
going
to
run
every
context.
A
So,
basically,
I
dot
items
is
going
to
give
us
the
key
and
then
value
as
a
tuple
and
we're
expanding
the
inner
tuple,
which
is
the
list
of
inputs
and
the
the
outputs
man.
Why
did
we
not
have
this?
This
would
have
made
life
so
much
easier,
all
right
and
then
yeah
all
right.
I
think
we're
good
to
go
and
then
we
go
through
and
we
do
our
assertion
error.
You
know
if
it
doesn't
match
the
what
it
should
be.
A
No
we're
just
going
to
add
it
in
async
test
case.
Let's
just
put
it
there,
I
don't.
I
don't
know
why
hesitating
I'm
putting
it
in
here.
I
don't
really
like
it.
I
don't
like
that.
I
don't
like
that.
No,
we
can't
do
that
because
no,
we
can't
do
that.
Okay,
all
right!
Fine,
let's
just
put
it
in
there
all
right.
A
All
right
yeah-
and
this
is
why
so
I
think
some
very
saintly
person
went
and
recombined
the
async
test
case
and
the
integration
cli
test
case,
and
now
we
have
just
this
beautiful,
beautiful
class
that
really
handles
a
lot
of
things
for
us.
A
So
I
can't
remember
exactly
who
did
that,
but
thank
you
very
much
so
you've
made
everyone's
life
so
much
easier.
So
we're
just
going
to
add
it
here,
because
we
already
have
all
those
methods
on
it.
So
we
can
just
add
one
more
and
so
because
we're
already
using
things
like
the
make
dynamic,
temp
file
make
temperature.
I
I
I
yeah
anyways,
we'll
just
do
this.
Okay,
so
let's
run
dataflow
check
results,
okay,
or
what
about
the
run?
A
Oh
yeah?
How
we're
going
to
do
that.
A
A
Okay,
where
is
that
oh
yeah,
it's
in
high
level.
A
A
Cool
is
going
to
kick
me
off
my
one
person
meeting
all
right.
Okay,
we'll
come
back
to
this.