►
B
A
So
what
happened
here?
Oh
yeah,
I
think
I
realized
something
when
I
looked
at
this
this
required.
So
this
was
not
here.
So
this
is
what
we
had
before.
I
looked
at
this
and
I
thought
it
didn't
look
quite
right
and
I
realized
okay.
So
it's
requiring
it's,
it's
complaining
right.
So
what's
the
error
message,
the
error
message
is
okay.
Well
now
it's
different.
A
A
B
A
B
C
A
A
Data
flow
default
missing
type.
So
why
is
that
field
modifications
get
field
name.
D
D
C
A
D
A
D
C
A
A
Okay,
missing,
add
missing
else
to
update
non-dict
so
basically
or
and
non-list
values.
Okay,
so
basically
what
I
mean?
What
happened
here
right
so
it
looks
like
we
were
trying
to.
If
okay,
we
merge.
If
the
key
already
exists,
we
detect
if
it's
a
dictionary
and
we
recursively
merge
the
dictionary.
If
it's
a
list,
then
we
append
to
the
list.
Otherwise
we
forgot
to
actually
just
update
the
key
so
oops,
but
now
we
do
that
so
because
I
think
it
was.
C
C
A
And
what
do
we
do
here?
That's
right!
Oh
yeah!
We
did
that
testing
stuff,
oh
great.
We
finally
did
that
this
will
help
with
that.
A
A
A
All
right,
so
we
refactored
and
we
changed,
which
is
not
allowed.
So
what
was
the
change
this?
This
was
the
change,
so
let's
commit
just
the
refactor
first
right
and
you
know
you
can
see
it's
not
actually
running.
A
A
Okay,
so
test
dot,
dot,
strings,
refactor.
A
A
C
A
A
Cut
graph
td
yeah,
it's
like
okay,
there's
nothing
here,.
B
E
A
A
E
His
depend
history.
A
C
A
All
right,
so
what
are
we
doing
so
we're
saying:
dump
that
remap
to
response
data
flow
t
it
to
this
write
the
output
via
t
to
dfl
main
package
overlay.yaml
and
then
pipe
that
through
dataflow
diagram
and
and
take
the
input
from
standard
in
there
file
not
found
hey,
hey
how's
it
going.
G
F
A
C
E
D
A
D
C
A
A
Okay:
okay,
oh
yeah
great,
you
can
hear
me:
okay,
okay,
so
yeah
so
I'll.
Just
you
know
I'll
just
keep
doing
what
I
was
doing
all
right.
So
basically,
what
we're
doing
here
is
yeah
we're
exporting
this
overlay,
we're
dumping
it
through
and
we
had
a
problem
with
the.
So
basically,
I
introduced
a
bug
with
the
diagram
command.
A
Well,
I
don't
know
if
I
would
argue
that
I
introduced
this
quite
honestly,
yeah
there's,
let's
see
what
this
looks
like:
okay,
good,
single,
okay,
this
looks
good
data
flow
to
be
overlaid.
Okay,
so
we
added
some
more
auto.
So
basically,
we
did
some
auto
definition
where
we
can
we're
beginning
to
unify
the
typing
stuff
with
python's
type
system.
So
basically
you
can
take.
You
know
new
type
from
typing.newtype
and
you
can
you
can
write
these
turn
in
definitions.
A
Okay,
so
see
get
single,
spec
spec
get
single
alpine
orange
okay,
so
this
is
going
to
go
through
and
okay.
So
this
basically
yeah
this
grabs
the
data
flow,
and
so
this
is
data
flow
to
be
overlaid.
A
So
this
converts
it
into
the
thing.
Okay.
So
then,
any
data
flow
here
from
src
from
seed
will
get
merged
in
and
then
this
merged
one
is
the
one
that
this
we
need
a
way
to
link
this
across
across
the
output
stage,
because
to
say
that,
like
hey,
this
thing
will
go
get
this
well,
maybe
wait
a
minute!
A
A
Okay,
I
just
realized
this
is
something
really
ridiculous.
These
output
operations
didn't
need
to
be
like
this
at
all.
Okay,
this
is
all
right.
Okay,
that's
really
funny!
Okay,
all
right!
This
didn't
even
have
to
be
like
this.
A
Oh
wow,
okay,
so
this
can
actually
just
take
directly
from
data
flow,
has
wow
wow.
That
makes
things
a
lot
easier:
wow,
duh!
Okay,
let's
see
what
this
looks
like
then,.
A
A
So
this
in
hindsight
seems
extremely
obvious
for
the
past
three
years
not
have
not
been
doing
this.
Okay,
wow,
okay,
wow.
A
Well,
all
right,
then,
okay,
so
basically,
you
know
we
had
these
output
operations,
but
there
was
really
no.
There
was
really
well.
This
simplifies
a
lot.
This
makes
a
lot
of
things
a
lot
easier.
A
So
we
were
talking
about
how
we're
going
to
make
these
data
so
basically
we're
going
to
support
a
data
flow
as
a
class
right
and
then
we're
going
to
call
different
output
operations
based
on
you
know
what
we
want
out
of
so
basically,
we
set
send
different
inputs
to
the
data
flow,
to
run
it
or
like
when
the
class
is
instantiated
with
the
double
context:
entry.
A
The
data
flow
starts
running
right
every
time
we
call
a
method,
we
add
specific
inputs
to
it
under
a
given
context,
and
then
we
overlay
specific
output
we
base.
Then
we
call
specific
output
flow
which
will
then
grab.
You
know
the
resultant
data
right
and
so
wow.
I
didn't
realize
how
easy
this
was.
Okay,
well,
that's
fantastic!
So
once
the
processing
stage
is
complete,
then
we
move
to
the
output
stage.
A
So
this
dest
is
an
object
in
memory
and
if
we
were
to
look
at
the
definition
for
it
in
this
df
main
package
overlay
yaml,
we
would
see
that
yeah,
the
primitive
is
object
right,
and
so
what
do
we
know
about
things
that
the
primitive
is
object,
which
basically
means
that
we're
not
sure
if
we
can
serialize
them
across
the
network?
That's
really
all!
It
means
right
now,
so
we'll
we'll
get
more
details
on
that
as
we
flush
it
out
so
wow.
A
I
feel
I
feel
very
dumb
for
not
realizing
this
for
years.
I
don't
know
how
many
of
these
I've
written-
okay,
wow,
wow,
okay,
all
right,
then.
A
Okay,
so
what
is
this
service
dev
export?
And
what
are
we
even
doing?
Oh
we're
running
alice
version.
That's
right!
Okay,
boom!
All
right!
So
now
what's
happening
here.
So
when
we
run
the
alice,
oh,
I
think
I
have
to
fix
my
let's
see
yeah.
I
think
I
have
to
fix
the
overlay
boop.
A
Oh
okay
or
not,
okay,
so.
A
Okay,
all
right!
Well,
that's
interesting,
okay,
so
we
can
basically
just
say
okay,
so
what
are
we
doing
here?
So
basically,
this
is
the
default
overlay
and
when
we
run
any
data
flow
and
for
our
example,
data
flow
is
going
to
be
vim
entities,
alice,
alice,
cli.
A
You
know
what
version
of
alice
are
we
on
and
we
talked
about
how
you
know
eventually
we're
going
to
unify
the
the
command,
the
command
line,
stuff
and
the
operations,
and
it's
going
to
be
the
same
thing
and
under
the
hood,
it's
going
to
run
a
data
flow
and
we
talked
about
the
top-level
system
context
and
basically,
the
top-level
system
context
is
this
first
data
flow
that
we
execute
and
then
all
within
you
know
we
have
this
calling
convention,
which
is
you
know
the
system,
context
which
is
the?
A
What
do
you
want
me
to
do
the
data
flow
or
the
open
architecture
which
will
then
maybe
have
a
data
flow
in
it
and
then
the
inputs
and
then
the
orchestration?
So
how
do
you
want
to
get
done?
Where
do
you
want
it
done
so
yeah?
Where
do
you
want
it
done?
Is
the
orchestration?
A
How
do
you
want
it
done?
Is
the
data
flow,
and
what
do
you
want
me
to
do
it
with?
Is
the
inputs?
Okay?
So
so
we
merged
now
we're
running,
run,
dispatch.
A
A
Okay,
so
this
is
going
to
do
the
same
thing
right
now,
because
the
type
the
typing
is
very
simplistic,
so.
A
And
we'll
just
call
it
data
flow
type
so
to
do
fix
or
to
do
resolve
or
unify
this.
A
B
B
A
Can
we
wrap
it
there?
I
thought
it
auto,
wrapped
it?
Oh,
no,
it
doesn't
auto
wrap
it
if
you
pass
it.
So
if
you
pass
it
here,
merge
it'll,
auto,
wrap
it,
but
it
doesn't
auto
wrap
it.
If
you
pass
it
be
the
other
one:
okay,
because
this
is
an
op-amp
and
or
let's
see
new
type,
to
definition,
not
to
mine.
A
A
C
A
Data
flow
merge:
let's
try
this
all
right,
so
the
auto
export
of
everything
is
complaining.
Okay,
where
are
we
at
so
data
flow
from
dict?
Okay,
so
it
went
through
and
it
merged
it.
Okay,
great!
A
All
right
so
well
didn't
like
the
lambo
interesting.
So
we
have
a
data
flow
now
looks
like
it.
Output
a
real
data
flow,
that's
fantastic
right,
so
this
is
the
one
with
the
overlays
applied.
A
Are
we
still
on
default?
We
still
are
on
default,
so
we're
still
so
remember.
Now.
The
overlay
process
is
is
twofold.
So,
first,
if
we're
not
provided
with
an
overlay
on
run,
then
we're
going
to
go
in
and
load
the
default
overlays
as
they
are
registered
via
the
entry
point
system.
A
A
A
A
So
now
we
can
say
add
you
know
the
data
flow.
So
basically
we
have
a
blank
data
flow
which
is
to
be
overlaid
on
the
default
flow,
so
we're
instantiating
all
of
the
default
installed
overlays.
But
what
are
we
overlaying
them?
We're
going
to
combine
them
all
together
into
one
data
flow
right
and
then
we're
going
to
execute
them
on
the.
A
On
the
you
know,
when
we
so.
A
A
All
of
these
come
from
the
the
entry
points
right,
so
we're
going
to
load
over
all
the
entry
points
right
and
so
in
this
case
we're
providing
this
data
flow
type
which
is
going
to
be
passed
to
merge
so
which
is
going
to
be
here,
see
data
flow,
so
so
we're
going
to
provide
everything
from
class.load,
which
means
anything
registered
under
dffmel.overlay,
will
be
merged
into
this
empty
dataflow
and
then
we'll
be
returned.
This
dataflow
object.
B
A
A
A
So
we
merged
it
and
the
thing
is
we're
merging
it
with
itself.
Remember
because
this
is
the
only
one
that's
installed,
so
we
so
we
manually
grab
it
here
right
or
we
grab
yeah.
We
hard
code
grab
it.
A
We
use
it
because
this
thing
will
take
any
data
flows
that
are
applied,
merge
them
together
and
return
the
merge
one
right.
So
then
we
add
all
the
data
flows
to
to
merge
together
being
the
same
data
flow
because
it
got
loaded
right
and
so
now
we're
like
okay.
Well,
what
is
the
resulting?
This
should
be
this
overlaid
right,
so
outputs
data
flows
to
data
flow
from
dict
outputs
data
flow.
I
think
this
might
be
because
we
change
okay,
so
data
flow
from
direct
outputs
data
flow,
so
adding
a
blank
data
flow.
A
A
E
A
A
F
A
Now
this
data
flow
stick
just
not
used
okay,
so
data
flow
being
overlaid
as
dataflow
2b
overlay,
okay.
So
that's
a
bad
name.
A
D
D
A
C
A
A
C
B
A
Okay,
so
should
be
in
okay,
sure,
okay,
so-
and
this
is
where
it's
like.
Okay,
I
think
maybe
maybe
relates
to
data
flow
as
class
methods
as
if
they
were
allowed
list
or
arguments.
A
A
A
Okay,
so
there's
a
place
to
edit
right
now
default
same
context
as
the
operation
for
the
method
was
executed
in
okay.
So
when
we
call
a
method,
we're
going
to
create
a
new
context
which
will
be
it'll,
have
a
unique
identifier
and
it'll
have
yeah
it'll
have
a
unique
identifier
right
and
it
will
be
related
to
basically
ok.
So.
A
A
F
A
A
This
thing
is
a
okay,
yes,
so
it
loads
the
data
flow
and
then
it's
not
gonna
yeah,
so
results,
overlaid
and
we'll
just
say:
overlay.