►
A
Ready
all
right,
let's
go
so
primitive
balance,
control
house
operations
could
have
issue
mediation.
Body
outputs
result
could
not
be
a
determined
meta
issue,
buddy,
okay,
so
these
things
are
mad
at
us
right
now
and
we
had
opened
the
df
base
file,
which
I'm
going
to
close
right
now
and
this
one
that
we're
looking
at
is.
Oh,
I
just
made
a
new
file.
Let's
name
this
quote.
A
All
right,
this
is
recommended
community
standards,
overlay,
alice
operations,
github
issue,
there's
no
overlay;
actually,
that's
not
okay.
So
the
path
is
entities
alice
alice.
Please
contribute
recommended
community
standards,
alice
operations,
github
issue,
okay,
so,
and
what
we
did
was
we
made
these
things
output
operations
and
we
added
this.
This
collect
one
in
and
this
collect
one.
B
A
Going
to
give
us
a
list
of
other
ones
and
then
okay,
so
meta
issue
body,
no
okay.
So
that's
not
what
it
is.
A
A
And
let's
see
what
happens
right,
so,
let's
eliminate
one
variable:
okay,
so
path,
all
right,
234!
So
not
this
file
path,
all
right!
Okay,
so
we
have
run
into
the
dreaded
unification
of
types
problem
again.
Okay,
so
there's
yet
another
place
where
the
types
aren't
unified-
and
this
is
this
is
exactly
why
I
was
probably
so
stressed
out,
because
now
I'm
not
anymore,
okay,
so
so
all
right.
It's
all!
Fine!
Okay,
so
recommended
community
standard
contribution,
name
tuple.
So
what
is
this
name
tuple?
And
what?
A
A
Third,
I
think
the
problem
with
this
is
that
we
don't
have
a
serialization
for
it.
Yes,
so
the
spec
on
the
export
of
the
import
object
fails.
So
let's
do
re,
dfmo,
df
types
and
let's
check
out.
A
B
A
A
All
right,
so,
let's
see
how
do
we
make
this
because
I'm
not
satisfied,
I'm
not
satisfied
with
just
throwing,
so
we
so
what
what
we
could
do
and
why
I
haven't
done
it
is
is,
is
I
want
this
to
become
what
it
should
be
and
not
to
become
the
half
thing
that
it
it?
It
is
not
right
now
because
it's
time
for
this
stuff
to
change
so
all
right.
So,
okay,
so
I
think
it's
it's
it's
a
config
dict,
which
is
the
stuff
that
we
just
did.
A
Okay,
traverse
config,
set.
Remember
this
patch.
Actually
did
you
all
didn't?
Maybe
this
is
the
one.
A
A
What
we
did
was
we
came
in
here
and
we
said
just
did
a
little
refactoring
to
this
to
make
the
class
name
dynamic,
because
then
we're
going
to
take
a
list
and
then
we
go
in
and
we
add
one,
that's
like
hey,
you
know
alice.
Should
I
contribute
because
we're
going
to
add,
because
I
I
don't
have
time
I
want,
I
want
to
make
sure
everybody
can
do.
A
Should
I
contribute
and
everybody
can
do,
please
contribute
right
and
hopefully,
if
we
have
docs
on
should
I
contribute
and
please
contribute
that's
a
single
action
and
a
single
response.
And
then
you
know
one
one
is
one:
is
safe
mode,
one
is
not
safe
mode
and
so.
A
You
know,
because
should
I
is
more
of
like
collect
the
data
and
please
contribute
so
please
the
please
family
of
commands
will
be
like
hey.
Please
do
something
the
should
I
is
you
know,
I'm
asking
for
your
recommendation
and
the
then
we'll
have
other
ones
you
know,
but
we'll.
A
You
know
with
the
overlays:
the
cli
is
just
like
one:
it's
one
interface
right,
so
we'll
plug
and
play
stuff;
okay,
so
all
right,
okay,
so
yeah.
So
we
added
that
and
then
we
basically
said:
okay,
we
formatted
the
file
with
black
because
apparently
haven't
been
formatted,
and
then
we
refactored
out
the
object
operation
stuff.
So
basically,
this
function
that
was
here
within
the
alice
alice
alice
our
entities
alice
alice
cli.
A
We
pulled
that
that
object,
operation
functions
out
and
we
put
it
in
df,
dfml,
util,
df
internal,
and
that
was
the
move
to
one
file.
Okay.
So
then
we
came
in
here
and
we
just
real
quick
hacked
in
the
collector
data
flow
within
the
inner
source
operations.
A
Right
so
remember
we
did
the
the
tutorial
series
on
the
industry
stuff.
It
is
back
in
like
december
or
something
with
the
sap
in
in
where
you,
where
we
build,
we
build
off
the
sap
thing.
Actually,
I
can
show
the
docs
now,
because
I
realize
the
docs,
for
the
main
is
working
right,
so
this
docs
is
working.
I
realized.
So
where
was
that?
Yeah?
A
Okay,
all
right:
this
is
this
image.
A
Okay,
well
I
don't
know,
I
don't
know
if
the
images
are
working
or
not.
Well,
let's
go
to
retrogame.
A
Working,
why
aren't
they
working
okay?
No,
so
they
didn't
get
rewritten.
Why
didn't
they
get
rewritten?
Oh,
oh
because
then
I
started
erroring
out
yeah,
oh
god
that
was
dumb,
okay,
okay,
so
but
anyways.
So
here's
the
here's,
the
here's,
the
software
portal
demo-
and
you
know
once
again
this
builds
on
this
talk
here
and
where
are
we
at
here
we're
talking
about
okay
yeah?
So
this
is
basically
it's.
A
It's
an
alternate
automating
classification
by
using
sap
center
source
portal
right
and
then
we
talked
about
how
we're
going
to
do
some
of
the
stuff
in
the
build
tree
or
with
our
you
know,
our
mentors
who
have
now
go
gone
off
to
start
their
own
work
there.
So
all
right.
A
A
A
I
it
really
is
drove
a
lot
of
this
living
threat,
model
stuff
and
I've
worked
with
him
and-
and
you
know,
we
we
we're
seeing
where
this
stuff
goes,
and
so
with
alice,
we're
gonna
beat
the
living
threat
model
right,
and
so
that's
what
we're
going
to
do
with
our
should.
I
contribute
is
we're
going
to
feed
the
living
threat
modeling
with
our
alice.
A
Please
contribute
we're
going
to
take
action
on
the
analysis
that
we
do
based
off
of
the
the
should
I
results
and
we
may
trigger
you
know
people
to
manually
do
those
actions
or
we
may
have
alice,
do
it
in
an
automated
fashion,
right
and
so
there's
it
will
be.
The
code
will
be.
The
living
throughout
model
will
be
alice.
Okay,
so
it
all
wraps
up
into
one
thing:
it
all
wraps
up
into
the
same
thing.
A
Okay,
so
directly
include
alice
okay,
so
this
is
basically
just
doing
that
same
overlay
load,
which
you're
seeing
here,
which
we
already
did
for
the
recommended
community
standards.
So
this
is
our
second
top
level
command,
so
alice
should
I
contribute
so
now
that
we
have
those
two
things
and
we
have
the
ability
to
load
basic
overlays
on
them
which
are
really
just
the
auto.
A
Whatever
auto
flow
does
data
flow,
dfml,
df
types,
data
flow,
auto
and
then
the
auto
flow
keyword
argument.
There.
A
Okay,
so
this
auto
flow
here
right,
so,
okay,
self.autoflow,
so
there's
an
autoflow
operation
right,
so
this
autoflow
operation.
What
this
does
is
it
goes,
and
it
looks
at
all
those
definitions
right
all
the
all
the
the
the
different
operations
and
what
what
they're
different,
what
their
inputs
and
their
outputs
are
and
their
conditionals
are,
and
it
links
them
up
into
this
into
this
internal
structure
called
the
flow
dict
right.
A
So
the
dictionary
because
all
the
mappings
in
python
are
called
dictionaries,
and
so
then
yeah
and
and
and
this
like
pre,
the
suffix
prefix
of
the
type
name-
is
another
thing
that
I've
been
playing
around
with
when
we
think
about
like
refactoring
and
getting
into
automated
refactoring,
because
you
can
really
start
to
look
at.
If
you
start
to
name
types
based
on
a
certain
convention,
you
can
then
begin
to
pull
out
structures,
and
so
we'll
have
alice.
A
Do
this
she'll
infer
based
on,
like
you,
know,
the
the
patterns
of
usage
of
different
variables,
which
ones
really
are
sort
of
about
the
same
data
and
then
name
them
appropriately
and
then
create
structures
out
of
them,
which
is
exactly
what
we've
just
done
right.
So
we'll
do
this
automated
refactoring
process
eventually,
so
so
it
links
them
up
into
this
flow
dict
and
it
throws.
A
What
is
that
seed
conditions?
Okay,
that's
the
origin;
okay,
that's
fine!
So
and
then
the
origin
concept
is
that
whole.
You
know.
Where
does
this
thing
come
from?
Okay,
so,
basically
yeah,
so
we
get
it's
included
in
the
auto
flow.
So
now
we
have
the
overlays
there.
We
have
overlays
on
both
of
our
key
entry
points.
A
Should
I
and
and
please
contribute,
should
I
contribute
and
please
contribute,
and
then
we
went
through
and
we
fixed
this
one,
and
I
think
we
just
did
this
one
together-
and
this
is
the
auto
import
use
variable
names
instead
of
generated
names,
okay,
yeah.
So
this
is
the
init
py
problem,
okay
and
then
yeah
okay,
so
we
might
have
done
this
one
together,
yeah
we
went
in
and
we
or
did
we
yeah.
The
problem
here
was
basically
that
yeah,
the
name
was
not
the
same.
A
Here
was
that
we
need
this
little
helper
for
these
config
dictionary
things
and
that's
what
we're
using
right
now
is
these
config
dictionary
things
right,
so
the
the
format
for
those
is:
let's
go
to
the
doc
site.
A
Okay,
so
what
are
we
doing
right
now,
so
input
definition,
dot
spec
is
in
type
loading
information
is
so
what
are
we
doing
and
why?
Why?
What
so?
Why
are
we
doing
this
way?
So
so
yaml?
So
if
you
saw
the
yaml
by
doing
it,
this
way
we're
we're
sort
of
by
separating
we,
we
we
separate
the
config
structure
from
the
initialization
of
the
object,
because
the
initialization
of
the
object
happens
on
a
enter
and
therefore
we
can
apply
the
overlays
and
we
can
apply.
You
know
any
sandboxing
that
needs
to
happen.
A
A
You
get
the
same
functionality,
you
get
a
cross
language
being,
and
you
get
it
by
doing
this
nested
config
structure
right
because
then
you
can
and
you
get
and
and
and
by
accessing
non-entry,
okay,
so
definition
spec
at
least,
let's
hope,
because
or
else
this
is
going
to
be
not
fun.
So
that's
what
we're
about
to
find
out.
Okay,.
C
E
C
A
Need
to
move
to
plugin,
slash,
config,
dict
style,
serialization.
A
Leveraging
traverse
config
set
recently
changed
in
any
case.
Oh,
was
that
why
we
did
this
in
case,
we.
D
C
C
D
C
A
A
Okay,
no,
this
is
fine.
I
mean
it's
good,
it's
good!
It's
good!
It's
good!
It's
fine!
What
happens
when
we
do
data
class.
A
Okay,
great
okay,
here's
the
book
so
yeah
because,
or
else
it
basically
just
says,
oh.
A
A
D
A
C
A
C
D
C
C
A
A
Okay,
all
right,
let's
try,
I
think
I
think
it
might
work
we'll
just
we'll
just
see
where
it
takes
so.
A
A
Not
implemented
error,
it's
a
good
thing,
it's
better
than
implementing
it
wrong.
Then
it
lets
somebody
else
know
if
you
want
this
functionality,
here's
exactly
where
you
insert
it.
So,
okay,
so
not
implemented
here.
A
Okay,
subspec
is
meant
to
denote
when
the
top
level
is
a
list
or
a
mapping
itself,
or
I
think
yeah
is
a
list
or
a
mapping
itself.
Yeah.
A
So
if
the
primitive
is
array
then
or
the
primitive
is
mapping
for
a
dictionary
that
will
be
stored
in
the
primitive
value
of
the
definition
and
then
subspec
and
that
are
used
together
to
determine
whether
it's
you
know
needs
to
be
converted
from
an
array
or
from
a
mapping
with
spec
as
the
internal
type.
Okay.
So,
if
is
alif,
is
instance,
okay,
so
this
is
named
hubble.
A
A
A
D
A
A
F
A
C
A
All
right,
so
I
want
to
add
this
dfmo
df
types
file
if
staged
all
right.
So
this
is
what
we've
got
so
we
we
modified
this
a
little
bit
and
we
we
basically
just
said
you
know
we're
going
to
raise
exception.
If
we
don't
know
what
to
do
with
this
thing
right,
so
we're
going
to
do
we
added
some
input,
validation,
rather
just
blindly
doing
it
there
so
git
commit,
or
actually
let's
run,
the
auto
formatter
on
that.
A
C
A
A
Remember
it's
all
going
to
combine
so
the
so
we're
going
to
we're
going
to
use
we're
going
to
have
the
init
function
have
include
a
you
know
like
the
schema
reference
for
the
for
the
config
of
the
object
being
created,
and
when
we
do
that
we
have
the
inner
function.
Okay,
so
we
have
the
inner
function.
Okay,
so.
D
D
A
Object
which
is
a
manifest,
so
it
should
be.
Oh,
the
unified
config.
B
The
unified,
what
was
that
the
unified
theory
of
config,
unified
theory
of
configuration
or
the
unified
configuration
methodology
or
something
saksham?
You
will
probably
remember
this.
You'll,
probably
notice.
A
There's
when
we
did
oh
man,
where
is
that?
Okay?
Well,
it
doesn't
matter
now,
but
basically
what
we
said
was,
but
this
must
be.
This
is
yeah.
Oh,
this
must
be
where
that
yeah.
This
is
where
this
came
from:
okay,
okay,
so
effectively
what
the
hypothesis
was
was.
If
you
have
three
fields
you
can
you
can?
E
D
A
So
let
me
read
my
notes
on
the
archetype:
real
quick
while
we're
at
it,
because
I
think
we
figured
this
out.
Let
me
put
it
to
do
and
then
let's,
let's
do
it
so.
A
Okay,
so
archetype.
A
Okay,
so
we'll
cover
the
architect
but
okay,
so
the
archetype
is
a
construct
for
data.
A
Set
a
system
context,
strategic
principles.
C
A
Config
technology,
okay,
plug
in
was
the
other
thing
and
the
other
one.
It
was
plugin
config.
A
A
Yeah,
okay,.
E
A
A
A
C
D
A
And
this
is
where
it's
a
d
id
as
well,
because
then
you
would
add
the
did
in
that
and
the
did
doc.
Okay,
so
so
the
did
doc
would
be
the
upstream
and
the
did
would
be
the
overlay.
I
guess
in
that
context.
A
Okay,
it's
like
it's
all
right
here,
but
it's
like
it's
it's
so
close!
So
close,
okay,
okay!
So
what
are
we?
What
are
we
running
into
so,
let's,
let's
deal
with
this
problem
in
hand
so
and
we'll
move
towards
that,
and
I
think
we'll
end
up.
C
A
Okay,
so
what
am
I
because
I'm
thinking
right
now,
I
think
I
think
this
input
stuff
will
lead
us
to
the
right
decision
with
the
data
flow
export,
because
I'm
I'm
debating
every
time
you
call
export.
Do
you
really
want
to
know
what
the
data
is
like?
Do
you
really
want
the
nested
structure.
A
A
In
this
case
context
would
be
the
parent
object
right,
so
the
parent
object
has
the
context
I
have
two
now
I
have
two.
The
parent
object
has
the
context,
and
so
the
parent
object
should
decide
whether
the
child
needs
to
be
wrapped
with
schema
and
plugin
name.
C
D
A
A
Well,
we
I
would
write
down
what
I
just
said
before
and
then
that
will
lead
us
to
the
next
step.
Okay,
so
when.
A
Methodology
is
right,
enumerate
everything
you
know
enumerate
of
the
things
that
you
know,
which
ones
are
processes
which
ones
are
assets
or
processes
which
you
could
interconnect,
enumerate
all
the
possible
interconnections,
all
the
possible
graphs
right
and
then
do
a
tuned
brute
force
to
find
the
best
solution
within
the
given
time
frame.
This
is
the
methodology,
okay,
so,
okay,
so.
A
A
A
Okay
and
then
let
me
write
a
little
note
here,
and
this
is
in
the
thread
button
right
everything
you
know.
A
And
which
are
processes
how
what
okay
so
using
tuned
fruit
force
pruning
the
trees
based
on
strategic
principles.
A
Okay-
and
these
are
the
top
level
conscious-
you
know
we're
also
thinking
of
these
as
conscious
or
conceptual
layers,
or
maybe
even
layers
in
a
neural
network,
and
these
are
the
strategic
plans
which
you
know
are
being.
A
Successfully
so,
basically
any
anywhere
anywhere
so
where's,
our
giant
where's,
our
giant
graph
right
so
where's,
the
giant
graph.
A
Not
helpful,
okay,
anyways,
you
see
this
right
so
and
you
can
sort
of
make
out
here
how
there
are
different
layers.
Basically,
as
we
move
through
and
then
we're
gonna
have
the
stages
right,
and
so,
if
you
had,
what
we
need
to
do
is
we
need
to
link
up
the
the
stages
that
we
just
created
that
we
set
to
output?
A
What
we
really
need
to
do
is
is
do
something
that's
more
like
that:
shim
based
approach,
where
we
say
next
phase
right
or
the
next
phase
parser
right-
and
we
really
just
mean
to
say
call
this
thing
after
everything
in
the
last
phase
has
completed
right
and
so
that
that
will
that
will
because,
because
what's
what's
not
shown
in
this
diagram,
I
think
is
output
operations
and
cleanup
operations
and
those
are
the
successive
stages
right.
So
we
need
to
have
this
rolling
okay,
we
need
to
have
this
rolling
stage.
A
This
rolling
set
of
stages,
this
rolling
set
of
stages,
okay,
we're
gonna,
have
this
rolling
set
of
stages
and
yeah,
and
then
it
just
rolls
just
rolls
forever:
okay,
yeah,
you
just
keep
rolling
you
keep
rolling.
You
keep
rolling.
A
A
You
need
to
communicate
with
the
input
network
and
the
operation
network
you
need
to
trace
through
the
the
code
in
df
memory.
You
see
gather
inputs
in
the
dispatch
operations,
you
need
to
see
where
the
data
flow
is
used
and
you
need
to
be
able
to
recommunicate
a
new
data
flow
and
update
to
the
data
flow
to
all
of
the
network.
Components
within
the.
A
Within
the
within
the
orchestrator
that
the
orchestra
uses
so
the
the
the
lock
network
and
the
redundancy
checker
and
all
of
this
stuff,
anything
that
might
use
the
data
flow
it
needs
to,
it
needs
to
have
a
refresh
on
it,
okay,
and
that
would
be
able
to
signify
what
what
that
there's
a
new
stage.
A
A
A
You
may
have
different
inputs
from
different
origins,
and
so
you
could
then
trigger
via
a
subflow
or
the
launching
of
a
context.
Yeah
you
trigger,
via
the
launching
of
a
context.
D
A
E
E
A
A
D
A
Okay,
so
here
data
flow
having
overlays
data
flow,
we
are
applying
overlays
to
by
running
overlay
data
flow
and
passing
as
an
input.
This
is
the
one
that
actually
is
the
real
data
flow
that
we're
putting
in
here
that
we're
this
is
the
upstream.
This
should
just
say
upstream.
This
should
say
upstream.
This
should
say
upstream.
A
How
dumb
is
that?
Okay
excuse
me.
C
D
D
C
A
Or
no,
it's
not!
Okay,
it's
just
okay!
So
this
is
basically
this
just
didn't
get
deleted.
Okay,
so
load
default
deployment,
environment,
input,
definitions,
list,
input,
definitions,.
A
Okay
inputs
items,
okay,
yeah,
so
short
names,
it
is
going
to
do
it
on
short
name,
so
we
can
do
upstream,
so,
okay,
so
we
can
do
upstream.
A
A
A
All
right:
what
is
the
system
context?
The
system
context
is
the
upstream,
the
overlay
and
the
orchestrator,
so
the
upstream,
in
this
case
is
overlay,
is
not
none,
so
overlay
system
context
equals
create
so,
let's
say,
create
a
system
context
to.