►
From YouTube: Weekly Sync 2021-07-27
Description
Meeting Minutes: https://docs.google.com/document/d/16u9Tev3O0CcUDe2nfikHmrO3Xnd4ASJ45myFgQLpvzM/edit#heading=h.u95utp8c4aef
A
A
A
A
A
Okay
right,
so
so
did
you
test
these
links?
These.
B
Yeah
they
work
on
docs.
They
do
okay,
great.
A
A
C
C
A
D
A
B
A
B
A
A
A
A
A
Okay
and
transfer
learning
was
good,
so
don't
we
just
need
to
update
the
change
log.
D
A
B
It
was
the
transfer
learning
models
that
were
taking
long
in
the
test
book.
Okay,.
A
A
Yeah,
my
concern
is
just
that.
Obviously,
if
we
have
stuff
that
that
that
takes
a
really
long
time
on
the
main
test
suite,
we
should
move
it
out
of
the
main
test.
Suite
or
like
you
know,
somehow
segment
it
like
we've
done
with
the
other
tutorials,
because
that
that.
A
It's
nice
to
be
able
to
run
all
of
the
you
know
the
majority
of
the
test
cases
very
quickly
where
you
can
catch
any
potential
problems
iteratively
fast,
okay,
so
this
is
yeah.
This
is
probably
too
long.
So
we
need
to
think
about
moving
these
out,
but
we
can
do
that
at
a
different
time.
So,
let's
see
so
how
we're
done
with
the
other
ones.
A
I
think
we
just
want
to
probably
similar
to
what
we
have
here
with
the
tutorials.
We
probably
want
to
go
through
and
leverage
the
same
type
of
thing
we're
doing
here.
A
So,
let's
see
this
is
console
test.
I
don't
know
some
something
something
like
what
we're
doing
with
tutorials,
because
those,
if
they're
going
to
be
it's,
going
to
be
long
running
like
this,
then
then
it
would
be
nice
to
not
have
it
be
run
as
default
with
the
rest
of
the
main
test
suite.
So
something
like
how
this
run
console
test
works.
A
A
Okay,
so
tess
notebooks
do
not
run
notebooks
as
part
of
main
test
suite.
A
A
A
A
A
A
Okay,
other
than
that,
though,
let's
see,
let's
get
this
merged.
A
A
A
A
A
A
A
All
right,
I
think
that
might
I
think,
putting
them
all
under
one
ci
job
might
actually
keep
things
going
faster.
A
Thanks
for
making
those
tweaks
on
all
those,
we
had
a
bunch
of
little
tweaks
to
make.
So,
let's
see,
thank
you,
hey!
No!
Thank
you
all
right.
So,
let's
see
all
right
there
we
go.
We
have
all
three
of
those
merged
in
let's
see.
A
Okay,
then,
let's
see
well
it's
sort
of
most.
I
want
to
try
to
go
backlog
first
here,
so
so,
okay,
how
is
this
commit
message?
Formatting
saying,
go
and
have
your
head.
If
we
touch.
C
This,
yes,
we
talked
like
we
would
implement
two
or
f
out
of
five
of
the
enhancement
issues
and
I've
done
that
the
couple
it
also
went
up
by
two
percent,
but
then
we
never
looked
back.
It's.
A
Yeah
yeah,
so
that
you
can
see
like
it's
sounds
good
sounds
good,
perfect!
Okay!
So,
let's
see,
let's
just
double
check
this
because
yeah
this
would
be
great
to
get
in
here.
A
A
C
A
A
A
A
A
A
A
A
Know,
let's
see
it
looks
like
you
didn't
push
it.
A
A
C
How
come
lindt
comment
is
showing
problems
in
merge
because
it
is
not
showing
the
same
on
my
side.
They're
showing
like
this
branch
has
no
conflicts.
A
I
don't
know,
let's
see,
oh
because
I
have
the
rebasin
merge
button
checked.
Maybe.
A
A
A
A
A
A
A
D
A
D
So
in
that
file
those
are
actually
the
cleanup
operations.
A
A
Okay,
so
do
you
have
plans
to
show
these
in
a
tutorial
or
something.
D
Yes,
we
have
like
data
sets
on
which
I
want
to
use.
Okay,
okay,.
A
Great
great
all
right,
okay,
so
let's
just
do
so.
I
think
I
updated
the
scale
stuff.
A
Okay,
let's
see.
A
A
A
There
we
go
okay,
so
we
should
try
to
default
service,
dev,
create
operations,
and
then
this
was
dfml
operations
cleanup
or
data
operations.
Data.
A
A
A
A
A
We
talked
about
how
you
might
have
to
iterate
over
data
sets
twice
to
find
out
all
the
things
right
so
like
to
find
out
all
the
categories
and
stuff
right.
So
that's
what
you're
planning
on
doing
here
it
looks
like.
Is
that
correct,
dude?
That's
going
to
be
in
the
data
flow
side
of
things
when
you're
passing
yes,
you're
getting
categories;
okay,
yeah
categories,
okay,.
A
D
It
has
ten
separations,
so
one
thing
I
had
to
ask
for
these
data
cleanup
operations
is
that
the
data
which
I
am
expecting
is
in
the
cleanup
operations
is
in
the
form
of
our
matrix,
like
list
of
lists.
It
is
expecting
so
will
that
be
fine?
With
the
data
flow,
I
mean.
A
C
So
can't
we
like
handle
matrix
data
like
you
want
users
ask.
Can
you
say
that
again,
like
clue
on
ts
the
handles
matrix
data
in
dictionary
format
in
dictionary
format?
A
Yeah,
I
don't
think
I
I'm
not
I'm
not,
I'm
not
I'm
not
able
to
visualize
what
you're
talking
about
here,
but
I
mean
that
sounds
fun.
I
mean
converting
things
to
a
dictionary.
Dictionary
is
fine.
I
think
yeah,
that's
pretty
much
what
we
are
doing
right.
Like
most
other
places,
we
usually
are
putting
things
into
that
features
dictionary.
A
So
yeah
yeah,
I
don't
know,
I
mean
yeah.
Everything
being
a
list
definitely
does
not
seem
like
it
seems
like
you're
going
to
need
another
layer
of
indirection
there.
D
So
yeah,
so
I
was
actually
thinking
like.
Maybe
I
can
come
like
take
one
of
the
data
sets
and
make
an
example
out
of
it
and
see
if
things
are
working
or
not.
A
D
A
D
A
End
up
doing
it
as
a
separate
operation-
I
don't
know
but
yeah
all
right.
So
let's
see
yeah
all
right,
cool,
perfect.
Let's
do
that
yeah!
So
we'll
just
make
a
note
that
we're
going
to
okay.
I
just
want
to
do
this.
A
A
A
Okay,
all
right
what
is
going
on
here?
Okay,
so
if
you
could
do
if
you
could
do
what
I
just
wanted
to
see
how
what
what
kind
of
issues
that
creates.
But
it
looks
like
we
have
some
things
that
we
have
to
deal
with
here
so
but
if
you
could
go
through
and
and
maybe
I'll
just
post
a
diff
of
this
or
something.
But
let's
see
because
it
should
be.
D
A
We
wanna,
let's
do
the
recreation
from
skel
again,
because
I
think
that
it's
scaled,
the
scale
stuff
has
changed
since
the
last
time.
I
like
it
out
of
this.
A
A
A
Okay,
anything
else
on
that
one.
D
A
A
A
A
A
Yeah
this
looks
this
looks
familiar
okay,
so
yeah
we're
doing
this
whole
thing
property
to
remove.
A
A
D
A
A
Sweet
no
way,
okay-
and
this
is
all
done-
awesome
all
right.
Well,
this
is
fantastic.
Okay,
so
then
yeah
we
don't
need
a
changelog
really.
I
think,
because
we
already
said
that
we're
adding
accuracy
stuff
in
the
change
log
since
last
release,
so,
let's
see
so
lent
on
docs-
is
failing.
What's
up
with
that,
none
type
is
no
attributes.
What
okay.
A
A
A
D
What
do
we
need
to
do
when
we
find
that
one
of
them
does
not
have
the
induction.
A
So,
let's
see
if
inspect
config
or
let's
see,
let's
see,
numpy
talk,
string
args,
let's
see
so.
A
All
right
so
see
if
I
would
say,
okay
so
check
if
make
config
inspect
works.
In
this
case
we
don't
or
we
wouldn't
to
or
let's
see.
A
Then
we
create
the
data
class
and
that
way,
while
we're
doing
that,
I
think
there's
some
oh,
because
we
could
then
remove
the
missing
ones.
Oh,
I
see
what
what
is
make
config
inspect
is
spec.
A
A
A
No,
I
thought
we
did
the
way
to
change,
to
make
config
numpy
to
do
something
with
that
properties.
Maybe
not
oh
yeah.
We
didn't
do
it
because
then
we
just
pulled
out
the
numpy
texture
marks.
Okay,
so
okay,
because
yeah,
if
it
doesn't
have
a
doc
string,
why
does
it
not
have
a
dot
string?
So
one
of
these
one
of
these
scores
does
not
have
a
duck
string.
A
Could
you
use
so?
The
thing
is
like
okay
yeah:
if
we
can
use
make
config
inspect,
then
we
can
skip
this.
I
think
there
should
be
a
way
to
let's
see.
A
A
A
A
A
A
A
Yes,
okay,
so
somehow
the
ci
says
that
there's
no
doc
string
for
a
certain
model:
okay,
doc
string,
args.
A
A
A
Dog
string
over
here,
so
that
is
odd,
so
something
in
the
wire
coming
through
twice:
let's
see
no
okay,
these
are
just
different.
Okay,
it's
fine.
A
A
A
A
A
A
Okay
and
great,
so
the
scores
are
being
tested,
which
means
that
they're
all
loading
fine,
so.
A
A
I
wonder
if
this
is
what's
going
on,
if
the
alpha
pie,
stuff
messes,
with
the
dark
strings
of
the
of
the
psychic
actually
stuff,
so.
A
Well
then
they
need
to
fix
that.
So,
let's
see
we
can
probably
hack
fix
it,
but
all
right
so.
A
All
right,
yeah,
my
guess,
is
that
the
dolphin
pie
stuff
is
screwing
it
up.
So
we
could
that's
what
it's
let's
see.
What
happens
if
we
do
this.
A
A
Yeah,
I
think
what
happens
is
that
these
the
way
that
they've
done
the
alpha
pie
patching?
A
A
They
don't
preserve
doc,
strings
and
things
like
that
and
I
think
that's
probably
what's
going
on.
We
should
probably
keep
that
in
mind
too,
as
we
wrap
stuff,
because
this
is
how
you
end
up
losing
information
so,
for
example,
or
let's
see
well,
okay,
let's,
okay,
let's
do
this
first,
one!
Okay!
So
if
we
do.
A
Okay,
yeah
see
now
it's
broken,
okay,
so
something
with
dell.
I
think
it's
the
alpha
pi
stuff,
because
I
updated
about
del
for
pi,
and
now
it's
broken
there
we
go
so
yeah.
They
they
have
incorrectly
wrapped
something
so
well
incorrectly.
Well,
not
not
completely
so
don't
type
of
object
has
no
attributes
split,
so
yeah,
my
guess
is.
If
we
look
at
these
things,
we
will
find
that.
A
Yeah,
okay,
indeed,
it
has
been
wrapped
so.
A
A
Okay,
so
we
can
just
add
a
quick
fix
for
this,
because
then
we
need
to
go
report.
This.
D
A
A
Yeah,
okay,
so
now
we've
confirmed
that
all
right
so
fix
for
dolph
reply.
Wrapping
function,
yeah
because
it
looks
like
they're
their
message.
They've
been
dumping
to
the
console
has
changed
recently
and
they
probably
have
updated
things
that
rap
in
a
different
way.
A
Okay,
this
happens
when
a
decorator
is
used
to.
A
A
A
A
A
A
Yeah,
maybe
well
I'm
not.
These
are
actually
classifiers
themselves,
not
the
accuracy.
Okay,
well
they're.
Probably
we
don't
have
to
go
too
deep
into
it,
but
I'm
just
curious
for
the
curiosity's
sake
of
if
you
guys
run
into
this
again,
making
sure
that
you
know
that
this
is
what's
going
on
so,
but
you
can
probably
we
can
probably
safely
assume
that
this
is
what
happened
so
so,
if
hazard
method,
wrapped.
A
A
A
A
A
A
Okay,
not
wrapped
yeah.
Okay,
all
of
a
sudden
we've
got
this
dowel
thing
wow.
They
really
made
this
more
difficult.
Okay,
so.
A
A
A
A
D
A
Well,
yeah
I
mean
this
is
so
we're
in
the
scores
right
and
we're
trying
to
grab
the
doc
string
from
the
score
right,
and
it's
saying
that
there
is
no
such
doc
string.
So
top
k,
accuracy
score.
It's
saying
it's
saying
top
k
accuracy
score
is
not
wrapped.
It
tries
to
grab
the
it's
about
to
try
to
grab
the
doc
string
from
it.
But
let's
see
actually,
let's
let's
confirm
that
it
does.
A
A
D
A
Scores
and
it's
come
across-
it's
come
across.
You
know
one
of
them
that
that
has
been
patched
by
dolph
for
pie
and
this
rock.
Let's
see
what
is
this
rock
rock
x
score
and
alpha
pi
has
changed
this
into
this
one.
That's
present
in
its
own
library,
so
alpha
pi.
A
So,
okay,
dal
rock.
A
Okay
and
they
don't
have
a
doc
string
on
it,
so
so
they
replaced
it
and
they
didn't
preserve
the
dock
string
thanks
guys
all
right.
So,
let's
see
what
should
we
do
here?
So,
let's,
maybe
okay,
if
I
continue
I'll
just
break
out
of
this
loop.
A
A
Okay,
what
do
we
do
here?
So
this
thing
the?
Ultimately
we
really
need
this
to
okay.
I
wonder
if
we
can
do
this,
so
we
could
just
we
can
do
it
at
this
level
and
we
can
say
so.
This
is
the
original.
You
know
scikit-learn
metrics
ranking
rockhack
score
as
multi-class
rocket
score,
so
ideally
what
they
would
do
is
they
would
just
you
know.
I
think
I
think
that
something
like
this
works.
A
A
Now
that
that
hopefully,
will
solve
the
problem,
but
we'll
see
here
so
okay,
so
that
okay,
okay,
so
now
we've
got
this
rock
okay
parameter,
not
a
dock
string
max
fpr,
okay,
so
well,
okay!
So
now
we
were
able
to
successfully
get
the
dock
string
attached,
but
it
looks
like
they
have
added
another
parameter
to
the
talk
string.
So
what
is
this
max
fpr
here?
So
remember,
not
a
dock
string.
A
What
the
hell
guys,
what
are
they
doing?
Okay,
we
can
create
like
an
alternate
mapping.
You
know
if
we
don't,
if
we,
if
there's
some
of
these
things,
that
that
you
know
that
only.
A
A
A
A
A
A
A
A
So
all
right.
Well,
I
think
we're
pretty
close
here
all
right.
So
what
else
do
we
have
to
cover
today?
Because
I
think
I
think
we're
close,
but
let's
make
sure
we
get
everything
so
support
for
multi-output,
so
archive
storage
models?
C
See
I
have
just
put
the
changes
which
I've
made
in
the
model
based
class,
so
I
would
just
need
some
input
on
like
how
I'm
using
the
data
flows
and
what
we
need
to
change.
Okay,
okay,.
A
A
Let's
see
all
right,
let's
just
see
okay
and
then
all
right.
Okay,
that's
this
should
be
quick
here
so
and
then
so
the
use
case
and
stuff
from
multi-output.
Okay.
So
there's
a
lot
to
review
here:
okay,
oh!
No!
It's
because
you've
added
a
use
case.
That's
why
there's
so
many
more
lines.
B
A
Okay,
okay,
it
looks
like
we're
good,
it's.
Okay,
it's
in
progress,
okay,
features.
A
B
Yeah
so
basically,
we
agreed
upon
creating
different
entry
points
for
different
multi-output
models
right,
but
then
I
was
looking
into
the
models
and
it
seemed
like
you
know,
adding
an
unnecessary
layer
to
the
whole
structure.
B
So
I
decided
to
go
with
just
using
the
wrappers
for
either
case
for
regression
or
classification,
and
this
way
we
are
able
to
support
all
the
models
for
multi-output,
and
you
don't
have
to
you
know
worry
about
if
a
specific
entry
point
supports
multi-output
or
not,
and
there
are
different
types
of
multi-output
like
binary
and
multi-labels,
and
all
that
you
don't
have
to
care
about
that.
You
just
kind
of
pass
that
on
to
the
multi-output
great.
A
So,
let's,
let's
just
make
sure
that
we
we
test
multi-output
for
each
model
if
we
say
that
it
wraps
each
model
and
let's
test
the
various
different
modes
as
well,
where
appropriate.
So
does
that
sound
good.
B
You
don't
have
to
review
it
in
detail,
but
just
like,
if
I'm
on
the
right
track.
B
Because,
like
we
discussed
two
weeks
ago,
not
having
a
separate
class
for
multi-output.
A
B
Let's
see
other
other
than
that,
there's
the
scorer
thing
that
I
have
to
figure
out
for
multi-output.
I
figured
it
out
for
before
the
rebase
and
now
I'm
just
you
know,
taking
the
mean
of
all
the
prediction
accuracies
of
predictions
through
the
score.
A
Yeah,
I
would
say
I
mean
this
seems
like
the
type
of
thing
where
we
might
want
to
get
these
psychic
scores
merged
first,
because
yeah,
just
having
mse
for
everything,
is
not
it's
not
the
best.
So
we
definitely
want
to
make
sure
that
we
have
different
different
options
here,
so
you
can
preserve.
You
know
some
of
the
stuff
that
you
you
might
have
been
intending
to
do
right
so
because
then
switching
switching
everything
to
a
different
motive
accuracy
system
is
definitely
gonna,
be
different
right.
So
all
right.
A
A
A
Yeah,
I
don't
know
you
know
what
I
mean
it's
just.
I
think
that
it
makes
sense
from
an
ease
of
use
perspective,
but
I
think
introducing
you
know
the
more
times
you
overload
arguments
the
more
times
things
can
get
confusing
so
because
you
basically
just
move
this
loop
into
here,
rather
than
moving
the
loop
to
the
left
to
the
collar.
You
know.
B
A
So
that
leads
to
an
inconsistency
I
would
say
we
should
move
it
out
unless
anybody
else
feels
strongly
that
it
should
be
in
here.
I'm
definitely,
I
think
you
know
I
I'm
open
to
it
being
in
here,
but
I
also
think
that.
A
Can
we
explain
it?
How
I
mean
all
you've
really
done
here
is
is
take
this
for
loop
right
and
put
it
within
the
predictive
method
right
if
you
move
the
for
loop
to,
if
you
move
the
forwards
yeah,
if
you
move
the
for
loop
to
right
here,
it
would
be
the
same
functionality
right.
B
A
Well,
I
mean,
I
guess
it
depends
how
we
want
this
to
work,
but
right
now
right
now,
if
you
use
a
model
and
you
ask
for
it
to
make
a
prediction
it'll
set
the
feature
name
that
you
want
to
predict
in
the
record
to
you
know
contain
that
that
value
and
that
confidence
for
that
prediction
right,
and
so,
if
you,
if
you
predicted
multiple
things,
let's
see
I
mean
where
you
have
yeah.
What
you
have
here
does
the
same
thing
right.
A
It
goes
through
and
it
says:
there's
multiple
predictions
and
they're
all
of
this
value.
With
this
confidence
right,
yeah
it's
it
should
be
the
same
thing
right:
okay,
it's
really
just
moving
this
loop
out,
all
right,
yeah,
okay
and
then
yeah.
So
then
we'll
wait
for
the
scorers
here
and
hopefully
we
can
wrap
these
scores
up.
A
Okay,
so
then,
okay
and
then
this
stateful
stuff,
so
hopefully,
okay,
so,
okay,
yeah,
we
talked
about
how
we're
gonna
take
the
yeah,
we're
gonna
set
location
as
a
separate
property.
Awesome,
okay,
so
term
circuit
directory,
let's
get
directory
get
directory.
A
C
Okay,
but
the
degree
director
is
created
only
when
location
is
a
file
right.
A
But
if
it's
not
a
file,
when
we
exit
the
usage
of
the
model,
then
we
won't
remember
to
free
the
temperature,
so
just
in
case,
because
there's
that
possibility,
so
you
have
the
close
location,
perfect,
very
nice.
A
A
D
A
Okay,
so
I
think
we've
said
that
the
the
zip
and
tar
are
operations
right
now
already
like
tar,
already
sports
compression.
So
I
think
we're
going
to
leave
off
the
compression
operations
from
these
data
flows.
C
Right
now,
the
thing
is
that
like,
if
I
want
to
create
a
dart
or
gz,
I
need
to
pass
the
so
if
you
want
to
try
which
is
not
really
right
now
so
so
to
just
make
it
uniform.
I
made
both
of
the
like
open
extract
and
parking
both
as
data
flows,
but
it
could
happen
in
simply.
A
Okay,
so
you
so
so
just
to
so
for
to
clarify
again
so
you're
saying
that
on
the
creation,
if
it's
on
the
creation
you'll
need
the
compression
on
the.
If,
if
it
yeah
right
on
the
creation,
you
need
the
compression,
but
on
the
okay
wait
I
don't
need
compression,
you
don't
need
it,
but
you
you've
included
it
for
symmetry.
Okay,
yes,
all
right
great!
A
A
C
C
So
I
would
think
of
putting
it
in
a
more
concise
form
like
right
now.
It's
looking
pretty
verbose.
A
C
A
I
don't
think
we
should
include
the
compression
stuff
in
this
because
the
goal,
so
so,
let's
let
me
just
say
so.
The
next
thing
that
I
was
going
to
say
is
that
I
think
what
we
should
do
here
is
have
so
remember.
We
talked
about
we'll,
have
the
location,
property
and
then
we'll
create
a
data
flow
for
extraction
and
we'll
create
a
data
flow
for
archiving
right,
and
so
that's
what
you've
done
right,
but
but
to
some
extent
we
want
to.
A
We
want
to
tweak
it
a
little
bit
and
we
want
to
have
a
two
more
properties,
one
that
would
be
like
you
know,
location,
save
and
location
load
and
those
would
be
data
flows
and
and
their
default
value
would
be
none
and
if
their
default
or
if
they
are
none
and
location,
is
file.
Then
we
create
the
data
flow
and
we
run
the
data
flow.
A
If
they
are
not
none,
then
we
run
the
data
flow
and
we
pass
the
temporary
directory
and
the
location
to
the
data
flow.
So
does
that
have
any
questions
on
that.
A
Okay,
cool
yeah
and
that's
just
a
slight
tweak
to
what
you
have
here
so
yeah
so
basically
add
two
more
properties
right
and
you
know,
data
flow
data
type
and
default.
None
if,
if
it
is
none
and
the
location,
is.
A
Yes,
the
model
config
yeah.
So
can
we
add.
A
Okay,
yeah:
in
the
same
way,
we
check
for
the
existence
of
location
location,
don't
assume
that
a
model
config
has
a
location
property.
I
don't
assume
that
it
has
saved
location,
save
or
location
load.
A
This
will
be
of
type
data
flow.
Okay,
don't
assume
that
it
has
location,
saver,
location,
load,
type,
data
flow
right
so
because
not
all-
and
then
so
don't
don't
go.
Add
these
to
each
model
yet
just
implement
for
a
test
model,
and
that
could
be
something
like.
Let's
see
do
we
have.
A
A
First.
Okay,
so
let's
add
two
more
properties
from
autofit
check
and
make
sure
don't
assume.
Okay,
so,
which
is
essentially,
you
know,
has
adder.
A
Okay,
so
don't
assume
the
location
is
save
the
location
load
exists
within
the
config
use
has
adder
to
check
there
up
type
data
flow
just
implement
for
the
slr
model,
we'll
add
two
other
model
configs
in
a
separate
or
in
next
pr
right.
So
we're
implementing
the
data
flow
support,
we're
testing
it
with
the
slr
model
and
then
we're
making
sure
that
it
works
for
all
the
other
models.
A
So
okay,
so
yeah,
and
then,
let's.
A
Let's
take
the
data
flows,
you
have
the
data
flows,
you
are
creating
and
make
them
static,
aesthetically
defined
somewhere
likely
the
archive
or,
let's
see,
okay,
so
yeah,
let's
make
them
statically
defined
somewhere,
so,
okay,
so
yeah,
because
we
could
just
static.
So
we
will
statically,
define
these
data
flows
and
then,
instead
of
adding
the
input,
path
and
output
path
and
the
seed
values,
we'll
just
add
them
when
we're
running
so
we'll
add
them
basically
on
the
call
to
run
here,
we'll
add
those
inputs.
A
It
small
size
data
flows.
D
A
A
A
Okay,
then
you
just
pick
the
correct
data
flow
and
you
yeah.
You
just
pick
the
correct
data
flow
and
you
run
it.
You
know
you
can
use
the
same
if
block
to
pick
a
static
one,
let's
see
so
compression
op.
Okay.
So,
let's
see
what's
going
on
with
compression
operate
so
get
operations.
A
A
C
We
already
check
if
it
is
a
file
or
not,
so
it
is
a
file
only
then
if
we
end
up
in
the
situation.
A
A
Okay,
okay,
so
let's
just
make
sure
that
that's
supported.
So
let's
make
sure
that
if
this
is
the
first
time
we're
using
a
model,
then
we
create
an
archive
of
the
saved
state.
Okay,
yeah,
okay,
yeah
I
mean
I
don't
know
yeah.
I
guess,
let's
see
so
yeah
that
compression
off
stuff-
I
don't
know
you
might
just
want
to
move
that
instead
of
statically
defining
you
might
want
to
just
move
this
get
operations
thing
into
its
own,
like
yeah,
move
this
into
something
else
and
then
have
it
return.
A
Yeah,
well,
you
know
me
me
neither
with
the
whole
data
flow
thing.
There
is.
We've
talked
about
this
many
times
before.
There's
got
to
be
a
less
verbose
way
to
define
these
data
flows.
We
just
haven't
invested
any
time
into
figuring
out
what
it
is
yet
so
so
you're
you're
you've
joined
the
boat,
we're
all
we're
all
wishing
there
was
a
quicker
way
to
define
these.
So
for
some
time
I
was
thinking.
A
Yeah,
there's
not
that's
it's,
it
is
pretty
verbose.
Yeah
and
that's
you
know
the
shorthand
command
and
stuff
like
all
of
this
stuff
is
like
well,
how
do
you?
How
do
you?
How
do
we
end
up
with
something?
That's
less
verbose,
we're
not
sure
yet
right
so
yeah
you
may
you,
may
you
may
come
up
with
some
stuff
as
you
do
this.
So,
let's
see
so.
A
Yeah
yeah
you
may
have-
hopefully,
hopefully
you
can.
You
can
come
up
with
some
ideas
here,
alright,
so
let's
take
the
data
flows,
you're,
creating
and
and
put
the
helper
function
to
create
down
defense.
A
A
A
Okay,
so
yeah,
so
we'll
move
this
stuff
into
the
helper
into
its
own
file,
and
then
we
can
at
least
you
know,
refactor
it
from
there
and
not
have
to
have
it
clutter
up
the
model
stuff
and
then
we'll
make
sure
that
you
can
do
it
from
something
that
doesn't
exist
so
like
the
file
doesn't
exist
yet
and
then.
A
Okay
and
then
we'll
do
the:
where
did
my
comment?
Go?
Okay
there
it
is
and
then
okay
and
then
okay,
we
archive
it,
save
it
to
the
location.
A
Okay,
so
and
then
I
think
we
need
to
make
definitions
for
these.
Let's
see
I'll
show
with
this
okay,
so
infopath
output.
A
So
I
think
we
probably
need
a
way
to
also
define
this
is
gonna,
be
a
thing
so.
A
Okay,
we're
gonna
have
to
think
about
this,
okay,
so
what
I'm
realizing
is-
and
this
is
something
we'll
have
to
deal
with
later,
but
this
will
probably
be
spun
off
of
the
shorthand
stuff
and
all
of
the
rest
of
everything,
so
we'll
probably
need
to
have
a
way.
This
will
probably
need
to
be
the
way
that
we
run.
The
data
flow
will
probably
need
to
support
all
of
the
rest
of
the
data
flow
running
stuff
so,
for
example,
adding
extra
inputs
or
running
it
in
a
different
orchestrator,
or
things
like
that.
A
So,
let's
see
what
happens
when
we
run
a
data
flow
run
command
config.
So.
A
Yeah,
we'll
need
to
be
able
to
configure
the
orchestrator,
we'll
need
to
be
able
to
configure
the
inputs.
Okay,
we'll
have
to
think
more
on
this,
but
we'll
we'll
implement
the
basic
support
for
now,
but
I
think
that
you
know
sort
of
long
term
we'll
need
something
that's
like,
maybe
we'll
just
we'll,
probably
okay.
A
This
will
probably
be
a
good
thing
when
we
okay,
so
when
we
unif
okay,
perfect
okay,
when
we
get
all
the
config
stuff
unified,
what
we
can
do
is
we
can
take
the
run
data
flow
command
and
the
run
data
flow
or
the
run
data
flow
operation.
A
The
run
data
flow
operation
could
be
the
same
as
like
the
run
single
run
context
from
the
command
line
and
it'll
have
the
config,
and
then
we
can
reuse
the
operation.
We
can
reuse
the
config
so
that,
when
you
say
location
save
it
can
actually
be
the
config
for
a
you
know.
It
can
actually
be
this
whole
config
type
of
thing
and
it
just
runs
the
it.
It
runs,
essentially
the
operation
underneath
it
using
the
configured
the
configured
operation:
okay,
okay,
okay,
we'll
be
fine,
we'll
be
fine.
A
I
just
wanted
to
think
about
that
for
a
little
second,
there,
okay,
so
okay!
So
let's
just
okay!
So
we
need
to
make
definitions
for
these.
So
let's
make
sure,
let's
just
make
sure
that
that
the
let's
make
sure
that
we
always
pass
these
to
the
data
flow
as
some
standard
definition,
a
name.
So,
for
example,
maybe.
A
Like
like
yeah
model,
location
and
model
tempter,
or
something
like
that,
and
then
we'll
probably
then
we'll
try
to
make
this
all
configurable
when
we
get
to
a
later
stage,
but
we'll
just
you
know,
set
the
definition
names
as
that,
and
then
let's
make
sure
that
these
data
flows,
you
know,
are
updated
that
you
update
the
flow
to
grab
from
you,
know,
model
location
or
model
temperature
right.
Does
that
make
sense.
C
A
Okay,
all
right,
I
think,
are
we
good
on
this?
Do
you
have
enough
to
to
chew
on
here?
Yes,
all
right,
great
cool?
Okay,
so
we
talked
about
that.
We
talked
about
okay,
so
the
one
thing
that's
left
here
is
still
we
got
this
scikit
stuff
is
being
annoying
and
okay
oops.
A
Let's
see,
we've
got
here,
we
do
this.
C
A
Let's
see
okay,
great
man,
why
wouldn't
it
let
you
guys
back
in.
B
A
That's
weird:
okay,
all
right
yeah!
So
let's,
let's
see-
and
I
was
going
to
get
some
try
to
figure
out
some
scheduling
stuff,
so
we
could
do
a
one-on-one
sudhanshu.
So
I'll
get
back
to
you
on
that.
So
let's
wait
that
it
wouldn't
let
you
guys
in
so
all
right.
So
all
right,
the
one!
What
else
is
there
anything
else
we?
We
must
talk
about
today,
because
I
think
we
might
try
to
wrap
this.
B
So
is
it
okay
to
if
I'm,
you
know
taking
mean
of
all
the
predictions
for
multi-output
of
the
accuracies
or
predictions.
A
Okay,
are
you
talking
about
for
the
confidence
or
what
it
wait
for?
Oh
for
the
accuracy,
oh
on
multi-output.
A
B
Because
it
was
a
lot
simpler
in
the
before
we
got
it
merch,
because
a
multi
output
has
response
quarter,
but
now
we
are
using
our
scores.
So
we
have
to
come
up
with
something.
A
A
A
So
all
right,
so
this
brings
up
an
interesting
question
now
I
think
we
may
want
to
make
the
scores
take
a
feature
to
score
like
the
feature
that
they
should
care
about,
because
right
now
grabbing
into
model
predict
I
mean
everything
should
be.
You
know
if
we
grab
into
model
predict
and
then
all
of
a
sudden
model
predict
supports
arrays
of
features
then
yeah,
okay.
Well,
then,
we
can
just
change
that,
but
it
seems
like
yeah.
A
Yeah,
I'm
just
thinking
here
all
right.
So
what
do
you
guys
see?
What
I'm
saying?
I'm
think
I
think.
Maybe
we
should
because
model
says
predict
right
and
then
the
score
just
is
going
to
infer
that
we
want
to
grab
whatever
the
model
was
predicting
right,
but
I
think
it
might
make
sense
to
have
the
score.
A
Take
a
config
where
it
says
you
know
the
feature
that
you
wanted
to
to
do
the
score
of,
and
then
in
this
case
we
would
say,
you
know,
feature
equals
salary
or,
like
you
know,
score
some
something
right.
The
feature
that
we
wanted
to
care
about
is
salary,
and
then
this
score
knows
about
that
right
and
then
we.
A
A
Yeah,
I
believe
so
I
I
think
so
so
because
this
this
this
example
on
the
front
page.
A
Okay,
so,
but
the
thing
is
from
the
implementation
of
the
score
side
of
things
right,
a
scorer.
What
is
it
what
it?
What
is
the
score
going
to
do
if
it
sees
multiple
predictions
rights?
So
in
here
you
would
say
you
know.
So
what
is
this
classification
accuracy
do?
Essentially,
if
it,
if
it
I
mean,
I
guess
it,
would
it
would
iterate
over
so
the
thing
is,
it
would
have
to
iterate
over
self.parent.config.org.
A
Oh
yo,
that's
what
that
was
okay,
so.
A
A
A
A
A
Two
yeah
yeah
because
the
underlying
model-
yeah,
okay,
okay,
let
me
just
take
a
look
let's,
let's
I
mean
whatever
you've
done,
obviously
is
working
right
now,
so
I
need
to.
I
didn't,
get
a
chance
to
read
the
notebook.
So
so
let
me
just
get
back
to
you.
Then
it
sounds
like
it
sounds
like
that.
That
makes
sense,
yeah
cool
cool,
all
right.
So,
let's
see
what
else
do
we
have
so?
A
Basically,
this
stuff
here,
the
numpy
dog
string
args,
it's
mad
about
it's
mad
about
this
guy,
not
having
max
f
pr
parameter,
not
in
dock
string.
A
A
A
A
A
A
A
Yeah
well
now
it
looks
like
there's
no,
no
max
fpr,
so
yeah,
that's
weird!
All
right!
I
don't
know
what
to
do
with
this
yet,
but
you
know
you
could
always.
You
could
always.
A
You
could
always
throw
it
through
a
try
block
on
it
or
something.
So,
let's
see,
I
would
say
yes
so
there's
this
numpy
one
and
then
you
could
say.
A
Yeah,
I
don't
know
what
to
do
with
these
right
now.
I
would
say
if
you
can
try
to
grab,
I
would
say,
install
the
psych
there,
the
dow
stuff
see
if
you
can
get
it
with
it,
patched
see.
If
you
can
grab
see
if
you
can
see,
if
you
can
see
if
you
can
somehow
create
the
parameters
based
off
of
the
real
psychic
version
of
things,
and
you
know,
if
not
don't,
spend
a
ton
of
time
on
this,
but
but
we'll
see
we'll
just
see
what
happens.