►
From YouTube: Implementing Dataset Sources 2021-03-04
Description
Yash and John met and worked out how existing datasets could be integrated into DFFML as sources.
We came up with a way to leverage the existing source infrastructure. We wrote new WrapperSource(Context) which can be used to wrap an existing instantiated source.
We also came up with a decorator which allows us to wrap functions (context managers) and create sources out of them.
We implemented a source for the iris training dataset.
Design decisions behind existing source infrastructure were touched on, as well as config infrastructure and the importance of unification there. Entrypoints and plugins in general were also touched on.
B
B
Let's
see,
let's
do
csv
and
then
dfm
source.
Maybe
we
should
make
a
directory
here
and
do
more
source
cached
or
maybe
this
is
like
yeah.
Let's
just
say
what
are
these?
These
are.
Data
sets
right
source
data
set,
so
let's
make
a
folder
called
dataset.
B
So
let's
take
a
look
at
that
tutorial
that
we
have
docs
tutorials
model,
tensorflow
or
yeah.
What
is
it
iris
is
a
good
name
for
it
all
right.
B
B
Thing,
let's
see
what's
the
best
way
for
us
to
do
this,
do
we
subclass
from
cs
resource
now
we
want
to.
We
want
to
be
a
source,
and
then
we
will
sort
of
provide
transparent
access
into
the
csv.
So.
C
B
B
Okay,
so
yeah
there's,
so
there's
really
no
config
for
this
thing,
because
it
just
is
what
it
is
or
I
guess
we
could
have
a.
We
could
have
a
config
with
the.
D
B
All
right
so
yeah
we
could
have
you
know:
url
stir
equals
field
to
download
data
set
from
since
there's
like
multiple
now
there's
multiple
places
you
could
get
this
from
and
I
think
we
have
it
actually
cached
within.
B
I
think
we
have
it
uploaded
to
github
issues
actually,
so
that
we
can
avoid
problems
with
that
at
download
site.
Where
did
it
go?
It
might
be
in
the
test
tests.
B
No
all
right,
I
swear,
we
put
it
in
there,
but
oh
well!
It's
in
here.
D
B
Oops
and
then
we
should
just
be
able
to
basically
take
the
cash
download
on
pack
archive
function
or
the
cache
download
function
and,
let's
see
so,
would
be
on
a
enter
so
load
fd,
temp
fd.
So
this
is
memory
based,
so
we
just
want
to
wrap.
Then
we
have
an
existing
source
of
graphs
yeah.
We
do.
We
have
data
the
data
flow
source,
so
df,
that's
actually
a
good
one
to
use
as
a
base.
Here.
B
B
B
A
So
the
source
would
be
named,
data
set
source
and
then
you
have
a
config
option
on
what
data
set.
Do
you
want
to
fetch
like
iris
for
iris,
and
then
we
have
a
dictionary
of
data
sets
or
some
something
like
that,
where
it
checks?
What
data
set?
Are
we
fetching
and
then
we
fetch
it
whatever
we
want
with
python.
B
Well,
I
think
that's
the
thing
is
you
know
the
way
that
you
implement
fetching
it
would
be
with
source
right
like
if
we're
doing,
if
we're
thinking
about.
What's
the
dfml
way
of
doing
this,
you
know
we
we
that
when
you
say
like
choose
the
correct
implementation,
choosing
the
correct
implementation
is,
you
know
making
a
source.
You
know
what
I
mean.
B
It's
kind
of
like
that
this
this
ends
up
being
just
like
the
scikit,
just
like
the
way
that
we
implemented
the
scikit
stuff.
A
B
A
So
like
having
a
separate
source
for
each
and
every
data
set
like
it
it,
it
doesn't
sound,
not
intuitive
to
me,
but
what
I
particularly
thought
was
when
I
mentioned
this
was:
if
you
have
this
data
set
source,
and
then
you
mention
source
data
set
and
whatever
source,
and
then
what
the
data
set
name
actually
is,
and
you
pass
it
iris
and
then,
since
iris
is
csv
or
I
don't
think
it
particularly
matters,
but
we
can
just
fetch
iris
in
whatever
way
we
want
and
then
feed
it
through
the
source.
B
Okay,
I
think
you'll
find
that
the
implementation
of
that
ends
up
being
this.
B
If
that,
like
I
see
what
you're
saying
you're
saying
basically
key
off
of
you
know,
look
you
have
sort
of
like
maybe
a
dictionary
that
says:
okay
like
what
do
I
do
for
iris,
right,
okay,
download
it
and
then
I
I
I
download
it.
I
do
the
string
transforms
on
it
and
I
open
a
csv
source
right.
B
B
Let's
see
so
you
end
up
saying:
okay,
this
source
is
an
alias
okay,
so
you
have
this
essential
source.
So
basically
we
would
have
a
key
here
and
you'd
say
you
know,
which
one
do
you
want
me
to
do
and
then,
within
the
a
enter
method
here
we
would
say:
okay,
if
iris
do
blank
so
yeah.
Let's
see,
I
think,
I'm
just
thinking,
because
these
may.
A
So
the
way
you
are
doing
this
like
here,
you
meet
training,
url
and
test
url
as
config
options
right,
but
shouldn't
this
be
like
training.
Data
set
should
be
a
separate
source
and
then
test
should
be
a
separate
data
source.
B
Okay,
let
me
just
let's
see
because
yeah
we
would
have
iris
train
iris
test,
because
I'm
just
thinking,
I
think
part
of
the
reason
why
this
might
look
like
look
like
it
does
is
because
we
just
we
have.
We
need
these
base
classes
first.
So
now,
if
we
have
this
data
set
source
and
data
source
context
right,
then
we
end
up
with.
B
This
so
url
and
url.
B
You
could
do
it
like
that
yeah,
you
could
do
it
like
that
yeah,
it's
just
then
how
do
you?
Okay?
So
then
we
have.
We
have
essentially
end
up
with
this
giant
dictionary
of
things
that
map
iris
training
to
iris.
It's
just.
We
we've
ended
up
with
a
new
way
of
doing
a
plug-in
system.
Now
you
know
what
I
mean
yeah
and,
and
you
can
effectively
get
the
same
thing
by
doing
it.
This
way
you
just
might
want
to
change
the
syntax
right.
B
So,
for
example,
like
I
see
what
you're
saying
like
from
an
ease
of
use
implementation
perspective,
it's
you're
saying
it
it's
easier
for
me
to
implement
a
source.
If
I
do
it
this
way,
right,
like
it's
more
straightforward
but
yeah,
and
what
I'm
saying
is
you
can
create
the
same
thing
by
doing
a
wrapper
around
the
sort,
the
existing
source,
implementation
right
and
now
you've
now
you've
avoided
creating
a
new
plugin
system
right,
because
now
you
end
up
with
multiple
instances
of
our
plugin
system.
B
Right
you
have
one
instance:
that's
you
know
the
way
that
we
do
it
and
then
you
have
this
other
way.
That's
like!
Okay,
now
let
me
key
off
this
thing
and
load
this
other
thing,
and,
and
you
run
into
this
thing
where,
if
you
have
an
edge
case
now,
you
need
to
so
say
you
have
an
edge
case
wherein
a
data
set
is
authenticated
behind
something
right,
say:
you're
doing
kegel,
and
now
you
need
some
kind
of
authentication
parameters
right
to
download
this
data
set.
C
B
Does
that
make
sense
yeah,
okay,
so
the
other
thing
is
that
okay,
so
yeah,
that's
one
thing:
okay,
so
this
and
then
so
this
is
so.
I
think
the
thing
is
that
that
what
we
can
make
this
in
such
a
way
that
it
doesn't
it
doesn't,
it
doesn't
end
up
being
as
cumbersome
as
it
is
now
right,
so
because
currently
we're
we're
making.
So,
let's,
let's
look
at
the
way
that
this
would
ideally
work
right.
So
let's
say
def
def
iris
data
set
or
data
set.
B
B
B
B
B
B
This
would
be,
it
would
look
more
straightforward
if
it
wasn't
decorator,
but
you
have
to
decorate
it
right.
You
know
what
I
mean:
yeah,
yeah,
okay,
so
and
then
we
just
say
return
func
or
no.
We
do
return
csv
source.
B
Func
cache
download
contents
hash
and
then
pathlab
dot
path,
home
directory.
B
Dot
cash
dfml,
datasets,
iris.
D
B
Right,
okay,
so
we
download
the
file.
We
return
to
csv
source
with
the
you
know,
with
the
appropriate
call
or
yeah.
We
don't
need
that.
B
Yeah
download
so
iris
training
right,
so
this
is
the
with
this.
Is
this
kind
of
like
what
you
would
be
looking?
For
so
say,
cash
download
was
really
just
you
know
this
function
here
right
then
this
would
be
the
kind
of
simplicity
of
implementation
that
that
we're
after
right
is
that
correct
or
is
there
a
simpler
implementation.
B
Okay,
yeah,
so
so
this
this
yeah.
So
this
is
what
it
looks
like
here
and
then
I
think
you
know
we
also
have
to
do
what
is
this?
We
also
have
to
make
where's
that
scd
command
yeah.
Okay,
there's
that
scd
command.
B
And
I
had
to
do
this
regex
sub
thing
the
other
day
and
I
totally
screwed
up
the
order
of
the
arguments
and
couldn't
figure
out
what
was
going
on.
I
was
like:
why
is
it
not
so
like
I
can't
find
it.
I
totally
screwed
up
the
order,
arguments,
okay,
replace
and
then
string
yeah
that
it's
not
intuitive
to
me.
I
don't
understand
that,
but
okay,
I
feel
like
it
should
be
string
and
then
replace,
but
I'm
not
gonna
burn
that
into
my
brain
anymore,
okay,
read.compile,
okay,
now!
B
Here's
the
thing
is
that
if
that
file
exists,
then,
if
that
file
exists,
okay,
we
want
to
download
the
file.
This
is
a
trick
yeah.
If
we
want,
we
want
to
download
the
file
and
now
the
problem.
The
problem
with
the
cache
download
function
here
is
that
we're
immediately
going
to
end
up
here,
let's
just
create
a
new
file,
we'll
immediately
end
up
with
a
original
we're
gonna
we're
gonna,
modify
it
as
soon
as
we
download
it
have
been.
You
know,
that's
no
good!
So
all
right,
let's
just
do
redox.
B
B
The
replace
I
just
oh
man,
we
don't
need
to
do
that
or
yeah.
We
do.
We
need
to
say,
okay,
replace
path,
equals
original
path,
dot.
B
B
B
B
Do
you
want
to
keep
doing
this
or
is
this?
Should
we
take.
B
Cool
all
right,
so
yeah
entry
by
name
replace,
so
we
do
replace
dot
dot
title
and
then
we
replace
so
we
create
we're
creating
a
class
name
here.
A
B
Right
so
now
it
should
be
formatted
camel
case
and
we'll
say
you
know,
plus
data
set
source.
So
now
we
have
iris.
So
this
should
give
us
iris,
train
data
set
source.
D
D
B
B
I'd,
like
would
be
great
if
we
could
auto
create
a
config
and
because
that
would
allow
us
to
configure
the
operat
that
would
allow
us
to
take
any
function
and
create
a
config
out
of
the
arguments
and
that's
powerful,
because
then
we
can
obviously
use
the
existing
config
infrastructure
to
instantiate
or
call
in
this
case
that
that
function
right,
and
so
this
is
something
that
we
don't
really
have
yet,
but
I
guess
we'll
just
have
it
right
now:
where's,
my
stupid
caps
lock
come
on
okay,
so
because
we
have
this
make
config
function.
B
B
This
is
just
a
wrapper
around
this
yeah.
We
need
to
document
that
thing.
Okay,
so
what's
the
syntax
here,
okay,
yeah!
Okay,
just
so
we
parse
the
arguments.
B
B
B
I
think
yeah
there
weren't
type
hinting
default
values.
So
what
we
had
to
do
is
we
had
to
parse
the
numpy
docs
string
to
grab
the
default
value,
so
we
have
code
to
do
this.
May
config
numpy
we'll
do
this?
Did
we
get,
I
think?
Actually,
where
is
oh,
we
may
have
code
to
do
this
actually
model
dfml
util.
B
Make
config
numpy
make
conflict
tensorflow
that
parses
a
google
style
doc
string,
but
we
don't
have
any.
Oh,
my
god
did
we
really
not
do
the
like
most
basic
case.
I
think
so.
I
think
that
there
should
be
one
model
pie
torch.
I
think
sakshan
was
going
to
implement
this
in
model.
Pi
torch.
B
Oops
yeah,
alright
make
pie,
torch,
config,
inspect
pie,
church
brands.
Okay,
we
just
need
to
take
this
out
of
pi
torch,
because
this
is
already
what
we
want.
This
is
exactly
what
we
want
so
because
none
of
this
is
pi
torch
specific.
So
let's
just
take
this
out
of
pi
torch,
oh
fantastic!
B
All
right
so
and
where's
make
config
so
make
you
tell
config
util
config
numpy.
So
let's
just
do
inspect
or
well.
We
don't
want
to
name
it
inspect.
It
might
get
confused.
B
I
think
this
is
what
we
want
here.
So,
let's
see,
config
is
dot,
util,
dot,
dot,
dfml,
dot,
dot,
okay,
sweet
yeah,
wow,
right,
hey!
That
was
easy,
okay,
so
now
we
just
need
to
call
that
so
we
can
just
say,
make
config
inspect
and
then
pass
it.
The
okay
entry
point
name
and
then
we
need
a.
We
need
a
funk
to
decorate
here.
B
B
So
we
wrap
yeah,
we
do
we
wrap
funk
and
we
turn
wrapped.
So
this
is
a
wrapped
funk,
and
now
these
are
the
args
and
kw
orgs.
B
So
these
are
now
you
know
these
guys,
so
ours,
that's
this.
So
then
we
call.
B
A
B
You
know:
should
we
open,
we
should
probably
should
we
be
doing
a
context
manager,
here's
my
question
because
then
we
can
yield
and
that's
usually
helpful.
We
should
actually
be
doing
a
context
manager
here.
So
let's
yield
csv
source
because
you
know
there's
always
times
when
you
want
to
open
temporary
resources
and
stuff,
and
we
just
do
that
now,
then
we
won't
need
to
worry
about
it
later.
You
know
what
I'm
saying.
A
B
Yeah,
so
when
the
source
closes,
we
could
clean
up
stuff
right,
for
example,
we
could
just
destroy
this
yeah.
We
could.
We
could
keep
this
this
modified
file
in
temp
if
we
wanted
to
so.
Let's
just-
and
that
simplifies
this
case,
because
when
we're
always
doing
is
async
gen,
because
then
because
or
else
you
have
to
handle
the
case,
whether
it's
a
routine
or
well,
you
you
wouldn't
because
you're
just
looking
for
a
return,
is
async
generator.
B
B
So
then
we
can
just
do
yield
okay
and
so
wrapper,
wrapped
funk
tools
wraps
funk,
and
we
should
also
say
we
should
say
you
know,
funk
equals
context,
lib,
dot,
context,
manager,
funk
okay,
so
we
create
a
con.
So
we
wrap
funk.
We
we
take
the
entry
point,
we
create
the
class
name,
we
wrap
funk.
We
turn
it
into
a
context
manager
because,
or
else
it's
just
going
to
be
a
generator
turn
it
into
a
context.
C
B
We
yield
to
it
so
our
wrapper
or
wrap
function
yields
the
result.
The
wrapper
returns
the
wrapped
function,
and
now
we
make
a
class,
we
make
a
dataset
source
class
where
config
is,
let's
see
the
config
is
you
know
whatever
the
config
for
the
function
is
and
okay
wrapped
funk
over
funk?
Okay,
return,
wrapped!
B
B
This
wrapper
we're
writing
this
this
this
decorator
here
that
takes
the
function
and
you
know,
creates
a
source
out
of
it.
So
now,
if
you
look
at
the
op
decorator,
what
the
op
decorator
does
is,
it
adds
the
implementation
that
it
creates.
So
it
creates
an
operation
implementation
and
it
sets
it
as
a
property
on
the
function
that
it
was
wrapping
and
that
way
we
don't
actually
change
the
function.
B
So
if
you
return,
for
example,
if
we
were
to
create
a
class
and
return
the
class
now
iris
training
that
variable
that,
because
the
function
is
also
a
variable
and
the
variable
iris
training
is
a
class
now-
and
I
can't
just
call
that
class
anymore
right
so
actually
yeah.
So
we
have
a
bit
of
a.
We
have
a
bit
of
a
thing
here
so
because
we've
wrapped
it
we've
wrapped
it
as
a
context
manager.
B
B
All
right
so
right
now
I
could
do
with
iris
training,
as
you
know,
iris,
or
with
iris
training
as
source
and
now
there's
no
source
equals
csv
source
and-
and
you
know
I
can
go
from
there
right.
B
Csv
source-
and
so
we
can
do
this
right
in
this
case,
we
return
wrapped
here
right.
So
this
is
this
is
and
we'll
we'll
just
delete
this
for
now.
So
this
is
this
case
right
now
we
have
the
option
of
returning
the
class
right,
and
in
that
case
we
will
create
this
class,
and
when
we
you
know,
then,
if
we
call
iris
training
or
iris
training
itself
in
this
case
is
in
instance,
is
this
data
set
source
subclass.
B
So
the
advantage
of
this
is
that
when
we
list
all
the
entry
points
yeah
when
we
list
all
the
entry
points-
or
the
advantage
of
this
is
when
you
register
the
entry
point,
it's
clear
that
you're
pointing
directly
at
this
iris
training
function.
B
B
And
so
so
and
then
the
third
option
is,
you
know,
set
it
as
a
property
right.
So
now
you
can
say
you
know,
and
this
is
what
we
do
with
op.
B
So
now
we
have
so
source
is
an
instantiated
csv
source
or
you
know
so
iristrain
dot
class
where
iris
training
dot
class
is
you
know,
iris
trained
data
set
source.
B
And
in
this
case
we
say,
you
know,
we
return
wrapped
and
we
say
wrapped
dot
class
equals
class.
Where
rap.class
is
this
right
and
now
we
have
access
to
both
things.
So
I
think
I
think
that
this
is
actually
the
this
sort
of
answers.
Our
question
here
with
the
with
regards
to
like
what?
What
should
we
do
in
this
case?
You
preserve
this,
and
this
is
why
op
does
it
the
way
it
does?
Is
you
preserve
the
ability
to
still
use
the
thing
right?
B
Yeah,
okay,
so
so
yeah!
So
now
now
we
have
this
class.
Now
we
have
this
yes,
so
now
we
have
class,
which
is
a
trained
data
set
source,
and
we
have
you
know,
training
and
we
can
just
use
that
to
create
a
csv
source
if
we
wanted
to
right-
and
that
is
nice
from
the
perspective
of
creating
unit
tests
right
and
it's
just
like
also
just
because
this
is
the
code
that's
there
like.
If
I
want
to
be
able
to
call
a
code,
that's
there.
B
You
know
that's
nice
to
be
able
to
do
from
from
an
end
user,
like
from
from
a
random
developer
perspective.
Looking
at
the
code
like
hey
what,
if
I
just
wanted
to
call
this
function
and
not
have
any
of
the
wrapped
stuff?
B
B
All
right
so
wrapper
search
context.
This
is
just
a
wrapper
around
you
know,
source
context
pass
through
complete
pass-through,
enter
and
exit
whenever
and
here's
a
wrapper
source-
and
it
is,
you
know,
wrapper
around
whatever
it's
config
sources,
let's
see,
and
actually
maybe
we
should
now
make
that
config
source.
We
should
probably
make
that
source
because
yeah,
because
we
don't
wanna,
we
don't
know
if
this
thing
is
going
to
have
a
config
somehow
so
now,
basically,
whatever
we
said
is
okay
yeah.
B
B
As
source
bam,
I
think
this
is
it:
hey,
enter
yeah
data
set
source.
Okay,
so
I
think
this
is.
I
think
this
is
it
yeah,
okay,
cool,
so
all
right,
yeah,
so
we've
got
this
source
here.
That's
the
wrapper
source
right,
and
this
basically
just
says:
okay,
you
know
my
okay
well
yeah.
So
let's
do
source
okay,
so
this
would
be
an
instantiated
source,
so
self.source
equals
instantiated
source.
Okay,
yeah,
just
a
in
our
method,
a
or
method
should
always
for
certain
self.
So
that's
not
really
necessary.
B
We
can
just
await
that
so
yeah.
So,
basically
the
wrapper
source
enters
the
source.
The
wrapper
source
context
enters
the
source
context
very
straightforward,
just
pass
through
on
everything
here
and
then
the
data
set
source
basically
says.
B
Okay,
I
have
this
wrapper
property,
you
know
and
that's
or
this
wrapped
wrapped
property
right,
which
is
this
function
that
that
that
we've
wrapped
we,
the
and-
and
I
have
some
kind
of
config
right-
and
so
my
config
at
the
point
of
a
enter
the
configs
instantiated
with
all
of
its
properties,
so
yeah
the
config
will
be
instantiated
with
all
of
its
properties.
So
we
can
just
say
export
and
that
will
return
a
dictionary
and
we
can
expand.
B
We
expand
the
dictionary
with
kate,
you
know,
keyword,
argument,
expansion
and
now
so
we're
calling
this
wrapped
function,
which
is
this
guy,
wrapped
and
we'll
do
keyword,
argument,
expansion,
which
calls
iris
training,
which
does
keyword,
argument,
expansion,
which
says
you
know,
url
equals
url
and
contents.
Hash
equals
contents
hash
which
are
already
set
from
their
default
values.
So
they'll,
probably
just
you,
know,
they'll
get
the
same
thing
unless
they're
overwritten
and
then
we
call
the
function.
So
we
do
this
cache
download.
We
can
optimize
this
or
we
can.
B
We
can
make
this
more
user
friendly
later
so
cache
download
decorator.
We
decorate
the
funk
and
that
way
when
we
call
funk
so
we
need
to
call
funk.
So
we
call
funk
that
results
in
the
cache
download
happening.
So
this
downloads
this
runs
the
download.
We
say
the
replace
path
is
okay.
This
file
in
the
cache
is
original.
We
resolved
that
path.
B
B
You
know
that
will
reference
the
contents
of
this
replace
path
and
it
doesn't
have
that
right,
property
set,
so
it
should
just
end
up
with
you
know:
file
name
equals
replace
path
and
yeah,
and
then
that
ends
up
here,
so
this
gets
yielded
here
as
source
source
equals
source
super
dot
a
enter.
I
think
this
is
what
we
want.
So,
let's
find
out,
let's
find
out
if
this
explodes
spectacularly
so
bim
tests
test
there.
Let's
see
boom
test
source
test
data
set.
B
B
So
now
we
have
sort
of
more
like
what
what
you
wanted
right,
which
is
just
sort
of
like
a
you
know,
a
look
up
table
right
and
so
now
now.
The
other
thing
is
that
we
can
do
what
saksham
has
done
with
the
pi
torch
models
and
add
a
dynamic
enumeration
of
possible
plugins.
So
not
only
will
you
load
from
entry
points
right,
so
we
can
override
the
base
source
load
class
method
to
say
that,
because
right
now
the
base
source
load,
class
method,
class
method-
I
think,
is
just
pass
through
to
entry
point
load.
B
B
B
B
When
we
look
in
setup.py,
that
is
all
of
this
stuff
right.
So
that's
these
keys
here.
So
this
this
in
question
is
dfml.source,
so
we
can.
B
B
That
load
class
method,
we're
not
going
to
do
it
now,
but,
and
we
can
say
you
know,
source
super
dot
load
and
then
you
know,
there's
there's
handling
here.
Based
on
whether
you
did
you
know,
loading
equals
none
right.
So,
basically,
whether
you
did
one
source
or
multiple
sources
and
we
can
say:
okay,
if
you
fail
to
load
it
then
look
through.
B
We
can
enumerate
all
the
sources
that
we
created
within
this
file
dynamically,
just
like
we
did
with
the
scikit
classes
and
sort
of
say
you
know
we
can
just
sort
of
dump
dump
the
contents
of
this
file.
Here,
this
source
data
set
or
everything
within
dfml
source
data
set,
and
we
can
say,
hey
you
know,
let
me
load.
B
Let
me
let
me
go
grab
any
thing:
any
source
source
classes
that
reside
within
dfml
source
data
set
and
if
the
load
fails
from
the
entry
point
see
if
we
can
load,
you
know
see
if
that
exists
in
something
that
we've
defined
in
that
file
right.
So
that's
something
we
can
do
later
and
that
way
that
way
we
sort
of
auto
anything
that's
defined
in
here-
gets
auto
exposed.
As
a
source,
so
this
yeah
import
from
different.util.testing.,
yeah
or
yeah
async
test
case
how
we're
doing
on
time.
B
Okay,
cool
import,
async
test
case.
B
B
Okay,
config.baseconfig
field:
I
thought
we
got
this
somewhere,
but
maybe
not
oh.
Did
we
get
entry
point?
We
didn't
get
the
entry
point
yeah,
that's
a
good
thing
to
also
get
context,
manager,
data
classes.
Okay,
we
don't
need
that
async
io!
Don't
need
that
editor
tools.
Okay,
we
need
inspect
and
context
slip.
B
B
B
All
right,
so
the
entry
point
decorator
is
the
right
hand.
Side
of
this
so
or
the
left
hand
side.
So
this
will
be.
You
know,
for
example,
if
we
do
once
we
do
at
that
at
entry
point,
that's
setting
that
that's
setting
the
entry
point
so
remember
that
entry
point
within
dfml,
util
entry
point
this
entry
point
here.
So
this
is,
you
know
dfml.source,
and
that
is
grabbed
from
since
this
is
so
this
the
source
class
subclasses
from
entry
point.
B
B
So
entry
point
label
is
getting
set
to
dataset.iris
in
this
case.
Okay,
where
was
I
and
then
type
is
not.
B
Callable,
let's
pray
this
dataset
source:
oh
yeah,
return.
B
B
B
This
great,
that
is
also
what
it
should
be
and
then
so
a
third
class.
Okay,
let's
see,
I
think,
we're
now
into
assert
class
okay.
What
do
we
want
to
know
now?
I
guess
let's
just
like
try
to
instantiate
it.
It
hurts
so
like
test,
but
let's
see
test.
D
B
Okay,
so
what
do
we
want
to
test
here?
We
want
to
test
it
like
the
wrapper
that
that
this
we're
going
to
try
to
test
this
part
here
this
data
set
source
stuff.
So
so
I
guess
let's
just
enter
the
source,
so
async
with
async
with
yeah.
Let's
just
let's
just
start
here.
B
B
Oh
yeah,
all
of
this
was
not
async
here,
wait
a
minute
yeah
weight
load,
wrapped,
wrapped,
okay
with
wrapped
okay,
so
wait.
This
is
still
async
generator
can't
be
used
in
a
weight
expression
load.
Let's
just
make
sure
that
this
instantiates
the
right
thing.
First.
B
B
B
B
Okay,
so
url
yeah,
okay,
so
it
looks
like
kwrx,
is
correct
there
and
there's
no
or
wait
okay
looks
like
args
is
getting
oh
itself.
Is
it
being
past
self
yeah
there
we
go
there.
We
go
that.
Should
there
you
go
so
now
we're
not
getting
self.
I
always
forget
that
all
right,
so
our
generator
context
manager
has
no
attribute
a
enter
so
that
source.
Oh,
we
need
to
return
itself.
B
B
B
B
Yield
funk,
oh,
that's
what
it
is
with
funk
as
result,
yield,
funk
or
road
result,
yeah,
obviously,
okay,
and
then
we
need
that
call
to
or
we
need
that
decorator.
D
B
B
B
B
This
funk
locals
funk
was
never
awaited.
Okay,
yeah
that
cache
download
function
only
works
on
this
cache
download
function
only
works
on.
B
Okay,
where
is
that
my
favorite
url
lib
function?
This
is
great.
I
love
this
thing.
This
thing
is
great
url
I
have
posted
on
there.
They
are
like
considering
removing
stuff.
I
was
like.
Please
do
not
remove
this
thing.
This
is
great
where'd.
It
go
url,
retrieve
yeah
check
this
out.
Where's.
B
B
B
B
We
don't
need
this
replace
path,
all
right,
yeah,
we
do.
Let's
just
do
it
like
this.
Let's
just
change
as
few
things
as
possible
right.
B
B
D
B
So
it
looks,
looks
pretty
correct
to
me,
except
for
stir,
object,
has
no
attribute
file
name
so
to
config.
Oh,
it's
just
yeah.
Okay,
maybe
we
did.
Let's
just
set
file
name.
B
D
D
B
No,
are
you
kidding
me,
I'm
never
so
successful
this?
Is
this
worked
out
really
well,
this
went
so
smoothly,
hey,
we
got
all
our
records
and
I
have
to
go
to
my
next
meeting.
That
was
like,
I
can't
believe
how
well
that
went.
Anyways,
okay!
Well,
let's
sink
later,
I'm
gonna
post
this
video.
I
can't
like
this
went
really
smoothly.
I'm
very
excited
about
this
well.
A
B
Yeah,
let
me
push
this
to
a
branch
all
right,
hey.
That
was
fun
thanks
for
indulging
me
there.
I
said
it'll
take
two
minutes,
but
I
guess
it
took
an
hour
and
a
half
so
but
hey
fun
stuff.
I
hope
I
didn't
bore
you
here.
Oh
no!
No!
No!
All
right!
I
will
push
this
thanks.
Josh
have
a
good
one.