►
From YouTube: Coach Alice: Volume 1: Chapter 1: Part 38: Boto3 Source
B
I
think,
but
we
got
the
upload.
A
A
B
B
A
A
B
C
B
D
B
A
B
Okay,
so
convert.
A
C
A
C
A
A
A
B
A
So
we'll
say,
source
for
storing.
C
A
C
A
I
don't
like
or
down
here,
but
we're
gonna,
do
that
we're
not
supposed
to
do
this,
this
import
style
thing!
Okay,
so
it's
against
the.
B
General
guidelines-
okay,
so
actually,
let's
see
so.
B
A
D
A
A
A
So
update
records,
okay,
so
records
it
does
nothing.
D
D
A
D
D
C
A
A
C
C
C
A
A
A
We
need
to
do
what,
on
a
record
key,
we
need
to
make
it
a
valid
file
name
so.
B
B
C
D
A
A
A
A
A
A
Wait
a
minute
wait,
a
minute
wait!
A
minute.
A
A
A
A
A
B
C
A
A
B
A
At
the
top
level,
quite
honestly,
I
think
okay
dash,
p
overlays,
alice
good
eye
contribute.
D
A
C
B
A
Okay,
so,
instead
of
main,
this
becomes
upload
to
bucket
it's
a
dfml
dot
off.
Let's
read
in
actually
the
df
b.
If
well,
examples
should
I
should
I
python
hi
by
operations
right.
So
this
is
the.
A
A
B
A
A
So
we
can
just
say:
results
is
different
group
by
output,
and
then
this
will
be
triggered
anytime.
There
are
so.
A
A
That
output,
it's
also
an
output
operation.
So
it's
going
to
be
running
up
the
stage,
and
now
this
is
kind
of
the
little
funky
part
here
where
we're
going
to
simplify
the
effectively.
What
we
do
here
is
we
we
take
this
client
creation
and
we
go
ahead
and
just
move
it
to
a
little
helper
function,
context
lib,
so
we
we
use
contextlive.async
context
manager.
C
A
A
3
config
and
then
it
can
say
it
can
yield.
C
D
A
A
D
A
A
So,
okay
so-
and
we
work
in
type
pipeline
a
lot.
So
this
is
where
we're
trying
to
make
this
as
native
python
as
possible.
So
basically
there's
no
difference
between
doing
this,
with
with
it,
with
the
orchestration
or
without
the
orchestration.
A
Right
because
we
want
to
focus
on
grabbing
stuff
that
already
exists
right,
so
we
can
make
the
orchestration
as
lightweight
as
possible,
because
we
don't
really.
We
don't
want
to
be
a
framework
right.
We
have
a
I
said
framework
earlier
because
we
have
like
we
have
within
that's
our
internal
api
effectively.
We
want
the
external
api
to
be
basically
like
nothingness,
just
just
what
just
regular
python
code
right
so
is
dfml
used
in
here
anymore,
okay
group
by
output
is
used.
A
D
A
A
A
Does
that
run
without
the
orchestration
so
upload
to
bucket
config
upload
the
button
config
stage?
Okay?
So
basically,
if
we
instantiate
that
upload
a
bucket
object
and
pass
it
to
the
upload
upload
to
bucket
function,
the
upload
to
bucket
function
would
take
that
and
it
would
be
happy.
It
would
be
a
happy
camper.
So,
let's
see
so,
and
that
means
we
can
throw
that
method
directly
on
there
all
right.
So
let's
treat
this
okay.
So
let's
play
with
this.
A
A
All
right
so
so
far,
so
good
we're
playing
with
the
api
right,
so
stage
output,
so
we're
going
to
get
rid
of
that.
A
A
A
Rolling
stages
comes
in
here
right,
and
so
this
is
where
we
talked
about
hey.
You
know
I
want
to
basically
be
n
n
plus
one
right.
That's
output,
that's
cleanup
right,
so
n
plus
two
is
cleanup.
N
plus
one
is
output
if
you're
in
processing
at
n,
right
and
so
but
you're
just
rolling,
and
you
keep
going
right.
So
it's
like
you
know
the
next.
I
think
the
next
scheduled
conscious.
A
A
Basically
you
have
subconscious
and
conscious
states
right,
and
so
you
you
can
jump,
but
the
the
yeah
there's
different,
there's
different
triggers
for
going
in
and
out
of
the
subconscious
and
conscious
states
right
and
so
basically
what
we
can
do
is
we
can.
We
can
say
a
subconscious
state
is
effectively
something
that
will
run
within
the
next
tick.
A
Tick,
tock
right
for
that
relative
time,
slice
right,
which,
with
within
the
state
of
the
art
like
within
the
the
near
field,
to
the
state-of-the-art
train
of
thought
or
the
the
state
of
consciousness
that
you're
in
right
because
remember
it's
time
spent
in
field
determines
the
speed
of
time
in
that
field.
So
so,
basically.
A
Yeah
you're,
when
you're,
when
you
do
this
rolling
stage
thing,
it
basically
says
hey.
Consider
me
consider
me
for
execution
next,
you
know
and
and
that
that
border
is
where
we
can
do
things
like
collect
all
the
outputs
for
the
list
and
stuff,
because
it's
sort
of
like
it's
it's
a
defined
transition
between
it's
a
it's,
a
context,
switch
it's
a
content.
It's
a
context!
Switch
okay!
That's
a
context!
Switch
okay
yeah!
So
when
the
stage
rolls
over,
that's
really
a
context
switch.
A
A
A
Stage,
okay,
so
what
did
we
decide
about
the
stage
thing?
I
just
I
don't
like.
I
don't
like
this.
I
don't
like
this.
The
way
it
looks
right
now,
it's
not
right,
so
it's
just
not
there's
something,
that's
not
right
about
it,
so
the
aioh2b
client
session
needs
to
be
entered
already.
A
A
Okay,
so
oh.
A
A
Okay,
this
is
good
okay,
so
we're
seeing
something
here
so
we're
seeing
something
interesting,
which
is
why
this
is
getting
awkward.
So
what
we're
seeing
is
that
the
imprinter
is
actually
something
that
should
be
defined
yeah
as
an
input
to
this
thing,
so
this
upload
to
bucket
upload
okay.
This
is
why
I
keep
doing
this
and
I'm
not
supposed
to
be
doing
this.
Okay,
okay,
so
keep
forgetting
it's
just
because
we
have
to
think
differently
now
right
and
it's
like
it's
all
confusing.
A
It's
not
like
you
used
to
think
with
programming,
so
okay,
so
this
connect
bucket
client.
B
A
C
A
A
Okay,
so
resource
is
not
what
we
want.
D
C
C
C
A
A
A
A
Perfect
example,
so
we're
creating
a
new
type.
This
is
what
the
underlying
type
is.
The
library
doesn't
tell
us,
but
we
had
to
look
it
up.
There's
no
type
steps.
Maybe
there
are,
I
don't
know,
and
so
it
is
this
variable
here
there
is
this
type
name
here,
and
so
now
we
can
say,
connect
bucket
client
and
it
should
produce
us
this
object
here
right.
So
then
we
can
say:
aio,
butter3,
client,.
B
C
A
C
A
A
A
A
And
see
I
don't
like
this,
because
the
type
information
is
two
places
now
with
this,
so
I
don't
think
this
numpy
style
docstring
is
a
good
way
to
go
long
term.
I
think
that
there
needs
to
be
some
sort
of
evolution
of
this
without
the
type
information,
because
the
type
of
information
is
now
stored
within
that
within
the
so
we'll
want
something
lighter
weight.
I
think
long
term.
A
C
C
A
B
C
C
Okay,
okay,.
C
A
A
A
Just
disabling
the
auto
format.
C
A
A
And
we'll
instrument
everything
else
automatically
by
doing
all
these
scans,
so
we're
going
to
run
all
these
scans
and
then
we're
going
to
map
the
models.
We're
going
to
grab,
build
intermediate
representations
of
all
the
programs,
we're
going
to
map
it
to
the
models,
we're
going
to
map
it
to
the
threat
models,
and
then
we're
going
to
build
this
machine
human
mapping
of
intent
and
we're
going
to
apply
it
to
different
code
based
on
this
cross-domain
conceptual
mapping
and
we're
going
to
apply
it
to
all
this
other
stuff.
A
It's
going
to
be
great,
so
this
is
where
we
start
with
connect
bucket
clients.
So,
okay,
so
we
started
a
long
time
ago,
so
async
with
connect
bucket
client,
alright.
So
let's,
let's
throw
in
some
of
these.
A
A
C
A
D
A
C
A
B
A
B
A
Okay,
so
for
okay,
so
async
def,
look
whatever!
So
what
I'm?
What
am
I
doing
right
now
and
I'm
automating
the
spin
up
of
the
s3
bucket
and
then
we're
going
to
shoot
the
other
thing
at
it.
So
how
do
we
create
a
user?
That's
that
we
need
to
know
if
we
can
create
a
user
first
deployment
recommendations,
blah
blah.
A
Okay,
so
I
think
we're
good,
I
think,
we're
good.
I
think
we're
just
going
to
blow
up
that.
Aws
access,
key
id
and
adress
secret
key
id-
and
I
bet
it'll,
just
take
the
data.
A
B
A
A
Okay,
so
I
think
we
should
probably
pass
them
in
yeah.
We
should
be
path,
we
shouldn't
be
passing
them
and
it's
environment
variables.
Actually,
that's
a
really
good
point.
That
would
be
bad,
so,
let's
pass
them
in
as
files,
because
you
see
them,
it
is
potential
to
see
them.
So
with
files
we
have
unix
access
control.
A
So
where
are
our
the
so
user
file
password
file?
Okay,
paul
secrets,
okay,
so
and
also
support
absolute
pass?
B
C
A
A
A
So,
oh
god,
I
love
these
people,
you
all
rock.
This
is
just
fantastic
okay
that
saves
so
much
time.
Okay,
that
should
be
it.
That's
a
solid
best
practice
right
there.
We
should
all
do
that.
Okay,
all
right!
So
if
event
in
subprocess
stdr
read
line
and
result
starts
with
api
split,
it
grab
the
url
all
right,
so
endpoint
url
that's
going
to
produce
endpoint
url
there,
so
this
is
going
to
be
okay,
so
we
want
a
three.
A
This
is
the
bucket
library
so
bucket.
No
we'll
call
it
aioboto3.
A
A
A
A
A
A
A
It
actually
worked
out.
Oh
maybe
that's
why
I
did
that
oh
yeah.
Sometimes
I
do
that.
I
think
that
subconsciously,
okay,
so
region,
name,
endpoint,
url,
key
id
access
key
all
right,
there's
all
our
types,
so
this
is
the
type
of
that
this
thing
needs
to
return
is
what
that
means.
Right
and
now
we
have
our
mystery.
A
C
A
A
A
A
A
A
And
then
yeah,
just
just
ignore
it
on
the
tuple
accept
a
list
as
a
list.
Do
the
auto
conversion
on
the
list,
because
people
will
usually
mean
to
return
lists.
They
don't
sort
of
mean
to
return
tuples.
They
just
do
because
it's
what
you
do
in
python
and
so
then
they
that
type
annotation
will
end
up
reflecting
that.
A
A
Well,
I
think
the
reason
why
I
was
saying
that
you
wouldn't
want
to
do.
That
is
because
you
want
to
allocate
the
memory
yourself
probably
and
then
pass
it
to
the
thing
that
initializes,
given
the
memory
region,
okay,
so
and
the
balance
check,
okay,
so
the
size
of
the
memory
region.
Okay.
So
but
that's
inside
the
point,
that's
beside
the
point.
Okay,
so.
A
A
So-
and
there
was
a
weird
bug
with
the
output
operation,
so
I'm
not
sure
what's
going
on
there,
so
hopefully
when
they
had
the
optional
on
it,
but
this
doesn't
say
optional.
I
think
it
was
just
when
the
annotation
is
on
there,
but
not
when
it
necessarily
has
a
default.
It's
what
the
combination
thereof.
Okay,
so.
A
This
thing
is
going
to
take
that
from
this
tuple,
okay,
so
and
why
did
we
care
about
it
being
a
tuple?
Well,
because
if
it's
a
tuple
okay
so-
and
this
is
the
thing
with
the
the
the
type
the
the
type-
I
don't
know
what
you
would
call
that
you're
just
mixing
and
mixing
and
matching
you
know,
you're,
just
finding
all
the
possible
combinations
and
permutations
like.
A
Is
it
recursively?
Probably
I
don't
know
you're
you're
traversing,
all
the
trees
of
possible
permutations
of
the
of
the
type
of
the
type
yeah
you're
traversing,
all
the
possible
trees.
You
build
the
set
of
all
the
possible
permutations
of
all
the
possible
trees
of
types
and
then
you
prune,
based
on
which
ones
are
valid
inputs,
but
I
don't
have
time
to
do
that
right
now,
so
so
when
so,
this
is
why
I'm
leaning
towards
interpret
the
tuple
and
that'll
just
be
shorthand
within
the
python
syntax.
A
Is
that
what
I
said
right?
I
think
it
is,
but
I'm
not,
and
now
it's
gone
so-
and
this
is
just
we
just
want
to
get
something
soon
and
that's
just
not
going
to
happen
soon.
A
A
A
A
Okay,
so
now,
and
and
an
inclusion
of
this
operation
within
an
overlay
with
the
current
overlay
mechanisms,.
B
B
A
A
C
A
Works
token.
A
C
A
C
A
A
We
yield
and
it
kills
on
exit
for
us,
so
at
least
run
command.
Events,
I
believe,
should
kill
it
gracefully.
So
I
mean
you
never
know
with
docker.
You
never
know
with
docker
but
run
command.
Events
should
kill
her
gracefully
or
else
maybe
it'll
just
stay
around
but
we'll
find
out.
So
I
hope
it
kills.
It
gracefully
so
to
do
audit
we'll
just
put
security
on
us.
A
A
lot
of
use-
it's
just
gonna,
be
annoying.
That's
what
it's
gonna
be
okay,
so
on
a
use
of.
A
Okay
and
also
to
do
yeah,
you
know
audit.
What
happens?
Does
this
kill
the
container
successfully
aka
clean
it
up.