►
From YouTube: Airflow Setup Walkthrough with Taylor and Magda
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
So
everything
a
lot
of
the
commands
are
going
to
be
run,
are
going
to
be
set
based
on
this
make
file.
So
there's
like
yeah,
so
there's
an
it
air
flow
and
there's
actual
air
flow,
and
you
can
see
like
this
is
where
that
git
branch
comes
into
play.
So
what
what
should
be
happening
so
like
first
thing
I
would
do-
is
export
my
git
branch.
A
No,
it
doesn't
I
mean
you'll
well,
actually,
yeah
your
pride
of
the
tire
I
should
push
it
out,
because
air
flow
is
gonna
like
if
you
run
a
job,
is
gonna.
Look
for
it's
gonna
clone
the
repo
for
that
branch.
Based
on
what
you
said.
Okay,
so
I
just
push
push,
the
empty
branch
will
do
making
it
air
flow
and
it
should.
A
C
A
A
A
A
B
A
Well,
no
just
the
first
time
so
yeah
just
the
first
time.
So
what's
it
so
the
the
clone
of
analytics
is
is
a
shadow
clone
it
just.
It
basically
creates
a
new
database
with
the
proper
name,
and
so
we
can
look
in
the
logs
here
and
it's
not
it's
not
actually
a
real
clone
because
it
it
would
take.
Is
we
have
snowplow
data
and
the
analytics
side
on
the
looks?
Perhaps
I
was
taking
a
long
time,
and
so,
if
you
need
a
real
clone
of
analytics,
we
have
a
job
for
that.
A
There's
the
pay
there's
the
clone
analytics
real
and
now
that
actually
does
like
a
full
snowflake
clone.
This
one
is
just
the
shallow
clone,
which
basically
makes
the
name,
like
your
branch
name,
underscore
analytics,
and
it
this
job
runs
every
time
you
push
code,
but
all
it's
doing
is
just
checking
that
the
database
exists.
Essentially,
when
you
run
the
clone
raw,
it
only
runs.
B
A
So
in
the
sidebar
here
we
should
see
analytics
testing
branch
analytics
and
that's
the
one
that
was
triggered
by
the
automatic
job
and
then
it's
in
the
process
of
creating
the
clone
for
raw
and
it'll
show
up
in
the
side
bar
here,
so
you'd
be
able
to
come
in
and
see
whatever
it
would
just
be
a
match
to
to
raw.
But,
like
I
said,
the
analytics,
one
is
just
a
shallow
clone.
There's
literally
nothing
in
here,
but
that
way
like
DBT
will
run
against
this.
This
clone
instance,
instead
of
the
actual
real
instance.
A
Okay,
so
let's
just
pretend
like
the
the
raw
job
was
done.
I'm
gonna
do
let's
just
do
sheet
load,
so
we
can
go
ahead
and
turn
it
on
and
it
should
automatically
trigger
a
job
just
cuz.
Once
you
turn
the
Java
on
air
flows,
like
you
know,
starts
trying
to
schedule
everything
yeah,
but
it'll
take
a
minute.
So
yeah.
A
A
So
this
is
the
main
cluster
and
then
the
workload
tab
is
typically
where
the
more
interesting
aspect
is
so
most
everything
runs
in
the
default
namespace.
So
we
have
the
deployment.
This
is
the
nginx
ingress
controller
and
then
well.
The
webserver
is
a
service,
but
you
can
see
like
this.
Is
the
incremental
be
lab,
calm
and
extract
running
in
default?
Then
this
is
an
old
pod,
that's
running
in
the
testing
namespace.
So
when
your
test
lovely,
should
all
run
in
the
testing
names,
boys.
A
B
A
A
A
A
C
A
It
runs
so
if
you're
actually
changing
something
about
like
any
of
these
extracts,
you'll
have
to
push
your
code
up
to
the
remote
to
get
the
job
to
see
it,
but
if
you're
changing
the
DAGs
locally,
it
should
just
be
based
on
the
local
changes,
and
it's
like
we
can
test
that
by
if
I
drop,
I'd,
say
I
just
delete
this
dag
and
then
coming
here.
It
should
so
that
was
the
snowplow
full
refresh
actually
I.
Take
that
back.
It'll
still
show
up
because
there's
entries
in
the
database.