►
From YouTube: Migrate Apache Airflow Ops to Dagster Jobs
Description
Many teams want to migrate off legacy orchestrators and onto newer best-of-breed orchestration platforms, but the idea can be daunting.
Well, it's now easier to make the switch. Odette Harary shares details on the new dagster-airflow library and perspectives on how to successfully migrate off Airflow by converting Airflow Ops to Dagster Jobs.
A
You
is
how
to
take
all
of
these
dags
and
migrate
them
into
dagster
and
you'll.
See
that
you
don't
need
to
set
up
any
infrastructure
and
you'll
be
able
to
run
and
test
this
locally
in.
Very
simply
so.
This
example
is
actually
available
on
GitHub
and
you'll
be
able
to
follow
along
as
well
as
use
this
as
a
template
to
replace
your
own
dags
in
here.
So
what
I
have
here
is
a
bunch
of
different
folders
I'll
start
off
by
showing
you
a
bunch
of
the
different
tags,
I'm
going
to
show
you
this
tutorial
bag.
A
So
this
should
look
like
standard
airflow
code
and
what
we'll
be
doing
is
take
this
airflow,
dag
and
turning
it
into
a
dagster
job.
Very
simply
so
you
won't
need
to
change
any
of
this
code.
What
we'll
be
doing
is
actually
wrapping
dagster
around
this
and
you'll,
see
that
there's
no
code
changes
and
no
infrastructure
set
up
to
both
edit
and
run
this
locally,
as
well
as
on
the
dagster
cloud.
Next,
let's
take
a
look
at
the
tool.
That's
making
this
happen.
A
So
I
have
this
dagster
migration
folder
over
here,
as
well
as
this
definitions
file.
So
what
we'll
be
using
a
library
called
dagster
airflow
which
takes
all
of
this
airflow
dag
code
and
is
actually
going
to
not
change
that
code?
A
bit
you
don't
need
to
write
any
dagster
code
as
well,
and
what
this
is
going
to
do
is
create
the
definitions
path
from
those
dags
in
one
or
two
different
commands.
A
Now
that
we
peek
behind
the
scenes
to
see
the
different
airflow
dags,
as
well
as
the
tool
which
is
dags
for
airflow,
which
is
going
to
make
this
happen,
let's
open
up
the
readme
file
and,
let's
see
how
to
run
this
locally.
What
I
have
is
here
is
two
different
commands
that
we're
going
to
run.
The
first
is
going
to
install
both
air
dagster
airflow,
all
the
different
dependencies,
as
well
as
the
stagster
migration
file,
and
the
second
is
dagster
Dev,
which
launches
the
UI.
So
let's
get
started
and
execute
these
two
commands.
A
Whatever
workflows
you
had
up
and
running,
and
now
you
could
transition
over
to
things
like
dagster
assets,
which
Tim
will
be
demoing
in
the
next
video,
very
slowly,
with
everything
working
and
your
core
business
operations
untouched.
A
A
Let's
run
it
so
right
now,
we're
running
all
of
these
airflow
dags
locally,
with
a
couple
easy
commands
got
us
here,
so
you'll
be
able
to
really
test
this
out
on
your
airflow
dags
yourself,
and
we
do
recommend
that
you
start
off
locally
before
going
into
Cloud,
just
to
make
sure
everything's
working
and
that's
it.
Everything
has
now
been
migrated
to
dagster
jobs
and
we're
executing
this
locally.
Now
that
we
saw
how
to
run
this
locally
on
dagster,
let
me
show
you
how
to
run
this
on
dagster
cloud.
So
this
is
our
serverless
option.
A
We
also
have
hybrid,
so
you'll
be
able
to
use
your
own
infrastructure,
but
our
dagster
Cloud
serverless
option
requires
zero
infrastructure
set
up
on
your
end
and
what
I'm
going
to
do
is
actually
just
run.
These
couple
commands
and
you'll
be
able
to
see
that
everything
is
pushed
to
dagster
cloud
and
you'll
be
able
to
run
all
of
the
different
airflow
jobs
there.
A
Okay,
so
here
I
have
my
dagster
cloud
and
you'll
see
that
we
have
different
code
locations
which
will
enable
your
teams
to
really
collaborate
here.
So
no
longer
do
you
need
to
have
one
person
editing
code
on
their
end
and
not
able
to
look
and
see
what
else
is
going
on
in
the
rest
of
the
organization?
A
Here
you
could
have
all
your
different
code
locations,
which
could
correspond
to
different
projects
that
are
work
being
worked
on,
and
you
really
could
have
that
strong
collaboration
between
all
of
your
data
team,
as
well
as
your
analytics
machine
learning
teams
as
well.
So
here
we
also
have
our
dagster
migration.
Folder
you'll
see
all
of
the
different
airflow
jobs,
airflow
Decks
that
have
been
converted
to
jobs
here
and
you'll
be
able
to
run
this
as
well.
A
So
the
last
thing
I'm
going
to
show
you
today
is
how
to
unit
test
your
code.
So
in
airflow
you
might
have
been
pushing
changes
to
production
with
maybe
low
to
no
confidence
of
how
it
will
actually
work,
not
a
lot
of
testing
to
make
sure
that
everything
was
working.
Fine
with
your
data
to
make
sure
that
your
production
instance
is
intact.
A
So
if
you
have
any
side
effects
that
are
being
created
as
part
of
your
existing
airflow
dags
you'll
be
able
to
test
that
and
once
you
get
into
dagster
and
the
different
assets
you'll
be
able
to
even
run
more
robust
tests
on
what
you're
expecting
as
part
of
your
data
pipeline
or
machine
learning
pipeline.
So
you'll
really
have
confidence
that
you
could
unit
test
every
little
piece
of
your
pipeline
before
moving
to
production
and
when
you
go
to
production,
you'll
see
that
you
have
tested
it
and
there
should
be
no
issues
there.
A
So
here,
I
have
a
bunch
of
different
code.
You'll
see
that
we're
testing
to
make
sure
that
everything
was
successful.
We
have
the
different
package
in
here,
which
is
dagster
airflow,
which
loads
in
all
of
our
different
tags
as
definitions,
and
what
I'm
going
to
do
is
actually
run
this
to
see
that
everything
was
executed
successfully.