►
From YouTube: DataHub 201: Introduction to DataHub Actions Framework
Description
Join Hyejin Yoon from the Devreel team at Acryl Data as she introduces the DataHub Actions framework. In this video, Hyejin explains what the framework is, how it works, and why you should use it. She also provides a real-life use case and demonstrate how to create actions using the framework.
Watch this video to learn how to access and manipulate data in DataHub (code included), and learn about the various types of actions you can perform, including notifications, propagations, and more.
Learn more about DataHub: https://datahubproject.io
Join us on Slack: http://slack.datahubproject.io
Follow us on Linkedin: https://www.linkedin.com/company/acryl-data
A
A
Okay,
what
is
state
of
actions?
Framework
date
of
actions
framework
basically
takes
actions
on
changes
happening
on
data
hub.
So
what
changes
and
what
actions
briefly
explain?
By
change
we
mean
the
events
happening
on
data
Hub
and
by
actions
we
mean
notifications,
propagations
and
much
more
I'm
going
to
go
through
this
a
little
deeper
throughout
this
session.
A
Okay.
So
what
does
it
do
before
we
dive
into
this?
Let's
start
with
this
question.
So
how
do
we
get
data
into
Data
hub
to
get
data
from
data
sources
like
bigquery
snowflake
to
data
Hub?
You
use
our
metadata
metadata
ingestion
framework.
This
could
be
a
simple
command
via
rcli
or
some
low
level
manipulation
using
our
apis
and
sdks.
A
On
the
other
side,
you
might
want
to
get
data
out
of
data
Hub.
There
could
be
a
lot
of
events
happening
on
data.
Hub
entities
get
created
attached,
removed,
deleted
Etc.
You
might
want
to
subscribe
these
kinds
of
events
and
get
notified,
or
you
might
want
to
see
the
changes
across
the
systems
or
you
might
want
to
propagate
aspects
on
entities
across
all
images,
ensure
the
data
Hub
actions
framework
comes
into
play
when
you
need
to
get
data
out
of
data
off
and
take
actions
based
on
them.
A
So
what
kind
of
change
do
we
support
correctly?
Currently,
we
support
entity
change
event
and
metadata
change
log
and
for
the
Event
Services
event
sources.
We
only
support
Kafka
SD
only
Ventures.
A
Basically
it'll
be
executed
like
a
normal
ingestion
using
the
CLI,
except
you
have
to
install
Apple
data
of
actions,
module
and
use
date
of
actions
instead
of
data
ingest
and
configure
the
action
config
file.
Something
looks
like
this,
so
this
is
a
very
simplified
version
of
the
action
config
file,
so
you
have
to
define
the
name
of
the
action
pipeline
here
and
Define
the
source
which
is
Kafka,
and
you
could
also
set
some
filters
and,
lastly,
you
have
to
configure
the
type
of
the
action.
It
basically
tells
what
actions
to
take
on
events.
A
A
If
you
configure
hello
world
like
this
it'll
simply
simply
prints
all
events
it
receives
as
Json,
and
if
you
configure
slot
as
a
type
of
action,
it
will
send
notifications
to
a
slack
Channel
you
already
configured
and
here's
the
awesome
thing.
We
support
propagation
from
version
0.0.13,
so
it'll.
Let
you
propagate
on
tags
terms
and
the
snowflake
so
I'll
show
you
I'll
click
them
on
this
later.
A
Also,
customizing
action
is
also
possible,
which
gives
you
a
lot
of
flexibility
within
the
framework
can
basically
make
your
own
action
Within
action
framework
following
these
steps.
First,
you
define
action
by
extending
the
action
based
class
entitlement
and
override
functions,
so
it'll
basically
have
three
functions,
create
at
close
where
act
function
contains
the
main
logic
of
the
action.
So
this
is
also
a
very
simplifier
version
of
the
customized
action
where
it
just
prints
out
all
the
events
as
we
as
it
receives.
A
Lastly,
you
can
run
the
customized
action
like
this,
so
you
define
you,
declare
custom
package,
name,
custom
file,
name
and
custom
Quest
name.
A
So,
as
we
covered
a
different
type
of
actions,
I'll
show
you
a
quick
demo
on
tag
propagating
actions.
Before
we
start
what
is
tab?
Propagation?
Let's
say
you
attached
a
tag
on
data
set
a
and
data
set
a
has
a
lot
of
Downstream
assets
like,
for
example,
it
has
a
thousand
of
the
Downstream
data
set
it'll,
be
completely
inefficient
to
add
tags
on
all
the
downstream
entities
manually
with
tag
propagation.
A
The
action
pipeline
will
listen
to
the
tag
added
event
and
use
the
lineage
to
find
the
downstreams
and
attach
the
tag
to
them
automatically
so
I'm
going
to
show
you
a
live
demo
on
this,
as
you
can
see,
I
already
deployed
the
data
instance
in
my
local
and
I
already
configured
the
action
config
file
here
so,
as
you
can
see,
I
already
named
the
pipeline
S
Type,
propagation
and
Define
the
source
and
I
configure
the
type
of
the
action
as
type
propagation.
So
this
is
a
very
simplified
version
of
the
time
propagation.
A
So
you
can
run
this
actions
by
say
that
of
actions.
Dash
C
in
the
name
of
the
file,
and
basically
it
will
say
action
Pipeline
with
name
blah
blah
blah
is
another
one.
Okay,
so
let's
see
how
it
works,
as
you
can
see,
there's
no!
Let's
remove
this.
Remember,
there's
no
tax
object
attached
to
these
data
sets
and,
let's
say,
I
want
to
attach
tabs
here
on
Central
height
data
set
and
let's
see
what
happens
on
these
data
sets
so
probably
in
this
documentation.
A
So
what
it
does
is
it
detects
the
event
on
attached
tags
and
the
text
of
downstreams
of
the
data
set
and
it'll
attach
the
tags
on
the
downstream
data
sets
like
this.