►
Description
Justin W. and Radovan Bacovic discussed about Restructure data being delivered by Automated SQL instance service ping, development and testing process for Python and DBT changes.
A
Here,
great
just
among
another
one,
we
try
to
do
kind
of
knowledge
transfer
regarding
automated
service
ping
or
just
risking
now,
because
it
will
be
the
only
one
option
very
very
soon.
Here
is
our
issue.
What
we
have
and
actually
what
I
think
Justin
is,
makes
sense
what
to
set
to
try
to
separate
this
into
two
issues
step.
One
is
actually
what
we
should
cover
today.
A
It's
perfectly
clear
if
you
agree
with
that,
I
mean
I
like
your
idea,
because
definitely
this
is
a
must
to
have
and
also
second
step
is
we
should
align,
discuss
with
Israel
and
miles,
but
I
think
it's
also
a
good
part.
What
you
said
is
kind
of
low
to
medium
impact,
but
still
should
consider
a
couple
of
options,
but
let's
do
step
one.
Meanwhile,
we
can
reassess
requirements
for
the
for
the
step
number
two.
Okay,
so
you
know
the
drill.
A
This
is
the
issue
and
after
issue
you
have
connected
here
where
you
draft
just
in
a
second,
we
have
merge
request
based
on
this
code
as
a
previous
step.
What
I
promise
to
you
is
I
Incorporated,
the
code
which
is
most
more
beautiful
and
easier
to
test
or
possible
to
test
comparing
to
the
previous
version.
Incorporated
everything
in
the
current
Mr
run
the
test.
Local
and
airflow.
Prove
everything
is
correct
and
actually
have
the
same
output
then
run
the
tests.
Everything
passed
fine,
so
this
is
your
baseline
from
here
to
start
moving
forward.
B
A
It's
a
different
story,
that's
intentional,
because
what
what's
going
on
I
have
a
different
project
related
to
Siri
speak,
but
it's
it's
a
very
specific
use
case.
We
want
to
deliver
our
bundle
of
code
to
another
team
and
they
will
run
the
spin
it
off
say
they
don't
need
the
full
repo
and
it's
a
different
technology,
and
it's
not
connected
except
this
point
here
later
on.
We
will
connect
everything
together,
but
this
is
kind
of
first
iteration.
A
So
that's
the
reason
why
we
have
separate
trip
also
what
I
think
it's
not
I
would
say
organically
or
very
easy
to
recognize
here
is
also
what
you
pointed
out
is:
if
you
go
to
issue.
A
Okay,
you're
confused
why
this
Mr
is
connected
to
this
issue
because
in
this
MRI
I
mentioned
here
so
if
you
mentioned
somehow
MRN
issues,
it
will
be
connected
automatically
without
my
intention.
So
I
just
mentioned
something
in
a
comment
and
gitla
product
just
hook
it
up
and
say:
aha,
okay,
it's
connected
to
this
issue.
So
that's
the
way
where
you
have
related
mode
requests
here.
Even
this
is
not
connected
with
this
issue,
except
I
mentioned
that
in
comment.
But
that's
the
way
how
things
are
going
so
be
careful
about
that.
I'm
always
confused
about
this.
B
A
Yeah
this
is
this
is
good
just
as
a
warning
for
for
future
issues,
because
I
always
felt
the
same
trap.
If
you're,
not
careful,
you
just
can
press
and
what's
going
on,
it's
completely
different
story
and
different
time
are
and
okay,
as
I
said,
I
incorporate
the
code
from
another
Mr.
As
I
said,
it's
already
tested,
checked
locally
run
through
airflow
and
I
would
say
yeah.
We
are
good
to
go
from
this
point.
So
this
is
this.
A
A
We
transform
them
in
Snowflake,
syntax,
add
few
more
details
and
execute
each
of
these
sequel,
Stone
Snowflake
and
get
the
results
instead
of
Select,
something
we
will
have
5
or
10
or
actually
whatever
we
calculate
so
it's
kind
of
hybrid
approach,
not
natively,
just
insert
the
data.
It's
not
until
we
insert
the
statements,
execute
statements
and
collect
the
results
and
then
that
Json
we
push
in
a
row,
SAS
usage,
ping,
schema
and
later
on
down
the
road.
A
We
transform
that
and
then
miles
will
take
care
about
that
Focus
for
you
and
me
now,
and
mainly
for
you
after
you,
you're
equipped
with
everything
you
need
is
to
change
the
structure
of
Json
file.
You
already
got
that
point
and
which
is
really
cool.
I
fully
agree
with
your
idea.
This
Step
One
is
super
simple,
as
you
said,
so,
instead
of
flattened,
you
will
keep
the
same
structure
and
replace
select
select,
and
you
have
actually
transporting
everything
there
with
the
number.
Also
I.
A
Think
that
part
is
typical
programming
task,
nothing
special
I'm
here
to
help,
but
I
got
the
feeling
you're
very
confident
about
it,
which
is
great
and
also
what
I
want
to
show
you
now
is
how
you
should
test
something,
how
you
should
run
it
in
airflow
locally,
how
you
should
create
what
you
need.
Let's
start
from
the
scratch:
okay,
fantastic.
B
A
I
will
just
repeat
actually
what
I
did
in
the
morning
and
I'm
also
pretty
confident
everything
should
work.
So,
let's
start
once
when
you
create
the
merge
request
and
change
your
code
and
have
initial
commit
from
your
Source
system,
actually
I'm
using
git,
of
course,
and
where
are
you
Source
tree
visually,
it's
easier
for
me
and
push
some
changes
on
initial
commit
here.
A
A
Can
you
see
my
screen
now
now
it's
much
easier
to
track?
Sorry
for
that,
as
I
said
once
when
you
run
initial
committee,
will
get
set
of
pipelines
here
divided
per
its
nature,
for
for
this
kind
of
task,
we
need
the
python
pipeline.
We
also
need
some
snowflake
stuff
here.
I
already
did
some
homework,
because
this
run
automatically.
A
A
A
For
that
purpose,
I,
say:
I
know,
I
need
this
raw
SAS
usage
thing
when
I
want
to
run
airflow
locally,
because
data
will
end
there
for
that
purpose,
I
created,
so
everything
should
pass
to
do
a
mimic
of
production
parameter,
I
need
for
clone
row
specific
schema.
You
have
everything
in
handbooks
for
each
game
it's
nicely
described,
but
does
to
do.
Exercise,
for
you
is
sauce
usage.
Pink
I
will
not
run
this
because
it's
already
created
so
not
really
happen.
Actually
it
will
fail,
but
this
is
already
there
for
you.
A
A
Actually,
this
is,
there
is
one
pink
I
already
calculated
to
be
sure
everything
passed
fine
with
new
code
and
I
got
the
result,
so
you
can
run
this
n
times
per
day,
whatever
you
want
to
just
alter,
alter,
alter
or
add,
add
more
records.
So
that
is
the
first
step
if
you
want
to
do
run
or
test
something
locally,
so
that
part
is
fine.
We
will
get
later
to
this
piece
of
code.
Then
I'm
on
my
Branch
here.
A
This
command
will
build
airflow
locally
for
me
and
use
my
code
for
my
Branch.
Everything
is
automatically
behind
the
scene
already
setup
for
me.
So
no
worry
about
this.
You
can't
screw
up
anything
in
production,
so
it
will
refer
to
your
schema.
It
will
refer
to
your
branch
and
because
we
run
here
is
the
local
airflow
and
if
I
go
to
localhost,
8080
I
will
get
local
version
of
my
airflow
by
default.
All
bugs
are
set
to
off.
A
A
A
A
You
have
never
know,
okay,
I
would
turn
it
to
on
and,
of
course,
typical
trick
here
is
clear
and
I
will
run
everything.
I
don't
need
this
namespace.
For
this
moment,
I
would
say:
Mark
success,
This
Thread
is
as
fast
as
a
lightning.
It's
already
just
down
download
the
rest
will
appear
and
upload
it
there
and
this
service
associate
metrics
usage
pink
took
20-25
minutes.
That
is
what
the
name
note
here.
A
A
A
If
I
have
any
code,
I
will
just
push
empty
brackets
to
the
server
in
this
table,
so
it
will
be
just
square
brackets
with
nothing
inside
like
empty
Json,
so
I
just
put
request
rice
for
status
and
it
will
be
handled
automatically
why
this
is
important
because
last
week
token
was
expired
or
some
someone
changed
the
token
and
everything
was
boom
and
we
just
think
for
being
formed
after
data
landed
here,
which
is
bad
and
I
failed
to
recognize
that
part
is
not
handled
properly
for
last
year.
Everything
was
perfect
now
in
happy
flow.
A
It's
all
good,
but
here
you
can
see
this
pipeline
will
fail.
So
this
is
the
login
for
so
it's
going
one
more
one
get
SQL
transform
execute
together
after
20
minutes,
I
will
or
my
pipeline
will
push
one
more
record
here
ready
to
go.
So
that's
the
story.
Together,
how
you
should
run
your
airflow
locally,
how
you
run
particularly
insta,
SQL,
Matrix
tag
or
task
also
I
would
say
for
this
task
or
for
this
Mr
it
will
be
also
good,
maybe
to
change.
A
This
was
the
project
started
a
long
time
ago.
I
inherited
that
project
and
we
have
already
an
issue
to
do
a
kind
of
naming
commercial
but
I.
Think
now
it's
safe.
It's
just
a
couple
of
minutes.
Do
not
do
it
heavily
just
spend
a
reasonable
amount
of
time,
as
I
said
here
also
to
let's
say,
improve
a
code.
A
A
So,
generally
speaking,
you
we
have
couple
of
way
of
tasks
like
one
is
changed
python
code
or
something
regarding
airflow.
Second,
tasks
can
be
out
of
the
shells
five
times
each
Monte
Carlo
just
do
some
settings
and
third
one
will
be
DBT
like
okay
for
DBT
kind
of
testing
run
test
environment.
We
should
use
a
different
approach.
We
can
cover
it
today.
If
you
have
time
we
need
just
a
few
minutes.
I
don't
know.
Did
we
have
a
chance
to
run
with
DBT
here
in
gitlab
and
how
to
do
that?
B
A
More
than
happy
to
do
that
because,
as
I
said
one
day,
you
will
just
have
something
to
change
in
DBT
and
run
the
models.
I
have
a
good
example
now
or
we
can
do
it
whatever,
and
then
you
know
the
entire
floor
so
a
long
story
short.
This
is
the
very
brief
introduction
how
you
should
do
up
and
running
local
test
airflow
also
for
airflow.
Once
when
you
change
your
code,
foreign,
you
should
push
your
code
to
do
remote
repo.
Why?
Because
this
task
will
get
everything
from
your
branch
and
spin
its
kubernetes
instance.
A
That's
the
one
story
about
how
you
should
approach
the
testing
strategy
and
check
your
execution.
That's
the
first
part
regarding
airflow
before
we
move
on
to
DBT
I
want
to
show
you
two
more
pieces
of
code
or
functionality.
We
should
cover-
and
it's
good
to
know,
I
all
always
forced
to
have
a
proper
test
cases,
especially
when
you
are
able
to
run
it
locally
on
the
remote
gitlab
Mr.
You
will
see
it's
running
automatically
and
you
can
pass
even
fail,
but
locally.
You
can
ensure
your
quality
of
code
and
also
run
test
cases.
A
It's
called
usage
pink
this
file
taking
care
four.
This
task
you
see
red
is
SQL
and
namespace
forget
of
name
space
for
now.
I
need
to
change
something
where
probably
end
of
this
month.
I
will
include
you
just
to
give
you
the
brief
overview,
so
we
have
insta
SQL,
metrics,
insta3is
metrics,
that's
a
step
two.
We
want
to
merge
it
together,
but
now
Focus
install
SQL,
metrics
inside
usage,
pink
structure.
A
A
A
Let's
go
back
to
the
code
in
SAS
instance,
ping,
as
I
said
you
cover
namespace
radius,
SQL
star
systemsping
is
actually
SQL.
Stick
with
naming
notation.
If
you
can,
if
we
do
I,
don't
know
what
tool
you
prefer,
but
in
pi
charm
it's
fairly
easy
to
do
this
routine
is
very,
very
I
would
say
simple,
and
it's
doing
do
the
thing
for
us
actually
to
execute
each
query.
One
step
back
is
another
file
and
it's
called
Transformer
possible
to
snowflake.
This
is
actually.
A
I,
just
instead
of
man,
use
this
and
wrap
up
the
code
to
use
this
kind
of
better
option,
because
before
it
was
very
long
unit
without
possibility
to
test
it
to
check
it,
to
extend
it
to
reuse
it
whatever.
So
this
is
working.
Fine
I
proved
that,
by
the
way,
this
transform
will
reshape
it
from
postgres
to
snowflake
logic.
Is
there
instead
of
Json
flatten?
You
should
use
nothing
probably,
and
it
will
do
the
thing
for
you
just
a
second
foreign.
A
As
I
said,
you
have
everything
you
need.
Probably
you
should
just
readjust
the
code
a
bit
and
and
run
it
then
The,
Next,
Step
I,
think
is
super
important
for
us.
I
always
suggest
on
that
best
cases,
I
introduced
I
think
most
of
them
regarding
this
unit.
Everything
is
here.
A
Everything
will
run
automatically
so
python
Pi
test
will
execute
all
test
cases
in
the
entire
repo
analytics
repo
complexity,
black
pilot.
This
failed
because
of
some
other
units,
but
for
now
usually
I.
Stick
with
my
changes
in
perfect
world.
There
should
be
like
show
stopper.
If
not,
if
it
doesn't
pass.
You
can't
emerge
your
marriage
case,
but
now
we
try
to
clean
up
our
call
basis.
A
So
yeah
we
allow,
let's
say
we're
generous
to
do
that,
but
in
the
future
I
think
we
should
make
it
more
strict
and,
as
I
said,
usually
when
I
change
piece
of
code,
I
do
full
analysis
on
that
piece
of
code,
and
usually
it's
okay,
so
also
you
can
rely
on
this
feel
free
to
add
more
test
cases
feel
free
to
change
it
screw
it
up
whatever
you
feel,
it's
necessarily
in
order
to
make
things
done,
and
that's
the
story
about
that.
So,
generally
speaking
from
airflow,
we
spin
our
cubital
spot.
A
We
put
the
code
there
executed
in
script
fashion,
manner
using
fire,
and
then
data
Advanced
when
it's
done
is
uploaded
to
snowflake
long
story.
Short
same
goes
for
redis,
for
instance,
equal
and
for
namespace.
Now
we
should
focus
in
this
part
only
insta,
SQL
Matrix.
What
they
put
in
Mr
is
main
requirement
like
must
to
do,
but
also,
if
you
have
time,
if
you
have
any
will
to
do
this,
just
make
this
a
little
bit
more
beautiful,
align
with
the
naming
convention.
We
need
a
fresh
pair
of
eyes
to
take
a
look.
A
What
is
not
in
a
good
shape
and
improve
it?
As
I
said,
most
of
these
things
are
crying
for
improvement,
because
this
was
initially
created
by
other
persons.
I.
Try
make
it
more
perfect.
Of
course,
technical
depth
is
there,
so
maybe
to
get
rid
of
that
and
also
unify
the
code
and
also
put
my
observations
here
to
ensure
the
code
quality,
you
can
feel
free
to
just
run
the
script
inside
your
project
and
also
focus
on
your
SAS
usage
ping,
so
isolate
it.
B
B
So
the
first
step
was
the
CI
CD
and
you
already
ran
this
part,
but
for
generally,
when
you
interact
with
snowflake,
you
need
to
run
a
clone
raw
specific
schema.
Yes,.
A
B
A
A
B
Okay
makes
sense,
yeah,
okay,
and
you
already
ran
that
I'm
not
going
to
do
that.
We
already
have
it
in
Snowflake.
We
can
see
it.
The
next
step
is
to
test
airflow
locally
and
we
simply
just
run
airflow,
and
this
allows
us
to
test
it
in
logo
host
8080..
B
We
find
the
dag
that
you
described
SAS
usage
ping.
We
turn
it
on.
We
run
it,
we
wait
20
minutes
and
then
we
can
see
the
result
and
stuff.
A
A
B
Okay
and
then
we
also
want
to
be
running
the
pi
test
locally,
as
we
are
developing
just
to
make
sure
we're
passing
everything
one
point
of
and-
and
you
also
had
mentioned,
the
different
different
modules
that
need
to
be
edited
one
and
then
so
now
we're
looking
at
the
final
just
kind
of
some
extra
things
that
if
I
have
time
to
do,
I
can,
and
one
of
the
things
you
mentioned
is
like
pilinting
I'm
just
doing
some.
B
How
is
that
is
that
automatically
done
when
I
run
the
make
air
like?
When
does
that
happen
when
I
want
to
test
it
locally?
It's
a.
A
A
Take
your
time
to
not
be
super
fast,
just
to
deeply
understand
what's
going
on,
so
this
is
not
extremely
emergency,
even
if,
if
they're
set
just
to
be
familiar
with
the
project,
that
was
my
other
intention.
Instead,
just
sort
out
the
problem,
this
will
be
fine,
make
python
code.
Quality
will
run
everything
on
the
entire
project,
but,
as
I
said,
some
things
are
not
in
perfect
shape,
so
I
try
to
stick
it
locally
in
make
file.
You
can
see
all
options
we
have,
so
we
can
run
one
by
one.
A
You
can
run
in
a
bunch
whatever
you
want,
but
I
just
extract
here.
Oh
sorry,
I
just
extract
here
regarding
the
the
folder
we
have
and
also
ignoring
what
is
not
super
important
for
us
or
there
is
some
I,
don't
know
firefighting
problems.
We
need
to
get
rid
of
in
the
future,
but
on
this
stage
after
first
iteration
of
linking
this
is
more
than
sufficient.
I
would
say.
A
Don't
think
so,
yeah
you
rephrase
it
very
in
a
very
good
shape
and
very
good
way
and
I'm
super
happy
if
I
can
yeah
broadcast
this
to
you
to
cover
everything
and
also,
as
I
said,
probably
after
one
two
days,
you
need
to
re
reward
this
session,
ignore
my
son
behind
and
yeah.
You
can
do
it
definitely,
of
course
I'm
here,
so
we
can
do
pair
programming
if
it's
not
if
it's
necessary
from
your
site
or
if
you're
done
stock
somewhere
or
just
okay.
B
A
A
A
A
B
A
A
Have
there
is
no
wisdom,
actually
I'm,
always
looking
for?
If
you
know
more
to
to
improve
this
make
file.
Everything
stuff
is
starting
to
change
here.
So
if
you
have
any
proposals,
just
say
create
an
issue.
We
can
contribute
on
that
because
now
it's
very
simple,
but
this
this
do
work
for
us.
As
I
said
this
typical
organization
inside
EBT,
so
transform
snowflake,
DBT
models.
Then
subfolders.
You
want
to
go,
let's
say
to
sources:
SAS
usage,
pink,
insta,
SQL,
Matrix,
yeah,
I,
just
forget
one
thing
is
the
SQL
errors.
A
A
Sql
errors
will
be
formed,
rest
will
be
succeed
and
do
the
downstream
modeling,
and
also
for
that
reason,
I'm
doing
that
bundle
to
proactively
deliver
to
the
pi
team,
so
they
can
easily
to
proactive
testing
and
we
will
avoid
this
kind
of
failure.
So
in
perfect,
all
this
stable
will
always
be
empty
by
the
way
when
it
comes
to
DBT
model.
We
have,
let's
say
SAS
usage
ping
instance:
model
I,
create
this
model,
alter
this
mode
or
whatever
I'm
going
to
my
repo
jump
analytics
and
run
make
run
DBT
enter.
A
A
For
this
purpose,
you
also
need,
let's
say
problem,
but
so
it's
Broad
in
your
case
is
just
in
one
prep
with
just
in
one
prod.
It's
your
copy
of
production
and
prep
layers
source
for
this
will
be
by
default
row
from
production.
In
some
reason
you
want
to
modeling
something
by
yourself
and
want
to
redirect
no
not
from
raw,
but
from
some
other.
A
In
that
case,
you
call
way
I
in
this
file,
Zsa
jrc,
I,
don't
know
the
name,
and
there
is
one
variable
here
should
be
exported
and
it's
called
export
snowflake
load
database
by
default.
This
is
raw,
but
earlier
today,
to
do
some
exercises
in
gymnastic.
I
refer
to
this
one,
four,
three,
three:
seven.
It
means
when
I
run
my
DBT
models.
I
will
not
get
the
data
from
production
row.
I
will
get
the
data
from
these
database
and
appropriate
schema
because
I
I
said
to
DBT,
okay
snowflake
low
database
variable
is
looking
here.
A
A
A
Air
battery,
which
prep
or
airbakovic
prod,
depends
what
kind
of
model
you're
running,
but
usually
the
architecture
is
raw,
prep
and
prod
from
layer
to
layer.
In
this
case,
I
know
because
I
made
this
model,
this
will
run
from
row
to
prep.
So
if
I
run
this
theoretically
I
think
it
will
be
fine,
I
would
say:
DBT
run
model,
SAS
usage,
ping
instance
and
boom.
It
will
end
up
in
my
schema.
A
You
you
can
see
everything
on
the
console
here
is
rods
like
create
a
table
or
rear
back,
which
prep
la
la
la
la
and
that's
it.
So
this
is
the
most
easiest
way
to
run
tests
develop
something
locally
when
it
comes
to
dbg
change.
When
you
push
your
code
to
the
repo
to
remote
repo,
you
will
find
also
the
set
of
pipelines
you
can
add
from
here.
What
is
the
difference
here?
I
think
it
will
not
end
up
in
your
private
schema
prop
and
prep.
It
will
end
up
in
this
issue
named
scheme
underscore
prep.
A
Beta
team,
you
can
see
the
all
CI
pipelines
and
all
CA
jobs,
how
they're
running
and
what
is
the
meaning,
what
let's
say
parameter
you
need
to
put
there
but
long
story
short.
Everything
regarding
DBT
shoots,
have
variable
DBT
models,
because
you
probably
want
to
run
one
model.
This
is
going
fine
for
now.
A
B
Yeah
I
know
that's
that's
fantastic.
That
was
actually
very,
very
helpful
because
I
actually
have
like
like
went
through
some
of
this
stuff
through
the
onboarding,
but
just
to
see
you
do
it
and,
like
with
the
example,
helps
a
lot.
I
did
have
one
question.
Actually
just
please.
When
you
rant
DBT
run
models,
I
noticed
you
had
specified
just
one
model,
you
didn't
add
any
of
the
parent
dependencies,
and
maybe
that's
because
this
particular
model
is
actually
one
of
the
it
doesn't
have
any
dependencies.
A
B
Yes,
like,
if
you
were
developing
on
a
Model
that
has
parent
dependencies,
do
you
generally
run
the
parent
dependencies
as.
A
Well,
yeah,
yeah
I
know
this
is
isolated
model
still
kind
of
in
in
test,
or
something
usually
depends
on
the
logic
like
if
I'm
working,
something
on
product
layer,
I
want
to
go
from
row
to
prep
and
prot,
then
put
plus
here
and
everything
Upstream
to
my
model
will
be
executed.
You
you
got
the
point.
You
know
this
trigger.
If
you
put
Plus
on
the
left
side,
it
means
everything
which
my
model
depends
run
everything
there.
If
you
have
100
models,
it
will
run
100
models
or
on
the
other
side.
A
A
A
You
should
run
this,
especially
it
was
a
problem
before
with
zero
something
DBT
now,
with
version
one,
you
say
full
refresh
to
purge
everything
what
it
was
before
start
from
the
scratch
and
fill
the
table,
but
yeah.
As
for
a
new
question
shortly,
is
you
should
use
Plus,
also
you're
able
to
run
DBT
models
by
tag
to
exclude
something
but
for
I
don't
know
very
small
changes
from
one
two.
Three
models
usually
I
use
this
notation,
or
this
depends
of
the
use
case,
but
yeah
good
advice.
Definitely.
B
Okay,
that's
that's
very
useful,
so
I
think
you've
covered
airflow
and
DBT
I'm,
hoping
to
ask
you
for
one
more.
Yes,.
A
A
B
Yeah
I
saw
you
guys,
head
up
and
and
usn
merge
that
Mr,
so
Justin
Stark
had
I,
don't
know
like
so.
Ideally
you
update
the
requirements,
but
then
I
think
Justin
was
like
oh
shoot,
like
it's
Friday
Rod
Avon's,
going
to
probably
update
the
requirements
on
Monday.
Maybe
we
should
do
a
temporary
fix,
so
I
guess,
I'm
kind
of
hoping
to
ask
you,
like
So,
So,
Justin's
fix,
was
to
update
the
python
code
in
the
dag
to
just
comment
out
the
Monte
Carlo
line.
A
B
Just
for
something
that
simple,
how
would
you
would
you
just
go
through
the
airflow,
the
airflow
steps
you
talked
about,
where?
How
would
you
test
that.
A
Yeah,
that's
that's
the
pain
point
for
some
thinks
in
airflow.
It's
super
difficult
to
test
something
like
that
kind
of
stuff.
You
can
run
it.
Maybe
you
should
get
some
crazy
results,
but
usually
we're
based
or
rely
on
our
experience
like
okay.
A
We
will
not
do
this
or
we're
just
run
it
blindly,
but
regarding
this
kind
of
stuff
it's
plugging
prayer,
like
actually
I
mean
I
crash
Justin
Stark.
You
know
what
he's
doing.
Definitely
he's
super
experienced
in
DBT
or
an
airflow
or
both
things,
but,
to
be
honest,
try
to
do
as
much
as
testing
as
you
can
locally.
A
If
it's
possible
right,
like
I,
know,
there's
one
task
for
me
like
when
DBT
runs
it
Runners
manifest.json
file,
so
it's
technically
almost
impossible
to
to
try
to
run
it
local
because
you
need
to
run
the
entire
DBT
flow.
It
takes
six
for
seven
hours.
So
one
day
you
will
just
I,
don't
know
waste
to
watch
how
things
are
going
so
the
cup
this
scope,
everything
like
commented
out
and
keep
only
two
free
models
to
run.
A
So
it's
kind
of
forced
minimal
set
of
data
temporarily
like
from
your
local
environment,
and
maybe
that's
the
answer
on
your
question.
So
that
was
the
reason
how
we,
how
or
how
I
tested
before
so
I
rely
on
okay.
This
copit,
instead
of
100
models,
will
run
two
models
and
see
or
probably
I,
feel
okay,
I'm
super
confident
this
can
move
forward
and
there
is
also
a
first
story
about
approach
of
development.
A
A
A
You
see
DB
test
Stacks
because
of
Monte
Carlo
issue.
I
spoke
with
Monte
Carlo
team
and
they
suggest
okay
upgrade
library
to
the
latest
version,
and
for
that
reason
you
should
go
to
different
repo.
It's
called
data
image
where
our
data
images
are
built,
and
if
you
go
here,
you
see
you
need
to
change
the
definition
or
version
of
your
package
should
be
installed
and
used
in
DBT.
B
A
A
Of
DBT
image,
so
now,
when
we
spin
up
airflow
locally
and
run
DBT,
we
will
run
this
version
of
package.
Then
locally
you
can
run
because
on
production
is
105
or
107
locally
in
your
branch
is
109,
because
you
call
that
version
when
you
spin
up
your
local
Air,
for
instance,
and
run
the
DBT
it
will
run
here.
So
that's
the
way
and
you're
able
to
test
it.
A
So
actually
you
combine
here
two
things:
one
is
change
on
data
image.
Request
is
merged
to
main
or
Master
Branch.
Then
you
create
a
tag
next
thing
on
a
little
crypto.
You
should
do
your
thing,
create
a
merge
request
and
update
the
version
of
DBT
image
in
this
case,
or
it
can
be
I,
don't
know,
data
image
or
any
other
image.
We
are
using
No
Limit
here,
but
in
this
particular
case
it's
DBT
image
referred
to
109
instead
of
105
and
that's
it
and
then
you're
actually
able
to
execute,
execute
and
do
airflow
tests
locally.
A
B
B
B
If
airflow
Works
locally,
then
this
Mr
gets
pushed
to
production,
the
CI
CD
jobs
get
run
and
we're
okay.
Do
we
have
to
do
anything
for
the
CI
CD
job
in
the
remote
for
this
new
tag?
No.
A
No,
no,
no
just
actually,
when
you
go
here,
add
Monte
Carlo
data
leave.
This
is
fairly
small
change,
I
edit
it
directly
from
browser,
then
actually
two
fireflies
that
are
built
built
and
this
is
delete.
So
if
this
pass
that
approve
it
check,
everything
is
okay,
it
was
very
minor
change
and
then,
if
you
go
to
repo
and
then
go
to
text,
you
should
create
once
when
it's
merged
109
tag
from
Master
branch
and
see,
and
you
have
no
tag
109
and
then,
as
you
said
nicely,
you
should
use
in
DBT.
A
Image
locally
has
been
up
average.
So
that's
it.
Actually.
You
have
two
piece
of
storage
data
image
build.
If
it's
built
properly,
then
it's
okay,
someone
should
merge
it
and
merge
your
code
to
master
branch
and
you're
ready,
then
on
your
local
MR
and
your
branch
in
F
in
in
analytics
repo
you're
able
to
use
109.
B
A
B
Okay,
that's
a
great
explanation.
I
think
that
gives
me
a
lot
more
clarity,
yeah.
A
I
was
sorry,
I
was
most
confused
with
this
part.
Where
is
what
how
to
change
something
outside
of
analytics
repo,
you
know,
and
let's
triple-
is
typical
programming
Ripple
with
python
code,
that's
okay,
but
in
if
you
need
to
touch
infra
part
or
upgrade
update,
delete,
improve
infra
part.
You
should
go
to
data
image.
B
Yeah
yeah,
100,
yeah
and
I
I
feel
you
like.
That's
the
part,
that's
more
confusing
for
me
because.
A
A
B
A
You
when
you
go
the
overview,
it's
fairly
simple,
but
I
spend
a
lot
of
time
and
nice
to
to
God
sort
out
this
for
myself.
So,
as
I
said,
this
data
meat
is
a
great
approach,
because
you're
able
to
use
actually
two
different
versions
of
images
inside
the
same
rip.
Okay-
for
let's
say
this
part
I
will
use
this
for
testing.
I
will
use
this
so
fairly
easy
I
would
say
when
Once,
when
you
got
the
point.
So
actually
we
have
five
six
images.
Dbt
data
images
are
the
main
one,
the
team.
A
B
Yeah
100
I'm,
sorry
real,
quick.
We
just
go
back
to
because
I
know
that
this
fixing
the
requirements
file
is
the
best
way
of
doing
it,
but
just
using
the
example
of
commenting
out
the
the
dag,
like
as
a
temporary
fix
I'm,
just
trying
to
imagine
what,
because
you
said
it's
better
to
test
it
as
much
as
possible
yep.
B
So
it
sounds
like
to
me.
Oh
it
finished
yeah.
B
Okay,
so
it
sounds
like
to
me,
though,
like
if
you're
updating
the
Dag,
you
kind
of
you
have
to
test
both
the
airflow
section
locally
in
the
local
8080,
but
then
you
also
have
to
test
that
DBT
or
do
you
have
to
test
DBT
at
that
point,
because
the
original
issue
is
DBT
test
is
failing.
So
if
you
comment
out
the
Monte
Carlo,
you
want
to
make
sure
that
DBT
work,
DBT
test
works.
So
then
I
guess
I
would
uncomment
out.
I
would
just
run
the
DBT
test,
airflow
task
and
then
that's.
A
A
It's
it's.
Your
problem
is
related
to
DBT
test
and
it's
part
of
the
deck.
For
now
our
setup
said:
okay,
we
have
one
DBT
DAC
for
all
tasks.
Now,
where
is
the
coupling
that
which
is
great
because
in
the
previous
company
we
have
already
couple
and
hierarchy
of
ducks
and
tasks
which
was
super
great
and
easier
to
manage
low
dependency,
lower
light
latency?
But
now,
as
I
said,
your
problem
is
inside
the
BTS.
A
You
should
test
that
part
inside
your
airflow
to
ensure,
because
you,
you
didn't
call
sensitivity,
you
call
DBT
with
some
parameters
and
it's
impossible
to
run
that
locally.
That's
a
restriction,
so
I
would
say
inside
airflow.
You
should
do
a
test
as
much
as
possible,
but
what
just
what
Justin
Stark
did?
It
is
probably,
if
you
go
to
here
to
Dax
to
I,
don't
know
test
or
let's
say
it's
cool
debit
test
or
something.
B
I
think
it's
within
dpt.py
there.
A
B
A
Because
DBT
with
Monte
Carlo
integration
said:
okay,
if
you
want
to
include
Monte
Carlo
stuff
around
Monte,
Carlo
library
and
this
library
is
defined
in
our
data
image.
So
if
you
remove
this
I
would
say:
99
your
coffee
and
everything
will
cost
fine.
Why?
Because
before
it
was
exactly
the
same
quote
previously,.
B
B
A
Mm-Hmm
yeah:
this
is
the
full
command.
Sorry
Monte,
Carlo
library
and
said
import
DBT,
Run
results,
DBT
Run
results
is
actually
filed
from
DBT
as
a
set
of
results.
We
also
import
this
in
Snowflake.
You
can
see
that
in
DBT
schema
just
to
get
the
feeling
of
it
looks
like,
but
yeah
I
got
the
point
now.
I
conclude:
okay,
this
command
set
in
for
this
Sim
Monte
Carlo.
Somehow
so
it's
kind
of
part
of
integration,
so
Dimitri
launch
results
file
should
be
uploaded
in
Monte,
Carlo
and
then
Monte
Carlo
can
connect
something.
That's
my
assumption.
A
A
Myself,
yeah
yeah.
Definitely
yes,
I
would
say
all
of
us
have
some
kind
of
way
how
we
work
something
it
can
be
wrong
or
right,
but
yeah
we
have
our
all
approach,
but
what
it
does
he
said,
I
will
do
this
kind
of
okay.
This
will
be
a
reminder
for
me
to
make
it
back
if
it's
okay,
but
I
I
would
I
will
do
this
to
be
on
the
safe
side.
Just
my
Approach,
nothing
else.