►
Description
Introduction on Handover of the mechanism to the PI team for the new app created by the Data team.
A
Okay,
folks
I
think
the
first
knowledge
session
regarding
how
to
check
series
pink
on
Snowflake
and
to
expose
new
app
created
by
data
team
career,
able
to
see
my
screen
or
not
I,
don't
want
to
escape
not
yet
probably
now
right,
yeah,
very
good,
now
yeah.
A
A
We
got
a
set
of
SQL
queries
in
possible
syntax
and
we
use
actually
the
same
queries
after
transformation
to
snowflake
syntax
to
execute
the
demon
result
or
tables
in
pause
this
pipeline.
Actually,
it's
a
copy
from
production
database.
So
you
have
a
plan
to
get
plenty
of
Select
statements
here,
and
this
is
good
because
we
transform
this
rename
the
tables
to
be
adjusted
to
snowflake,
syntax,
execute
them
calculate
the
metric
storage
somewhere
and
then
down
the
road
analytics.
Engineers
can
use
this
as
a
source
of
tool
for
automated
service
being
metrics.
A
But
of
course
this
is
in
a
perfect
world.
Things
are
changing
a
bit
very
quickly.
I
would
say
from
time
to
time.
So
it's
very
easy
to
overlook.
If
you
alter
new
metrics
or
change
the
code
for
the
existing
one
and
then
we
have
boom
when
we
want
to
calculate
something
and
query
is
not
correct
for
any
reason,
usually
it's
like
okay,
some
table
is
not
there
yet
in
postgres
pipeline.
A
So
usually
we
reactively
fix
the
things,
introduce
new
table
or
objects
or
new
column
or
whatever,
and
then
rerun
the
pipeline,
and
everything
is
fine.
We
want
to
shift
a
bit
to
be
more
proactive
and
deliver
something
to
you,
so
you
can
easily
do
the
test
in
confident
way
and
then
confirm
with
everything
is
fine.
You
can
move
on
without
ask
us
or
Mark
label
any
issue,
so
it
will
be
I
would
say
efficient
way
of
communication.
In
that
case,
you
will
be
easily
recognized.
A
Something
is
wrong
or
not
with
with
objects
in
Snowflake,
you
will
play
with
positive
syntax,
and
let
me
show
you
how
things
are
going
also
before
we
move
forward.
This
is
the
let's
say:
General
mock-up,
how
we
try
to
organize
our
application
if
you
introduce
new
metrics
on
the
PA
team
or
your
site,
and
you
have
pause
this
query
for
that
Matrix.
You
should
call
restful
API
with
that
query
and
our
application
will
do
the
rest
under
the
hood
will
transform,
as
I
said,
postgres
to
snowflake
notation,
with
some
renaming
and
few
additional
calls.
A
We
need
to
calculate
the
Matrix
executed,
query
and
show
you
the
not
result,
but
just
is
it
correct
or
not,
because
we
want
to
hide
the
result
for
Seiko
security
and
everything,
and
then,
if
that
response
code
for
S4
API
is
200,
it
means
everything
is
fine.
You
can
process,
otherwise
you
should
just
send
us
the
result
of
that
restful.
Api
call
and
we
can
do
in
advance
up
front
everything
you
need
to
organize.
A
Then
you
later
on,
you
can
check
it
again
and
if
it's
fine,
you
can
proceed
and
deploy
it,
deploy
your
code,
only
limitation,
which
is
annoying
for
me
and
probably
will
be
for
you
is
about
credentials
and
how
to
get
and
hook
up
on
the
snowflake
database
in
this
iteration
it
will
be
open
webpage
for
you.
You
should
log
in
with
your
credentials.
A
A
A
A
It
will
be
done
very
quickly
and
because
project
is
small
in
case
you
don't
have
anything.
You
need
to
run
this
application,
it's
poetry
and
python
libraries,
you
will
just
run,
make
Command
python,
poetry,
install
and
Magic
will
happen
here
and
everything
will
be
installed
for
you
in
advance.
The
next
thing.
What
we
need
to
do
is
just
to
put
secrets,
so
I
will
go
in
the
folder.
A
This
is
my
username,
because
I
want
to
hook
up
here
for
purpose
of
this
demo
with
my
account
password
is
empty
because
I
will
use
SSO
login
in
Snowflake.
My
role
account
will
be
git
level
also
for
you
and
I
use
very
small
data
warehouse.
This
access
token
is
not
needed,
I
would
say
at
the
moment,
but
I
put
it
for
some
future
purposes
once
when
we
put
everything
on
on
cluster
and
have
proper
security,
better
token
and
everything,
but
for
noise,
it's
just
dummy
value.
A
Let's
make
our
life
a
little
bit
harder,
I,
don't
know
why
I
didn't
remove
it,
but
not
a
problem.
I
save
everything,
I
need
in
dot,
environment
file
and
I,
then
I
need
to
to
make
run
command.
I
mean
I'm
starting
from
scratch.
So,
for
the
first
time
you
will,
it
will
take
some
times
under
the
hood
interesting
to
say
we
are
using
python,
poetry,
Library,
which
is
great,
so
you
can
easily
isolate
the
environment.
A
Despite
what
you
have
on
your
machine,
it
will
not
screw
up
rest
of
your
projects,
your
environments,
everything
else
and
also
we
use
fast
API
framework
to
create
a
restful
API.
This
tool
is
very
I,
would
say
impressive
for
us,
because
you
get
this
figure
documentation
automatically
out
directly
from
the
code.
Same
goes
as
for
DBT,
even
type
of
code
in
DPT,
implicitly,
you
get
the
documentation,
ready,
so
no
duplication
or
that
boring
work
part
with
the
documentation
when
I
run
when
I
run
make
run
I
installed.
Everything
after
the
libraries
I
need
pick
up.
A
So
if
you
open
the
browser,
you'll
see
a
typical
documentation
for
restful.
Api
calls,
as
I
said,
this
is
automatically
generated
for
us,
which
is
great
and
I.
Have
some
dummy?
Try
it
out
option
like
Hello
World,
which
is
okay,
see
this
is
the
body,
and
this
is
the
code
and
what
is
the
most
interesting
for
you?
Is
this
part,
and
you
need
that
boring
better
talking
here,
just
to
make
everything
properly,
I
want
to
try
it
from
web
page
and
I
want
to
execute.
A
A
Input
for
me
is
metrics.
Name
can
be
anything
literally,
but
for
your
sake,
you
put
your
real
metrics
name,
so
you
can
easily
communicate
and
pause.
This
query
in
this
case
select
one
I
will
use
more,
let's
say:
color,
for
example,
something
real
and
go
to
the
command
line,
be
able
to
use
serum.
You
can
use
serial,
you
can
use
Python,
you
can
use
urban
rails,
you
can
use
whatever
you
want
if
I'm
here
and
do
serial
and
then
again
log
in
and
close
it
I
can
see
results.
A
And
I
want
to
show
you.
This
is
my
well-formatted
good
example.
This
is
input.
What
you
put
there.
This
is
output
status
is
okay.
This
is
snowflake
query.
How
we
transform
this
into
something
digestible
for
us
and
for
our
data.
Warehouse
error
message
is
null,
which
means
everything
is
fine,
and
this
is
timestamp.
Pretty
much
is
if
I
go
here
and
call
it
again,
but
instead
of
positive
security,
select
count.
Something
I
will
put
fake
table
name
which
not
exist,
something
and
execute.
It
then
annoying
part.
A
A
If
your
status
failed,
you
have
the
snowflake
Theory,
so
it
means
something
is
wrong.
We
miss
some
object.
Table
does
not
exist
whatever
so
actually
in
case
something
is
wrong
and
it's
not
200.
It
will
send
this
to
us.
We
can,
let's
say
read
this:
introduce
new
objects
later
on.
You
can
retest
it.
If
it's
fine
and
you
got
okay
status
here,
200,
it
means
we
are
ready
to
go.
A
Nothing
special,
very
simple.
As
I
said,
there
is
an
annoying
part
with
this
closing
the
top
in
your
favorite
browser
by
your
choice,
but
everything
else
is
kind
of
fairly
easy.
We
are
thinking
in
the
future,
as
I
said,
to
put
this
on
cluster,
you
will
get
URL
and
you
can
fully
automate
the
testing
the
testing
process,
so
for
now
it
will
be
kind
of
semi-automated
somehow.
But
for
me
the
most
important
thing
is
to
to
get
the.
A
A
It's
very,
very
difficult
for
me
or
for
other
callers
to
check
these
new
metrics
is
introduced,
and
do
we
miss
something
or
not
so
it's
kind
of
fundamental
process
subject
to
failed
or
I,
don't
know,
make
some
errors,
and
this
is
the
way
how
we
can
communicate
fairly,
easy
regarding
automated
service
being
a
pipeline
or
or
piece
of
the
code.
So
actually,
this
is
very,
very
long,
very
short
story
about
what
actually
we
built
anything.
A
We
can
help
you
to
install
this
to
start
using
this
I'm
more
than
happy
to
contribute
and
to
support
you.
But
as
of
now,
this
is
a
brief
introduction.
Installation
is
fairly
easy
check
out
the
code
make
run.
You
will
get
your
local
application
server
embedded
built
in
go
to
the
page.
You
have
a
full
documentation
and
also
you
have
our
readme
I.
Think
it's
more
than
enough
to
get
a
point,
as
I
said,
for
you,
most
important
thing
is,
or
only
two
things
you
should
add
is
Matrix
name
not
very
important
and
pause.
A
B
B
And
it
can
wait
for
for
us
to
somehow
automate
it
because
it'll
be
the
most
efficient
and
it'd
be.
It
would
allow
us
to
like
self-serve
all
the
GitHub
without
processing
too
much,
so
we
can
just
automate
and
react
only
in
case
of
Errors,
not
not
in
case
even
anything
else.
But
this
is
a
great
starting
point.
Yeah
I
have
one
follow-up
question,
because
I
noticed
that
you
are
configuring
and
using
your
snowflake
account.
B
Snowflake
access
in
the
nf.n
file
and
I
just
want
to
confirm
that
anyone
who
wishes
to
use
this
application
even
on
local
needs
to
have
their
own
access
Grant
to
the
snowflake,
because
I'm
not
sure,
if
that's
the
case
for
all
developers
in
the
produce
integers
group.
If
not,
then
we
probably
could
open
the
access
requests
to
to
get
such
access
to
the
rest
of
the
team.
But
first
of
all
we
need
to
know
if
it's
required.
A
Yeah
actually
I
put
pre-requirements
here.
I
know
you
will
ask
me
because
for
me
it's
fairly
easy.
It's
my
I,
don't
know
bread
and
butter,
job
I,
I,
don't
even
think
about
it,
but
yeah.
It's
very
important
to
know
you
should
use
Norfolk,
SSO
account
or
username
password.
No
one
will
give
you
the
grant
for
using
a
password
because
of
the
compliance
and
in
future,
we'll
have
more
strict
policy
for
this.
So
probably
you
should
have
your
personal
SSO
account
or
we
can
consider.
A
If
you
create
access
request
issue,
we
can
consider
one
read
only
access
with
very
humble
grants
for
this
purpose
and
for
your
team.
If
you
don't
all
of
you,
have
that
your
personal
account
in
that
case
it
will
be
like
p18
something
at
gitlab.com,
but
if
you
have
your
own
personal
account
using
can
connect
through
OCTA
to
snowflake
it's
more
than
sufficient,
then
query
will
be
exited
as
your
user
here
for
this
purpose
of
demo,
I
use
my
own
account.
I
expose
that
part.
A
But
if
you
don't
have
it,
you
should
create
that
let's
say
access
issue
or
we
can
fix.
Okay,
maybe
on
the
team
level
we
can
have
one
with
three
the
only
privileges,
and
we
are
fine
with
that
because
of
security,
compliance
and
everything
else.
I
spoke
with
Dennis
about
this.
It's
a
kind
of
complex
because
initially
I
use.
A
Let's
say
this
combination
and
I
want
to
put
everything
in
Docker,
but
this
is
forbidden
for
me
now
and
but
if
you
want
to
put
SEO
account
to
be
executed
on
the
docker,
it's
almost
impossible
to
choose
to
switch
the
technology
to
use
poetry,
which
is
kind
of
okay.
But
as
I
said,
let's
start
from
something
and
try
to
deploy
this
on
on
kubernetes
cluster.
Then
we
will
have
this
username
password
locks
in
that
iteration.
A
If
there
any
or
I
don't
know
make
this
more
smoother,
but
yeah
long
story,
short
bottom
line
prerequisite
you
need
to
have
SSO
account
accessible
via
OCTA,
or
we
should
do
it
on
Team
level,
probably
for
sake
of
security
and
compliance.
You
should
have
your
own
account,
probably,
but
I
need
to
check
that.
We'll
put
that
as
a
correction
point
for
me
and
inform
you
about
that,.
A
B
We
can
consider
adding
this
step
to
the
our
review
process
to
integers
review
process
to
check
new
metrics
right
now
we
are
passing
the
data,
warehouse,
impacted
or
impacted
yes,
labels,
so
maybe
right
now
we
can
alter
the
process,
so
we
actually
run
the
this
track
and
only
apply
the
data,
warehouse,
impacted
or
data
with
house
impact
check
impact
check.
If
we
got
the
error
response,
if
we
got
okay,
then
we
don't
bother
you
with
exactly
the
impact
twice
so
I
think
it
would
be
a
nice
action
point
for
us
definitely.
A
Yes,
especially
process
yeah,
especially
here,
if
everything
is
fine,
don't
put
the
label,
it
means
yeah.
We
are
confident
because
this
is
our
responsibility
to
make
this
up
to
date.
In
case
something
is
right.
Okay,
we
are
behind
that
and,
as
you
said,
if
something
is
wrong,
just
paste
this
Json
and
say
impact
it,
and
then
we
can
easily
check
okay,
why?
This
is
why
the
problem
occurred.
So
we
can
introduce
what
is
needed
in
some
extreme
cases.
A
We
can
easily
communicate
with
you,
but
we
know-
and
we
have
Cornerstone
to
discuss
I'm
here
for
any
any
suggestion,
any
help
any
support
Nikolai
once
when
you
introduce
the
processor,
put
or
change
the
process
of
reviewing
ping
me
I'm
more
than
happy
to
help.
You
I
will
also
touch
this
recording
session
in
agenda,
so
you
can
easily
find
it
whenever
you
want.
B
Right
just
a
question
about
how
it
currently
Works
in
terms
of
how
we
flag
the
metrics
that
aren't
data
warehouse
compliant.
So
that's
the
label
we
use
and
that
gets
picked
up
by
the
data
team.
Or
what
like
what
happens
when.
B
A
For
for
now,
the
processes
you
put
that
label
we
check
visually,
is
it
okay
or
not
or
there's
any
problem
just
assume
or
as
per
our
experience,
so
you
can
imagine
some.
There
is
a
new
start.
It
will
be
very
hard
or
very
difficult
to
see.
What's
going
on
in,
something
is
changed,
but
with
this
change
for
to
understand
from
E
coli,
I
think
it
sounds
as
a
good
idea
with
this
iteration.
A
You
should
use
our
our
application
check
new
Matrix
code,
and
if
everything
is
okay,
you
don't
even
put
a
label
there,
because
you
know
you're,
confident
everything
is
right.
If
something
is
wrong,
I
think
probably
you
can
introduce
your
piece
of
process,
but
this
Json
file,
as
a
request,
will
be
more
than
sufficient
for
us
and
put
a
label
like
okay
impact
it
and
we
can
see
yeah
it's,
but
because
of
that,
because
we
know
this
application
produce
that
result
and.
A
A
So
we
can
easily
execute
it
once
when
everything
is
there,
like,
okay,
introduce
new
column,
your
table,
we
can
re
rerun
it
and
say
it's
fine,
and
for
you
just
rerun
the
code,
then
100
it
will
be
fine,
as
I
said
also
people
from
our
team,
like
this
approach,
using
restful,
API
to
communicate
with
other
teams
and
some
other,
let's
say,
hidden,
isolated
environments,
and
also
we
have
one
more
use
case
to
introduce,
and
probably
it
will
be
a
good
motivation
for
us
to
deploy
everything
on
post
go
on
our
cluster,
which
is
also
great,
as
I
said:
One
Stop
Shop
for
you
just
URL
address
access
and
that's
it
better
talk
and
you
can
easily
easier
use,
but
for
illustration
yeah,
you
should
do
that.
B
I
require
just
sending
me
a
lot
of
context
and
just
on
how
this
works.
Currently
yeah
I
don't
have
the
access
to
the
agenda.
I
need
access,
so
I
cannot
paste
them
in
so
just
use
the
some
chat.