►
From YouTube: 2023-09-04 Analytics Section Meeting
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
Welcome
to
the
analytics
section
to
sync
on
the
4th
of
September.
We
have
one
topic
for
our
Show
and
Tell
and
get
here
of
the
stage.
B
Yeah,
so
this
is
related
to
the
issue.
I
think
we
got
for
like
two
three
weeks
back
where,
like
someone
tried
the
without
the
SDK,
all
the
events
and
it
actually
blocked
the
pipeline
and
like
no
other
events
were
going
through
so
I'll
be
presenting
like
what
the
fix
we
did
for
that.
So
I'll
share
my
screen.
B
So
basically,
so
this
is
the
issue
and
in
that
issue
like
the
payload
is
present.
So
only
thing
which
was
missing
in
the
payload
was
the
STM
field.
It
is
a
timestamp
like
a
timestamp
of
like
when
the
collectors
send
this
event
to
the
server
so.
B
Different
type
types
of
time
stem
and
like
this
is
one
of
them
and
in
our
and
so
that
timestamp
is
actually
optional
for
snow
blow,
but
in
analytics
configurator
we
made
that
time
step
mandatory
like
it
should
be
a
date
field.
It
can't
be
anything
else,
so
that
is
why
the
pipeline
was
blocked
and
like
The
Click
house
was
not
picking
any
other
thing
as
like,
because
it
was
throwing
an
error
for
that
particular
event.
So
the
fix
for
that
which
we
added
was.
B
Like
if
that
timestamp
is
not
present,
so
this
is
the
JavaScript,
so
we
added
a
snow
blow,
JavaScript
enrichment.
So
what
it
does
is
like
when
we
send
the
event
to
The
Collector
like
there
is
an
enrichment
process
that
happens
and
it
basically
filters
out
like
whether
it's
a
good
event
or
a
bad
event,
so
we
can
provide
with
the
JavaScript
function.
We
can
provide
a
condition
like
consider
this
as
a
bad
event,
so
here.
B
Stacking
like
whether
that
field
is
present
or
not,
if
it's
not
present,
we'll
just
say
we'll,
just
pass
it
to
the
bad
events
with
this
particular
error
message.
So
that
is
the
fix
we
did
and
in
the
snow
blow.
How
we
add
that
is
like
we
create
a
Json
file
with
with
the
custom,
JavaScript
snowflow
configuration
and
that
particular
code
which
I
showed,
would
be
encoded
and
show
it
here
so
and
we
added
to
the
enrichment
folder
and
it
takes
up
from
there.
B
B
Here
we
are,
we
are
getting
those
events
in
case
any
one
of
the
events,
so
here
in
the
payload
we
can
see
there
is
the
skm
is
present.
So
if
I
remove
the
STM
and
send
it
again
so
I'll
try
this
with
Postman.
B
Well,
I
guess:
I'll
just
shared
this
Chrome,
but
I'm,
just
removing
the
STM
from
that
and
just
sending
it
so
I
have
sent
that
and
if
we
see
in
the
click
house
snow
blow
bad
events.
B
It
will
come
here
and
it
will
show
the
exact
error
message
which
I
showed
like
STM
is
not
present,
so
it
just
shows
this
timestamp
is
not
defined
and
it
has
put
into
the
bad
events
category.
What
was
happened
after
that
like
once
this
was
there
like
this
pipeline
was
blocked
and
nothing
was.
We
were
not
able
to
send
any
new
events.
So,
if
I
show
you
like,
we
are
already
seeing
the
new
event,
so
I'll
just
click
some
of
more
and
so.
B
A
pipeline-
and
it
just
moves
it
to
the
bed
event
State.
So
this
was
the
fix
for
that.
There
are
other
bunch
of
timestamps.
Are
there
so
I'm
planning
to
add
those
validations
as
well?
In
this
configuration.
D
B
D
Okay,
so
in
the
issue
we
also
discussed
moving
these
scripts
to
git
sub
module.
Do
you
want
to
still
explore
this
idea,
or
maybe
like
in
a
separate
issue,
or
do
we
just
for
now.
B
B
About
that,
so
basically
what
we
thought
about
it
like,
so
we
need
to
basically
copy
this.
These
things
into
the
booth
like
dev
kit
and
analytics
stack
right,
so
it
there
is
a
chance
that
they
both
can
go
out
of
things.
If.
D
B
Like
someone
forgets
to
create,
create
a
Mr
for
one
repo
and
not
another,
so
for
that,
like
we
have
created
another.
B
Serve
as
a
single
source
of
Truth,
so
that
is
a
custom
enrager.
D
B
So
in
this
repo,
this
repo
will
basically
serve
as
a
single
source
of
Truth,
where
we
put
all
our
custom,
enrichments
and
I've
added
in
the
readme
section
as
well
like
so
from
that,
like
we'll
copy
all
those
things
like
if
they
are
in
out
of
syncs
like
that,
that
would
be
the
source
of
Truth
yeah.
D
B
D
C
So
I
was
looking
at
like
there
are
some
Fields
where
we
can
skip
the
rows.
If
there's
a
passing
error,
but
actually
those
I
couldn't
manage
to
make
it
work,
I
think
it's
more
for
like
CSV
Imports,
but
what
I
found
is
that
there
is
an
option
on
the
Kafka
table
itself,
so
it's
called
kafka's
keep
broken
messages.
C
We
didn't
set
it
up
now:
I'm
testing
it
interior
as
I've
understood.
If
we
set
a
number
there,
it
Should
Skip
those
non-perishable,
let's
say
messages
to
the
bad
key
automatically
I
think
we
can
also
investigate
that.
A
Yeah,
yes,
I
think
that
would
be
awesome
if
it
actually
skips
the
rows
but
I
what
we
have
to
keep
in
mind
that
those
then
wouldn't
automatically
be
sent
to
the
bad
queue.
In
my
understanding
it
it
will.
C
C
So
in
a
documentation,
it's
not
clear
on
stack
Overflow,
it
was
mentioned
that
it
would
move
it
to
a
bad
queue
which
is
good,
but
I
didn't
test
it.
Yet
I
will
I
need
to
check
it.
Okay,
okay,
yeah.
A
Yeah,
because
that
is
definitely
as
I
said,
I
mean
I.
Think
that's
something
we
discussed
before
that.
Certainly
something
we
need
to
do
is
we
need
to
prevent
in
every
way
possible
that
one
would.
However,
malformed
event
is
tripping
up
the
whole
queue
and
kind
of
leading
to
everything,
just
stopping
if
we
can
in
any
way
we
just
need
to
say:
okay
just
jump
over.
If
you
encounter
something,
that's
so
malformed
that
you
cannot
deal
with
it
and
then
we
can
deal
with
the
consequences
later
of
trying
to
figure
out.
A
A
Maybe
one
note
also
on
the
on
the
repository
setup
that
we
have
it
now
in
three
different
repositories.
That's
definitely
not
ideal.
I.
Think.
One
thing
that
you
also
saw
is
that
the
in
the
config
that
you
up
like
give
to
snowplow
the
code
is
also
base64
encoded,
which
means
that
the
actual
enrichment
config
isn't
really
readable.
A
So
we
need
to
put
the
actual
code
somewhere
separately,
where
it
says.
Okay,
this
is
the
actual
code,
and
then
you
pass
it
through
base64
encoder,
and
then
you
can
actually
turn
it
into
this
enrichment
Json,
but
we
wanted
to
go
for
a
as
minimal
solution
as
possible
right
now
that
we
have
it
documented
what
the
solution
should
be
like
how
should
it
look
like
and
then
copy
it
in
in
one
place
in
a
single
source
of
three
and
copy
it
over
into
analytics,
stack
and
devkit?
A
If
we
require
to
work
on
this
more
often,
so
if
we
encounter
it
B
one
has
repeatedly
changed
this
or
add
more
validations
as
I
could
set.
Then
I
think
we
should
definitely
or
definitely
also
need
to
look
into
how
to
make
so
process
more
smooth.
A
Whether
we
use
actual
git
functionality
for
that
or
some
other
functionality,
we
will
need
to
see.
As
far
as
I
know,
there
is
just
no
kind
of
snow
plow
provided
functionality,
how
you
package
up
a
enrichment
in
a
portable
way
so
like
that,
you
install
it
like
a
package
or
something
that's
just,
unfortunately,
not
provided.
A
So
we
need
to
come
up
with
our
own
solution
and
step
by
step,
make
it
more
automated
than
it
than
it
is
now,
but
at
least
it's
kind
of
working,
and
we
can
see
this
as
a
first
step.
A
All
right,
then,
thanks
a
lot
Ankit
for
kind
of
walking
us
through
those
things
I
think
that's
very
valuable.
That's
exactly
what
the
kind
of
issues
and
problems
we
should
share.
You
know
Show
and
Tell,
since
the
they
are
kind
of
such
clearly
at
the
at
the
intersection
between
our
two
groups,
you're
working
on
the
one
thing,
how
you're
working
on
the
other
thing
and
it's
kind
of
where
we
pass
things
over.