►
From YouTube: 2022-05-05 meeting
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
B
All
right
cool,
so
I
guess
we
can
get
started
all
right.
Yeah
looks
like
right
now
we
don't
really
have
any
top
place
issues
or
pr's.
I
guess
pretty
much
just
heads
down
on
metrics
work:
diego
did
you
want
to
go
through
the
metrics
board
review
yeah.
A
Okay,
pretty
good
job
people.
Thank
you,
everyone
for
your
effort
reviews.
A
We
are
very
close.
Only
one
issue
in
progress.
The
implementation
of
timeouts,
I
think,
has
been
making
some
progress
lately.
A
Irrigation
aggregation,
class
and
and
monotonicity
as
the
spec
requires.
We
have
discussed
this
before
and
I
also
added
a
bunch
of
these
cases.
All
these
that
are
instrument
match
conflict,
one
up
to
six.
Seven
yeah,
like
eight,
is
cases
or
something,
and
I
asked
the
gmat
to
review
these
test
cases
right
now.
So
please
take
a
look.
This
should
fulfill
that
as
requested
of
using
all
those
characteristics
when
comparing
the
instrument
matches,
and
finally
I
have
this
one.
A
I
submitted
it
yesterday
as
well,
no
or
two
days
ago.
I
remember,
I
think,
sarah
for
the
review.
I
am
right
now
addressing
your
comments,
pretty
much,
I'm
changing
this
class
so
that
the
attributes
are
in
the
point,
object
not
here
and
I'll,
be
submitting
this
soon.
So
when
that's
done,
I'm
gonna
probably
ask
you
for
reviews
and
aaron:
do
you
have
any
status
on
these?
These
other
issues.
C
So
I'm
just
gonna,
I
think
we
agreed
on
the
actual
issue
or
having
the
discussion
the
way
to
move
forward.
The
only
thing
is,
I'm
gonna,
probably
not
do
the
change
for
the
otlp
exporter,
I'm
gonna,
leave
it
leave
it
as
is
and
address
it
in
a
separate
pr,
because
I
don't
think
it's
it's
like
an
outstanding
bug
right
and
there's
some
common
shared
code
with
the
tracing
stuff.
C
So
I'd
rather
address
that
separately,
but
basically
there's
some
bugs
with
the
current
otlp
exporter
and
in
fact
the
batch
batch
span
processor
isn't
even
respecting
the
the
timeout
that's
passed
to
it.
It
only
uses
it
in
force
flush.
So
I'm
just
going
to
add
basically
the
parameters
in
a
pr
to
close
this
issue
and
I'm
going
to
open
a
separate
one
for
otlp
and
then
we'll
also
have
to
go
through
the.
C
I
guess
otp
is
the
only
metric
exporter.
We
have
right
now,
so
that's
probably
the
only
one
that
will
be
important
for
the
metric
stuff.
Regarding
the
timeouts
for
the
callbacks,
that's
probably
much
the
same,
I'm
just
going
to
add
a
timeout
parameter.
I
guess
for
for
the
callbacks
and
yeah
I
don't
know,
I
think
that's
basically
it.
A
All
right
sounds
good,
so
this
is
the
status
of
the
board
right
now,
as
you
can
see,
we're
very
close,
and
do
you
think
that
if
we
get
all
these
three
issues
mentioned,
we
can
get
a
release
candidate.
A
See
why
not,
I
think,
that's
reasonable
great,
great
okay,
so
yeah,
I'm
gonna
be
focusing
on
getting
these
two
merged
as
soon
as
possible.
Hopefully,
today
I
will
have
fixes
for
your
comments
here
and
do
you
think
the
apr
for
this
can
be
submitted
today
or
tomorrow.
C
Yeah
I'll
I'm
gonna
be
updating
it
today.
Sorry.
A
All
right,
great
yeah,
I'll
review
immediately.
My
goal
is
to
have
a
release
candidate.
This
week
we
were
planning
on
doing
that
last
week,
but
but
yeah.
If
we
can
get
that
released
by
the
end
of
tomorrow.
That
would
be
fantastic.
So
please
ping
me
immediately.
If
you
need
any
reviews
or
anything
I'll
jump
right
in
to
review
your
appearance.
C
Yeah
click-wise,
can
we
discuss
the
view
name
conflict,
one
yeah,
so
I
I
think
the
I
didn't
look
too
closely
at
the
implementation,
but
it's
still
like
really
nested.
I
think
that
was
my
outstanding
comment,
which
is
still
still
sort
of
applies,
like
the
actual
code.
That
is
doing
all
the
checking
and
logging
and
stuff
is
getting
really.
A
Okay,
yeah
yeah
yeah.
Sorry,
I
haven't
addressed
that
at
that
comment.
I
I
will
definitely
move
this
into
a
function
so
that
it's
not
that
deep
yeah.
Sorry
I
was
just
I
was
focusing
yesterday
just
on
implementing
this
part
yeah,
but
but
yeah.
Definitely
I
will.
I
will
implement
that
because
yeah,
it's
getting
really
really.
C
Could
I
just
ask
about
like
the
semantics
of
this,
so
if
we
make
it
the
equality
check
for
the
view
instrument
that
the
instrument
matches
based
on
essentially
the
like
identity
of?
What's
what's
in
the
data
model,
it's
a
little
confusing
to
me
to
to
do
it
as
equality,
because,
like
the
view
instrument
match
is
also
holding
a
set
of
like
like
instances
of
the
aggregations
for
specific
label
sets.
C
So
it
just
might
be
kind
of
confusing
to
say,
like
okay,
this
one's
equal
to
this
one,
but
then
they
have
completely
different
data
inside
of
them.
It's
more
like
it's
more
just
like
a
check
of
the
like
metadata
properties
that
won't
that
are
immutable.
A
Yeah,
I
could.
I
can
rename
this
method
to
something
else
like
I
don't
know
something
more
descriptive.
C
Like
you.
A
A
Yeah
I
mean
I
can,
instead
of
using
the
the
equality
operator
to
avoid
that
semantic
confusion.
I
can
rename
this
to.
I
don't
know
view
instrument
match
or
comparison
or
something
so
some
other
name,
that's
more
descriptive
that
doesn't
mislead
readers
into
thinking
that
yeah.
C
Okay,
I'll
leave
a
comment
then,
for
that
the
only
other
thing
I
was
gonna
say
was
in
the
in
the
test.
I
just
looked
at
them
again.
I
think
it
would
be
really
good
to
add.
I
have
like
a
integration
test
folder
or
integration
tests
like
pattern,
sort
of
thing,
there's
a
few
other
tests
like
that
to
do
this
at
like
a
higher
level.
I
think,
would
be
good
because,
like
this
is
this
is
testing
sort
of
like
the
implementation
of
the
sdk.
C
Does
it
like
more
like
a
unit
test
and
we're
mocking
the
dependencies,
whereas
sort
of
it'd
be
good
to
capture
the
test
at
a
higher
level
so
that
if,
if
the
implementation
changes,
the
behavior
is
the
thing
that
we're
testing
really
like,
I
think
for
josh
mcdonald
to
review
this.
It
would
be
easier
if
it's
like
a
full
scenario
like
here's.
Here's,
the
set
of
views,
here's
the
specific
measurements
that
I'm
making
and
knowing
that
like
nothing,
is
mocked
out.
C
C
Can
have
more
information
in
those
cases,
okay,
yeah!
That
would
be
my
recommendation
just
so
that
it's
easy
to
like
codify
how
we,
how
we're
implementing
the
spec?
E
I
have
a
question,
so
I
was
under
the
impression
that
we
are
going
to
create
a
separate
metric,
separate
issue
for
each
framework
that
we
are
also
going
to
support
from
metric
sdk.
So
is
that
plan
is
still
active
or
is
there
any
change
in
that.
F
So
that's
a
different
issue.
We
will
still
be
creating
an
issues
for
the
frameworks.
This
is
this
is
just
for
the
you
know,
main
api
and
sdk
and
releasing
them
trying
to
get
them
to
the
ga
in
the
country.
We
still
have
the
issues
created.
You
know
for
the
ad
in
the
matrix
instrumentation
for
the
frameworks.
E
So
we
are
going
to
do
that
after
this
api
and
sdk
rga.
F
C
All
right,
I
think
it's
so
good-
is
that
something
you're
working
on
get
like
adding
instrumentation
in
the
control
repo.
E
For
a
minute,
not
right
now
but
as
I
was
working
on
one
of
the
metric
issue,
but
I
mean
in
last
last
segmenting
it
was
I
mean
something
to
discuss
that
we
are
going
to.
You
know,
create
a
separate
issue
for
each
framework
for
metric
instrumentation.
So
I
was
interested
in
working
on
that.
So
that's
so
nice.
B
Hey,
thank
you.
What
kind
of
I
guess
we
don't
have
to
talk
about
this
right
now,
but
there
are
like
metrics
implementations
for
some.
I
I
believe
I
worked
on
this
like
a
year
ago.
What
kind
of
things
are
you
are
you
following
or
including?
Do
you
think
you
have
an
idea.
E
E
As
of
now
I
mean
I
haven't,
did
not
dig
deep
into
it,
but
I
haven't
yet
to
start
it.
So
I
was
just
I
mean
just
checking
whether
we
have
the
issue
create
already
created,
so
I
can
get
some
clarity
from
it,
but.
B
B
Issues
in
the
contributor
it
was
like,
like
add,
metrics
back
to
certain
instrumentations.
You
can
start
off
there.
F
Yeah,
like
last
time,
we
discussed
the
goal,
was
to
restore
them
and
then
use
the
latest
like
make
it
sync
with
the
latest
metrics
api,
and
then
you
know,
if
there's
any
additions,
that
we
can
follow
later,
that
the
goal
was
to
restore
the
previous.
Whatever
we
had,
you
know,
duration.
Whatever
we
were,
you
know
capturing.
C
C
B
Yeah,
it's
it's
greatly
out
of
date,
so
you
know
yeah.
C
That'll
be
cool
yeah.
I
think
I
think
it'd
be
awesome
to
have
some
instrumentations
as
part
of
like
this
ga.
So
if
we
can
build
some
against
the
rc
and
then
we
have
something,
people
can
actually
use.
That
would
be
awesome.
D
B
Do
we
have
a
tentative
release
date?
Do
you
think.
A
Yeah,
I
was
just
with
aaron
that
we
we
get
this
three
issues
merged
and
then
we
make
a
release
candidate
tomorrow,
end
of
day.
A
Well,
j
mcd
is
looking
at
already
right
now,
at
least
at
this
part,
I
can
prepare
a
document.
You
know
like
defining
our
our
interfaces
so
that
maybe
he
can
take
a
look
at
them
as
well.
D
A
B
C
I
think
we
can
make
the
rc
before
we
do
that.
I
think
I'm
like
relatively
confident
that
there's
not
any
like
breaking
api
stuff
that
we've
misimplemented
I'm
more
concerned
about
the
behavior.
I
think,
except
maybe
the
metric
reader,
that
one
I
think
it
would
be
better
if
riley
took
a
look
at
if
that's
possible.
I
don't
know
if
you
guys
still
work
together,
leighton
but
yeah.
G
I
was
wondering
so
is
there
a
way
to
run
the
tests
on
a
repo
not
for
everything
but
for
individual
projects,
because
tests
for
every
or
maybe
I
just
don't
know,
but
just
for
everything
they
take
a
long
long
long
time,
and
it
is
tough.
A
Yeah
you
can
you
can
do
this,
for
example,
if
you
run
it
like
this,
you
will
only
run
it
with
one
python
version
and
you
will
only
run
the
test
cases
in
this
folder.
A
A
D
G
Yeah,
okay,
okay,
make
sense.
Can
you
do
you
mind
pasting
these
two
lines?
I'm
I
I'll
message
you
on
slack.
F
G
Okay,
so
what
you're
saying
I
should
clone
the
main
repo
and
then
it
will
be
kind
of
tests
will
run
from
the
main
repo
and
when
you
run
them
just
cloning
contribute
they
are
trying
to
reach
online.
Is
it
sorry
I
didn't
quite
yeah,
I
didn't
quite
get
what
you
meant.
G
A
If
not,
I
guess
we
can
put
some
time
back.
Thank
you
for
joining
and
see
you
next
week,
thanks
guys.
Thank.