►
From YouTube: 2022-03-24 meeting
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
A
A
Good
morning,
I
usually
leave
this
meeting
a
little
bit
and
lately
we've
been
having
this
situation
where,
because
of
the
hotel
metrics
finalization,
not
much
is
happening
for
me.
I
always
show
up
to
this
meeting
a
little
bit
nervous
because
of
that
two
weeks
ago
there
was
a
great
discussion
about
great,
limiting,
limiting
a
spencer
was
there.
I
admit
that
I
am
here
to
be
present
for
this
meeting
and
have
nothing
much
to
contribute
today.
A
I
know
that
last
time
we
were
speaking
about
rate,
limited
sampling,
there's
been
a
little
bit
of
movement
in
our
repositories.
My
go
sampler
reference
implementation
for
probability,
sampling
is
close
to
merging
because
we
unblocked
a
bunch
of
other
stuff,
but
it's
only
because
we
unboxed
other
stuff,
not
that
I
did
anything.
B
Which,
which
repository
were
you
just
referring.
A
To
the
oto
go,
contrib
is
where
I
often
work,
because
I'm
go
hacker
these
days
and
go
contrib
had
several
pieces
of
sampling
infrastructure
move
in
the
past
two
weeks.
I'm
not
sure
that
they've
entirely
merged.
Let
me
yeah,
they
have
so
well.
No
so,
okay,
they
have
not
yet
merged,
but
I'm
gonna
put
a
link,
so
you
can
follow
along
in
this
repository
we
have
presently
pending.
A
A
Period
but
it's
been
buried
by
all
the
other
little
pr's
that
get
merged.
I
can't
even
find
the
number
it's
it's.
They
were
pending
forever,
which
is
why,
finally,
somebody
said
we
need
this
merged
and
the
people
paid
attention,
I'm
not
finding
it.
That's
weird
anyway.
My
if
you
ask
me
what.
A
It's
so
old
yeah
I'm.
I
believe
that
is
it.
You
say:
9
36
yeah
or
a
thousand
hours
behind
wait.
I
can't
I'm
like.
A
Anyway,
oh
yeah
yeah,
that's
that's
the
one
that
merged
after
forever.
So
if
you
ask
me,
what's
the
road
map
for
hotel
sampling,
jaeger
remote
is
probably
the
leading
oss
blessed
sampling,
configurable
sampler,
and
it
has
a
lot
of
what
I
think
the
users
want
and
it
lacks
probability
built
in
and
the
reason
we
were
talking
about.
A
Religious
sampling
is
that's
the
facet
or
the
feature
of
the
jager
sampler
that
we
don't
have
a
drop
in
for
so
that
that's
why
I
find
it
to
be
interesting,
but
back
to
what
I
said
at
the
top.
I
I've
only
been
following
this
at
a
distance
and
I'm
absolutely
supportive
of
specking
out
something
to
get
probability
a
probability,
sampling,
implementation
of
jaeger
sampling,
but
I'm
just
here
to
cheerlead.
At
this
point.
B
Yeah
I'm
curious
what
either
or
anyone
I
can.
What
is
your
opinion
on
like?
How
much
do
you
wait?
Sort
of
you
know
there
are
existing
systems
that
are
pretty
ready
to
work
off
of
off
of
jager
based
stuff
versus
like
there.
There
are
some
apologies
for
being
sort
of
vague.
B
I
think
what
I'm
saying
is
true,
but
I
haven't
I'm
not
gonna
state
it
well
now,
but,
like
I,
I
think
I'm
skeptical
that,
like
the
jaeger
pattern
or
like
it,
will
be
super
extensible
for,
like
the
the
more
like,
dynamic
or
interesting
ways
that
that
I
think
I've
also
seen
so
like
we
didn't
get
to
talk
about
dyn's
sampler
last
week
right
but
like
I'm
wondering
I'm
not
sure
yet
if,
if
if
that
could
be
extended,
and
so
I'm
kind
of
anticipating,
you
know
if,
if
like
a
config
sort
of
data
model
or
like
exchange
kind
of
protocol
thing,
if,
if
like
a
new
one
did
have
to
like,
would
be
warranted
to
would
be
required
to
accommodate
the
like,
more
powerful
or
flexible
or
whatever
stuff
that
gets
mentioned,
sometimes
how
bad
how
how
un
intolerable
would
that
be?
B
A
Personally,
I
think
it
would
be
tolerable.
I
would
have.
I
think,
what
the
premise
that
you
just
gave
sounds
compelling
to
me,
but
only
probably
because
I've
actually
seen
a
dynamic,
simpler.
I
kind
of
know
what
you're
talking
about.
My
guess
is
that
the
larger
community
has
very
little
grasp
of
the
differences
that
you're
imagining
and
some
of
my
own
enthusiasm
about,
say
the
var
up
sampling
algorithm
was
that
I
finally
understood
how
you
could
do
better
dynamic
sampling,
and
that
was
just
like
wow
I'd
like
to
see
that
out
there
in
the
world.
A
So
I
I
I
guess
my
my
leading
statement
about
jaeger
was
based
on
just
having
seen
the
larger
community
kind
of
solidify
around
it.
But
it's
also
I've
seen
how
much,
how
little
understanding
there
is
of
what
you
can
do
is
why
we're
talking
about
probability
in
the
first
place,
you
know
so,
if
you're
asking
spencer,
whether
I,
whether
I
support
the
claim
that
perhaps
a
new
sampling,
a
new
configurable
sampling
protocol
or
an
extension,
that's
kind
of
backwards
that
fits
backwards
into
the
jaeger
model.
A
A
I
think
it'd
probably
be
worth
knowing
at
a
very
firm
level
the
differences
between
the
x-ray
and
the
and
the
jaeger.
I
think,
there's
some
stuff
that
x-ray
has
done
that
they're
they're
completely
focused
on
their
own
technology
right
now,
there's
basically
two
tracks
here:
there's
the
anger,
sampling
and
there's
the
x-ray
sampling
there's
something
about.
A
There
is
something
about
being
adaptive
in
the
notion
of
a
centralized
reservoir
there
that
might
tie
you
into
what
you're
what
you're
looking
at.
I
wonder
if
there's
a
sort
of
grand
solution,
I'm
not
not
hoping
that,
but
I
just
if
there
was
a
direction
that
could
converge
the
x-ray
and
yeager
with
the
honeycomb
that
would
be
sort
of
it
sounds
nice
at
least.
B
Yeah
yeah
this
your,
I
forget
the
phrase
you
just
used.
It
was
like
you
know
a
comparison
or
like
maybe
I'm
now
putting
words
in
your
mouth,
but
like
comparative
study
or
something.
But
that's
like
one
of
the
things.
So
I
have
very
rough
notes
that,
like,
among
other
things,
have
like
done
like
a
survey
of
like
at
a
pretty
low
level
of
like
exactly
how
x-ray
jaeger's
various
like
local
and
remote
like
file
and
then
and
then
adaptive
and
then
also
refinery
work.
B
I
don't
know
if
we've
talked
about
that
here,
but
yeah
refinery,
okay,
okay
and
yeah,
so
that
was
definitely
helpful
and
all
of
those
like
x-ray,
you
know,
is
like
a
sample,
sorry
centralized
thing
and
then
both
jaeger
and
refinery
effectively
like
share
memory
somehow
either
via
like
jaeger.
You
have
to
appoint
it
like
either
cassandra
or
something
else
for
its
adaptive
mode
and
then
refinery
peers
discover
each
other
via,
I
think
it's
dns,
or
something
and
then
or
reddit.
Sorry.
B
They
like
all,
have
the
hostname
of
a
redis
cluster,
and
then
they
like
know
how
to
find
each
other
through
that
redis
cluster.
But
anyway
yeah,
so
that's
definitely
like
a
desirable
criterion
that
I've
sort
of
identified
like
if
you
state
a
sort
of
throughput
limit
in
your
policy
will
that
be
enforced
like
no
matter
how
many
sort
of
actors
there
are
in
the
system
that
are
like
making
sampling
decisions.
B
So
I
yeah
there
are
some.
There
are
some
popular
samplers
for
which
that
is
like
not
true
and
like
I,
I
think
in
fact,
actually
I
might
have,
I
might
have
misspoke.
I
think
refineries
support
is
limited
in
the
sense
that,
like
I
think,
if
you
read
the
dots
closely
they're
like
oh
this,
like
throughput
limit,
your
throughput
target
thing
is
like
per
refinery,
node
and
so
like.
What
I
assume
people
would
most
like
is
a
way
to
express
like
for
my
entire
system.
B
This
is
like
the
next
throughput
and
then
somehow,
like
all
the
you
know,
appears
in
this
like
horizontally
scaled
thing,
perhaps
like
an
hotel
gateway,
cluster
or
a
bunch
of
hotel
agents.
Collaborating
like
cooperate
with
one
another
to
meet
that
to
ensure
that
so
anyway,
that's
kind
of
one
one
aspect
of
something
I
was
I
was
looking
into,
and
I
wanted
to
turn
that
into
a
question.
B
I
sort
of
logistical
or
like
process
question,
which
is:
does
this
group
have
any
preference
or
recommendation
on
how
to
share
and
get
feedback
on
like?
B
I
don't
want
to
call
them
design
docs,
because
this
is
kind
of
I
mean
it's
kind
of
turning
into
that,
but
there's
like
a
lot
of
like
background
like
study
and
synthesis
and
like
comparative
this
and
that
on
features,
and
it
is
kind
of
heading
toward
like
potentially
a
sort
of
like
proposal
about
you
know,
criteria
that
a
system
should
satisfy
and
like
a
sample
format
that
could
achieve
all
the
things
that
are
nice.
B
I
have
that
currently
and
like
a
markdown
file
that
I've
been
working
on
I've
also
seen
in
terms
of
like
elsewhere
in
the
hotel
system.
These
sorts
of
like
primordial
pieces
of
documentation
in
like
google
docs
as
well,
and
I
was
wondering
if,
like
this
group,
had
any
preference,
unlike
when
I
I
would
like
to
share
this
stuff,
I'm
working
on
and
probably
share
it
in
smaller
pieces
too.
But
I
was
like
logistically
does
their
preference
on
where
or
how.
A
Yes,
I
think
that
the
botel
process
of
these
this
oteps
repository
is
the
right
way
to
go.
Not
everyone
has
always
done
it.
It's
sort
of
like
there
are
definitely
draft
phases
where
people
use
google
docs,
but
I
would
like
to
propose
that
you.
A
It
sounds
like
you've
drafted
or
started
thinking
about
drafting
a
an
otep
that
would
be
like
I've
studied
the
landscape
of
samplers,
and
you
know:
here's
a
the
type
of
configurable
sampler
that
extend
that
sort
of
incorporates
what
jager
and
jaeger's
done,
which
is
widely
known
and
then
adds
some
some
sort
of
new
new
fanciness
that
that
comes
from
honeycomb
but
doesn't
but
also
doesn't
ignore
x-ray.
I
think
x-ray
has
this
it's
a
little
bit
opaque.
That's
the
problem
we're
having,
but
it
does
have
this.
A
I
think
it's
a
two-level
like
load
reservoir
throughput
sharing,
so
I
think
that
there's
like
a
central
service
that
allocates
throughput
to
each
node
like
the
refinery
situation
as
well,
so
that
yes,
each
x-ray
for
a
short
period
of
time,
each
x-ray
node
has
its
own
static,
throughput
limit
and
then
that
gets
adjusted
over
time
to
balance
the
global
system.
So
I
don't
think
it's
very
fancy
or
very,
very
sophisticated,
but
it
it's
more
than
jaeger
has
its
own
in
some
sense
or
it's
different
enough.
A
It's
just
a
different
way
of
thinking
absolutely
would
support
an
otap
that
would
excite
me
in
fact.
B
Yeah,
I
don't,
I
don't
know
if
you've
looked
into
it,
but
you
you
just
cited:
it's
a
word
that
that
x-ray
uses
they
they
call
it
like
a
reservoir.
Have
you
yeah
a
chance?
Looked
into
it
because
it.
B
Yeah
I
was
like
trying
to
look
into
it
because
they're
like
docs
weren't,
in
like
as
much
detail
as
I
wanted.
So
I
think
I
dug
into
like
the
javascript
sdk
and
from
what
I
could
tell
it
like.
It
does
resemble
like
a
sort
of
a
token
bucket
with
certain
parameters.
So
that
was
interesting.
I
didn't
expect
that
because
the
word
yeah.
A
A
Terminology
can
be
hard,
yeah,
very
supportive
of
an
otep,
and
I
I
mean
some.
A
Someone
has
to
do
some
work
here
and
I,
if,
if
they
want
to
see
hotel,
have
something
I
guess
and
I'm
so
enthusiastic
spencer
about
your
interest
here.
I
I
just
said
it
at
the
beginning
of
the
meeting
I
feel
like
because
of
the
metrics
stuff.
I
can't
really
put
in
my
own
cycles
but
glad
to
support
and
love
to
see
it.
B
Okay,
great
so
then
I
will
that'll
be
the
form,
then
that
I
take.
Our
oteps
often
are
typically
precipitated
by
like
a
an
issue,
a
github
issue
to
accompany
them,
or
not
necessarily.
A
Usually
they
are
big
enough
that
they
that
one
issue
would
be
as
big
as
the
document
kind
of
and
and
so
they
proceed
a
pr
to
the
spec
and
you're,
either
going
to
put
an
issue
in
the
spec
or
you're
going
to
put
a
no
tap
in
the
oteps.
A
So
I
think
you're
fine
to
just
create
no
tap
and
go
from
there.
A
Okay,
yeah
and-
and
we've
been
talking
about
how
to
like
not
let
efforts
fall
through
the
cracks
we've
been
talking
about
as
sort
of
formulas
formalizing.
This
idea
that
there's
a
technical
committee
sponsor
for
every
sig.
I
am
definitely
your
sponsor
and
happy
to
shepherd
and
help
here
cool.
Thank
you.
A
And
there
was
nothing
else
on
the
agenda,
I'm
glad
to
see
someone's
typing
in
the
notes.
A
I
know
that
last
time
we
were
talking
about
some
stuff,
but
I
think-
and
there
was
probably
this
like
nugget
of
a
proposal
that
we
could
continue
having-
which
is
the
use
of
a
c
value
in
the
trace
state,
to
convey
non-power
of
two
adjusted
counts
that
were
made
independently
of
a
consistent
random
sampling
decision.
I
still
support
that
notion,
but
I
don't
it's
not
doesn't
feel
urgent
to
me
since
we,
the
bigger
picture,
is
not
really
solved
yet.