►
From YouTube: Notary Learnings
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
I'm
going
to
start
here
by
just
sharing
starting
at
kind
of
the
basic
level
for
everybody
in
the
room
who's
already,
a
notary
plus
people
that
are
want
to
be
notaries,
and
then
people
outside
that
are
wondering
how
this
process
works.
So
at
textile,
I
guess
one
of
the
first
things
to
state
is
like
at
textile.
A
It
can
add
obstacles
for
people
that
want
to
join
the
network
and
use
it
quickly,
and
so
I've
kind
of
put
the
hat
on
like,
in
this
conversation
among
notaries,
of
kind
of
trying
to
fight
a
little
bit
for
simplicity
and
fight
for
kind
of
removing
some
of
those
barriers
here
and
there,
and
I
think
we're
getting
a
lot
closer
and
deep
and
everybody
else
has
been
doing
some
awesome
work
and
the
dashboard,
for
example,
for
transparent
transparency.
A
But
let
me
just
show
you
where
we
are
today
and
I
think
that
there's
still
challenges
here
and
I'll
point
them
out.
So
when
people
come
to
phil
plus
the
kind
of
default
pathway,
is
they
open
the
ticket
on
github
and
I'll?
Show
you
one
of
the
results
of
those
in
a
second
when
tickets
are
open
for
me,
I
actually
just
jump
right
into
a
rubric
to
score
them.
I
have
a
set
of
questions
that
I
asked
them
that
then
their
responses.
A
Let
me
sort
of
work
through
this
rubric
and
come
up
with
a
score,
and
I
don't
need
to
go
into
the
details
of
the
equation
here.
This
is
based
actually
on
another
notary
neo,
who
did
the
kind
of
groundwork
to
set
this
up.
He
shared
it.
I
said:
okay,
that's
awesome,
that's
the
kind
of
structure
that
that
we
need
to
get
going,
but
basically
my
rubric
just
works
through.
You
know
for
semi
and
honest
anonymous
people
sometimes
applying
for
data
cap.
You
can
start
kind
of
picking
away
at
are.
A
Are
they
from
an
organization
that's
known
either
through
social
profiles
or
other
activities
online?
What
is
their
reputation
through
past
projects,
both
both
in
our
community
and
externally,
and
then
what
are
their
intentions?
What
are
their
stated
intentions
for
using
the
network,
because
if
they
haven't
already
used
data
cap-
or
they
don't
have
an
address
with
storage
history,
you
have
to
just
kind
of
you're
gonna
have
to
trust
them
the
first
time
a
bit
and
just
ask
them
what
they're
planning
to
do
so.
A
A
So
there
should
be
some
extra
leeway
there
to
let
them
experiment
and
then
basically
I
work
through
all
the
answers
to
here
to
this
rubric
and
on
the
other
side
it
outputs
the
the
amount
that
sort
of
I
feel
comfortable,
giving
them,
and
then
that
amount
is
is
what
I
would
grant
them
and
actually
overwhelmingly.
This
rubric
has
been
has
erred
on
the
side
of
of
favoring
them,
and
so
I
think,
there's
I
don't
think,
there's
actually
a
case
where
somebody
passed
getting
data
cap
that
they
didn't
get
what
they
wanted.
A
So
they
also
state
like.
I
want
one
terabyte
or
I
want
you
know,
100,
terabytes
and,
and
the
rubric
will
spit
that
out
and
the
levels
have
matched
really
well
overwhelmingly.
So
what
does
that
look
like
in
practice?
So
here's
jv
just
just
to
pick
on
one
he
applied
for
a
hundred
gigabytes,
and
I
think
this
was
using
an
old
version
of
the
this
was
usually
my
old
version
of
the
rubric,
but
he
he
basically
worked
through.
A
Each
of
these
questions
gave
the
answers
and
then
and
then,
on
the
other
end
it
popped
out.
We
gave
him
the
100
gigabytes
of
data
cap
simple,
as
that,
so
some
obvious
weaknesses
to
this
one
is
a
lot
of
these
questions
are
still
fairly
subjective,
so
the
external
reputation,
for
example.
A
I
have
to
just
there's
no
easy
way
to
just
say:
okay,
this
this
number
makes
sense
and
that's,
I
think,
that's
the
biggest
weakness
here,
but
one
thing:
I'd,
I'd,
really
love
to
see
happen
as
we
want
to
onboard
more
notaries
into
the
space.
I'd
love
to
see
this
kind
of
this
kind
of
rubric
get
sort
of
reviewed
by
all
the
existing
notaries
and
say:
okay,
what
are
the
best
practices
here
that
we
could?
A
Maybe
it's
only
a
subset
of
this
that
we
could
recommend
to
all
future
notaries
that
they
start
there,
that
they
could
build
their
points
of
view
from
that
from
that
starting
place,
and
so
this
has
already
been
shared
on
the
notaries
channels
so
happy
to
share
again,
but
perhaps
we
could
make
a
small
project
before
we
onboard
the
next
batch
of
notaries,
to
give
some
guidance
on
what
kind
of
rubric
they
could
start
with,
whether
they
choose
to
use
that
or
not
is
up
to
them.
A
I
know
that
there's
other
notaries
with
other
rubrics
too.
So
perhaps
we
can
come
up
with
the
right
frankenstein
here
as
a
starting
place
right.
So
some
other
some
other
interesting
things
like
I
started.
We
also
apply
for
data
cap,
so
textiles
projects
that
we're
working
through
and
one
interesting
thing
that
emerges
is
the
difference
between
minor.
How
like
how
you
judge
a
a
company
applying
versus
the
project
that
they're
applying
for
and
so
we've
had
to
apply
for
multiple
data
caps.
A
That's
that's
something
that
we
should
expect
quite
a
bit
from
larger
organizations
that
don't
want
to
mix
their
private
keys
among
projects.
So
you
don't
necessarily
want
all
of
your
data
being
created
by
a
single
address,
so
you're
going
to
need
to
apply
for
multiple
data
caps.
So
how
do
we
deal
with
that?
Like?
Do?
We
have
to
make
them
apply
again?
Do
we
split
up
data
caps?
Do
we
have
them
define
different
projects?
A
Another
thing
is
the
notary
that
I
applied
for
actually
asked
when
I
applied
for
a
second
data
cap.
They
asked
well
how
how
like
how
is
your
first
data
cap
been
going?
I
thought
that
was
really
good,
and
so
I
used
riba
sushi's
script.
If
you
haven't
seen
it,
I
can
share
it
in
zoom
to
output
some
stats
on
my
own
address,
and
so
this
is
actually
covered
in
deep's
dashboard
now,
but
I
output
the
stats
from
my
own
address
and
posted
them
on
the
original
data
cap
application.
A
I
think
something
like
a
bot
that
did
this
automatically
after
checkpoints
would
be
pretty
amazing
way
to
track
the
progress
of
a
data
cap
being
deployed.
So
here's
my
application
for
data
cap.
It
was
an
old
one.
I
applied
for
nine
terabytes,
and
so
you
know,
maybe
a
month
or
three
months
later,
spitting
out
the
progress
on
how
that
data
cap
has
been
used
would
be
great
for
transparency.
It
would
make
it
easy
for
the
notaries
to
go
and
just
get
the
notification
and
see
this
unfolding,
but
I
thought
that
was
really
good.
A
The
other
thing
is
yeah,
so
deep
mentioned,
I
I
started
building
the
stats
here
to
spit
out
the
how
all
the
notaries
were
progressing
through
issues.
So
basically,
this
time
to
data
cap
is
really
important
and
just
wanting
to
kind
of
start
finding
where
people
are
getting
caught
up
in
the
process
and
try
to
reduce
this
to
you
know,
day
or
days
maximum,
because
when
you're
a
new
user
on
file
point,
it's
it's
a
really
bad
user
experience.
A
If
the
first
thing
you
you
have
to
do
is
wait
weeks
to
check
to
test
your
idea-
and
so
I
think
most
of
this
is
now
covered
in
the
dashboard-
that's
been
built.
The
only
thing
that
maybe
is
different
and
interesting
is
this
runs.
A
This
runs
daily
now
and
it
builds
up
a
history
of
of
these
stats,
and
so,
if
you
ever
want
to
go
and
do
some
fun
analytics
here,
you
can
go
check
us
all
and
see
how
we've
been
progressing
through
this,
and
so
I'll
just
leave
this
running
now,
and
maybe
this
will
become
redundant
with
other
places.
This
history
is
being
captured,
but
it
could
be
an
interesting
data
source
for
people
that
want
to
dig
into
what's
going
on
some
other
things
that
I
think
are
really
important.
Two
things
combined.
A
I
found
that
having
to
review
people
that
were
applying
for
small
data
caps
is
sort
of
pointless
like
we
should
just
be
letting
them
play
on
the
network,
and-
and
so
we
got
this
approved,
which
which
bumped
up
the
the
glyph
data
cap
deployment
to
32
gigabyte,
which
is,
I
think,
a
really
a
reasonable
amount
to
start
playing.
At
least
they
get
to
fill
a
sector
or
send
off
a
few
deals.
I
think
it
could
actually
go
higher.
I
think
we
would
be
pretty
safe
going
higher.
A
I
did
the
calculation
someplace
in
here.
Actually,
I
think
I
share
the
calculation
that
it's
like
30,
it's
like
30
000
you'd
have
to
create
30
000
fake
github
accounts
to
get
a
petabyte
of
data
onto
the
network
with
the
32
gigabytes.
So
you
could
imagine
we
could
probably
we
could
probably
like
bump
that
by
10
or
100
and
be
pretty
safe,
and
so
anyway,
that's
for
consideration
later.
A
This
fip,
which
would
allow
the
top-up
of
data
caps
and
the
reason
that's
critical,
is
those
two
things
com
things
can
can
combine,
and
so
we
could
actually
start
letting
people
start
with
small
data
caps
and
coming
up
with
the
process
where
they
can
automatically
top
up
based
on
outputs
like
this,
that
actually
show
the
distribution
of
their
data
across
a
fair
set
of
miners
and
that
they're
using
the
network
in
an
interesting
way,
and
so
that
those
things
all
really
excite
me.
A
So
hopefully,
this
rubric
can
get
slimmer
and
slimmer
and
we
can
base
it
more
on
these
automated
processes
that
are
connected
more
to
the
economics
of
filecoin,
and
with
that,
I
guess
that
those
are
all
my
points
so
kind
of
very
excited
for
the
future
really
want
to
start
a
conversation
about
these
rubrics
and
getting
your
new
notaries
onboarded
and
keeping
that
client
sort
of
experience,
minimal
and
short.