►
From YouTube: Digital Identity Attestation (February 17, 2021)
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
Other
things
just
in
case
somebody
decides
to
send
me
the
great
grand
secrets,
while
I'm
sharing
my
screen
here.
A
C
A
All
righty,
okay,
so
in
theory
we
can
all
see
your
screen,
see
your
slides.
Can
someone
else
confirm
to
me?
I.
D
D
We
tried
a
number
of
different
solutions,
did
a
lot
of
research,
constantine's
feedback,
although
indirect
was
crucial
to
getting
to
where
we
are
right
now,
so
I'm
going
to
run
through
like
the
first,
I
don't
know,
10,
slides
or
so
pretty
quickly.
Just
know
that
summer,
2019
and
summer
2020,
we
had
student
mentors
that
tried
a
bunch
of
things
right
so
next
slide.
D
So
the
goals
when
we
started
was
to
modify
git
to
give
it
the
ability
to
to
sign
and
verify
signatures
of
any
kind
right,
not
just
gpg
and
the
sec
as
a
secondary
goal.
What
we
wanted
to
do
was
normalize
the
command
line,
switches
and
configuration
setup
for
handling
signing
and
verifying,
because
there's
a
lot
of
legacy.
You
know
dash
dash
gpg
this
and
gpg
that
in
both
the
command
line
and
configuration,
so
we're
gonna
absolutely
maintain
backwards.
D
Compatibility,
that's
just
a
hard
requirement,
always
so,
if
you
had
existing
tooling
and
you
were
using
the
old
config
values
or
the
old
command
line
switches
that
it
would
still
work,
but
it
would
assume
same
same
defaults
right
so
gpg
would
always
be
one
of
the
options
and
if
you
use
dash
dash
gpg
sign,
it
would
configure
git
as
if
you
had
used
the
new
system
and
defined
gpg.
Alright
next
slide,
so
our
approach
was
to
abstract
away.
So
this
is
this
was
our
first
approach.
D
Okay,
two
summers
ago:
we
wanted
to
abstract
away
the
signing
infrastructure
so
that
it
was
sort
of
driver.
It
was
like
a
driver
model
right.
We
had
a
it's
kind
of
like
what
the
kernel
does
right.
You
had
a
struct
with
function
pointers.
Then
we
had
implementations
of
that
struct
with
with
concrete
functions
for
each
of
the
methods.
D
We
succeeded
in
doing
a
lot
of
cleanup
around
the
config
switches
and
the
the
sorry
the
command
line
switches
in
the
config
settings,
and
we
wanted
to
make
the
signatures
self-describing
in
the
sense
that
they
would
identify
what
signing
tool
they
used
and
what
tool.
So
then
you
can
then
git
could
infer
what
tool
to
use
and
all
those
things
okay
and
then,
as
far
as
preserving
backward
compatibility,
we
converted
the
existing
code
for
gpgsm
and
gpg
into
drivers
that
followed
this
thing,
this
this
new
abstraction
next
slide.
D
So
I
think
you
guys
all
kind
of
know
how
this
works.
How
signing
works
right
it
pipe
forks
to
gpg.
It's
hard-coded.
It
uses
standard
in
and
standard
out
to
pipe
the
contents
that
needs
to
be
signed
to
gpg.
It
signs
it.
The
signature
is
sent
back
to
get
over
the
standard,
output
or
sorry.
The
get
standard
in
gpg
is
standard
out
and
then
any
error
failures
is
represented
in
the
exit
code
and
standard
error,
messages
right
and
then
there
is
some
really
well.
D
Let's
see,
I
don't
want
to
judge
it
because
it
was
done
and
it
works,
but
it
it
hard
codes,
hand
parser
for
looking
for
gpg,
sig
lines
and
things
like
that
or
sorry
messages
in
the
standard
error
log
to
decide
what
whether
it
worked
or
not
right.
Okay,
next
slide
verification,
same
kind
of
thing:
it
uses
string,
matching
functions
to
pull,
to
detect
the
signature.
D
That's
in
the
git
object
extract,
it
stores
it
in
a
temporary
file,
then
pipe
forks
to
gpg's,
verify
command
and
then
oh
wait.
I'm
sorry!
I
made
a
mistake:
it's
the
verification
that
has
the
hand
coded
parser
for
the
phrase.
You
know
good
sig,
the
string
good
sig
to
determine
if
the
signature
was
good
or
not
so
it
works,
but
there's
kind
of
a
breakdown
around
that
in
terms
of
the
signature,
parsing
and
stuff
like
that.
D
So
if
gpg
changes
its
error
messages
for
any
reason,
it
would
break
its
integration
with
git,
because
it's
hard
coded
okay
next
slide
so
yeah
in
2019,
the
mentor's
name
was
ibrahim.
He
was
great.
D
Approach,
one
okay
right,
so
this
was
abstracting
away
into
the
drivers.
Okay,
so
and
then
the
point
here
is
at
the
bottom.
Those
are
actually
live.
Links
to
the
submissions
to
the
mailing
list
that
contained
his
patch
set-
and
most
of
this
is
reuse-
is
usable
and
probably
should
land
with
whatever
happens
in
the
future,
because
patches,
one
two
and
three,
I
believe
clean
up
and
normalize
the
command
line,
switches
and
the
configuration
stuff
all
right
next
slide.
D
This
was
just
a
picture
that
he
gave
in
his
presentation
again
we
abstract
out
a
giant
generic
signing
interface
and
had
drivers
for
each
of
the
different
signing
tools.
Next
slide,
we
cleaned
up
config.
So
it
looks
something
like
this
right
now.
We
later
found
out
through
research
that
this
was
not
sufficient
and
I'll
explain
that
later.
But
this
is
we
tried
for
simplicity,
but
this
it
needs
to
be
a
little
more
complicated
than
this
all
right
next
slide.
So
then
2020
we
had
another
project
with
jimit.
D
Oh
yeah,
sorry
this
the
before
we
move
on
to
the
the
2020
the
result
of
the
submission
to
the
get
mailing
list
was.
We
don't
want
to
have
drivers
written
in
c
because
of
the
fear
of
an
explosion
that
came
through
an
exp
explosion
of
c
code.
That
would
have
to
be
vetted,
because
this
is
the
signing
infrastructure.
It
would
require
extra
security
checks
and
extra
maintenance
and
they
were
afraid
of
you
know
orphan
c
code
right,
which
is
a
lot
harder
to
deal
with
than
orphaned
external
scripts.
C
D
Okay,
so
moving
on
to
2020,
we
had
jimit
jimit
picked
up.
The
existing
patches
spent
a
long
time
trying
to
understand
it
going
through
all
of
the
mailing
list
feedback
working
with
several
of
the
key
maintainers
in
the
git
project
trying
to
negotiate
okay,
we
can't
do
drivers.
What
are
our
options
so
next
slide?
Please
what
are?
What
are
our
options?
D
Which
one
should
we
push
forward
on
and
then
that
took
long
enough
that,
in
the
end,
what
jimin
ended
up
doing
was
working
closely
with
me
to
come
up
with
a
proposal
and
that's
what's
the
rest
of
this
slide:
deck,
okay
and,
of
course,
constantine's
feedback,
and
a
couple
of
your
like
one
of
your
blog
posts
about
carving
commits
into
sig
chains
or
something
like
that
also
influence
the
thinking
here.
So
thanks
constantine.
D
So
initially
we
wanted
to
try
to
standardize
the
pipe
fork
interface
and
then
write
like.
Basically,
we
would
accept
the
gpg
pipe
fork.
Interface,
that's
hard
coded
as
the
de
facto
standard
for
talking
to
external
tools
and
we
were
going
to
write
bash
scripts.
Essentially,
that
would
mimic
gpg's
behavior,
but
for
other
signing
tools
we
tried
to
do
this.
We
went
down.
D
So
then
we
look
for
other
options
and
we
did
a
lot
of
a
big
survey
of
different
signing
tools
and
the
one
we
landed
on
is
how
gpg
handles
coordinating
between
the
pieces
in
their
ecosystem
gpgme
uses.
D
So
we
did
a
lot
of
research
and
basically
said:
look
there's
a
there's,
a
small
set
of
fundamental
problems.
Okay,
gpg
has
to
be
able
to
identify
what
needs
to
be
digitally
signed
and
with
what
identity.
Okay.
Now
it
already
knows
how
to
identify
what
needs
to
be
signed,
but
the
identifying
the
identity
of
the
signing
key
or
the
signer
is
hard-coded
right
now,
as
name
and
email,
because
that's
what
gpg
uses,
but
we
can
turn
that
into
a
generalized
string
so
that
identifies
the
sign-in
key
for
any
arbitrary
signing
tool.
So
that's
solved.
D
Okay,
then
the
data
needs
to
be
sent
to
the
signing
tool
to
be
signed,
and
then
the
the
result
needs
to
be
read
back,
so
the
resulting
signature
needs
to
be
read
back
and
the
and
the
success
or
failure.
Okay,
now
the
azure
one
protocol
solves
that
it's
very
clean.
D
D
In
a
way,
so
every
digitally
signed
object
needs
to
say
well,
okay,
I
was
digitally
assigned
with
this
tool
with
this
identity,
and
these
are
the
options
that
need
to
be
passed
to
a
verifying
tool,
along
with
the
signature,
along
with
the
identity,
to
for
the
verification
tool
to
do
its
job.
So
we
have
to
transmit
data
through
time.
D
Essentially
and
again,
the
azure
protocol
supports
this
and
then
the
other
problem
for
verification
is
like
I
said
you
have
to
send
the
signature,
the
data
that
was
signed,
the
identity
of
the
signer,
plus
the
options
needed
for
that
particular
signing
tool,
and
it's
the
options
that
are
the
key
to
making
constantine's
concerns
his
requirements,
something
we
can
meet
with
this
design.
Okay,
I'm
going
to
pause
there.
Any
questions
at
this
point
before
we
dive
into
some
examples.
C
That's
the
description
of
what
we're
trying
to
accomplish.
Basically,
it's
a
way
to
attest
the
patches
outside
of
git
so
that
before
they
actually
make
into
the
git
repository,
how
do
we
make
sure
that
patches
are
not
modified
or
well
else?
We
are
able
to
give
that
a
station
to
developers.
D
Right
right-
and
I
will
get
to
a
very
specific
example
that
I
think
meets
constantine's
requirements
because
he's
proposing
using
mini
sig,
which
is
a
different
signing
tool,
but
and
and
then,
with
the
limitations
of
email,
submission
right.
I
think
we
can
meet
his
requirements.
Okay,
so
next
slide.
D
The
config,
the
config
symbol,
is
a
simplification
for
a
protocol.
Based
approach,
looks
something
like
this.
Where
you
have
section,
I
we've
you
know
in
these
names
are
arbitrary.
I
just
pick
six
because
it
was
short
and
easy
and
then
subsections
for
each
tool.
D
Verbatim
now,
the
ones
in
the
config
or
from
the
command
line
get
past
verbatim
to
the
signing
tool.
Then
the
signing
tool
is
responsible
is
responsible
for
passing
back
options
that
will
get
stored
in
the
signed
object.
That
will
then
later
be
passed
verbatim
to
the
verification
tool
and
that's
how
we
connect
the
dots,
okay-
and
I
wanted
to
highlight
here
for
in
the
config
that
you
can
have
a
single
tool
that
handles
both
signing
and
verification.
D
This
is
how
this
is
done
in
like
banking
systems
and
existing.
You
know
certificate
authorities
and,
like
things
like
that,
they
have
hardware
hsms
that
are
only
accessible
over
the
over
the
internet.
Over
the
network-
and
they
have
to
use
as
like
tls
connections
and
so
the
cert
there
is
the
client
cert
to
use
or
the
tls
connection
for
authentication
and
authorization,
and
then
the
option
obviously
is
saying:
hey:
go
ahead
and
use
like
say
the
hyperledgers
tls
certificate
to
sign
the
data
object.
Okay,
next
slide.
D
The
signing
tool
is
the
server
it
uses,
options
to
pass
those
options
from
the
config
and
command
line
to
the
signing
tool
uses
the
sign
command.
Okay,
let's
go
to
the
next
slide
here.
I'll
give
you
a
better.
I
think
that
the
protocol
will
give
you
a
better
understanding
here
so
lines
that
start
with
s.
Colon
are
the
server
which
is
the
signing
tool.
D
In
this
case,
c,
colon
is
the
client
which
is
git
in
this
case,
so
git
would
pipe
fork
and
the
upon
execution
of
the
signing
tool,
the
site
to
signal
that
it
was
ready.
It
would
send
okay
to
get
okay
and
then
git
says.
Okay,
here
are
my
options
and
it's
doing
all
the
options
from
the
confe,
the
config
line,
so
this
is
using
gpg.
This
is
an
example
of
using
gpg
and,
if
you
remember
from
the
config,
these
were
the
options
there.
D
It
says:
hey:
here's,
the
identity
of
the
signer,
here's
the
mint
truss
level,
which
is
this
is
for
backwards.
Compatibility
by
the
way
git
supports
this
already
as
a
command
line
switch.
So
that's
why
it's
here
in
this
example,
and
then
the
hard
coding
of
gpg
at
the
moment
causes
armored
detached
ascii
armored
detached
signatures
to
be
generated.
So
this
is
the
default
options
for
gpg
in
this
new
system
to
support
backwards
compatibility
with
existing
signatures.
D
Okay,
so,
after
doing
all
those
options,
the
server
is
confirming:
okay,
okay,
okay
for
each
one
git,
then
issues
sign
command
and
then
it
issues
d
lines
which
are
the
lines
of
the
data
to
be
signed
and
there
there's
a
maximum
number
of
characters
and
I
don't
believe
there's
any
encoding
need
necessary.
It
can
just
you
know,
verbatim
line
after
line
after
line.
D
Is
a
convention
for
passing
binary
data
yeah?
Because
gpg
does
this
right?
It
it
passes
blobs
between
different
tools
over
the
as1
protocol.
Okay!
So
then,
when
it's
done
with
sending
all
the
data
lines,
it
sends
an
n
command
which
tells
the
signing
tool.
Okay,
that's
all
the
data.
You
have
all
your
options
go
ahead
and
sign
it.
D
The
server
then
replies
immediately,
so
the
signing
tool
applies
immediately
with
data
lines
and
it
starts
with
the
options
that
need
to
be
in
this.
The
options
that
need
to
be
stored
in
the
signed
object,
along
with
the
signature
itself,
and
the
idea
here
is
that
these
lines
will
get
stored
verbatim
in
the
the
get
object.
D
So
these
make
the
signatures
self-describing
sig
type,
it's
an
open,
pgp
signature,
sig
options
is
the
list,
comma
separated
list
of
options
that
must
be
passed
to
the
verification
tool
and
then
the
sig
lines
are
the
lines
of
the
signature
data.
This
is
very
close
to
how
it
works
already,
except
we
don't
have
self-describing
and
don't
have
options,
so
that
would
be
something
new
and
then,
after
the
signature
was
done
or
it
was
done,
sending
the
signature.
D
It
says:
okay,
the
cl
the
client
git
says:
buy
the
service
is
okay
and
then
the
egg,
the
signing
tool
will
exit
you'll
get
an
exit
return
on
whether
it
succeeded
or
failed,
and
at
that
point
you
would
be
able
to
take
the
replies
and
parse
them
and
show
an
error
or
something.
So
if
you
get
a
return
code,
that's
less
than
or
that's
greater
than
zero
or
not
zero.
I
mean,
then
you'll
have
received
error
lines
in
the
protocol
that
you
could
then
display
to
the
user.
D
The
as1
protocol
will
reply
with
e
lines
with
error
codes.
If
there
are
errors.
Okay,
next
slide
verification,
basically
git
parses,
the
sig's,
sig
type
options
and
sig
key
it
pipe
forks.
It
takes
the
config
switches
and
the
command
line
switches,
to
figure
out
what
command
to
run
for
verification
based
on
the
type
of
signature,
and
then
it
pipe
forks
passes
options
from
the
sine
tool
and
passes
the.
If
there's
a
sig
key,
it
will
pass
the
key
directly.
Otherwise
it
does
the
identification
and
then
does
the
signature
and
verify
and
pass
the
data.
D
So
here
let
me
show
you
an
example
of
that,
so
the
verification
of
the
signature
next
slide,
please,
the
the
verification
of
the
signature
that
we
just
created
similar
right
s
is
the
server
which
is
the
verification
tool.
In
this
case,
gpg
c
is
get
the
client
it's
going
to
pass.
Those
options
that
were
stored
in
the
git
object,
so
min
trust
level
equals
marginal
passes
those
verbatim
to
the
verification
tool.
D
Then
it
passes
the
signature
lines
which
it
pulled
out
because
those
were
the
sig
lines
stored
in
the
object
and
then
it
issues
the
verify
command,
which
is
the
lines
of
data,
the
d
lines
containing
the
data,
the
signed
object,
and
then
the
server
will
reply
with
either
d
lines
or
e
lines.
Okay
and
d
lines
will
be
its
message:
verification
message:
if
it's
successful
e
lines
will
be
the
error
associated
with
the
signature.
D
So
if
the
signature
doesn't
check
out-
or
it
was
done
with
an
expired
key
or
anything
like
that,
you
would
get
e
lines
and
then
again
we
have
the
okay
by
okay
right.
So
let's
go
on
and
look
at
something
a
little
more
interesting.
This
is
where
go
to
the
next
slide.
D
Please
so
constant
wanted
to
and
correct
me
if
I'm
wrong
constantine
okay,
you
want
to
have
email
submitted
patches
that
were
signed
using
minisig,
which
produces
very
small
compact
signatures,
and
you
potentially
also
want
to
link
you
know
because
mini
six
supports
having
verified
comments.
D
D
I
would
have
hey
here's
my
signing
key
right
and
there's
the
path
home
user
mini
sign,
mini
sign,
dot
key,
so
that
gets
passed,
a
mini
sign
and
then
the
next
option
is
hey
use
this
regular
expression
to
extract
a
string
from
the
data
that
is
being
signed
and
make
that
part
of
the
cryptographically
verified
comment:
that's
associated
with
the
mini
sign
signature.
This
is
a
feature
peculiar
to
mini
sign,
and
so
in
this
case
that
the
verified
comment,
regular
expression
is
to
extract
the
signed
off
by
line
okay.
D
So
then
I
issue,
then
my
git
client
git
tool
would
issue
the
sign
and
it
would,
this
would
be
part
of,
say,
like
git
format,
patch,
for
instance
right.
So
it
would
be
giving
the
lines
of
the
formatted
patch
as
it
would
be,
put
into
an
email,
as
it
would
be
wrapped
as
an
rc.
What
722
message
an
email
message
and
what
we
get
back
is
again
the
sig
type
line
that
identifies
this
as
a
mini,
sig
or
mini
sign
signature.
D
The
sig
key
in
this
case
is
the
public
key
needed
to
verify
the
signature,
because
mini
sign
doesn't
use
a
complicated,
signer
identification
they're
just
like
give
us
the
public
key
or
the
you
know
the
key
pair
to
sign
or
verify
okay.
So
in
this
case
we
would
put
in
be
putting
the
public
key
of
the
signer
into
the
signed
object
and
then
the
signature
itself
would
also
go
directly
into
the
object,
and
then
you
get
the
okay
by
okay.
C
C
C
Difference
between
them
two,
because
the
version
one
did
just
basically
fail
to
adopt,
because
I
was
trying
to
be
too
polite
and
saying:
let's
not
bloat
the
headers
of
the
emails
and
have
external
documents
for
for
verifying
patches
and
for
everybody
else.
It
was
basically
yeah.
It
failed
to
take
off
so
the
version
two
we
are
just
you
said
fine.
You
want
to
bloat
the
header
so
we're
putting
all
the
attestation
information
into
the
header
of
the
email.
B
C
That
the
biggest
problem
with
preserving
signatures
from
between
git
and
email
and
then
git
again
right
so
currently
the
way
it
happens
is
somebody
writes
a
code
they
committed
to
their
local
git
repository.
Then
they
do
get
format
patch
and
they
send
it
off
to
the
maintainer
to
look
over,
but
we
cannot
directly
translate
that
back
into
a
git
object
because
almost
almost
always
99
of
the
time,
the
git
mainta,
the
maintainers
of
the
project,
will
want
to
add
things
to
the
to
the
commit
message
or
to
the
sign
off.
C
You
know
reviewed
by
signed
off
by
and
all
that
stuff.
So
that's
where
that's
where
this
breaks
down,
because
I
I
had
the
same
idea:
can
we
can
we
preserve
git,
commit
information
between
through
the
the
entirety
of
this
process?
You
know
between
git,
then
to
email
them
back
to
git,
and
this
this?
The
response
to
this
was
that
this
is
one
not
possible
and
to
not
need
it.
C
Nobody
wants
this
because,
because
this
creates
the
problem
for
them,
that
they
can
no
longer
modify
the
commit
message
or
make
slight
minor
changes
to
the
code
like
for,
for
line
breaks
or
for
any
any
other
kind
of
minute
changes.
Everything
would
require
them.
I
found
this
tiny
error
or
missing
semicolon
or
bad
formatting,
please
resubmit,
and
that
just
creates
too
much
back
and
forth.
Nobody
was
really
interested
in
that.
C
So
the
what
we're
trying
now
to
accomplish
and
say:
okay,
so
we're
not
going
to
preserve
the
this
through
the
git
object
completely
through
through
email.
C
What
we
want
to
do
is
just
provide
the
link
to
with
based
on
the
message
id
to
where
somebody
can
look
at
the
the
entirety
of
the
message
and
then
to
whatever
attestation
verification
that
they
want
to
do
and
when
the
maintainer,
when
they
receive
the
patch,
they
will
be
able
to
run
local
verification
just
to
make
sure
that
at
this
stage
they
can
verify
that
this
was
submitted
by
the
developer
or
the
domain
of
the
developer
and
yeah.
C
Not
even
that
yeah
well
we're
looking
we're
pulling
public
keys
from
the
dkim
record,
so
almost
all
email
these
days
is
dickham
signed.
The
cam
is
not
fanta,
not
fantastic
for
this,
because
it
does
some
normalization
that
actually
breaks
stuff
like
python
right.
You
can
submit
the
same
to
different
patches.
C
They
can
have
the
same
hash
because
the
all
the
white
space
would
be
collapsed
into
one.
You
can
imagine
that
for
python,
for
example,
a
return,
that's
indented
versus
return,
that's
unindent
and
can
have
a
completely
different,
meaning,
meaning
yeah,
and
there
could
be
a
security
problem,
but
the
yeah,
but
but
if
somebody
wanted
to
submit
a
gpg
or
any
other
kind
of
of
signatures,
they
are
able
to
do
so,
and
the
link
that
I
provided
there
we
look
into
each
of
the
scenarios
there
we
can
say
we
can
sign
this
directly
with
dkim.
C
We
can
sign
this
with
pgp.
We
can
sign
it
with
minisign
and
we
can
sign
basically
with
anything
that
we
want
as
long
as
we
actually
specify
which
one
it
is
in
the
headers
of
the
message,
but
we
don't
try
to
preserve
this
between.
You
know
git
and
mailing
list
and
get
so.
The
objects
will
always
almost
always
be
broken.
So
we
don't.
D
D
Okay,
so
maybe
I
don't
want
to
go
down
this
road,
since
this
is
based
on
all
information,
but
that
still
doesn't
modify
the
desire
to
normalize
the
interface
between
git
and
external
signing
and
authentication
or
author
or
sorry
verification
tools.
Yeah.
C
Sure
no,
this
is
correct.
Being
able
to,
for
example,
sign
with
mini
sign
or
sign
with
any
other
signature
mechanism
would
be,
would
be
much
better
than
just
having
gpgsm
or
or
open
pgp,
because
mini
sign.
The
upside
of
this
is
that
yeah.
It
is
much
much
simpler
and
there
are
ways
of
distributing
public
keys
via
like
dns
records,
because
the
key
is
tiny
enough
to
do
this.
D
Anyway,
I
think
that's
the
end
of
my
my
presentation.
I
I
do
want
to
follow
up
with
you
constantine
and
see
what
you
proposed
in
your
second
email
and
and
see
where
the
community
is
at
right
now
to
see
if
there
is
anything
that
would
modify
this
design.
This
is
just
a
proposal
to
be
able
to
teach
git
how
to
talk
to
any
verifying
signing
tool
in
a
standardized
way
and
then
be
largely
agnostic
to
what
gets
come.
D
What
comes
back
from
the
signing
tool
stored
in
the
object
and
then
what
it
passes
to
the
verification
tool.
So
this
is
basically
hacking
git
into
being
a
the
tool
that
transmits
signatures
and
then
associated
metadata
through
time
to
the
verifier
right,
and
it
just
has
a
standard
abstracted
interface.
D
D
D
So
I
was
going
to
say
one
last
thing:
hyperledger's
interest
here
is
that
we
are
moving
towards
self-sovereign
identity,
which
is
decentralized
digital
identity,
and
there
have
been
proposals
to
use
git
well
written
by
myself,
to
be
honest,
to
use
git
repos
as
the
pki
mechanism
and
to
make
git
repos
fully
self-verifying.
D
So
the
the
records,
the
provenance
of
digital
identities
and
signing
keys
is
tracked
in
the
git
repo
itself,
using
files
or
metadata,
or
anything
like
that.
I
that's
totally
separate
thing,
but
the
first
step
is:
we
should
teach
git
how
to
use
any
signing
tool
because
there's
significant
desire
for
this
across
industry
for
many
different
reasons-
hyperledger
is
just
one
get.
Our
google
has
their
own
gitlab
has
their
own
ideas.
The
lf
actually
has
their
own
ideas.
D
We
want
to
be
able
to
track
the
provenance
of
you
know,
using
digital
signatures
to
do
dco
and
cla
tracking
of
contributions
so
anyway
I'll
end
it
there
and
then
take
questions.
A
Yeah,
so
I
I
have
two,
although
you
know
I
can,
I
can
take
turns
so
I
had
one
just
in
general
about
the
aswan
protocol.
You
know,
do
you
have
a
url?
Is
that
I
presume
it's
written
somewhere?
Is
there
a
what's
the
license
of
that
or
so
there
are
some
legal
issues.
We
need
to
be
worried
about.
D
It's
it's
in
gpgme
already.
I
don't
know
what
the
license
is.
Whatever
governs
gpg
me,
it's
g!
It's.
D
Pg-
let's
see
here
it's
published
by
the
free
software
foundation
here
I
can
post
the
link
real,
quick
in
the
chat
of
the
documentation.
We
should
do
some
investigation
if
this
doesn't
work
right
for
some
reason,
licensing
or
whatever
I
just
propose.
We
develop
our
own
protocol
and
release
it
under
apache
2
at
the
lf
or
whatever
or
as
part
of
the
get
project
just
say
we.
D
A
Oops,
so
it's
posters,
it's
posted
in
the
chat.
A
In
general,
I
will
say
for
your
for
your
mods
to
get
the
general
policy
at
least
the
open
ssf,
the
original
technical
charter,
and
I
think
what
what
people
generally
do
anyway.
For
all
of
things
is
you
know
if
we're
contributing
to
an
existing
project
like
get
then
whatever
that
license
is
as
long
as
it's
an
open
source
license?
Go
because
there's
no
point
in
making
code
that
won't
be
accepted
because
of
licensing
issues.
D
So
far,
no
code
has
been
written
other
than
the
first
three
patches
from
ibrahim's
patch
set
two
summers
ago.
That
largely
does
the
yeoman's
work
of
cleaning
up
the
config
and
the
command
line
switches.
The
code
that
normalizes
the
signing
interface
can
be
a
is
a
good.
That's
in
patch.
Four,
I
think,
is
a
good
jumping
off
point
for
cleaning
up
the
signing
interface
and
making
it
use
a
protocol
rather
than
the
hard-coded
pipe
forking
to
gpg.
C
So
libaso
in
itself
is
new
gpl
v3,
but
that's
the
library
and
not
the
protocol.
I'm
not
sure
there
is
a
separate
license
on
the
protocol.
How
the
library
license
would
you
know
be
applicable
to
the
protocol
itself
or
not,
but
it's
v3.
So
theoretically
a
protocol
is
covered
by
it
as
well.
I'm
not.
I
don't
actually
remember
how
that
it's
handled
yeah.
I.
C
A
C
I
have
a
comment
about
the
submissions
because
I
I
think
they
were
fairly
well
received
by
the
git
community,
but
the
fact
that
it's
coming
from
an
intern
or
somebody
who
says
you
know
I'm
just
a
student
working
on
a
summer
project
code.
I
think
that
gives
it
sort
of
a
amateurish
pattern.
If
you,
if
you
like,
I
think
it
needs
to
come
from
a
more
solid
group
saying
you
know
we
would
like
this
implemented
for
the
following.
D
Reasons
right,
so
there
is
explicit
support
from
the
google
fido
team.
I've
talked
to
a
number
of
them.
We've
had
several
meetings.
They
want
to
use
this
for
obviously
their
external
signing
hardware
and
infrastructure
and
yeah
I
mean
me
as
pledge
your
employee.
We
have
industry
partners
like
ibm
who
have
and
secure
key
who
have
interest
in
this
as
well.
So
at
this
point,
you're
right,
it
really
just
needs
to
have
some
broader
industry
buy-in,
for
whatever
reason
right,
it
doesn't
really
matter
what
their
reasons
are.
D
If
google
says
yes,
we'd
like
this
we'd
use
it
and
if
ibm
says
yes,
we'd
like
this
we'd
use
it.
You
know.
I
think
it
would
have
stand
a
good
chance,
especially
if
it
got
a
lot
of
scrutiny
and
we
went
through
the
process
of
you
know,
whatever
the
future
process
right,
where
it's
it's
a
proposed
edition,
you
can
compile
git
with
the
support
we
can
test
it
out
in
industry
and
maintain
it
until
everybody's,
like
yeah.
We
think
we
worked
out
all
the
kinks
and
then
merge
it
up
upstream
to
the
main
branch.
D
D
Let
me
code
this
up
and
and
float
it
in
front
of
the
like
the
fido
team
and
the
teams
over
at
ibm,
but
currently
I'm
personally
thursday
is
my
last
day
at
hyperledger
and
at
the
linux
foundation.
I'm
moving
on
to
take
some
time
off
to
take
care
of
family.
My
you
know
my
father's
passing
away
he's
dying
of
dementia.
D
My
mom
needs
help
they're,
elderly,
and
so
I'm,
I'm
gonna
take
an
indefinite
period
to
take
care
of
my
parents,
but
I
will
be
on
email
and
I
would
I
can
still
come
to
meetings.
I
would
still
be
very
much
interested
in
being
part
of
this
and
shepherding
it
through.
I
don't
know
that
I
have
time
to
write
the
code,
though.
B
Sorry
to
hear
that
david,
but
thanks
for.
A
B
D
I
mean
the
full
story,
you
guys,
I
you
know
I
I
seem
very
flippant,
but
my
dad
had
a
really
bad
head
injury
when
I
was
11,
he
had
a
on
job
site
accident
and
so,
and
I
was
told
by
doctors
that
day
that
he
wouldn't
make
it
through
the
night,
and
so
I'm
thankful
to
have
had
him
for
the
last
30
years.
D
E
B
Do
you
have
maybe
offline?
You
can
tell
me
who,
on
the
google
fido
team
you've
been
working
with,
so
maybe
I
can
follow
up
and
I
mean
I
think
a
lot
of
us
want
to
see
this
implemented.
It
seems
like
you
need
to
take
some
time
off.
So
I'm
curious
if
you
have
other
folks
in
mind
that
might
be
able
to
do
the
work.
D
No,
I
was
gonna,
I'm
still
able
to
propose
you
guys,
aren't
gonna
like
this.
I'm
still
able
to
propose
hyper
ledger
mentorships,
as
part
of
you
know
as
being
an
alum
and
part
of
the
hyperledger
community.
So
you
know
I
could
sponsor
one
and
mentor
somebody
this
summer.
They
could
do
the
right.
You
know
be
my
hands,
but
I
don't
know
I
I
just
can't
promise
it.
I've
been
trying
to
just
say,
look
I'll,
just
sit
down
and
write
the
code
right.
D
B
D
B
D
There's
a
significant
community
in
the
decentralized
identity
world
that
wants
this,
and
I
know
that
there's
a
lot
of
people
who
want
to
build
self-verifying,
git
repos,
I
I've
talked
to
several
people
at
gitlab
that
want
to
build
that
into
their
tool
chain
so
that
you
can
clone
a
repo
and
it
can
immediately
go
yep
that
whole
thing
is
verified.
You
have
the
canonical
version,
the
you
know
the
maintainers
stand
behind
it
right
everything
checks
out
before
you
even
try
to
build
anything.
B
F
F
Sauce
on
that
I
mean
as
soon
as
you
started,
talking
about
like
hey,
you
can
pass
options
into
the
signing
process
and
then
those
options
will
be
signed
over
and
contained
in
the
object
to
be
used
for
verification.
So,
like
a
classic
example
of
this
is
well
one
that
got
a
lot
of
attention.
Anyways
was
with
the
jots
json
web
tokens.
Where
you
know
the
same
sort
of
thing
happened:
you'd
pass
in
headers
and
those
headers
would
be
used
to
configure
the
verification
mechanism.
F
One
of
the
headers
was
the
algorithm
that
you
were
using,
which
means
you
could
you
could
downgrade
the
quality
of
the
hash
algorithm
from
say,
like
a
strong
sha,
algorithm
to
say,
md5
and
then
to
make
matters
much
worse?
They
then
later
added
as
an
option
algorithm,
none,
which
meant
there
was
just
no
signature
at
all.
So
someone
could
then
take
a
jot
library,
and
you
know
rip
the
signature
out
put
al
nun
in
there
and
it's
going
to
verify
verify
I'm
going
to
naive,
verifier.
D
Right
so
the
the
way
you
do
this
is
that
any
inputs
to
a
verification
tool
have
to
be
covered
by
the
signature,
so
any
modification
of
that
would
cause
the
signature
to
fail,
regardless
of
any
potential
downgrades
right,
so
you
would
pass
the
options
through.
So
if
you
go
back
to
slide
see
what
slide
is
it
slide?.
A
A
D
Right,
you
could
see
that
I'll
wait
for
you
dude.
So
if
you
look
on
so
it's
passing
option
min
trust
level
equals
marginal.
That
line
trimmed,
so
everything
missed
like
option.
Space
right
would
be
trimmed
off
and
anything
after
margin
will
be
trimmed
off.
That
could
be
seen
as
the
first
line
of
the
data
that
is
being
verified.
D
Okay,
the
reason
they're
in
this
order
is
because
it
makes
sense
to
someone
debugging
it,
but
what
you're
really
doing
is
that's
the
first
d
line
that
would
normally
appear
after
verify.
Now,
if
we
wanted
to
normalize
this,
we
would
just
put
you
know:
okay,
signature
d
lines
of
the
signature,
then
verify
and
the
first
d
line
of
verify
would
be
min
trust
level
equals
marginal
and
then
the
the
lines
of
the
sign
data
object
so
that
the
the
signature
itself
covers
the
option
you
know
includes
the
options
used
for
verification.
D
F
F
I
guess
wondering
like
this.
Stuff
is
subtle.
C
Of
it,
it's
also
part
of
the
git
commit,
so
it
will
also
be
a
hashed
and
the
next
I
mean
the
chain
would
then
be
broken.
You
can't
modify
the
previous
any
of
the
options
in
the
previous
commit
right
so
they'll
be.
Then
then
your
git
repository
will
start
being
valid.
You
could
theoretically
modify
the
latest
that
the
tip
commit
because
that's
not
stored
in
the
next
commit
yet,
but
then
you
know
you
can
also
modify
any
other
part
of
the
git
commit
right
and
it's
kind
of.
C
B
D
You're
right,
so
the
the
object
hash
is
sort
of
a
second
factor.
I
mean
what
you
describe
is
theoretically
possible.
I'd
have
to
dig
through
it,
but
digital
signatures
are
not
necessarily
reliant
on
a
hash.
I
mean
when
you
specify
a
hashing
algorithm
in
a
digital
signature,
you're
not
using
modern
digital
signatures,
because
they
don't
they
don't
do
that.
They
use
sponge
functions
so.
B
D
All
the
other
ones
they
they
don't
they're,
not
encrypting
a
hash
anymore.
So,
and
I
think
it's
in
response
to
things
like
what
you're
describing
actually
you
know.
So,
if
we're
using
modern
signature,
algorithms
and
the
configuration
of
the
verifying
the
verifying
tool
is
covered
by
that
signature,
I
I
mean
anything's
possible,
I
won't
ever
say
anything's
perfectly
secure,
but
I
think
that's
probably
good
enough,
because
existing
systems
already
do
this
and
they
seem
to
think
it's
good
enough.
D
Things
like
secure
scuttlebutt
does
stuff
like
this,
where
the
configuration,
the
verification
is
either
hardcoded
in
the
tool
or
or
is
covered
by
the
signature
so
and
other
secure
systems
like
the
tls
handshake
protocol
does
very
similar
too.
It's
it's
signed
for
con
for
offers
and
acceptance.
B
D
Protocol
negotiation,
so
that's
really
great.
That's
awesome
feedback.
I
love
that
you're
thinking
that
way,
because
you
know
I'm
just
one
engineer
and
I
love
having
someone
look
over
my
shoulder
and
pointing
holes
like
that.
I
think
what
we
should
do
is.
I
would
love
to
just
try
to
do
a
quick
and
dirty
implementation
of
this
and
just
start
iterating.
Do
an
early
release
and
start
iterating,
maybe
set
up
a
sub
branch
in
the
get
repo
under
the
the
future
or
whatever
it
is.
D
What's
the
what's
gets
mechanism
where
they
have
there's
like
a
whole
branch
of
potential
upgrades
like
when
they're
that's
how
they
handle
the
sha-256
upgrade?
What's
next,
I
believe
next
next
yeah.
B
D
A
This
is
a
smaller
point,
but
I
think
I've
mentioned
to
before
david,
but
just
quick
on
the
larger
group.
One
challenge
with
the
get
configs
as
you
were
showing
them
is,
you
know
most
people
what
they
want
to
do
is
they
want
to
install
software
with
package
managers
and
not
muck
around
with
their
config
files
to
make
things
work?
A
I
think
this
would
be
easily
handled
by
also
teaching
the
get
program
to
look
for
a
global
directory.
You
know
like
on
a
unix
like
system,
slash,
etc,
dot,
slash,
get
config
d
and
then
all
the
configs
in
there,
and
that
what
that
would
mean
is
instead
of
constantly
trying
to
edit
a
config
file
which
is
kind
of
risky
when
the
package
installed
when
a
package
gets
installed,
it
just
installs.
Oh
here's
another
way
to
sign
packages
and
here's
the
configuration
of
how
do
you
do
that
with
git?
A
D
D
Right,
that's
an
awesome,
awesome
upgrade
and
I
don't
know
why
git
doesn't
support
this
already.
I
see
that
as
a
as
a
parallel
effort
or
a
tangential
effort,
something
that's
not
directly
a
dependent
of
this
upgrade,
but
I
think
you're
right.
That
would
make
it
really
awesome
if
it
supported
a
dot.
You
know
git
config.d
or
whatever
get
config.d
directory.
That
way.
You
know
I
can
install
a
package
for
minisig
and
it
can
have
a
contrib
package
that
has
git
configs
that
just
drop
it
into
place
right
for
how
to
run
the
tool.
A
It's
not
dependent
on
it,
but
I
think
this
is
part
of
this
over.
I
think
I
mean
overall,
not
just
this
group,
but
all
the
open,
ssf
working
groups.
I
think
we
we
keep
talking
about
not
just
making
security
possible
but
making
it
as
much
as
we
can
easy
or
the
default
right,
because
the
yeah,
oh
all
you
have
to
do-
is
edit
a
configuration
file.
You
already
lost
90
of
the
people.
A
That's
true!
Every
extra
step
is
a
disaster
as
far
as
I'm
concerned.
Every
time
we
make
it
easier
is
you
know,
make
that
quote
easy
button,
we're
more
likely
to
actually
get
people
to
do
it.
D
That's
true
yeah
and
the
interactive
package
install
stuff
could
be
wizards
right.
So,
like
a
d
package
reconfigure,
you
know,
mini
sig,
get
or
get
mini
sign
right
would
walk
you
through.
You
know,
generating
a
key
pair
and
pointing.
A
The
first
time
you
sign
it,
oh
wait
a
minute,
I
don't
see
your
key.
Would
you
like
to
make
one
yeah
exactly
you
know,
but
you
exactly,
I
think,
that's
exactly.
The
way
we
need
to
be
thinking
is,
how
can
we
make
it
so
that
there's
as
close
as
we
can
zero
pain
as
automated
as
possible,
we're
never
gonna
make
everything
automatic,
but
the
closer
we
can,
the
better.
It
is.
C
A
Right,
that's
fair,
but
but
I
think
my
point
still
stands
the
easier
we
can
make
it
the
better
it
is
and
including
that
part.
How
do
I
get
that
word
out
that
this
is.
C
D
Exactly
so,
I
don't
want
to
dig
into
that,
because
I
think
there's
modifications
that
need
to
be
done.
I
think
the
did
method
crap
is
over
complicated.
D
I
think
the
w3c
came
in
and
said
it's
a
web
technology
and
over
complicated
everything,
but
you
know,
as
they
tend
to
do,
the
one
thing
that
came
out
of
that
research
and
that
work,
though
that's
really
really
interesting
to
me,
is
that
contributions
to
open
source
projects
digitally
signed
and
stored
in
a
git
repo
is
the
only
way
I
know
of
for
there
to
be
a
human
scale
proof
of
work.
D
So
I
could
be
you
know,
a
git
developer
that
has
published
my
keys
like
in
the
git
repo
like
in
some
folder,
like
hey,
here's,
my
signing
key,
and
you
can
just
call
me
mickey
mouse
right,
shout
out
to
my
friend
lawrence
lessig
there,
and
I
you
know,
I'm
just
known
as
mickey
mouse
right
and
here's
the
signing
key
and
over
the
years
I
make
a
number
of
really
good,
well
thought
out
contributions
to
the
project
signed
by
that
digital
identity.
D
Allow
me
to
log
in
with
my
hacker
reputation,
for
instance,
so
if
the
git
repo
maintainers
have
designations
for
people
who
have
met
some
arbitrary
threshold
of
a
consistent
contributor
or
whatever
just
give
me
the
name
contributor
right,
mickey
mouse
is
a
contributor,
for
you
know
the
get
project
and
here's
a
cryptographic
proof
of
all
of
mickey
mouse's
contributions
that
earned
them
that.
D
So
then,
with
that
I
could
use
a
url
to
the
canonical
repo
and
my
signing
key
to
you
create
a
capability.
You
know
in
a
capability
security
model
to
authenticate
myself
to
like
some
open
source
developer
service.
Let's
say
it's
a
free
email
service,
that's
only
for
open
source
developers
that
does
not
screw
up
patch
submission,
for
instance,
right
it's
configured
so
that
patches
pass
through
without
any
mangling,
and
it's
only
open
to
vetted
open
source
developers.
D
You
could
see
that
you
could
use
this
service
or
authenticate
to
this
service.
Based
on
your
reputation.
That's
cryptographically,
verifiable
in
a
repo,
so
it's
the
only
way
I
can
think
of
doing
trusted
anonymous
digital
identities,
because
you
have
to
have
a
proof
of
work
to
to
defeat
civil
attacks
and
contributions
over
time.
They're,
not
easy.
You
know
getting
them
landed
that
we
know
this
is
hard
getting
a
patch
into
any.
D
It's
just
an
interesting
thing
that
came
out
that
we
noticed
that
came
out
of
that
research
and
this
project
here
would
start
to
enable
that
we
could.
We
could
move
into
an
area
where
we
can
have
different
personas
for
different
projects,
for
instance,
and
so
there's
broader
community
and
societal
implications
of
that.
B
So
wait
and
we
have
a
minute
left
and
you
just
started
on
a
topic
that
I'm
sure
we're
going
to
have
more
discussions
about
in
future
meetings
and
one
of
the
one
of
the
things
is
just
making
sure
that
mickey
mouse
and
daffy
duck
aren't
the
same
person
like.
How
do
we
know
if
dappy
duck
is
and
mickey
are
the
same
in
improving
each
other's
commits?
I
don't
know
if
that's
possible
and
what
you're
describing.
D
Well,
I
mean
you
would
just
have
different
signing
pairs
for
them
and
they
could
be
the
same
person,
but
you
can
defeat
that
by
doing
you
know
social
graphs
and
there's
other
science
about
this.
So
I
mean.
A
I
I
I
think
that
I'm
not
aware
of
any
way
you
can
solve
that
truly
just
by
by
digital
keys.
On
the
other
hand,
you
can
allow
someone
else
to
sign
off
and
say
I
attest
that
these
two
are
not
the
same
and
now
you're
back
to
do.
I
trust
the
attester,
which
is
in
fact
a
general
problem
for
most
of
these.
It's
that
that's
actually
the
normal
case.
How
do
I
know
that
david
wheeler's,
david
wheeler?
Well,
the
us
government
says
I
am
well.
Do
you
trust
the
us
government?
A
D
Well,
in
a
hyperledger,
our
research
basically
got
us
to
the
point
with
we
don't
have
to
solve
the
ultimate
trust
problem.
We
just
have
to
solve
how
to
bind
societal
trust
infrastructure
to
the
digital
trust
realm.
So
a
lot
of
what
our
work
is
focused
on
now
is
using
regulated
financial
institutions
to
do
kyc
attestation
right
and
that's
good
enough
for
governments.
That's
good
enough
for
business.
That's
good
enough
for
trillions
of
dollars
in
e-commerce.
D
I
think
that's
good
enough
right,
so
we
could,
in
theory,
have
a
bank,
for
instance
kyc
me
kyc,
you
kim
and
then
without
revealing
identities,
sign
an
attestation
that
my
signing
keys
and
your
signing
keys.
I'm
mickey
mouse,
your
daffy
duck
that
we
are
not
the
same
person.
That's
all
that
matters
is
that
this
bank
is
saying
we're
not
the
same
person
and
they
could
under
court
order
or
something
under
a
warrant
reveal
our
identities,
but
that's
not
necessary
in
this.
D
You
know
what
we're
trying
to
accomplish,
which
is
integrity
of
an
open
source
project,
we're
we're
not
designing
military
weapons
or
protecting.
You
know,
dissidents
or
anything
like
that.
You
have
to
try
to
keep
things
in
perspective
when
you
do
this,
and
I
mean
I
spent
two
years
going
down
all
these
rabbit
holes
of
hyper
ledger
and
then
we
come
back
came
back
to
don't
let
perfect
be
the
enemy
of
the
good
yeah.
That's
that's!
It.
B
Cool
well,
I
gotta
run,
but
thank
you
thank
you
for
the
presentation.
I
I
think
I'd
like
to
follow
up
and
figure
out
how
we
can
push
this
along
and
figure
out,
what's
needed.
So,
okay.
C
Forget
two
things:
one
is:
is
pluggable
stuff
that
you
talked
about
in
signatures
and
aswan
or
any
other
mechanism
and
two
a
way
to
get
the
developers
public
keys
in
the
repository
itself.
You
know
those
we
will
accomplish
great
things
already,
because
this
will
allow
us
to
to
bootstrap
trust
based
on
the
clone
initial
git
clones.