►
From YouTube: Graph NFT Subgraph Community Call #1
Description
The first NFT Subgraph Community Call took place February 25th. Developers building NFTs with The Graph discussed their current projects, how they're using subgraphs, and what improvements can be made for the protocol to better support indexing and querying of data across chains.
The next NFT Subgraph Community Call will be March 4th at 10am PST. Join the community and learn more on Discord:
The Graph: https://thegraph.com/discord
The Graph NFTs: https://discord.com/invite/EdZE6EmVKC
A
Okay,
great
so
welcome
everybody
to
the
first
nft
subgraph
call.
A
So
this
was
a
a
group
in
a
call
that
was
put
together
relatively
quickly,
alex
masmedge
created
this
nft
subgraph
telegram
group
has
over
100
people
in
it
now
and
we
decided
to
just
get
a
bunch
of
people
together,
so
we
can
discuss
some
things
live
so
alex
thanks
for
kind
of
coordinating
the
group
there-
and
I
know
there
have
been
you
know
quite
a
few
people
in
this
group
that
have
been
you
know,
really
active
in
driving
things
forward
for
nfts.
A
So
I
you
know
we
really
want
to
kind
of
let
people
highlight
their
work
and
and-
and
you
know,
I'm
just
kind
of
playing
a
supporting
role
here
and
kind
of
you
know
getting
the
group
together
so
for
the
agenda
today.
I
wanted
to
start
with
a
subgraph
showcase
just
so
that
we
can
kind
of
see
what
kinds
of
things
people
are
already
building
on
the
graph
and
then
discuss
what
the
open
problems
are.
That
people
see.
You
know
what
are
areas
for
standardization.
A
You
know
what
kind
of
subgraphs
need
to
be
built,
and
you
know
any
other
kind
of
work
streams
you
know.
Hopefully
we
can,
you
know,
get
to
some
kind
of
consensus
on
the
biggest
problems.
A
Cool
and
we've
got
paul
with
the
quick
trigger.
So
let's,
let's
start
with
you,.
A
Yeah,
if
you
can-
and
let
me
know
if
I
need
to
change
something,
but
you
should
be
able
to
share.
Oh
no
all
participants.
Here
we
go.
B
Okay
and
we've
got
nick
and
elpizo
on
the
call
too
so
feel
free
to
chime
in.
But
here
is
the
foundation
primary
mainnet
subgraph,
and
it's
currently
missing
some
metadata.
We're
going
to
add
some
more
of
that
in
soon.
But
here
you
can
see
some
of
the
main
entities
we
have.
So
there
are
the
accounts,
interacting
with
the
foundation
subgraph,
if
someone's
a
creator,
there's
a
separate
entity
for
that.
B
You
can
track
what's
happening
with
the
nfts
here,
doing
our
best
to
adhere
to
community
standards,
but
definitely
want
to
drive
that
conversation
forward
and
open
to
tweaking
these
as
people
see
fit,
and
then
there's
also
auction
data.
So
you
can
look
at
bids
coming
in
here
down
in
nft
market
bid.
B
A
Cool
awesome
and
let's
move
on
to
the
next
one
ryan
bell.
C
Sure
so
we're
doing
nft
pricing
and
I'm
gonna
pull
up
this
crypto
punks.
If
you
got
my
screen
on
here,
so
here's
a
standard
we're
working
on
for
how
we
might
be
able
to
track
different
sale
events
and
transfer
events
across
these
nft
entities.
C
So
on
here
you
can
sort
things
by
the
average
sale
price
or
by
the
the
total
number
of
sales
and
you
get
analytics
like
the
metadata
off
the
crypto
punks
contract.
So
you
can
see
like
the
ape.
Crypto
punk
has
a
high
number
of
average
sales.
A
Cool
yeah,
just
one
thing
I
would
point
out
here.
I
actually
saw
this
for
the
first
time
with
the
moloch
subgraph,
but
I
noticed
that
the
metadata
here
is
actually
just
a
json
object,
and
you
know
it's
always
good
for
this
kind
of
data.
You
know
you
can
parse
it
in
the
sub
graph
and
then
actually
you
know
have
a
data
object.
That's
you
know,
got
those
fields
so
that
you
can
just
kind
of
read
the
fields
directly
without
json
parsing
on
the
client.
C
A
Yeah
it
it's
an
interesting
idea
and-
and
maybe
that's
the
best
way
to
do
it,
but
it
could
be
worth
you
know
discussing
what
kinds
of
different.
I
guess
this
is
the
the
point
of
standardization,
because
if
there
can
be
a
kind
of
type
that
you
can
kind
of
switch
on,
then
you
could
just
kind
of
like
parse
the
data
itself,
and
then
you
know
it's
always.
A
It's
always
nice
to
have
structured
data,
but
but
this
might
be
an
indication
of
you
know
yeah
some
standardization
that
could
could
help
in
this
area.
Cool
cool,
awesome
thanks,
yeah,
all
right.
I've
got
james
with
north
origin.
D
D
So
I
suppose
we
try
and
capture
pretty
much
any
on-chain
activity
that
happens
on
the
platform,
from
sort
of
tokens
being
created
to
we
sort
of
have
this
concept
of
a
high-level
collection
of
tokens
that
we
call
an
addition
that
we
sort
of
link
through
sort
of
numerous
fields
and
tap
into
a
little
bit
of
the
subgraphs.
D
You
take
search
on
that
as
well,
which
is
sort
of
okay
sort
of
from
an
implementation
perspective
we
capture
offers.
So
when
it
happens,
who
owns
stuff
again,
we
differentiate
between
creators
and
collectors,
and
we
just
find
that
easier
to
work
with.
And
we
also
have
some
pretty
funky
stuff,
where
we
try
and
do
sort
of
stats
about
per
day
that
we
capture
in
subgraph,
which
has
got
some
pretty
wild
logic
that
we
stole
from
a
c-plus
plus
script
to
work
out.
If
the
current
time
stamp
is
in
a
particular
day.
E
D
I
guess
so:
that's
that's
the
ko
subgraph
really
and
all
it's.
You
know
it's
anger.
A
Very
cool
awesome
thanks
next
we've
got
alexander
from.
Is
it
frosty.
E
Yeah,
that's
right.
Let
me
share
my
screen.
A
And
would
you
mind
stopping
to
share
yeah.
A
A
F
A
A
Okay,
we
can,
we
can
come
back
to
you,
you
can
go
ahead
and
do
that
all
right
next,
let's,
let's
see
what
rona
ronan
has
to
share.
G
I
yeah
so
I
mean
my
sub
graph
is
is
quite
simple,
but
I
can
still
share
the
screen.
I
I
guess
it
was
the
first
generic
sub
graph
deployed
on
mainnet
on
the
graph.
Just
it
was
made
six
months
ago.
I
remember
it
took
two
months
to
to
sync
but
yeah
it
contained
old,
nft
standard,
so
its
goal
is
to
to
ensure
that
there
is
no
one
left
out.
So
from
my
side,
what
is
missing
is
basically
some
standard
to
add
some
stuff.
G
For
example,
I
see
in
the
other
subgraph
that
we
saw
like
a
creator
concept.
Currently
we
are
missing
a
crater
standard.
This,
of
course,
is
outside
of
the
grass,
but
I
think
as
a
community,
if
we
could
push
for
having
that,
that
would
be
great
yeah.
I
take
the
opportunity
to
just
to
mention
what
I
am
missing.
The
other
part
on
the
ipfs
metadata
is
something
I
would
love
to
have
and
it's
something
I've
been
requesting
a
while
ago,
issue
issue
1077
on
the
graph
node
repo,
a
mechanism
to
call
back
graphs.
G
So
it's
I
think
that
will
solve
the
issue
of
metal
that
are
not
being
alive
at
the
moment
of
of
the
passing,
and
apart
from
that,
I
would
like
to
work
on
erc-1155
erc721
combined
graph,
because
some
contract
can
support
both
both
standard,
and
so
there
is
the
idea
of
concept
of
collection
that
I
hope
to
maybe
publish
somehow
as
part
of
sandbox
work.
So
sandbox
is
one
example
of
using
a
dual
one:
one,
five,
five,
seven,
two
one
yeah.
I
guess
that's
resume
a
bit.
A
Okay,
awesome
yeah
great
great
input
there
and
and
we'll
try
to
capture
some
of
those
suggestions
for
things
that
we
can
be
working
on.
G
A
And
if
jerry
isn't
ready
to
go,
you
might
be
muted.
We're
not
hearing
anything,
give
you
two
more
seconds
and
then
we'll
switch
to
alexei
from
wearable.
B
F
Yeah
yeah,
I'm
here,
okay,
so
I
could
start.
Then,
thanks
for
like
good
to
see
you
all,
it's
a
pretty
like
a
lot
of
people
here,
which
is
great,
so
I
have
probably
two
links.
One
link
is
our
repository
for
our
sub
graph.
F
F
It
only
gives
you
deals
list
of
deals
on
raibl,
but
what
I'm
would
like
to
talk
about
is
that
we
also
created
like
a
project
inside
this
repository,
which
called
like
nft
subgraph,
and
there
is
like
seven
tasks
here,
so
we
would
love
to
reward
like
top
contributors
who
could
like
and
and
close
these
tasks
and
here
a
bit
different
approach.
F
F
The
reason
one
example
I
can
share
here
is
like
yip
721
sub
graph.
It
looks
like
this
sub
graph
allows
you
to
query.
Like
all
your
see,
this
is
the
one.
F
G
F
Right,
so
this
is
just
one
of
the
examples
so
yeah
we're
kind
of
very
open
to
contribute
and
to
do
what
we
can
do
and
we
can
like
fund
this
project.
Also,
we
probably
will
have
a
final
reward
joined
with
the
graph
protocol,
some
kind
of
yeah
price
from
the
graph
and
from
variable
both
of
us.
F
So
we
are
committed
for
500
5000
k
for
this
project
and
the
graph
is
committed-
I
I
guess
almost
committed
for
10
000
k,
so
it's
in
total,
it's
like
15
000
k's,
and
we
would
love
to
split
this
reward
between,
like
top
five
contributors
to
this
repository,
who
could
just
finish
and
yeah.
It
would
be
great
if
we
will
find
a
way
to
finish
all
these
tasks.
F
Several
nifty
tokens
like
yeah,
so
here
is
more
description,
which
is
alex
salinkov,
provided
here.
A
So
this
is
great,
so
maybe
we
can
kind
of
circle
back
around
a
little
later
in
the
call
when
we
start
to
kind
of
define
the
the
future
work
that
we
want
to
take
on
here.
But
it
looks
like
these
are
really
well
organized
and
you
know
to
the
floor
to
kind
of
see
if
people
have
input
on
these
specific
issues
here.
A
I
Yeah,
I
was
gonna
tell
you
quickly,
I
think
so.
Attribution
is
a
big
thing
in
here,
because
the
the
creator
sometimes
is
said
to
be
the
first.
You
know
mentor,
but
sometimes
with
lazy
minting,
the
first
mentor
is
actually
the
buyer,
and
so
I
think
some
coordination
around
who
is
the
creator
of
the
nft
versus
the
owner
is
to
be
you
know,
cleared
out
and
coordinated
around
the
industry
and
then
also
like
media
and
market.
I
I
think
zoroa
pioneered
it,
but
I've
seen
the
subgraph
on
this
call
and
a
lot
of
people
are
putting
the
market,
and
so
I
think,
there's
also
space
to
coordinate
there
because
it
seems
like
this
is
the
space.
This
is
where
the
space
is
going
and
yeah
alexa
with
rybal
like
has
done
the
work
that
the
earliest,
and
so
you
really
wanted
to
acquire
it
from
day
one,
and
so
I
think
your
sub
graph
is,
could
be
like,
like
one
open
standard
that
anyone
can,
just
you
know,
pull
requests
in.
So
that's.
F
Just
yeah,
this
is
what
we
like
would
love
to
build,
just
support,
building
like
general
sub
graph
for
all
the
projects
and
yeah
we
seen
this.
This
is
just
we
have
to
query
like
all
nfts
and
later
on,
support
like
different
sales,
mostly
like
sales
contracts,
to
track
like
omni
sales
from
different
platforms,
but
acquiring
like
owners
of
the
different
nfts.
It
should
be
like
on
top
of
this,
like
specific
contracts.
It's
just
the
whole
space.
I
A
So
super
promising
work
being
done
there.
Let's
just
continue
with
the
showcase
and
then
we'll
start
to
capture.
Some
of
you
know
these
suggestions.
So
next
is
hadrian
with
the
1155
sub
graph.
J
J
I
think
it's
something
that
I
found
very
valuable,
that
most
subgraph
builders
don't
do,
and
basically
you
can
just
see
it
in
the
in
this
graph,
for
example,
is
that
so
this
is
an
1155
for
example,
but
so
it's
a
token
id
where
you
can
find
the
registry.
If
you
go
to
the
registry,
you
can
have
a
list
of
all
the
tokens.
This
roots
are
back
here.
You
can
have
all
the
balance
for
all
the
accounts.
J
With
this
token,
if
you
go
to
account,
you
can
have
all
the
balance
for
all
the
registries
and
all
the
tokens
and
from
any
object.
Basically,
you
can
have
a
list
of
events,
for
example
all
the
transfer
from
that
this
balance
was
affected
and
from
the
transfer
form
you
can
obviously
get
the
transaction
id,
which
might
be
useful.
J
If
you
want
to
have
a
look
on
ether
scan
at
what's
happening,
but
you
can
also
have
a
look
at
all
the
the
particularities,
so,
for
example,
here
what
we
can
see
is
that
we
can
see
for,
like
the
first
five
accounts,
what
balance
they
own
so
which
token
registry,
and
which
so
we've
got
the
identifier.
The
registry
you've
got
the
value
of
this
balance
because
we
are
in
1155,
but
you
also
have
the
transfer
from
so
you
can
basically
know
which
transaction
we're
involved
into
building
this
balance.
J
Similarly,
this
is
a
721.
Obviously
there
is
no
balance,
but,
as
you
have
a
list
of
transfer
that
is
always
available
from
all
objects
or
list
of
transaction
events,
it's
always
available.
That
means
that
you
can
sort
by
timestamp
to
detect
all
events
that
affected
or
token
or
a
user,
or
something
like
that.
So
I'm
not
saying
that
that
basically
here
I
invented
something
perfect,
because
obviously
I'm
not
capturing
everything
relating
to
artists
or
a
platform
specific
approach.
J
For
example,
we've
got
the
stuff
that
is
pretty
standard
to
rt721,
but
you've
also
got
this
object.
That
is
decimal
value
or
event
that
you
may
not
be
aware
of,
and
decimal
value
is
just
something
that
came
from
tooling.
I've
built
to
build
sub
graph
and
I've
got
an
entire
npm
repository
of
building
blocks.
For
example,
if
you
got
any
event
and
you
want
to
create
an
event
id
to
identify
a
number,
I've
got
a
tool
link
for
that.
J
Also,
when
you've
got
values
that
are
decimal
number
decimal
value,
helps
you
get
both
an
exact
value
and
also
a
decimal
value
with
a
decimal
integral
approach.
That
way,
you
are
able
to
do
atomic
operations
in
a
very
resilient
way.
If
you
just
do
it
on
big
decimal,
sometimes
you've
got
underfloor
issues
so
yeah.
That's
why
all
I
wanted
to
show
today.
A
Okay,
that
is
super
cool
and
really
helpful.
So
you
know,
maybe
you
could
post
a
chat
to
some
of
those
utilities
that
you've
built
and
people
can
see
if
they're
sure.
A
Awesome
thanks
thanks
for
sharing,
let's
see,
is
alexi
back
on
or
sorry
alexander.
A
Screen,
I
think
this
is
the
last
one
we
have
and
then
we're
gonna
jump
into
trying
to
capture
some
of
the
ideas
that
have
been
floated.
E
All
right
yeah,
so
my
name
is
alex
and
I'm
cto
and
co-founder
of
linen
app
and
like
in
my
spare
time
and
mostly
for
educational
purposes
and
for
fun.
E
Like
me
and
a
couple
of
other
developers,
we
are
doing
this
crypto
media
subgraph
and
so
far
we
managed
to
scan
like
five
platforms
and
all,
and
also
we
have
like
three
coming
soon,
including
those
hash
masks
so
and,
unlike
all
the
other
subgraphs,
we
are
mapping
each
platform
with
the
custom
mappings,
and
because
of
that,
we
can
get
the
the
creator
name
and
greater
address
fields
that
are
not
a
part
of
any
standard
right
now.
That
is
like
unfortunate
yeah.
E
I
guess
we
should
add
it
to
some
of
the
standards,
but
so
far
we
need
to
parse
each
platform
separately
to
get
those
and,
unlike
other
subgraphs,
we
also
support
eip
1155
semi
ability.
So,
instead
of
having
only
one
owner
for
each
nft,
we
have
like
the
list
of
every
owner
of
eap
1155
with
the
quantity
of
the
nfts
that
they
own
yeah
and
so
far
we
don't
parse
a
token
uri.
E
We
just
having
the
link
and
that's
what
is
like
lacking
in
this
subgraph,
but
I
guess
we
can
like
unite
our
forces
with
other
subgraphs
that
are
focusing
on
parsing
the
the
metadata
yes
and
that's
what
what?
What
is
our
plans
for
the
future
and
yeah
we're
gonna?
Add
some
more
cool
platforms
to
the
subgraph?
Yes,
that's
that's.
Basically,
it.
A
Awesome
cool
thanks
for
sharing
all
right.
So,
let's
see
if
you
can
stop
sharing
there.
Yes,
yep
I'll
pick
up
on
this
side,
and-
and
we
can
turn
this
into
a
bit
of
a
working
meeting,
so
I'm
gonna
share
and
we
just
did
the
sub
graph
showcase
and
and
next
I
think
it
would
be
helpful
to
try
to
capture
you
know
a
list
of
what
we
think
the
open
problems
are.
So
you
know
I
think
there
are.
You
know
subgraphs
that
people
want
to
have
built.
A
You
know
probably
some
areas
of
standardization
but
it'd
be
nice
to
hear
you
know,
as
people
have
been
building
their
sub
graphs.
You
know
you
know
what
kind
of
you
know
blockers
have
they
hit
or
other
you
know
things
that
you're
aware
of
that
would
be.
You
know,
open
problems
or
things
that
are
worth
tackling.
G
Yeah,
I
can
jump
in
with
the
thing
I
mentioned
previously
like
so
for
me
was
like
ipfs
metadata
when
the
issue
when
they
are
missing,
and
the
second
one,
like
I
think
other
also
I've
mentioned-
is
a
creator
standard,
at
least,
but
I
think
for
marketplaces.
A
Okay,
so
just
to
clarify
missing
ipfs
metadata
is
the
nft
refers
to
a
hash,
and
you
basically
can't
that
the
file
is
not
available
anywhere.
G
Yeah
so,
for
example,
like
so
graph
support
ipfs,
but
because
the
data
could
maybe
the
ipfs
node
that
is
used
for
for
creating
the
data
is,
maybe
is
down
or
whatever
happened,
and
then
the
graph
will
return
null,
which
is
fine,
but
then
there
is
no
way
to
bring
it
back
on.
So
one
suggestion
I
had
in
I
put
in
the
issue:
1017
is
basically
to
have
a
call
back
mechanism,
but
we
could
could
talk
about
this
potentially.
J
A
A
Yeah
we
can
I'll
share
some
of
the
things
that
are
on
kind
of
the
graphs
road
map,
or
you
know,
features
that
the
edge
and
node
team
are
working
on
that
that
might
address
some
of
these
gaps.
But
but
for
now
it's
good
to
hear
what
people
are
seeing.
G
Yeah,
by
the
way,
the
callback
mechanism
can
be
used
for
other
purpose
as
well
like,
for
example,
you
you
want
to
currently
like
for
erc20.
You
could
scan
all
the
you,
the
graph
or
even
for
the
air
system
to
one
you
have
to
scan
everything
and
because
there
is
erc20
and
rc71
you,
you
have
to
do
some
easier
computation.
G
Another
mechanism
is
instead,
the
graph
is
listening
and
someone
can
submit
saying
hey.
I
am
a
erc721
and
then
the
graph
can
just
start
to
compute
on
that
address
and,
of
course,
if
it's
not
71,
it
will
not
go
through
it,
and
so
you
could
still
have
the
data
either,
but
the
data
can
be
instead
of
always
have
to
be
put
before
it
can
be
injected
without
requiring
transactions.
Because
currently
you
could
do
that,
but
you
require
a
transaction
or
something
I
mean.
I
don't
know
if,
if
you
follow
what
I
mean,
but.
A
Yeah
well,
let's,
let's,
let's
think
about
that,
so
you
know
we
do
support
dynamic
data
sources,
but
I
guess
this
is
different
from
that,
because
what
you're
saying
is
the
contract's
already
been
deployed?
There's
no
transaction,
that's
triggering
anything,
but
basically
out
of
the
band
you
recognize
hey.
We
want
to
start
indexing
this
contract,
that's
already
been
deployed.
I
mean
it
would
still
make
sense
to
have
some
kind
of
registry
yeah.
That's
true.
A
Exactly
what
dynamic
data
sources
handle
right,
so
you
basically
have
a
contract
that
that
says
you
know
these
are
all
of
the
erc
721
contracts
that
we
want
to
index.
For
example,
whenever
there's
a
new
contract
you
want
to
index.
That
is,
you
know,
a
transaction
on
chain,
but
then
you,
it
dynamically
goes
back
and
just
indexes
that
whole
contract.
G
A
Is
there
anyway
yeah
there's,
definitely
something
here
so
so
we
can
kind
of
you
know,
discuss
our
latest
thinking
on
this.
H
I
just
want
to
plus
one
the
ipfs
metadata,
I
think,
like
our
team
nicholas,
I
think
it's
in
this
room,
but
we've
tried
a
couple
of
things
where
there's
like
a
retry
mechanism.
We
have
a
block
handler,
but
it
slows
down
the
graph.
So
I
think
the
figuring
out
a
way
to
I
think
your
team
has.
It
has
given
us
like
a
api
to
hit
basically
to
start
pinning
files
all
night
on
that
on
the
graph
nodes.
So
there's
like
more
robust
ways
to
do
that.
H
I
think
it'd
be
super
helpful
because
I
think
it's
really
cool
yeah,
because
I
think,
like
a
really
cool
thing,
about
being
able
to
index
both
like
on
chain
data
and
like
ipf
as
data
is
suddenly
you
have
everything
in
one
place
and
the
clients
just
like
they
don't
have
to
query
like
multiple
times,
big
plus
one
on
this.
D
Yeah
one
of
the
problems
we
found
is,
that
is
say
you
pin
something
to
say:
pinata
or
inferior.
The
subgraph
rpfs
node
runs
on
a
really
old
version,
it's
very
difficult
to
communicate
to
the
late
east
ipfs
network.
So
you
really
have
to
pin
to
like
the
platforms
you
want
and
subgraph.
When
really
you
should
be
able
to
resolve
it.
It's
just
not
go
on
a
new
enough
network.
C
C
But
what
would
be
really
nice?
If
is
if
we
had
some
kind
of
composability
to
where
we
could
end
up
having
this
big
old
sub
graph?
But
it
is
sharded
at
these
different
standardized
intervals
of
these
different,
like
wearable
and
super
rare,
and
then
all
that
can
fit
under
like
the
art
branch
of
this
thing,
but
as
a
part
of
that,
each
branch
needs
to
conform
to
the
same
kind
of
schema
for
how
it
interpreter
how
it
handles
metadata.
A
That
that's
a
great
insight
hadrian
raises
hand.
J
Yeah,
I
just
wanted
to
say
that
at
open
zeppelin,
one
of
our
project
is
alongside
all
the
standard
implementation
that
we
provide.
I
would
like
to
additionally
provide
a
standard
sub
graph,
so
for
every
smart
contract
that
open
pin
has
there
will
be
a
standard
sub
graph
and
one
of
the
issue-
that's
preventing
us
to
do
that.
For
now,
is
this
possibility
issue
like
if
you've
got
a
standard
sub
graph
for
721
and
one
for
1155
having
us
to
merge?
A
Absolutely,
and
and-
and
you
know,
I
I
think-
opens
up
one's
already
kind
of
done-
a
lot
of
pioneering
there
with.
Obviously
you
know,
standardizing
smart
contracts,
so
why
not
also
help
to
standardize
subgraphs.
E
Actually,
it
might
be
not
really
sometimes
it
might
be
not
so
useful,
because
you
can
have
different
subgraphs
for
the
same
smart
contracts
and
that
scans
the
different
aspects
of
this,
for
example,
once
a
graph
can
scan
the
market,
the
others
can
scan
the
ownership.
So
yeah
I
mean
yeah,
I'm
not
sure.
If
you
can
standard
the
subgraph.
A
H
Yeah,
this
is
seva
from
protofire.
We
built
a
couple
of
graphs
based
on
nfts
and
we're
running
a
couple
of
issues.
One
is
ipfs
related
metadata,
related
issues.
We
already
discussed
that,
but
the
second
one
is
related
to
most
most
of
the
token
are
registered
on
all.
There
are
a
couple
of
contracts
that
russians
you
contract
that
provide
information
about
nfts.
H
The
problem
with
right
now
is
in
the
way
that
the
graph
work
today
we
lose
the
event
between
the
nfc
created
and
when
it's
resisting
on
the
crash
history
contract,
because
we
we
don't
know
exactly
the
the
start
block.
That
is
a
problem
because
the
ratio
probably
is
later
on
the
blockchain
than
the
nft
itself,
and
we
need
to,
I
don't
know,
start
indexing
from
the
x
block
where
the
the
dynasty
was
created,
not
just
resistor.
On
the
token
branches.
A
Okay,
so
is
this
an
issue
with
dynamic
data
sources,
because
you
can
you
you
can't
actually
index
historical
data
from
before
the
registry
was
created
right
so
right.
A
And
data
sources
don't
allow
indexing
data
prior
to
the
like.
I
forget
what
we
call
it
but
kind
of
source
contract
being
deployed
right.
So
that's
the
issue,
so
the
nft
itself
might
have
been
deployed
to
block
you
know:
2
million
you
have
this
registry
contract
to
block
4
million,
that's
keeping
track
of
all
of
the
721
contracts,
but
it
can't
actually
start
indexing
from
from
from
before,
okay
yeah-
and
I
remember
that
there
was
something
really
hairy
in
implementing
this.
A
But
but
if
it's
becoming
a
problem
here,
it's
it's
good
to
resurface
that
and
it's
it's
definitely
worth
tackling
this
one
cool,
more
open
problems.
This
is
this
is
all
really
good.
D
D
I'd
like
to
be
able
to
set
an
end
block
on
them,
because
really
after
they've
been
retired,
I
I
don't
really
care
about
them
if
I've
already
indexed
them,
so
I
imagine
that
might
yeah
I'm
just
making
a
guess
whether
that
would
speed
it
up.
If
I
can
almost
say
index
between
these
blocks
and
after
that,
you
can
sort
of
ignore
it.
That
would
certainly
help
or
it
might
help
the
speed
of
our
subgraph.
It
takes
like
a
good.
J
A
G
I
have
oh
yeah,
it's
more
for
the
standardization
part
another
one
is
collection
is
related
to
to
the
support
for
duel,
one
one,
five,
five,
seven,
two
one
in
in
this
kind
of
contract,
a
seventy
one,
can
be
extracted
from
one
one,
five,
five
and
it
become
part
of
a
collection
and-
and
I
think
all
the
marketplace
also
support
collection
in
other
ways,
basically,
a
standard
for
collection
to
to
know
that
one
token
is
actually
part
of
a
bigger
group
inside.
A
E
A
E
Yeah,
if
you're
talking
about
the
semi-punch
ability
right,
we
need
to
standardize
how
we
could
track
multiple
owners
in,
in
the
same
manner
across
different
some
graphs.
A
H
Yeah,
I
was
going
to
say
the
metadata
that
we
store.
That
schema
probably
needs
to
be
standardized
as
well,
and
I
think
the
one
we
currently
have
from
721
is
is
outdated.
So
it's
name
description
and
image,
but
I
think
we're
seeing
nfts
have
all
sorts
of
like
assets
like
they're
you're,
seeing
videos
you're,
seeing
audio
you'll,
probably
see
3d
ar
stuff,
like
that,
so
figure
out
how
to
store
that
stuff
will
be
important.
D
H
Agree-
and
I
think
ideally,
this
has
some
like
kind
of
self-referencing
like
versioning,
you
know
mechanism
where,
like
you,
can
reflect
on
it
and
figure
out
how
to
render
it
based
on
based
on
itself.
H
It
sorry
I
was
just
gonna
say
it's
also
worth
calling
like
nuance
here
is
like
they're.
Essentially
I
don't
think
there's
a
world
where
you
can
like
enforce
royalties
on
nfts,
which
is
important
to
creators
without
hijacking
transfer,
and
so
the
best
that
you
can
really
get
to
is
have
an
open
standard
for
for
an
honor
system
right.
So
you
you
self-report
what
the
fee
should
be
and
who
it
should
go
to
and
then
marketplaces
have
to
honor
it,
but
it's
open
source
and
verifiable.
H
D
A
Yeah,
these
are
great
suggestions.
C
See
one
other
umbrella
for
the
standardization
is
events
and
there's
a
few
different
kinds
of
events
that
can
happen
for
an
nft
one
being
just
like
an
updated
token
address
like
for
its
ipfs
address.
So
in
the
beginning,
you
might
mint
it
with
some
address
and
then,
if
you
change
that,
that
needs
to
be
an
event
that
others
can
listen
for
and
sale
events
transfer
events,
things
like
that
bids.
E
A
Yeah,
I
guess
it
could
be
both
because
you
know,
if
especially
then,
if
you
want
to
like
aggregate
events-
and
you
know
you
can
get
all
of
that
metadata
in
a
consistent
format
than
for
building
feeds,
and
things
like
that
would
be
standardized.
E
A
G
So
I
have
just
another
one,
just
I
thought
about
it
now,
because
now
we
are
moving
to
a
system
where
we
will
have
a
token
on
different
chains,
and
I
guess
this
current
way
is
to
have
one
rpc
on
point
per
inch,
but
it
would
be
great
to
have
a
massive
like
endpoint
to
get
all
the
token
across
all
the
chains
that
the
user,
own,
etc.
A
Yeah
for
sure
cool,
okay,
so
let's
do
this
one
and
then
I'm
I'm
gonna
share
just
some
of
the
things
that
the
edge
of
node
team
is
working
on.
That,
I
think,
could
help
address
some
of
these
issues,
just
so
you're
kind
of
aware
of
of
the
road
map.
A
But
before
we
jump
there
I
mean
it
sounds
like
generally.
What
people
need
is
you
know
subgraphs
that
aggregate
you
know
the
721s,
the
1155s.
You
know
the
the
different
like
marketplace
contracts
and
it's
it's.
You
know
I
I'm
not
immediately.
A
You
know,
you
know
sure,
like
how
many
of
these
there
are
out
there.
So
we'd
love
to
just
kind
of
hear
from
people.
You
know
what
their
sense
is
for
kind
of.
You
know
how
many
of
these
contracts
out
there
what
subgraphs
need
to
be
built?
How
do
we
kind
of
organize
to
to?
Maybe
you
know,
get
get
to
get
these
all
built
out.
I
A
Yeah
yeah,
so
so,
let's,
let's
chat
about
that.
Okay,
so
so
sub
graph
composition
is
something
that's
been
on
our
roadmap
for
a
while.
We
did
an
initial
pass
at
this
like
a
year
ago,
but
had
to
kind
of
put
it
on
the
shelf.
So
there's
there's
two
different
kind
of
types
of
subgraph
composition.
A
A
You
know,
for
example,
you
know:
we've
recognized
that
if
you
have
a
single
giant
subgraph
like
as
I
was
mentioned
up
here
somewhere,
it
takes
too
long
to
sync,
and
so
you
know
it's
much
better
to
have
smaller
kind
of
more
granular,
specific
subgraphs
but
then
still
be
able
to,
like.
You
know,
roll
them
up
into
a
logical
subgraph
that
you
know
kind
of
contains
all
of
that.
But
it's
it's
still
kind
of
query
time,
composition,
and
so
you
wouldn't
necessarily
have
the
data
like
interleaved
right.
A
Let's
say
for
like
a
specific
entity,
but
you
would
at
least
be
able
to.
You
know,
have
all
of
that
data
in
one
place.
A
So
that's
that's
kind
of
the
the
query,
time
composition,
where
basically,
you
can
mix
and
match
entities
and
parts
of
different
sub
graphs
together
and
kind
of
compose
them
into
like
a
larger
subgraph.
F
Then
there's
kind
of
oh
sorry,
yeah,
sorry
interrupting
you
just
wanted
to
ask.
If
you
don't
mind,
what
do
you
mean
in
terms
of
time
of
sync,
if
you
create,
like
the
large
collection
of
all
nft
721,
for
example,.
F
Yeah,
so
could
you
just
move
a
bit
up
so
yeah
here
it
here?
It
is
it's
like
large
sub
graph
of
all
721
contract
state
takes
too
long
to
sync.
A
Here
is,
you
know,
let's
say:
let's
say
that
there's
57
21
subgraphs
and
then
you
decide
hey.
Let's
add
one
more.
You
know
you
know
sub
or
contract.
Sorry
to
that
you
would
have
to
resync
everything
from
scratch
and
that
would
take
a
long
time.
So
if
you
have
a
competition,
you
know
you
could
add
the
51st
721
contract
just
sink
that
one
contract,
but
then
have
an
updated
sub
graph.
That
has
you
know
all
51
contracts
indexed.
G
I
G
And
it's
quite
a
simple
one
like
if
we
use
adriana
version
of
it.
It
might
be
even
more,
I'm
not
sure
because
but
yeah.
So
it's
two.
F
So
yeah,
in
this
case
it's
literally
not
really
possible
to
index
right
now,
all
the
contracts
and
once
a
graph
right.
A
Yeah
not
not
in
one
subgraph
exactly
so
so
you
could
basically
index
them
individually
and
then
you
can
do
this
composition.
Then
then
you
can
kind
of
iterate.
You
can
add
new
contracts
because
you
don't
have
to
resync.
You
know
as
long
data
as
already
indexed
is
in
the
format
that
you
want
it.
Then
then,
that
kind
of
provides
things
yeah.
H
But
there
are
also.
F
A
Yeah,
so
that
that's
what
we're
discussing
would
be
you
know,
through
this
composition.
Now
I
I.
I
think
that
what
you
really
want
is
the
indexing
time
composition,
which
I
guess
is
different
from
this
query
time:
composition,
which
would
basically
allow
you
to
interleave
the
data.
So
you
know
you
could
have
like
a
feed,
for
example,
where
for
the
different
contracts,
you
know
it's,
it's
all
adding
data
to
like
a
single.
A
You
know
collection
in
the
sub
graph,
and
so
that's
a
little
bit
more
complicated
to
support.
But
it's
something
that
you
know.
I
think
we
should
start
looking
at,
especially
you
know
for
this
type
of
use
case.
So
so
let
let
us
let
us
dig
into
it
a
little
bit
more
and
and
we
can
chat
with
some
of
you
to
understand
a
specific
use
case
and
see
if
we
can
prioritize
this,
because
I
I
see
that
it
would
be
really
valuable,
but,
okay,
we
we.
A
We
only
have
eight
minutes
left.
So,
let's
just
say
for
the
what
subgraphs
need
to
be
built.
It
did
someone.
Can
someone
just
chime
in
with
like
what
they
think
is
a
rough
estimate
for
the
number
of
you
know
contracts,
whether
it's
you
know,
721,
1155
or
like
marketplaces
that
currently
don't
have
good
subgraphs?
Is
there
like
a
rough
rough
ballpark.
I
There
are
ten
good
ones,
foundations,
aura
variable
yeah.
Also
I
I
had
a
quick
question
around
the
graph
throughput,
so
the
graph
is
still
not
in
the
fully
grt
ecosystem
way
right,
it's
still
hosted
for
now,
or
is
it
did
it
move
to
a
decentralized
storage
because
it,
I
think
so
us
at
showtime
we
are
curing
every
nft
and
it
seemed
to
us
that
very
short
term.
I
I
Performance
for
curing
like
right
now
we
are
curing
the
the
url
of
nfts
is
read
the
google
cloud
plus
link
from
from
the
openc
api,
and
it
seems
faster
to
us.
A
Okay,
yeah:
I
would
love
to
dive
into
that
because
it
you
know
it
should
have
good
performance
and
so,
let's,
let's
say
performance
versus
openc
api.
I
For
us
for
us,
and
and
like
I
just
dropped
the
link
of
our
alpha
version
and
it
loads
pretty
fast
with
openc
and
so
we're
not
in
a
rush
but
like
well.
Of
course,
you
know
this
is
by
organizing
this,
because,
obviously
open
cpi
is
not
the
long-term
solution
and
we
want
to
move,
but
it
seemed
to
us
that
it
was
not
as
fast
to
carry
some
other
stuff
drafts,
but
I
think
it's
because
it's
not
fully
migrated.
Yet.
I'm
not
sure,
though,.
A
Yeah
yeah,
so
let's
look,
you
know
we'll
do
some
testing
here
just
to
answer
the
question
about
where
we're
at
in
the
migration.
Basically,
at
this
point
we're
just
doing
you
know.
A
Basically,
you
know,
testing
on
on
the
decentralized
network
edge
and
node
is
working
on
a
gateway
that
it's
going
to
be
releasing
relatively
soon
first
in
kind
of
private
beta
and
then
for
general
availability
and
that's
when
we're
going
to
see
dapps
start
migrating
to
the
decentralized
network,
so
just
kind
of
keep
keep
an
eye
out
for
for
updates
there
and
and
yeah
we're
excited
about
kind
of
the
performance
and
and
different
aspects
of
interval
out
reliability
and
all
that,
but
but
but
yeah.
A
Let's,
let's
see
how
the
performance
compares
and
what
we
can
do
there.
There
are
additional
improvements
that
we
can
make
to
the
indexing
speed,
because
you
know
we're
hearing
from
people
that
that's
a
pain
point.
So
you
know,
for
example,
one
thing
that
we're
looking
at
is
parallelization
of
indexing.
If
there's
data
that
we
know
doesn't
need
to
be
done,
sequentially
we
could
do
like
you
know,
parallel
processing
and,
and
that
could
speed
things
up.
So
there's
a
bunch
of
things
like
that
that
we
can
look
at.
A
It's
a
problem
with
ipfs
in
general
is,
is
you
don't
know
if
a
file
is
available,
and
so
we,
the
indexers
in
the
network,
need
to
have
a
way
to
kind
of
come
to
consensus
so
that
they
all
agree
whether
a
file
is
available
or
not?
And
then
just
as
was
actually
suggested
here,
there's
going
to
be
a
call
back
kind
of
mechanism,
so
say
the
indexers.
A
You
know
they
they
couldn't
find
the
metadata
for
this
nft
and
then
at
some
point
in
the
future.
The
the
file
shows
up
then
they
could
all
update
and
and
like
re-register
and
index
the
metadata
that
came
online.
So
this
is
all
part
of
ipfs
data
sources,
which
is
something
that
we
are
going
to
be
working
on.
A
So
we
didn't
have
a
chance
to
start
organizing
the
work
streams
here
and
and
the
subgraph.
So
I
think
this
is
good
agenda
for
a
next
call.
You
know,
I
know
that
the
nft
space
is
blowing
up.
You
know
we
all
need
to
move
fast.
So,
let's,
let's
have
another
call
in
a
week,
and
and
we
can
start
breaking
this
down
into
specific
work
streams.
We
can
also
carry
on
some
of
the
discussion
it
would.
It
would
be
worth
creating
a
channel
in
the
discord.
A
I
think,
since
you
know,
people
are
already
in
in
the
graphs
discord
and
it's
a
good
place
to
kind
of
have
longer
forum
conversations
around
some
of
these
problems,
and
you
know
the
as
mentioned
the
graph
foundation
has
expressed
interest
in.
You
know:
funding
development
around
nfts,
so
you
know,
I
think
what
we
want
to
do
maybe
on
the
next
call,
is
identify
some
of
these
work
streams
and
try
to
get
some
definition
around.
A
These
facts
find
some
owners
for
each
of
these
that
can
really
kind
of
help
drive
these
things
to
completion
and
and
then,
if
we
need
to
find
external
developers
to
help
build
some
of
the
stuff.
A
The
foundation
can
assist
with
that
and
and
funding
some
of
the
work
through
grants.
A
Share
it
out
in
the
in
the
telegram,
after
after
the
call
some
things
yeah
do
people
have
any
last
words
before
we
break
here.
A
Okay,
awesome
well,
thank
you.
Everybody
for
joining
my
first
nft
call
really
excited
to
you
know,
have
the
the
whole
community
coming
together
here,
and
hopefully
we
get
some
really
dope
stuff
up
and
running
soon.
Thank
you.