►
From YouTube: CDS Hammer (Day 1) - Import/Export // Local Import
Description
http://goo.gl/U4b70r
28 October 2014
Ceph Developer Summit: Hammer
Day 1
rados: Improve Export/Import Functionality // diff: Integrity Local Import
Danny Al-Gaaf
A
B
B
C
It
also
failed,
depending
on
the
file
system,
where
you
try
to
export
it,
because
if
you
have
the
big
files
uploaded
to
wire
the
s3
interface
to
the
cluster
on
export,
they
shuns
get
a
kinect
some
data
and
write
it
down
metadata
and
write
it
down
into
the
external
attributes
of
the
files
and
they
reach
the
maximum
size
depending
on
a
file
system.
So
there
is
also
far
that
is
failing
completely
and
there
are
no
problem
in
generates
from
my
point
that
it's
the
depending
on
external
buttes.
C
So
if
you
have
a
file
system,
not
supporting
acceptable
to
fake
completely
so
yeah
and
we
could
improve
the
export
performance
by
changing
the
code,
I
have
don't
have
to
patch
around.
At
the
moment
I
roasted
so
I
have
to
rewrite
it
by
shuffling
the
objects
that
needs
to
be
exported
before
giving
them
to
bangkok
threats,
so
that
that
improve
the
performance.
C
I
guess
because
our
all
threads
tried
usually
to
read
from
the
same,
was
the
at
the
same
time
so
yeah
the
homes
was
limited,
and
but
we
couldn't
improve
the
import
that
easy
so
that
the
other
side
it
was
still
slow
to
reimport
all
the
data
into
the
new
cluster
and
in
general
it
would
be
nice
to
have
some
way
to
export
from
one
cast
onto
region
for
to
the
other
cluster,
without
any
fastest
and
in
the
fields.
Oh
yeah.
B
It
doesn't
doesn't
work
very
well
for
currently
okay
yeah,
so
I
mean
so.
B
D
Was
going
to
say,
we
need
a
way
to
cookie
Phi
it
so
that
we
can
assign
some
pieces
of
the
object
space,
two
different,
the
different
processes
and
well
obviously,
we
can't
be
putting
all
the
outsiders
in
the
file.
That's
rough,
though
yeah.
B
Okay,
so
I
think
the
first
and
most
obvious
problem
is
that
exporting
Rados
to
a
file
system
is
just
like
fundamentally
broken,
because
a
rightist
object
has
bites.
It
has
attributes
that
don't
fit
in
X
adders
and
it
has
Oh
map
key
value
data
which
doesn't
fit
in
a
file
or
a
attribute
like
it's.
Neither
so
I
wonder
if
we
should
just
get
rid
of
that
right.
B
Just
take
that
off
the
table
or
or
make
a
new,
maybe
rename
this
as
like
call
it
something
else
on
export,
but
like
rados,
dumped
a
file
or
something
duplicated,
so
I
think
Stan.
Then
the
export
really
should
be
writing
to
a
stream
similar
to
what
r
BD
export
and
RPG
diff
do
that
you
can
either
pipe
to
a
file
or
you
can
pipe
over
the
network
to
another
process.
That's
an
importing
from
that
same
stream
yeah
and
that
I
think,
will
give
us
much
more
much
more
flexibility
where
this.
E
B
B
Bite
format
stream
format
that
will
stream
well
I
guess
export
is
just
a
raw,
but
there's
export
dish
has
its
own
little
format,
but
either
way
yet
stream
based
thing,
so
you
can
send
it
to
send
it
out
standard
in
which
I
think
makes
sense
here,
I
think
while
we're
at
it,
we
should
make
sure
that
the
export
greatest
export
should
take
a
snapshot.
Argument,
pull
snaps
argument.
If
there
is
a
pool
snapshot.
B
The
snapshots
are
tricky
because
there's
two
kinds
of
snapshots,
so
if
there
are
pool
snapshots
then
do
we
want
to
export
the
pool
state
as
of
a
particular
pool
snapshot,
or
do
we
want
to
try
to
reconstruct
the
state
of
the
entire
history
of
the
pool
in
the
target
cluster
and
that
I
guess
it's
sort
of
a
question?
I
don't
know:
do
you
have
a
sense
of
that
dang.
C
C
B
B
B
D
D
E
E
B
B
B
B
Okay
and
then,
basically
for
export,
we
would
just
have
optional
arguments
that
you'd
give
it
a
start
and
end
offset,
and
it
would
go.
Do.
B
That,
in
parallel
across
a
bunch
sending
to
the
same
piece,
that's
ingesting
on
the
other
end
separate
ones,
I
guess
we
don't
need
to
really
need
to
worry
about
atomicity
in
case
your
ex
rental
pool
you're,
not
a
stream
it,
and
if
you
you
have
to
like
write
the
whole
objects
carefully
or
anything
like
that,
so
it
would
be
like
a
optional
chunk
size.
I
guess
yes,.
B
And
since
the
metadata
pool
doesn't
use
any
of
the
snapshot
stuff,
you
could
just
use
a
pool
wide
snapshot
to
do
that
as
far
as
taking
a
snapshot,
but
also
you
probably
not
the
ability
to
copy
it
to
another
pool.
In
that
case,
I
think
it's.
There
would
be
this
this
one
here
where
it
would.
You
would.
D
E
B
Make
want
something
like
rate
a
stash
people,
fingerprint
dart,
and
if
you
give
it
a
range
of
offsets,
it
would
just
like
it
would
generate
a
check
7
by
like
iterating
of
the
objects
in
a
deterministic
order,
and
you
make
a
medically
cool.
Then
you
could
like
interior.
You
could
take
two
pools
and
generates
the
same
fingerprint
across.
E
E
B
D
What
you
that
we're
definitely
do
that
I'm,
just
I'm,
just
not
sure
if
the
formats
could
be
precisely
the
same
I
it
might
be
once
you
strip
out
all
the
PG
specific
metadata
that
relevant
for
an
import
anyway.
So
maybe
there's
a
there's
a
way
to
filter
the
stream.
Actually,
that
would
be
the
correct
thing.
You
pass
it
through
a
filter
that
took
out
all
of
the
PG
log
and
PG
info
stuff
and
turn
em.
You
know
object
info
and
turn
it
into
a
yeah.
That's
actually
the
right.
Thank
Justin
just
leave.
E
B
D
E
B
B
D
B
D
B
E
D
B
D
E
B
E
B
B
Also,
an
old
blueprint,
I'm,
not
sure
he's
here.
Let's,
let's
talk
about
this
at
the
same
time,
it's
this
sort
of
very
similar
problem
and
if
I
won't
take
more
than
10
minutes.
So
the
challenge
there
is
the
RVD
export
doesn't
have
any
crcs.
As
I
understand
it,.
F
G
Not
really
directly
I
think
we'd.
We
talked
about
this
before
I
can't
move
exactly
what
we
thought
about
before,
but
we
thought
about
maybe
make
gathering
some
kind
wrexham
of
the
image,
but
not
enough,
but
not
reading
all
the
data.
Perhaps
I'm
reading
certain
offsets
of
it
or
at
some
percentage
of
the
disk
size
like
that
and
and
storing
that,
along
with
the
diff,
then
you
could
compute
that
check
so
I'm
when
you're
doing
an
important
I'm
and
make
sure
that
matches
what
the
important
says
it
should
map
should
be.
It's.
G
Do
but
it's
not
it's
not
really
suitable
when
you're
creating
your
images
right
now,
it's
just
automatic
generated
for
you.
E
E
G
G
G
B
E
B
B
B
E
E
B
G
B
Doesn't
it
can
you
do
a
different
port
parting
from
the
beginning,
it
would
doesn't
create,
doesn't.
E
E
E
B
G
Good
question
thank
for
hammer,
there's
lots
of
other
things
that
are
going
on
already.
That
might
take
up
more
time.
Yeah.
E
A
B
C
B
You
could
also
like
search
this
I
think
that
he
does
that
so
actually
I
mean
the
problem
is
that
we
don't
have
like
an
identifier.
That's
in
the
image
that
tells
you
which
image
it
is
I
mean.
There's
like
that,
there's
a
name
but
like
who
knows
the
name
is
it
could
be
arbitrary?
Doesn't
it
to
be
the
same
thing,
whereas
the
fingerprint
would
just
sort
of
give
you
a
probabilistic
fingerprint
of
the
content
that
would
give
you
some
assurance.
That's
the
right
one.
E
C
G
Like
you
can't
controlled
by
users
when
decorating
images.