►
From YouTube: Extending Ceph's Reach - Tushar Gohad, Zhong Xin
Description
Cephalocon APAC 2018
March 22-23, 2018 - Beijing, China
Tushar Gohad, Principal Engineer Intel
Zhong Xin, Eking General Manager of Cloud Services Business Group
A
Li
hao
good
morning,
thank
you
for
the
honor
of
speaking
at
the
inaugural
safe
conference.
My
name
is
Tushar.
Go
hard.
I
worked
in
the
capacity
of
principal
engineer
at
Intel.
Well,
my
focus
has
been
softer
different
storage
for
the
last
five
plus
years
now.
As
part
of
leading
these
efforts
at
Intel
I've
been
involved
with
OpenStack
Ceph,
as
well
as
a
few
other
storage
technologies,
including
network
attached
storage
for
media.
Today,
I
would
like
to
talk
to
you
about
extending
sips
reach
beyond
where
it
is
today.
A
So,
as
the
scientist
John
Lilly
of
dolphin-
and
you
know,
human
sensor,
research
Fame
once
said,
the
only
security
that
we
have
is
our
ability
to
change
so
ad
until
we
embrace
these
new
challenges,
as
we
have
invested
in
exciting
new
hardware
as
well
as
software
technologies,
including
Ceph,
so
here's
his
vision
for
safer
Intel,
as
I
said,
we
we
want
to
expand,
search,
reach,
bringing
new
workloads
to
to
SEF.
We,
we
have
been
a
responsible
community
citizen
and
we
we
are
actually
working
on
growing
it
or
making
it
better.
A
Our
safe
sip
efforts
have
been
following
our
open
source
commitment.
You
know,
as
you
may
know,
we
have
been
the
top
the
next
kernel
contributor
for
several
years
now
and
I,
based
particularly
with
self.
We
have
an
eight
strong
team
of
open
source
developers.
There
have
been
working
on
various
areas
and
it
we.
We
also
work
with
OEM,
and
is
we
partners
such
as
Red,
Hat
and
Sousa,
and
canonical
on
helping
that
their
customers
are
Ceph
for
various
usages?
A
So
so,
let's
look
at
where
the
the
data
data
is
growth
is
coming
from,
so
we
are
starting
off
with
an
average
Internet
user
producing
10
gigabytes
a
day.
But
if
you
look
at
you
know
extending
it
to
smart
factories,
you
know
that
that
amount
extends
to
a
petabyte
a
day
or
or
as
Google
Google's
research
would
tell
you
a
smart
car
of
the
future.
Word
would
basically
produce
a
gigabyte,
a
second
which
amounts
to
4
terabytes
a
day.
So
all
of
this
growth
is
is
fueling.
A
A
A
Is
high,
private
and
hybrid
clouds
are
growing
at
a
pace
even
faster,
so
so,
if
you,
if
you
see
the
trends,
private
crowd
is
growing
at
24%
and
hybrid
cloud
is
growing
at
26%.
So
as
a
whole,
it
becomes
essential
that
you
know
we.
We
basically
have
have
a
solution.
That's
open
that
that
powers
these
these
growing
clouds.
So
so,
let's
take
a
look
at
who
is
who's
using
surf
today,
so
the
logo
Wallis
as
I
call
it.
A
You
know
these,
essentially
all
the
companies
that
there
have
been
that
that
are
deploying
surf
today
or
actually
looking
to
that
we've
been
working
together
with
so
one
common
theme
that
you
would
notice
across
this
like
logo
wall
is,
is
that
these
are
all
data
companies
as
I
would
call
it.
These
are
the
companies
whose
business
is
to
store
process
analyze
or
deliver
data.
These
are
the
companies
that
had
to
make
transition
to
open-source
scale-out
Seth
to
meet
their
challenges
of
growing
workloads
and
growing
data
demands
and
doing
so
in
a
cost-effective
manner.
A
Now
these
companies
have
smaller
armies
of
engineers
of
experts,
F
users
or
DevOps
in
many
cases,
and
that's
what
they're
able
to
do
what
they
need
to.
But
there
are
many
companies
outside
of
this
list
that
that
would
easy
fall
into
the
private
or
hybrid
cloud
category
that
we
need
to
make
self
more
accessible.
A
So
once
we
have
the
display
back,
but
I'll
keep
talking
in
the
interest
of
time.
So
let's
look
at
some
of
the
technology
behind
this,
and
in
particular
what
Intel
has
been
has
been
doing
in
the
context
of
Ceph,
so
Inter's
self
in
contributions,
our
involvement
with
the
safe
community
dates
back
to
about
2013.
A
So
we
we
contributed
to
a
racial
coding
as
one
of
the
first
features
we,
we
did
also
also
work
with
the
community
for
integration
of
various
offloads
into
blue
store.
But
let
me
let
me
focus
on
what
we're
doing
today,
with
with
the
community,
so
sage
touched
upon
during
his
talk.
You
know,
non-volatile
memory
is
a.
It
is
the
big
theme
right,
so
we
we
are
working
with
the
community
on
applying
non-volatile
memory,
programming
models
to
the
asynchronous
OSD,
the
as
I
laid
out
earlier.
A
You
know
safe
in
order
to
bring
low
latency
workloads
to
surf
clients,
and
optimizations
is
a
key.
You
know.
Essentially,
one
of
the
key
areas
that
we
are
looking
at
is
to
cache
data
closer
to
the
application
on
the
client,
so
that
we
can.
We
can
reduce
the
delayed
the
latency
that'sthat's
brought
over
by
the
network
as
well
as
diversities,
and
the
third
category
of
workloads
to
the
right
is
essentially
we
we
are
working
on.
A
You
know
working
with
the
community
on
compression
and
encryption
support,
for
example,
for
exposing
those
those
offloads
that
some
of
the
newer
until
platforms
provide
now
I
want
to
talk
to
you
about
three
three
trends
and
and
why
they're
important
going
forward
for
first
half
we,
we
shared
this
vision
with
we
saij
in
the
community
and
again
kind
of
making
making
the
point
here.
I
won't
spend
too
much
time
on
this
slide
because,
as
you
guys
know,
we
are
you're
transitioning
towards
non
water
to
the
memory.
A
So
what
what
this
chart
shows,
though,
is
if
you,
if
you
look
at
the
the
green
portion
of
each
of
these
bar
charts
from
left
to
right,
what
we've
shown
is
a
latency
for
hard
drive
media
going
down
to
3d
crosspoint,
which
is
the
the
fourth
bar
to
the
right.
The
what's
changing
about
storage
is
that
the
system,
software
as
a
percentage
of
the
total
I/o
latency
is,
is
growing
so
essentially
in
the
in
the
hard
drive
days.
A
If
you,
even
if
you
made
your
software
latency
2
to
0,
it
did
not
matter
because
you
you
had
milliseconds
of
latency
to
the
media
with
3d
cross
corner
or
obtain
media.
That
intel
has
introduced
that
latency
for
for
just
the
media,
is
in
the
sub
10
microseconds,
which
which
basically
you
know
it's
gonna,
make
us
think
or
rethink
software
stacks
as
they've
been
written.
So
what
have
been
whatever?
What
has
until
been
doing
about
this?
So
a
few
of
these,
like
acronyms
DPD,
KS,
p
DK
p.m.
DK.
A
How
come
up
many
of
you
may
be
from
there
with
it.
But
these
are
the
development
kits
that
Intel
is
putting
out
there
as
as
the
foundational
building
blocks
for
non-volatile
memory
programming
model,
as
well
as
asynchronous
paulmor,
lock,
Alice
primitives
to
to
make
the
network
or
network
storage
stacks
go
faster.
A
These
are
all
open
source
projects.
They
they
have
been
communities
for
a
while,
so
DP
DK
has
been
around
for
2008,
which
which
opened
up
the
high
performance
routers
market
to
to
the
standard
x86
platforms.
Sp
DK
is
in
fact
the
team
that
I
belong
to.
We
we
have
a
thriving
community
for
about
three
years
now
and
in
fact,
we
we
have
a
nice
pretty
key
summit
right
now
on
co-located,
with
with
this
in
in
Beijing
Intel
intelligence
stories,
acceleration
library.
A
But
the
next
wave
is
with
persistent
memory,
which
is
essentially
bringing
storage
or
memory
closer
to
processor
than
it
done
even
before
we
react,
we
recently
renamed
our
pmm,
which
is
persistent
memory
library
for
four
block
objects,
as
well
as
byte
addressable
program
models
to
pee
and
decay
in
the
same
spirit,
as
you
know:
DP
D,
K
and
s
P,
D
K,
so
so
the
these
are
the
components
that
are
at
the
disposal
of
the
community.
Until
is
investing,
you
know
a
lot
of
resources
to
keep
these
communities
alive
and
growing.
A
So
if
you
look
at
today's
data,
centers
they're
they're
based
on
homogeneous
servers,
so
if
you
should
take
a
look
at,
let's
say
hyper-converged
or
typical
data
center
deployment,
even
with
this
aggregation
with
SEF,
you
will
see
that
every
node
that
is
deployed
has
certain
compute
and
network
storage,
as
well
as
accelerator
elements
when
when
there
is
a
need
to
upgrade
so
I'll,
give
you
an
example.
So
let's
say
you
are
running
an
AI
workload,
artificial
intelligence,
workload
that
that
makes
use
of
FPGAs
or
discrete
accelerators.
A
Intel
actually
has
been
working
on
an
open
spec
to
to
change
this
picture
and
the
reason
why
you
do
that
is
because,
in
data
centers
with
the
homogeneous
server
model,
the
the
efficiency
rates
for
for
compute
storage
and
there
too
or
accelerator
usage
for
that
matter,
are
in
this
lower
lower
10%.
So
so
how?
What?
A
So
so
I've
shown
example
here,
where
you
know
I
was
able
to
compose
a
node
which,
which
needed
fast,
SSD,
storage
and
and
only
a
couple
of
compute
units,
but
then
I
might
have
another
workload
which
benefits
more
from
accelerators,
but
it's
not
as
storage
intensive,
where,
where
I'm
able
to
compose
node
in
a
different
way.
So
this
is
this
is
basically
under
the
RAC
scale,
design
umbrella,
which
is
an
open
standard.
That
Intel
is
developing.
A
A
A
The
next
way
that
we
see
coming
is
the
idea
of
nvme
or
fabrics
which
is
loggers
to
the
ice
cozy
world,
where
you
essentially
transported
as
Cuzzi
commands,
or
it's
Ethernet
going
forward.
We
see
direct
attached
storage
and
the
nvm
Express.
You
know
in
the
Express
format
and
pooled
using
our
DMA
or
tcp,
like
like
fabric
with
the
protocol
being
nvm
your
fabrics,
so
this
model
actually
brings
brings
a
lot
of
good,
which
is
essentially
the
ability
to
scale
compute
and
storage
independent
of
each
other.
A
It
does
let
let
us
do
a
few
interesting
things
with
with
some
with
her
story.
Stack
likes
F,
so
imagine
for
a
second
that
your
euro
s
DS
and
come
in
via
me,
SSDs
we're
not
in
the
same
box
in
the
sense
that
they
are
not
statically
mapped
to
each
other.
An
OST
runs
as
a
stateless
unit
on
a
computer
note
and
it
maps
to
remote
and
vme
devices
as
needed,
so
in
a
containerized
environment
for
example.
This
is
actually
this.
This
enables
some
beautiful
scenarios.
A
Where
you
could
you
could
either
failover
you
could
you
know,
make
an
order
serviceable
by
by
simply
migrating
stateless,
OST
to
200
note
4
choose
to
save
power.
It
could
be.
You
know,
you're
running
out
of
cpu
cores
on
that
node
and
your
OST
is
a
high-performance
OST.
That's
that's
just
core
hungry!
You
want
to
migrate
to
another
node.
You
could
do
that
with
by
simply
remapping
your
storage
to
another,
any
VM,
your
fabrics
node.
A
So
so
again.
The
message
here
is
that
self
is
able
to
work
in
this
in
this
more
today.
But
if
you
look
at
the
holistic,
rack
skill,
design
management
or
data
center
management,
there
are
two
standards
at
the
heart
of
it.
There
is
redfish,
which
is
a
which
is
a
standard
that
we
are
driving
with
DMT
F,
which
is
a
body
that
is
that's
basically
looking
at
the
composability
at
the
hardware
level
and
swerd
fish,
which
is
essentially
a
network
and
storage
configuration
management
layer.
A
A
And
you
know
talking
about
redfish
and
swordfish
brings
us
to
to
the
you
know
to
the
topic
of
manageability.
That
sage
touched
upon,
so
so
in
the
short
term,
to
make
it
very
impactful.
We
need
to
make
sure
that
manageability
is
the
key
area
we
focus
on.
So
it's
not
just
about
the
provisioning
or
day
one
install
it's
it's
really
about
day.
Two,
as
your
workload
grows
you
need
to.
We
need
to
make
sure
that
monitoring
our
other
thing
and
tuning
some
of
those
workflows
are
automated
to
streamline
provisioning.
A
A
A
So
talking
about
you
know,
we
talking
about
data
companies
and
you
know
private
and
hybrid
cloud
deployments
where,
where
we
as
developers,
where
do
we
find
the
motivation?
So
I
wanted
to
talk
to
you
about
a
story
where
Steph
has
been
the
force
for
good
in
a
sense,
so
until
has
been
working
on
a
few
initiatives
where
we
are
trying
to
apply
AI
artificial
intelligence
for
for
for
the
good
or
to
make
the
world
a
better
place.
One
of
those
programs
is
called
safer
children
outside.
So
this
is.
A
This
is
a
this
basically
a
nonprofit
that
I
have
had
some
personal
involvement
with
in
their
use
of
Ceph,
and
that's
why
I
chose
a
story
to
share
with
you.
So
this
is
a
non-profit
whose
mission
is
to
find
missing
children
and,
in
general,
prevent
child
victimization.
This
is
a
growing
problem
across
the
world,
as
we
know
so:
National
Center
for
Missing
and
Exploited
Children.
They
work
as
a
clearinghouse
for
for
these
victims,
their
families
and
to
to
help
the
law
enforcement
in
in
essentially
avoiding
child
victimization.
A
So
what's
the
role
of
IT
at
at
this
nonprofit,
so
obviously
IT
has
to
do
things
IT
endeavors,
but
it
also
has
to
support
their
mission,
so
they
given.
This
is
an
analytics
workload.
This
nonprofit
had
large
sets
of
data
to
analyze.
They
had
growing
storage
needs,
just
like
every
other.
You
know
every
other
company
out
there
in
this
space
and
they
were
it,
was
becoming.
A
So,
of
course,
we
which
was
safe
as
as
a
replacement
for
their
current,
and
you
know,
enterprise,
environment
and
one
of
the
well.
The
highlights
of
this
effort
was
this.
Is
this
is
more
the
first
ffs
in
production
deployments
that
that
we
have
worked
on
so
not
just
that
that
National
Center
for
Missing
and
Exploited
Children
has
switched
to
self-assess
they
they
have
been
able
to
optimize
their
database
backups
and
restore
times
from
hours
to
minutes.
A
They
were
able
to
contain
caused
by
moving
to
a
standard
server
model
and
they
they're
running
their
large
data
sets
happy
Leon's
F
of
S.
They
again.
This
nonprofit,
thankfully,
had
a
couple
of
DevOps
that
we're
sort
of
expert
users
of
OpenStack
already,
which
is,
which
is
why
we
were
able
to
you,
know,
get
them
get
them
here,
but
but
but
here's
what
they
said
for
people.
You
know
to
the
people
that
are
considering
safe.
A
So
essentially
they
they
talk
about
demoing
it
trying
it
talking
to
a
friend
concerned.
The
community
right
essentially
making
use
of
the
API
is
safe,
ApS,
which
they're
talking
about
the
the
new
set
manager,
features
and
they've
been
using
it
with
sensor
for
monitoring,
which
has
worked
great
for
them,
but
their
message
to
the
safe
community
is
read
it
out
for
you,
National
Center
for
Missing
and
Exploited
Children
relies
on
open
source
tools
like
safe
and
the
community
that
supports
them
every
day.
A
A
So
I'll
leave
you
with
with
of
the
familiar
message:
just
keep
innovating.
We
we
want
to
be
able
to
get
into
new
workloads.
Databases
in
AI
are
those
workloads
we
have
been
looking
at
for
a
while
and
and
with
the
with
the
asynchronous
OSD
efforts
that
sage
talked
about
with
C
star
D
P,
D,
K
s,
P
D
K,
as
well
as
some
of
the
clients
are
caching
work
that
we're
doing
we
we
are
gonna
get
there.
We
know
the
non-volatile
memory
is
here,
so
we
need
to
embrace
those
new
programming
models.
A
Disaggregation
we
see
as
the
as
the
theme
of
the
next
next
generation
data
center
and
we
need
to
get
ready
in
to
make
sure
itself
actually
fits.
That
and
last
but
not
least,
you
know,
we
need
to
lower
the
barrier
for
adoption
for
all
those
companies
that
are
wanting
to
move
to
self.
So
so
again,
thank
you
for
all
you've
done.
A
A
So
so
I
would
like
to
invite
mr.
sumption,
who
is
the
vice
president
and
general
manager
of
cloud
business
at
hna
group,
so
at
hna
group
they
have
been
deploying
safe
for
for
a
while
now
for
a
variety
of
use.
Cases,
including
video
infrastructure
of
the
service,
as
well
as
hyper-converged
for
branch
office
applications.
So
mr.
song
gears
to
tell
us
about
their
chef.
B
Honestly,
you
gotta
train
using
the
yellow
that
shows.
I,
don't
know
parenting
yeah,
it's
usually
gotta
change,
your
name
ventolin
go
so
you
know
what
man
she
3
yeah
sure
so
even
excited
to
check
home.
You
sure
you
don't
see
them
yeah,
so
we!
Finally
in
you
mean
technologies
which
I
offer
to
you
among
khone
laga.
B
Can
you
feel
me
now?
Second
controller
to
Boone
Hall
I,
put
you
at
su
chio
Cody.
So
you
know
she
oughta
go
young
go-go
juice
and
world
coincide.
New
Harmony,
chief
American
girl
I
see
that
she
you
know
she
on
how
many
we
got.
You
know
how
he
gets.
Yeah
Yoshio
you
research,
a
more
logical
saloon,
Co
around
to
achieve
an
ad
the
trigger
the
dissonance.
In
addition
to
some
tsunami
attack,
to
determine
an
engine,
a
tautology
say:
I
didn't
do
anything
to
you
know:
change
Mucha
nurse
tighten
a
homonym
check
solely
in
the
past.
B
B
Since
we
all
agree
she's
right,
you
get
your
sorta
technologies.
We
tackle
the
ceiling,
you
know
have
a
new
millennium,
truly
oceans
or
socialism,
children,
young
children.
Suddenly
the
woman
standing
herself,
our
communities,
you
know
so
technically,
you
know-
comes
out
the
function
to
to
see
as
well
easy
chi-chi-chi
Valhalla.
What
he
covets
is
no
idea
whose
idea
was
it?
Oh,
she
saw.
No,
no
I
cannot
do
my
juice
with
you.
What
citizen
daughter
she
Coyotes
would
have
thought
hey.
We
said:
whoever
wants
the
cantata
here
who
Liam
on
which
isn't
coming
fine,
our
children.
B
He
said
she
only
take
accounting,
man,
man,
you
should
control
the
lighting
without
she
that
you
know
Yoji
sound,
honey,
honey,
honey.
You
got
you
dog,
I,
see
coach
I
need
a
year
type.
Oh,
come
on.
She's
kandorian
teach
our
children.
What
our
cookie
you
know!
Do
panco
know
me:
she
embodies
a
high
energy
from
the
mochi.
Since
then
was
that
mountain
was
consumed
with
water
put
honey
in
you
go
honey,
go
you
got
good
holiday.
You
mean
had
moment
yes
like
watching
your
Missoni
enjoy
Miguelito
Sonny
and
you
should
open
your
module.
B
How
you
used
to
evolving
for
the
enterprise
open,
Georgie
wanna
go
to
order,
yeah,
yeah
functional
you
mean
I
shot!
My
Hoshi,
you
don't
know!
Just
haven't
you.
Yes,
you
put
on
sweetie
that
I'm
Levi's
your
shorts,
your
room
is
even
an
oppa
JC,
a
kind
of
magician
and
okay
Holly
I.
Want
me
to
change
a
song
I'm
Trisha
woman
to
Colleen
young
woman,
Sergio
Susanna.
He
go.
How
do
you
think
all
the
time
in
CC?
If
I
mean
a
woman
answered,
you
go
don't
energy
zone?
B
No,
oh
me
show
you
eat
how
to
vamp
a
thumb
in
see
your.
She
opens
tell
us
about
yourself:
kubernetes
subpoena
hi,
I'm,
father
Gela,
shi
quan,
each
other
on
the
couch
Cu
in
pouches.
Here
Canadian
like
or
City
chef,
Emeril's,
daughter,
children.
He
said
trying
to
initiate
you
into
the
ego.
You
know
what
being
high
the
money
I
want
me,
a
kind
of
cheesy
sound.
You
know
function
to
kind
of
achieve.
So,
let's
mount
it
on
the
packing
she's
high
authority.
B
Frenchie
G&G
go
yeah
yeah,
so
G
hook,
I
eat
whole
new
batch.
It's
not
hosting
Fernando
Jesus
is
here
so
how'd
I
get
have
fun
and
go.
How
did
she
say?
I
see
a
chisel
so
signified
in
some
magician.
You
need
how
you
need
to
get
I'm
in
the
fine
I'm
treating
someone
there.
You
go
entire
kind
of
touchy,
good,
30
minutes
in
yourself.
You
know
visual
images
that
are
you
finding
a
woman
see
how
chicken
control
decision
so
weighted
kind
of
a
challenge.
He
goes
show
me.
B
How
sometimes
you
go,
can
I
try
not
yet
he
says
a
police
van
heading.
You
know
tension,
woman,
CEO
controller
Clarence,
you
know
real.
She
teaches
her
room.
She
illuminated
the
channel,
he
said
mean.
How
do
you
know
savage
here
that
chicken
answers
for
unity
way
all
right,
honey,
woman,
hi
uncle
internship
in
Chicago?
You
mean
she's
a
pension.
What
he
said
that
in
her
out,
he
said
comes
over.
You
know
when
she
no,
she
ended
up
again.
She
got
chicken
stock.
B
My
heart
is
immersed
in
the
gentian
again
fuh
I'm.
That
says.
Yes,
you
was
hunting
him
and
she
controlled
shoe
shock.
You
were
sound.
Jenga
titles
are
several:
you
take
a
real
cool
in
chicken
and
goat,
hey
Sam
for
energy
from
cacao,
seeing
the
chief
Francis
really
didn't
pay
to
all
the
children.
Hi
Dallas
on
door.
Chief
financial
I
saw
my
entire
Uchiha
dissolution
as
a
the
children.
Libraries
don't
chance,
one
an
opening
she
eating
syndrome.
I
saw
that
accrued
leave
a
sou
da
legit
sewage.
B
Absolute
a
me
is
going
online,
Tsui
Chik
Hornet
she's
at
higher
Sophie.
She
already
taken
into
college
QED,
the
chief
no
boundary
changes.
I
saw
a
chance
for
to
Cocina
believe
my
title
chill
out.
So
you
can't
I
see
my
uncle
when
I
go
to
see
dollar
well.
I
saw
a
good
chance.
We
should
be
ready,
go
through
your
function.
B
Jesus
is
here
conclusion
you
people,
should
you
party,
I,
hope
y'all
should
be
about
the
English
on
she's
Rumia,
Simone,
Stephanie,
Oakland
Coliseum,
we're
going
to
interview
that
interest
and
tourism
SST
Tommy,
and
so
what
it
won't
seen
and
sausage
how
a
power
to
inspect
the
ginger,
some
re-entry
now,
Joe
sure
I'm,
assuming
we're
ahead
and
go
coming
in
tournament.
A
college
this
year
come
on
Dorothy
a
huge.
How
ninja
negotiation
ago,
Tyagi
Kirkland
says
what
we
eat
several
shots
you
so
that
social,
oh
yeah,.