►
Description
When a supercomputing center installs a new system, users are invited to make heavy use of the computer as part of the rigorous testing. In this video, find out what top scientists have discovered using Edison, a Cray XC30 supercomputer, and how NERSC's newest supercomputer will accelerate their future research.
Subscribe so you don't miss a video - https://youtube.com/berkeleylab
Instagram: https://www.instagram.com/berkeleylab/
Twitter: https://twitter.com/berkeleylab
Facebook: https://www.facebook.com/BerkeleyLab/
More Berkeley Lab news: http://bit.ly/BerkeleyLabNews
A
C
E
F
Is
the
national
energy
research
scientific
computing
center
nurse
has
lots
of
users?
We
have
4,500
users
that
run
lots
of
different
codes,
and
so
we
wanted
to
buy
and
procure
a
system
that
would
do
well
on
all
of
those
applications,
and
so
we
felt
like
Edison
with
its
architecture,
was
the
best
suited
for
our
workload,
informally.
E
I'm
referred
to
as
a
beam
line,
scientist
I
work
at
the
advanced
light
source,
which
is
an
x-ray
facility,
but
allows
us
to
do
x-ray
experiments
that
you
couldn't
do
anywhere
else.
The
instrument
that
I
work
at
at
the
LS
is
an
x-ray
imaging
instrument
focused
on
high
resolution,
small
parts,
so
we're
usually
not
doing
big
things.
We're
imaging
small
features,
sort
of
like
a
high
definition,
CT
scan
basically,
and
we
apply
that
technique
to
a
whole
bunch
of
different
areas.
E
We're
imaging
things
for
earth
science,
for
material
science,
for
biology,
just
in
all
kinds
of
different
areas
is
we're
actually
collecting
2d
images
and
what
we
want
to
generate
is
3d
images,
so
there's
a
bunch
of
processing
that
has
to
happen.
We
have
to
go
to
a
supercomputer
to
achieve
that,
and
what
this
improved
analysis
allows
us
to
do
is
actually
collect
less
images
to
get
the
same
results
we
can
actually
collect.
Data
sets
faster
and
get
bet
time
resolution,
as
well
as
just
get
better
images
in
general.
G
The
Edison
system
so
far
has
proved
very
popular
with
the
users
at
nurse
I.
Think
some
of
the
reasons
are
the
high
speed
interconnect
that
includes
both
access
to
the
storage
system
and
communication
between
the
processors,
as
well
as
the
Intel
processors
that
are
in
the
compute
nodes
that
have
been
performing
well
on
many
applications.
At
least
I've
heard
anecdotally
from
a
number
of
the
users
that
they're
very
happy
each.
D
36-Hour
production
run
produces
about
one
second
of
simulated
data,
so
it's
1636.
Our
production
errs
over
three
weeks
if
it
were
continuous
computing
and
that
was
on
hopper.
The
thing
about
Edison
is
that
Edison
we
have
found
to
be
about
two
and
a
half
times
faster.
Our
particular
code
in
our
particular
production
runs
so
that
reduces
that
time
to
something
on
the
order
of
just
over
a
week,
which
is
a
lot
more
manageable,
particularly
when
you
have
Q
times
workflow
and
getting
through
the
cube.
D
Looking
at
the
data
moving,
the
data
after
a
production
run
in
between
production
runs.
The
reason
why
Edison
is
faster,
we
think,
is
that
our
code
is
memory.
Bandwidth,
limited,
okay
and
Edison.
Has
larger
memory
bandwidth
and
hopper
or
could
just
be
it's
plain
faster,
but
two
and
a
half
times
it's
significant
and
IO
has
not
taken
a
hit.
Performance
has
been,
has
been
great
up
to
a
hundred
thousand
processors
nurse.
H
Has
always
focused
on
sustained
application
performance
when
we
have
acquired
systems,
since
we
have
such
a
large
community
of
users
and
indeed
at
any
one
time
we
tend
to
run
somewhere
between
one
and
300
different
jobs
on
the
machine,
it's
important
to
us
that
the
sustained
performance
of
the
applications
be
maximized.
We.
I
C
This
image
is
one
of
the
species
formaldehyde
to
stable
intermediate
species.
That's
a
marker
of
some
of
the
auto
ignition
that
occurs
ahead
of
the
lifted
flame
stabilization
point,
or
we
can
see
in
great
detail
the
concentrations
of
these
species
and
how
they
interact
with
both
underlying
velocity
field
and
temperature,
and
some
of
other
major
minor
species.
A
Only
5%
of
the
matter
in
the
universe
is
something
that
we
understand
at
all,
so
25,
more
percent
are
in
the
form
of
dark
matter,
which
we
have
some
idea,
what
it
is,
but
we
don't
know
exactly
what
it
is
and
then
an
additional
some
70%
or
so
are
in
the
form
of
some
mysterious
dark
energy,
which
we
really
don't
have
any
physical
framework
written.
It
is
actually
a
challenge
for
the
standard
model.
A
In
physics,
by
going
after
this
cosmological
modeling
and
studying
large-scale
structure
of
the
universe,
we
are
trying
to
address
what
easily
the
universe
is
made
of.
That
is
also
one
of
the
stated
goals
of
office
of
Science
in
the
Department
of
Energy.
One
of
the
missions
is
to
understand
all
forms
of
matter
and
energy,
so
this
is
in
a
way
directly
addressing
that.
G
One
of
the
interesting
aspects
of
the
XC
30
design
is
its
configurability.
The
fact
that
you
can
replace
the
processors
we
are
very
interested
at
Nura
skin,
trying
to
both
serve
the
current
user
community
with
their
application
needs,
but
also
trying
to
move
the
community
forward
to
advanced
technologies
such
as
energy,
efficient
processor
designs
and
trying
to
balance
the
computing
that
they
need
to
do
today
with
where
we
think
they're
going
to
need
to
be
in
the
next
five
to
ten
years.
B
The
driver
with
this
is,
is
the
big
science
is
understanding
the
cosmos
we're
addressing
those
questions
that
humans
have
asked
ever
since
they've
looked
up
at
the
sky
at
night?
How
did
we
get
here?
Where
did
we
come
from?
What's
the
universe,
doing
what's
its
future
going
to
be
what
was
its
past?
So
these
are
the
fundamental
questions
that
were
asking
with
these
kinds
of
observations.
B
I've
been
working
on
analyzing
data
from
observations
of
the
Big
Bang,
so
these
are
the
photons
created
in
the
Big
Bang,
which
we
now
call
the
Cosmic
Microwave
Background,
the
expansion
of
the
universe,
since
the
Big
Bang
has
stretched
these
photons
from
being
unimaginably
hot
at
the
beginning
of
the
universe,
to
being
three
Kelvin,
three
above
absolute,
zero
today,
so
very,
very
cold,
photons
and
background,
because
these
are
the
primordial
photons
either
the
photons
that
come
from
behind
everything
else.
Planck
was
first
conceived
in
late,
80s,
Early
90s.
It's
really
been
a
long-term
plan.
B
We
knew
that
we
we
needed
to
do
this
extraordinary
amount
of
computing.
We
were
going
to
have
this
extraordinary
hunt
of
data,
but
there
was
no
way
we
could
handle
it
on
the
computers
that
were
available
at
that
time,
and
so
we
were
just
going
to
have
to
progress
and
rely
on
Moore's
law
in
computing
to
carry
us
to
where
we
needed
to
be
in
ten
years
time.
B
So
we've
gone
through
five
six
generations
of
supercomputers
here
at
Nernst,
most
of
which
have
been
crazy,
and
so
with
each
generation
we've
been
able
to
get
one
step
closer
to
what
we
always
knew.
The
goal
was
going
to
be
sort
of
just-in-time
computing.
Edison
is
arriving
just
as
we
need
to
generate
the
largest
simulation
set
to
support
the
analysis
of
the
data
that
will
be
published
next
June.
B
We
don't
just
analyze
the
data.
We
have
to
generate
simulations
of
the
data
in
order
to
understand
the
statistics
of
what
we're
seeing.
We
have
one
real
data
set,
we'd
like
to
have
10,000
simulations
of
that
data,
so
that
becomes
a
completely
dominant
calculation.
We
had
to
try
and
read
in
this
huge
volume
of
data
and
analyze
it
and
then
write
out
some
results.
Now
we
do
that
all
on
the
fly
so
that
when
you
start
running
your
analysis
code,
the
first
thing
it
does
completely
unbeknownst
to
the
analyst
is
generate.
The
simulation.
B
So
you
say
well,
show
me
this
data
and
that
data
doesn't
actually
exist
at
the
point
where
the
analyst
starts
running
their
code
and
under
the
hood.
We've
developed
all
the
infrastructure
and
the
interfaces
to
generate
that
synthetic
paper
on
the
fly,
avoid
all
the
I/o
issues
and
really
enable
the
kind
of
speed
ups
that
we've
needed
to
be
able
to
exploit
systems
like
Edison
and
to
be
able
to
generate
tens
of
thousands
of
realizations
vector.
J
We're
really
excited
to
have
some
of
the
most
energy-efficient
computers
in
the
world.
We
work
very
hard
to
make
sure
that
the
amount
of
science
we
get
for
the
amount
of
watt
of
power
we
put
into
our
systems
is
as
high
as
possible.
Edison
is
an
incredibly
efficient
computer
system
and
it's
pretty
novel
in
a
couple
of
approaches.
First
of
all,
they
have
the
air
moving
transversely
down
the
entire
row,
which
is
extremely
efficient.
For
us.
It
means
that
we're
using
the
same
air
to
cool
each
compute
cabinet.
J
As
it's
moving
through
the
system
with
our
previous
generation,
we
had
22
air
handlers
with
Edison.
We
can
do
it
with
two
and
then
that
heat
is
directly
captured
through
a
radiator
that
sits
between
each
of
the
compute
racks.
So
the
energy
transfer
and
heat
transferred
directly
into
the
water
system
is
extremely
efficient.
For
us,
we've
got
the
entire
cooling
loop
now
down
to
right
about
15
kilowatts
of
power
use
compared
to
a
two
and
a
half
megawatt
compute
system,
I.
F
Think
the
exciting
thing
about
nurse
scanned
and
what
we're
all
passionate
about
is
really
enabling
science
at
benefits.
Society
I
think
a
lot
of
us
are
really
here
for
public
service.
There
are
lots
of
very
high-paying
computer
industry
jobs
here
in
the
Bay
Area,
but
really
what
drives
us
and
what
gives
us
passion
is
that
we
want
to
see
there
be
a
broad
range
of
scientific
breakthroughs
that
really
benefit
all
of
us.