►
Description
CESM Chief Scientist, Gokhan Danabasoglu, gives the first session of the 2020 CESM Tutorial with an introduction to the Community Earth System Model and the components that make up the model.
The CESM Tutorial will consist of:
Lectures on simulating the climate system, practical sessions on running CESM, modifying components, and analyzing data.
For more information:
http://www.cesm.ucar.edu/events/tutorials/2020
Community Earth System Model (CESM) is a fully-coupled, community, global climate model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states.
For more information:
http://www.cesm.ucar.edu
A
So,
let's
go
ahead
with
our
first
session
of
this
tutorial,
which
is
an
introduction
to
the
community
air
system
model
and
will
be
presented
by
gokan
dana
glue.
Gurken
is
a
current
csm
chief
scientist,
a
senior
scientist
in
the
oceanography
section
and
the
past
co-chair
of
the
csm
motion
working
group.
The
general
subjects
in
his
research
are
understanding
the
role
of
the
oceans
in
the
earth's
climate
system
and
computational
modelling
of
the
ocean
as
geophysical
fluid,
and
he
will
be
giving
this
session.
Thank
you
gerkan
for
doing
this.
A
B
You
hunter
so
I'll
go
ahead
and
start
well.
I
would
like
to
welcome
everybody
first
to
the
cesm
tutorial
and
it's
a
virtual
one.
That's
going
to
indicate
it
and
it's
unfortunate,
because
this
tutorial
is
particularly
very
useful
to
early
career
scientists
both
to
learn
about
cesm
and
also
to
sort
of
establish
connections
both
with
each
other
and
also
with
the
encar
scientist
or
cesm
scientist
in
general.
So
I'm
fortunate
that
we
are
going.
B
B
So
what
I
would
like
to
do
as
gunter
indicated
everybody's
background
is
quite
different
in
this
tutorial,
so
I
would
like
to
essentially
provide
a
brief
introduction
to
global
earth
system
models
and
what
cesm
is
within
that
context.
B
Then
I
would
like
to
briefly
mention
a
few
of
our
efforts
on
coupled
model
intercomparison
project
phase,
six,
that's
steam,
f6
and
I'm
pretty
sure
that
you
all
heard
about
cmip
and
what
it's
all
about,
and
I
would
like
to
then
provide
some
updates
on
our
ongoing
activities
and
the
later
part
of
the
talk.
The
last
part
of
the
talk
will
be
more
on
the
future.
Looking
side,
what
are
we
doing
with
respect
to
our
next
model
version
towards
cesm3?
B
And
I
would
like
to
mention
many
things
here
at
least
introduce
them
to
you,
even
if
you
are
not
necessarily
interested
in
right
now,
at
least
it'll
be
in
your
mind,
it'll
you'll.
Remember
that
certain
things
is
existed
in
cesm
or
existing
cesm,
so
that
when
one
in
case
you
need
them
in
the
future.
B
B
As
you
know,
global
earth
system
models
represent
a
virtual
laboratory
for
experimentation.
We
cannot
do
real
experiments
with
the
climate
and
we
have
only
one
essentially
sort
of
member
of
the
real
climate.
So
global
earth
system
models
provide
a
laboratory
framework
or
a
numerical
laboratory
framework
to
improve
our
scientific
understanding
of
observed
events
or
climate
change,
and
this
can
be
historical
or
paleoclimate.
B
We
can
use
the
models
to
simulate
future
climate
change
and
its
impacts.
Another
example
is
essentially,
we
can
use
them
to
make
predictions
of
weather
and
more
on
the
side
of
right
now.
That's
in
actually
emerging
science
area
on
climate,
or
perhaps
more
correctly,
earth
system,
variability
and
predictability
or
prediction,
and
this
the
time
frame
for
that
effort
covers
from
sub-seasonal
to
all
the
way
to
decade.
Decadal
time
scales-
and
these
are
essentially
mostly
referring
to
initialized
earth
system
predictions.
B
So
global
earth
system
models
use
physical
equations
and
in
terms
of
with
respect
to
the
ocean
and
the
atmosphere,
for
example,
they
would
be
navier,
stokes,
navier-stokes
from
fluid
dynamics.
They
use
those
equations
to
simulate
key
fields
and
processes
in
the
atmosphere,
ocean,
land,
sea
ice
and
land
eyes.
B
These
equations,
of
course
these
are
continuous
equations,
but
they
are
discretized
on
a
model
grid
and
an
example
is
shown
here.
This
is
a
regular
lat
long
grid,
I'm
not
going
to
go
into
problems
associated
with
such
regular
lat
long
grids,
but
all
these
equations
are
discretized
and
you
end
up
essentially
solving
these
equations
at
particular
grid
points,
and
they
can
be
staggered
or
non-staggered.
There
are
various
methods
to
do
that,
simply
because
it's
impossible
to
solve
these
equations
at
really
small
spatial
scales.
B
It
is
just
too
costly.
We
end
up
discretizing.
These
equations
on
grids
usually
order
one
degree
in
the
horizontal,
so
that's
corresponding
to
order
100
kilometer
resolution.
That's
for
most
of
the
modeling
centers
100,
kilometers,
sort
of
the
workhorse
model
resolution
and
clearly
there
are
many
processes
that
are
shown
here
in
this
schematic
that
are
essentially
occurring
below
that
grid
resolution.
B
But
they
are
important
for
climate
so
that
we
need
to
include
their
impacts
in
the
in
the
model
simulations
and
the
way
to
do
that.
What
we
call
them
essentially
parametrizations,
these
processes
that
are
listed
here.
Most
of
them
are
essentially
represented
in
terms
of
resolved
physics
and
their
their
impacts
are
incorporated
into
the
model
equations
as
parametrizations,
for
example,
we
don't
use
molecular
diffusivity
or
viscosity.
B
Instead,
we
sort
of
reuse
mesoscale,
for
example,
in
the
ocean,
and
some
middle
scale,
essentially
mixing
effects
through
parametrizations,
and
they
are
usually
parameterized
as
diffusive
processes
in
the
models.
So
csm
is
one
such
model.
It's
one
such
global
earth
system
model
and
I
included
a
schematic
here
to
show
the
components
of
the
coupled
system.
We
have
the
atmosphere,
it
has
chemistry
and
it
has
a
high
top
and
low
top
version.
B
B
These
individual
model
components
exchange
both
states
and
fluxes
through
a
coupler,
so
they
communicate
through
each
other.
For
example,
land
model
waves,
its
sort
of
wave
information
to
the
ocean
model
and
ocean
model
uses
that
wave
information,
for
example,
to
parameterize
or
to
incorporate
the
impacts
of
langmuir
mixing
and
external,
forcing
such
as
greenhouse
gases
anthropogenic
aerosols,
volcanic
eruptions
and
solar
variability
are
essentially
incorporated
through
the
atmospheric
model
component.
Usually
we
have
in
the
ocean
model.
We
also.
We
can
also
account
for
geothermal
heating,
but
in
our
regular
applications.
We
do
not
use
that
feature.
B
The
upper
part
upper
charts
here
refer
to
evolving
model
complexity.
A
coupled
sort
of
modeling
initially
started
as
just
atmospheric
ocean
two-component
system
and
that's
sort
of
in
mid-1960s
to
late
1960s,
and
you
can
see
that
over
the
years
they
evolved
a
lot.
They
included
many
additional
components,
such
as
the
sea
ice
components,
ice
sheet
recently
and
marine
ecosystems
they're
all
included.
So
there
is
an
increased
complexity
in
terms
of
additional
model
components.
B
But
if
you
look
at
the
right
upper
right
panel
here,
not
only
additional
component
models
came
on
board,
but
each
component
model
also
increased
in
their
complexity.
So
hopefully
you
won't
hear
I'm
at
home
in
the
basement
and
they're
cutting
our
yard
or
grass
out
there.
So
there
will
be
some
noise
coming
from
that,
the
so
each
individual
component.
B
This
is
actually
showing
as
a
function
of
time
the
model
complexity
and,
for
example,
if
you
look
at
the
atmosphere,
there
were
very
few
features
in
the
atmospheric
model
in
early
days
and,
as
you
can
see
by
the
thickness
of
these
sort
of
shades,
they
increase
in
their
complexity
as
well.
So
this
essentially
increases
the
cost
and
puts
more
pressure
on
your
on
your
resources.
B
Another
thing
that
you
need
to
consider
depending
upon
what
problem,
what
science
question
that
you
would
like
to
answer
is
to
consider
how
many
ensemble
members
you
would
like
to
essentially
perform
to
essentially
entertain
or
to
determine
your
signal-to-noise
ratio.
B
How
and
I'll
show
you
one
example
of
that
being
later
in
the
talk
you
you
can
start
with
the
this
is
sort
of
I
mean
in
the
weather
scale,
it's
going
to
be
so-called
butterfly
effect
when
you
start
from
your
initial
conditions,
a
slight
perturbation
of
the
system
can
lead
to
different
sort
of
states
further
in
time
and
depending
upon
whether
you
want
to
essentially
assess,
for
example,
precipitation
likelihood
in
a
certain
region.
B
T
stands
for
triangular
spectral,
truncation
t85
is
exactly
1.4
degrees,
t42
is
about
2.8
degrees
and,
as
you
go
to
t340,
these
are
essentially
less
than
order
a
few
tens
of
kilometers
in
the
atmosphere.
So,
if
you're
interested
in
essentially
higher
resolution
in
a
certain
region
or
in
the
globe,
essentially
you
need
to
consider
finer
resolution.
Of
course
it
comes
at
the
expense
of
more
computer
time,
and
that's
why
this
chart
is
given
here
or
this
schematic
here.
You
need
to
allocate
your
resources
based
on
what
science
question
that
you
want
to
answer.
B
Considering
how
complex
you
want
the
model
to
be
how
many
ensemble
members
you
would
like
to
have
and
what
resolution
you
would
like
to
have
and
some
modeling
enterprises
also,
you
can
do
a
regional
grid
refinement
and
in
our
atmospheric
model,
with
the
sc
spectral
element.
Dynamical
core
regional
grid
refinement
is
also
possible
and
you'll
hear
more
about
these
during
this
week,
another
so
just
back
to
cesm.
B
This
is
just
to
give
you
an
idea
about
the
organizational
structure
of
cesm
cesm
consists
of
12
working
groups
and
our
newest
working
group
is
the
earth
system
prediction
working
group.
All
the
ideas
essentially
come
at
the
working
group
level.
If
you
would
like
to
essentially
have
engage
with
cesm
community
and
if
you
have
ideas
about
sort
of
community
projects
community
integrations,
you
can
attend
these
working
group
meetings.
B
Usually
there's
one
working
group
meeting
for
each
working
group
during
the
summer
workshops
and
under
the
the
working
groups
also
meet
during
a
during
the
winter
time
or
spring
time
each
year,
and
then
there
is
a
scientific
stream
committee
about
the
working
groups
and
then
we
have
also
an
advisory
board
at
the
top
essentially,
but
the
real
essential
ideas
come
from
from
below
at
the
working
group
level,
and
I
just
wanted
to
also
mention
that
the
cbsm
project
has
been
in
existence
roughly
for
25
years.
B
So
it's
a
lot
of
experience
all
together.
So
csm
code
base
can
support
a
range
of
climate
science
goals
through
a
single
model
codebase.
It
can
be
used
in
single
column,
applications
or
course,
resolution
applications
and,
for
example,
if
you
use
it
as
a
single
column,
application,
for
example,
in
the
ocean
model
or
in
the
atmosphere.
B
B
We
have
coarser
resolution
versions
of
the
model
and
they
can
be
used
for
long
simulations
or
several
thousand
year
simulations
and
they
can
be
primarily
used
for,
for
example,
stereo,
climate
applications
and
the
same
cloud
base
can
be
used
in
higher
resolution
simulations.
B
B
All
components
in
our
default.
Configuration
can
be,
act
are
active,
so
all
components
can
be
active,
but
all
component
models
also
can
be
replaced
with
data,
so
called
data
models.
An
example
of
this
thing,
for
example,
is
an
ocean
only
or
ocean
sea
ice
coupled
simulations
forced
with
atmospherically
analysis
products.
In
that
case,
the
atmospheric
model
is
simply
a
data
model
reads
in
the
data
sets
that
are
needed
to
force
an
ocean
model
or
ocean
sea
ice
coupled
model,
so
that
configuration
is
available.
B
We
have
other
simple
model,
simpler
model
configurations,
for
example
aqua
planet.
We
have
several
dynamical
cores
in
the
atmospheric
model
and
I'll
show
you
examples
of
that,
and
you
can
run
a
slab
ocean
model
for
some
applications
and
there
are
many
and
numerous
options
are
available
within
each
component
and
we
are
also
sporting.
B
An
increasing
number
of
component
sets
and
configurations
as
out
of
the
box
available
configurations
and
I'll
talk
about
a
bit
more
about
more
recent
efforts
on
simple
models
within
the
cesm
framework,
so
couple
model
into
comparison
efforts,
that's
cmip6,
so
just
to
make
sure
that
everybody
is
aware
of
this
effort.
As
you
know,.
B
Including
csm
many
modeling
groups
overwriting
right
now,
40
modeling
groups
are
participating
in
this
effort,
cmf6
effort
what
it
is,
what
it
is
actually
quite
time
consuming
both
it
requires
both
computational
resources
and
people
with
resources.
Also,
it
essentially
involves
at
the
core
so-called
deck
simulation
that
stands
for
diagnostic
evaluation,
characterization
of
climate.
It
requires
essentially
each
modeling
group
participating
modeling
group.
It's
like
an
entry
card.
You
need
to
run
order
500
years
at
least
a
pre-industrial
control
simulation.
B
One
percent
co2
increase
experiment,
quadrupling
instantaneous,
coupling
quadrupling
of
the
carbon
dioxide
concentration
and
emip.
These
are
atmosphere
only
simulations,
in
addition
to
that,
cmip
is
essentially
is
designed
to
address
certain
scientific
questions,
and
it
has
certain
themes,
and
these
themes
are
given
here:
systematic
model
biases
and
you
can
try
to
address
those.
There
are
certain
studies
looking
into
variability
predictability,
predictions
and
future
scenarios,
and
you
can
also
look
at
response
to,
for
example,
external,
forcing
and
around
those
themes.
B
There
are
so-called
endorsed
model
intercomparison
projects,
and
there
are
many
of
them
and
cesm
actually
participated.
Some
of
you
may
be
familiar
with
these
modeling
comparison.
Projects
csm
participated
in
about
20
plus
inter-comparison
projects,
and
we
provided
essentially
two
sets
of
model
simulations
for
cmh6.
B
The
first
more
extensive
set
uses
our
nominal
one
degree
horizontal
model
resolution,
and
we
are,
we
participated
both
with
low
top
with
limited
chemistry.
That's
the
cam
six
model
version,
community
atmosphere,
model
version
six,
and
then
we
also
participated
with
higher
top
and
with
more
extensive
chemistry
model
version
for
the
atmospheric
component.
That's
the
vacuum.
All
atmospheric
community
climate
model,
for,
as
I
said,
for
cheaper
applications
for
maybe
paleo
studies
that
can
be
used
for
carrier
applications.
We
also
participated
and
created
and
participated
a
two
degree.
Atmospheric
model
version.
B
All
the
other
components
are
using
the
same.
There's
like
the
ocean
and
the
sea.
Ice
models
are
still
using
one
degree.
Atmospheric
model,
sorry
one
degree
horizontal
resolution,
but
the
atmospheric
model
is
at
two
degree
a
resolution
here,
and
these
are
all
available
to
the
community
and
if
you're
interested
in
learning
more
about
these
solutions,
we
have
actually
an
agu
csm
special
virtual
special
issue.
We
are
expecting
total
70
masters
for
this
purpose.
B
40
of
them
are
already
published
or
submitted,
and
you
can
essentially
go
to
this
website
and
then
find
all
of
these
all
of
these
manuscripts,
the
ones
that
are
published.
They
have
a
doi
number,
the
ones
that
are
still
in
review,
they're
also
available,
and
you
can
get
their
pdf
copies
here
and
the
primary
overview
manuscript
is
the
community
or
system
model
version
two.
I
was
the
lead
author
for
that
and
that
has
been
available
from
the
website
as
well.
B
B
They
are
essentially
producing
a
larger
equilibrium,
climate
sensitivity
than
their
previous
versions,
and
for
those
of
you
who
are
not
familiar
with
what
equilibrium
climate
sensitivity
is,
it
simply
represents
equilibrium,
change
in
surface
temperature
when
co2
is
doubled.
So
you
take
the
pre-industrial
control
simulation.
It
has
a
certain
level
of
co2,
you
double
it.
You
run
it
to
equilibrium
and
then,
when
you
at
equilibrium,
you
look
at
your
surface
temperature,
you
difference
it
from
the
beginning
state
or
the
pre-industrial
level,
and
that
gives
you
the
equilibrium
climate
sensitivity
in
practice.
B
Of
course,
it's
not
possible
to
run
it
for
thousands
of
years.
These
models
they
take
too
much
too
much
too
many
resources.
So
you
try
to
come
up
with
an
estimate,
and
I
just
this
is
a
chart
from
a
paper
by
jeremy
lettow.
It
shows
equilibrium,
climate
sensitivity
as
a
function
of
transient
climate
response
transient
climate
response
is
a
lot
easier
to
obtain,
and
perhaps
it's
more
relevant
for
society.
B
B
The
bottom
line
here
is
that
in
all
the
older
versions
of
the
models
equilibrium,
climate
sensitivity
was
not
really
larger
than
four
and
a
half
degrees
and
in
the
new
model
versions
they
are
now
in
excess
of
four
and
a
half.
In
fact,
over
five
degrees
in
quite
a
few
of
the
models
and
sort
of,
I
circled
the
csm
solutions
here,
and
they
are
there's
no
single
sort
of
silver
bullet.
B
Why
that
this
increase
happened,
what
we
found
is
essentially
in
our
studies,
small
changes
to
crop
microphysics
and
boundary
layer
parameters
can
essentially
lead
to
these
increase
in
equilibrium,
climate
sensitivity
and
what
happens
over
the
southern
ocean
latitudes
appear
to
be
quite
important
regarding
cloud
feedbacks,
so
I'm
sure
that
you
either
heard
about
this
thing
or
you'll
hear
more
about
it.
Another
plot
that
I
wanted
to
show
you
is
a
model
performance
summary.
B
For
example,
orange
variables
represent
energy
related
fields,
long
way,
net
short
way
met
at
the
surface.
So
in
this
and
there's
some
kind
of
based
on
this
comparison,
how
good
the
comparison
is,
there's
a
score
assigned
to
it.
You
want
to
be
on
the
red
side
here:
blue
numbers
or
green
numbers,
especially
blue
numbers,
are
not
so
good,
so
I
just
wanted
to
essentially
highlight
that
in
the
top
10
roughly,
I
I
believe
of
the
our
four
csm
contributions
are
highly
rated,
meaning
that
the
csm
model
simulations
are
among
the
top.
B
Coupled
model
simulations
in
representing
observations
based
on
observational
based
analysis
or
results
or
re-analysis
products,
so
it's
quite
good
for
cesm,
and
this
is
available
also
in
a
paper
by
john
fosulo.
Okay.
So
what's
going
on-
and
I
just
want
to
provide
some
brief
updates
now
we
have
essentially
the
main
csm2
2.0
version
of
the
model
was
released
in
summer
of
2018
and
since
then
we
have
had
three
what
we
call
incremental
releases.
These
are
not
answer
changing
releases.
B
They
are
essentially
adding
more
out
of
the
box
configurations
available
to
the
committee
and
we
are
anticipating
that
there's
going
to
be
another
release
in
september
next
month,
and
this
is
going
to
be
an
answer
changing
release,
it's
going
to
include
some
additional
features
on
top
of
what
we
had
released
earlier
and
there
are
some
changes
in
the
land
model
or
new
functionalities,
for
example,
and
then
the
ocean
model
is
going
to
also
include
a
version
of
its
new
ocean
component
modular
ocean
model
version.
B
As
I
indicated,
we
have
newly
formed
a
new
working
group,
earth
system
prediction
working
group
and
the
idea
here
is
essentially
this
working
group
will
serve
the
cesm
and
brother
geoscience
community
by
facilitating
and
coordinating
fundamental
research,
focused
on
understanding
and
advancing
research
on
initialized
earth
system,
predictions
on
time,
skills
from
sub-seasonal
to
multi-decadal,
and
we
will
perform
large
ensembles
of
initialized
predictions
that
cannot
be
afforded
by
the
individual
pi
and
just
like
our
csm1,
large
ensemble
or
csm1
decatur
prediction,
large
ensemble
they'll
be
available
to
the
community
and
we've
been
actually
doing
quite
well
in
terms
of
our
predictions.
B
B
On
the
decatur
time
scales,
we
have
a
40
member
large
ensemble,
and
this
is
essentially
being
used
a
lot
by
the
broader
community,
both
at
the
national
level
and
international
level,
and
I
wanted
to
show
you
just
one
quick
example
here:
this
is
showing
skill
in
summer
precipitation
over
the
sahel
region
here
in
the
spokes
region
over
three
to
seven
year.
Let's
say
roughly
five
year
lead
times.
B
The
top
panel
here
is
the
anomaly
correlation
coefficient
is
shown
as
a
function
of
ensembl
size.
The
blue
line
here
is
uninitialized
predictions
or
uninitialized
simulations.
They
are
not
predictions
and
it's
just
uninitialized
runs.
You
don't
expect
any
skill
necessarily
and
the
dedicated
prediction.
Large
ensemble.
The
red
line
here
is
from
our
initialized
system.
So
for
this
metric
it's
easy
to
beat
the
uninitialized
system,
so
there
is
no
issue
there,
but
another
important
metric
for
prediction-
simulations
is
so-called
persistence.
B
B
You
need
30
ensemble
members,
so
this
is
essentially
showing
an
example
of
why
you
need
multiple
ensemble
members
and
of
course
this
depends
upon
geographical
region
and
the
field
that
you
are
looking
for,
and
this
is
showing
a
comparison
of
our
decadal
prediction
system.
Blue
is
uninitialized
red
black
is
the
obs
in
this
case,
and
then
the
red
is
our
decade
of
prediction
system.
So
it's
doing
quite
well
in
terms
of
essentially
predicting
the
sort
of
low
frequency
signals
in
precipitation
in
this
region.
B
Okay,
I
I
mentioned
earlier
that
you
may
need
multiple
ensemble
members,
and
this
is
just
to
show
you
an
example
of
that.
As
I
mentioned
earlier,
we
have
csm1
large
ensemble
simulations
available
for
communities
use,
and
I
wanted
to
show
you
what
I
meant
by
that.
So
what
is
shown
here
is
the
seed
surface
temperature
variability
with
a
focus
in
the
north
atlantic.
It's
usually
referred
to
as
atlantic
multidecadal
variability,
but
you
don't
need
to
know
that
the
top
four
lines
here
or
four
panels
here
from
left
towards
right
are
observational
or
re-analysis.
B
Based
estimates
of
sea,
surface
temperature,
anomaly
its
low
frequency
component
and
what
is
shown
in
the
remaining
20
panels
is
just
by
changing
the
atmospheric
temperature
field
in
the
16
digit
in
in
its
initial
conditions,
you
can
get
different
representation
of
this
variability,
as
shown
here.
So,
if
you're
interested
in
what's
happening,
for
example,
in
the
south
atlantic,
this
is
showing,
for
example,
cold
anomalies.
This
is
showing
warm
anomalies,
so
you
need
to
be
able
to
run
multiple
ensemble
members
to
get
some
confidence
on
which
one
is
essentially
more
likely.
B
So
that's
why
you
may
need
to
use
it
multiple
ensemble
members,
but
I
wanted
to
highlight
that
this
kind
of
product
is
available
for
everybody
to
use
in
c
for
the
community.
Indeed,
you're
essentially
performing
a
new
large
ensemble
csm.
True
large
ensemble,
it's
in
collaboration
with
our
colleagues
in
south
korea,
the
previous
large
ensemble,
was
performed
only
with
40
ensemble
members.
This
new
one
is
going
to
be.
Is
being
performed
rather,
if
for
we
will
be
producing
100,
ensemble
members,
this
is
slightly
older
slide.
B
I'm
sure
that
you'll
be
using
this
data
set
a
lot
and
just
coming
back
also.
We
have
also
data
assimilation,
capabilities
within
the
cesm
framework,
and
we
use
data
assimilation,
research
testbed
for
that
purpose
and
our
current
sort
of
research
level.
B
We
have
a
prototype
of
what
we
call
strongly
coupled
data
assimilation
system,
meaning
that
the
atmosphere
and
ocean
model
use
a
single
covariance
matrix
rather
than
two
independent
ones.
So
a
prototype
of
this
data
simulation
system
within
the
fully
coupled
framework
is
running,
but
I
wanted
to
make
sure
you
understand
the
difference
here.
There
are
various
ways
of
performing
data
assimilation.
B
This
is
using
ensembl
common
filter,
meaning
that
each
component
model
requires
multiple
members
up
to
maybe
40
or
even
more
members
running
simultaneously
to
get
the
ensemble
spread,
correct
or
reasonable,
and
that
you
can
imagine
if
you
are
using
40
ensemble
members.
That
means
it's
going
to
cost
you
40
times
more
than
a
regular
coupled
system
to
run
this,
so
it
is
not
really
economical.
B
B
I
don't
want
to
go
into
many
details
here,
but
in
contrast
to
the
ensemble
common
filter
approach,
which
required
multiple
running
multiple
ensemble
members,
this
technique
is
essentially
requires
running
only
one
ensemble
member,
but
the
spread
is
essentially
based
on
some
previous
knowledge
of
possible
spread
that
comes
from
an
existing
simulation,
and
this
is
rather
affordable.
Even
at
high
resolution
frameworks
and
another
thing
that
I
wanted
to
mention.
B
We
do
also
have
a
high
resolution
version
of
the
model
and
the
results
from
this
are
already
available,
and
this
is
actually
a
collaboration
under
the
ihas
umbrella,
international
laboratory
for
high
resolution
earth
system
predictions.
It
is
between
us,
texas,
a
m
university
and
also
the
qingdao
national
laboratory
for
marine
science
and
technology.
B
We
are
using
a
slightly
older
version
of
tomorrow
for
this
purpose:
csm
1.3,
the
code
base
has
been
substantially
rewritten
to
run
on
the
machines
in
china.
They
are
sunway
systems,
they
are
somewhere
in
between
cpu
and
gpu-based
systems
and
that
codebase
is
available
and
then
the
manuscript
has
been
recently
accepted.
I
believe
I
think
it's
accepted
in
a
gmd
geoscientific
model
development
and
if
you
are
interested
in
using
these
data
sets,
what
we
have
is
a
500-year
pre-industrial
control
simulation.
B
We
have
a
1850-2100
transient
simulation
run
with
icp
8.5
scenario.
We
have
an
80-year,
1
co2
increase
run
and
we
are
also
four
cycles
of
ocean
sea
ice
coupled
forced
pinecast
simulations
for
the
1958
2018
period.
As
of
july
june,
8th,
we
made
some
of
these
simulations
already
available.
I
should
also
mention
that
we
have
also
completed
contributions
for
high
resmed
130-year
18,
1950
control,
1950
2050,
transient
simulation
and
1950
2050,
just
atmosphere
on
assimilation
and
it's
corresponding
low
resolution.
B
Simulations
are
also
available,
so
some
of
the
simulations
were
made
available
solutions
available
on
june8.
The
rest
of
the
data
sets
will
be
available
by
the
end
of
this
year
and
there's
a
manuscript
already
just
submitted
actually,
and
you
can
find
that
manuscript
from
the
ihasp
website,
if
you
type
ihas
on
google
you'll
get
that
get
to
that
webpage,
okay
towards
csm3.
So
this
is
the
last
part
of
the
talk,
and
I
just
wanted
to
give
you
an
idea
about
where
we
are
going
in
our
model.
B
So
we
are
essentially
developing
idealized
modeling
toolkits.
This
is
going
to
make
it
a
lot
easier
to
essentially
know
what
idealized
model
configurations
available
within
cesm
we'll
be
also
developing.
Some
new
idealized
model
configurations
such
as
extensive
extensions
of
the
single
column
ocean
model.
What
I
call
pencil
model
that'll
be
hopefully
including
the
edmund
component
as
well,
but
it
is
going
to
be
essentially
a
nice
framework
to
figure
out.
B
What's
going
on
with
csm
simple
modeling
efforts,
we
are
recently
essentially
looking
into
increasing
our
atmospheric
standard
atmospheric
model
resolution
higher
than
what
it
is
already
and
also
not
only
increasing
the
vertical
resolution,
but
also
we
would
like
to
increase
the
model
top,
and
this
will
essentially,
it
will
not
be
as
high
as
what
our
high
top
model
does,
but
it
will
be
extending
higher
than
the
present
low
top
version
to
improve
vertical
resolution
in
the
free,
troposphere
and
stratosphere,
and
the
idea
is
particularly
to
get
a
better
representation
of
the
qbo.
B
B
Another
thing
that
we
are
actually
looking
into
is
changing
our
atmospheric
dynamical
core
from
the
present
regular
lat
long
finite
volume
dynamical
core
and
we
are
considering
three
options.
One
is
the
ac
dicor
that
stands
for
spectral
element,
dynamical
core,
it's
a
highly
scalable
hydrostatic
dynamical
core,
and
you
can
essentially
do
grid
refinement
with
that,
and
it
can
run
also
physics
on
a
separate,
usually
coarser
grid
for
uniform
grid
applications
as
well.
B
We
also
so-called
finite
volume,
3
fe3
dynamical
core,
that's
again
in
the
cube
sphere,
version
of
the
regular
latino
finite
volume,
dynamical
core
and
the
third
dynamical
core
is
pampas.
That
is
an
irregular
grid.
Dynamical
core-
and
it's
shown
here
in
a
sort
of
hexagonal
grid
configuration
here
in
the
lower
panel,
so
we'll
be
having
some
sort
of
a
backup
in
this
dynamical
cores,
either
later
this
year
or
likely
early
next
year,
and
as
I
alluded
to
earlier,
we
are
also
changing
our
ocean
model
from
pub2,
which
is
used
in
cesm2.
B
B
So
it
is,
we
are
trying
to
learn
how
this
model
behaves.
Essentially,
lots
of
documentation
and
training
opportunities
are
possible.
We
are
conducting
webinars
to
sort
of
make
sure
that
people
know
about
this
thing
and
there
are
certain
practical
cases
use
cases
available
on
the
website
as
well,
and
if
you
have
more
questions
about
this
thing,
gustavo
is
one
of
the
organizers
of
this
tutorial.
B
I'm
sure
he'll
be
happy
to
answer
any
mom
6
related
questions
and
just
for
your
information.
Also
when
I
said
we
are
going
to
release
csm
2.2
in
september,
like
the
september
time
frame,
there's
going
to
be
an
early
friendly
user
version
of
mom
6
functional
release,
version
of
mom
6
available
in
that
model
release
as
well-
and
this
is
just
to
show
you
that
model
is
working
and
then
we
are
getting
reasonable
solutions.
B
Okay,
with
that
40
minutes
exactly,
and
I
would
like
to
stop-
and
I
just
wanted
to
show
you
the
similar
well
gunter
showed,
if
it
weren't
for
the
virus,
we
would
be
actually
in
this
building
up
here,
conducting
this
tutorial
in
the
main
seminar
room.
In
any
case,
thank
you
for
your
attention
and
if
you
have
any
questions
I'll
be
happy
to
take
them
now.
Thank
you.
A
A
And
you
know
for
you
to
to
take
the
time
to
think
of
a
question.
I
will
ask
the
first
question
simple
question.
Hopefully:
are
the
previous
version
of
csm
now
becoming
obsolete.
B
Well,
not
quite
because
we
are
still
sporting.
In
fact,
we
changed
our
recent
data
policy
to
support
some
of
the
earlier
code
versions,
going
all
the
way
back
to
ccsm4
roughly
10
years
ago,
and
there
are
several
reasons
for
that
is
university
users,
like
graduate
students
and
all
that
stuff
work
with
their
older
versions
of
the
model.
B
The
second
reason
is
that
csm
won,
for
example,
the
cato
predictions
and
the
large
ensembles
are
still
using
csm
1.1
code
base
and
that's
roughly
10
years
old
and
that
they
are
also
being
still
supported,
and
similarly
our
high-res
stimulation,
csm
1.3,
almost
eight
years
or
nine
years
old,
and
we
are
still
sporting
that,
because
that's
the
only
high-res
version
that
we
have
right
now.
So
they
are.
I
guess
the
short
answer
is
yes,
we
are
sporting,
the
older
versions
within
reason,
though,.
A
Alright,
so
we
have
a
couple
of
questions:
mata
go
ahead
and
meet
yourself
and
ask
your
question:
hi.
Yes,
so
well,
that's
a
question.
I
asked
you
gunther
on
my
email
because
I'm
curious
about
how
you
do
coupling
and
one
thing
the
question
I
have
is
that
if
you're
using
different
initial
conditions,
for
instance
ocean
realities
and
atmospheric,
how
do
you
make
sure
that
those
initial
conditions
are
consistent?
Do
you
like
use
a
protocol?
Do
you
check
that
before.
B
Unfortunately,
we
don't
have
anything
automated
at
this
point
and
you
need
to
be
really
careful
as
a
user,
how
to
do
that,
for
example,
from
the
ocean
side.
If
the
d.a
product
the
analysis
product
is
performed
using
our
ocean
model,
then
there
is
no
issue,
because
then
it's
done
in
a
consistent
way.
However,
we
have
also,
in
fact
I've
done.
I've
done
that
in
the
past
as
well.
B
But,
more
importantly,
you
need
to
obey,
for
example,
in
the
ocean
model
borotropic
or
a
clinic
split.
You
need
to
make
sure
that
the
vertical
integral
of
pyroclinic
mode
is
essentially
zero
and
the
vertically
integrated
of
integrated,
total
velocity
adds
up
to
the
barotopic
mode,
and
you
have
to
unfortunately
do
those
things
offline
and
make
sure
that
everything
is
consistent.
A
Yeah
but
well
it
it
answers
yourself.
The
question,
if
you're
using,
for
instance,
if
you
have
different
you
get
the
data
for
the
atmospheric
part
from
every
analysis,
but
then
the
data
for
the
option
part
from
another
re-analysis,
and
you
also
need
to
do
some
checks
to
make
sure
that
they
both
match.
B
I
showed
that
slide
on
couple
data
assimilation,
that's
one
of
the
ways
that
you
can
avoid
inconsistency
between
the
two
systems
and
I
think,
whatever
you
do,
there's
going
to
be
some
kind
of
initialization
shock
of
the
system
that
you
need
to
essentially
be
careful
in
the
interpretation
of
early
part
of
the
results
and
it'll
likely
end
up
using.
I
mean
you'll
end
up
likely
using
bias
correction
for
that
purpose.
A
Hi,
thank
you
for
the
presentation.
I
was
wondering
in
the
mom
six.
You
were
talking
about
how
there's
the
three
different
grades
that
you
could
use
and
I
was
wondering
what
would
be
the
benefit
of
using
the
hex.
I
think
I
believe
it
was
a
hexagonal
grid
versus
a
rectangular
grid.
B
So
that
was
actually
not
mom,
that
was
the
atmospheric
dynamical
core
options
essentially,
but
so
that
is
empass
atmos,
that
is
the
empass
in
its
own
irregular
grid.
One
advantage,
essentially
of
that
configuration
is,
you
can
do
regional
grid
refinements
in
any
way
any
any
region
that
you
would
like
to
have.
So
that's
one
advantage,
but
it's
not
necessarily
the
grid
structure
per
se.
It's
essentially
off
that
system
it's
in
irregular
grid,
and
it
allows
you
to
do
that.
I
guess
it
comes
with
the
grid.
So
that's
one
big
advantage.
B
You
can
do
regional
grid
refinement
without
going
into
without
you
can
do
it
in
multiple
regions
in
a
single
global
application.
B
So
that's
that's
one
beauty
of
it,
but
it's
very
expensive
because
it's
irregular,
you
don't
know
where
your
nearest
neighbor
is
and
computationally
becomes
extremely
expensive
and
also
potentially
it
can
represent
the
coastal
regions,
for
example,
much
better.
You
can
essentially,
for
example,
you
can
do
grid
refinement
in
let's
say
in
gibraltar
region.
You
can
put
a
lot
of
grid
there
with
large
higher
grid
resolution
and
then
make
it
coarser,
so
it
will
essentially
adjust
the
coastal
regions
or
adapt
the
coastal
regions
a
lot
better.
A
A
I'm
sorry
if
I
mispronounce
your
name,
please
correct
me.
If
I
just
pronounce
your
name
yeah,
can
you
hear
me?
Yes,
yes,
yeah
thanks
for
the
lecture,
and
you
know
you
mentioned
the
refinement
when
we
need
a
higher
resolution,
we
can
do
refinement
for
a
certain
region
and
meanwhile
we
can
also
use
regional
climate
model.
So
I
wonder,
what's
the
difference
between
the
two
approaches
and
what's
the
pros
and
cons
of
them.
B
Well,
the
biggest
part
that
I
can
think
of
right
now
is
essentially
when
you
do
regional
refinement
in
a
global
modeling
context.
You
allow
two-face
a
two-way
grid.
Interaction,
so
in
other
words,
what's
happening
in
the
in
the
in
the
regional
in
the
refined
area,
can
directly
communicate
without
any
sort
of
ad
hoc
issues,
and
everything
is
done
in
a
conservative
way
in
a
sense,
because
it's
within
the
same
system,
when
you
do
it
in
a
regional
coupled
model.
In
fact,
we
do
have
a
regional
couple
model
version
of
it.
B
It's
essentially
supported
by
texas
a
m
as
part
of
our
collaboration.
The
issue
is,
there
is
essentially
it
is
not
embedded.
If
it
is
not
embedded,
then
you
have
to
provide
some
kind
of
boundary
conditions.
Then
you
have
to
deal
with
the
sort
of
sponge
layers
and
all
that
stuff,
and
you
are
likely
at
the
mercy
of
the
boundary
conditions
that
you
are
specifying
at
that
point.
Even
if
it
is
embedded,
the
original
system
is
embedded
in
the
coupled
fully
coupled
system,
which
we
have
a
version
as
well
that
you
can
use.
B
B
However,
a
big
disadvantage
of
regional
refinement
in
a
global
coupled
system
is
that
you,
if
you
do,
for
example,
north
authentic
refinement
it
you
end
up
essentially
spending
pretty
much
entire
your
computer
time
in
that
region.
Netiquettes,
your
cfl
condition,
will
likely
be
dictated
by
the
time,
step
or
good
resolution
in
that
region.
So
you
are
going
to
be
paying
a
lot
of
computer
resources.
B
B
Oh,
thank
you.
Would
you
mind
talking
a
little
bit
about
the
icasm
and
what
version
is
supported
on
which
one
I
see
here,
sam,
that's,
the
isotope
enabled
version
yeah.
B
That
is
a
my,
so
you
can
essentially
talk
to
betty
and
esther
about
that
thing
and
you'll
get
more
accurate
information,
probably
from
them.
It
is
unfortunately
right
now
that
it
is
supported
only
an
older
version
of
the
model.
I
mean,
I
think
it's
supported
in
c
esm1
or
ccsm4.
I'm
not
100
sure
we
had
a
few
meetings
to
bring
all
of
the
features
of
the
isotope
enabled
version
to
a
more
recent
model
version.
B
Unfortunately,
there
is
some
work
going
on,
but
it
won't
be
completed
probably
for
another
year
or
so,
because
it
is
it,
it
was
more
expensive,
more
extensive
than
we
originally
thought.
So
I
think,
if
I'm
not
mistaken,
it's
csm4
or
csm1
based
at
this
point
and
that's
another
reason
that
we
need
to
essentially
provide
support
for
that
version.
A
All
right,
thank
you,
everyone
for
your
questions.
We
reached
the
our
time
for
this
session.
Thank
you.
So
much.