►
From YouTube: SimPEG Meeting from January 5th
Description
SimPEG weekly meeting from January 5th, 2022
A
A
I
before
we
get
started
something
I'd
just
like
to
have
a
chat
about
we
can
either
do
is
today
or
next
week
or
in
two
weeks
from
now,
just
have
a
thought
on
things
that
we
would
like,
as
a
community,
to
try
to
tackle
with
simpeg
in
the
next
coming
in
the
coming
year.
A
Like
things
we'd
like
to
see
added
things,
we'd
want
areas
that
we
kind
of
want
to
focus
on
just
looking
forward
to
the
new
year
to
see
where
people
are
at
what
they're,
what
they're
hoping
to
do
with
zimbag
it'd
be
nice
to
kind
of,
like
you
know,
take
this
role
and
kind
of
focus.
Us
focus
see
what
we're
interested
in.
Obviously
it's
you
know
what
people
want
to
tackle
they're
going
to
tackle
that
just
to
have
an
idea
of
where
people
are,
though,
so,
just
kind
of
start.
A
Thinking
about
that,
I'm
assuming
we
don't
have
too
many
quick
reports
over
the
last
two
weeks
myself.
I
I
you
know
spent
the
last
few
weeks
with
family
down
in
the
states.
I
don't
think
we
caught
covet
because
we
were
let
back
into
canada
yesterday.
So
it's
all
good,
though
I
have
an
article
going
through
with
the
copy
editing
status
for
geophysics
that
should
be
coming
out
soon,
so.
A
It
is
a
separate
project
that
I
had.
This
paper
has
been
in
revisions
like
being
going
through
the
whole
process
for
about
a
year
and
a
half
now.
A
Yeah,
it
is
on
like
combining
gravity
and
gravity
radiometry
data,
just
a
little
bit
more
rigorous
than
it's
actually
most
almost
every
other
piece
of
work
that
has
ever
been
published
about
is
more
rigorous
of
an
approach
to
it
about
how
how
to
adequately
combine
them.
Thought
processes
just
kind
of
a
general
steps
you
should
take
when
you
want
to
combine
them
and
why
and
what
they
can
do
together.
A
A
I
say
it
was
my
mini
masters
project
that
I
did
within
within
my
phd,
like
it
had
nothing
to
do
with
any
of
my
phd
topics
and
just
like.
Essentially
it
would
have
been
a
master's
project
on
its
own,
but
it
was
just
a
neat
fun
little
thing
to
do
off
to
the
side,
so
that'll
be
hopefully
coming
out
in
geophysics
soon,
and
I
should
be
able
to
share
it
shortly.
A
D
Hey,
are
we
going
to
be
able
to
expand
that?
Do
you
think
to
to
mag
and
mike
radiometer.
A
It's
one
of
the
things
where
it's
like,
I
don't
it's
about
like
we
do
have
to
actually
properly
weight
the
two
different
types
of
data
sets,
or
else
they
just
don't
fit.
They
don't
fit
and
you
don't
you
kind
of
lose
out
on
the
benefits
of
combining
them.
If
you
don't
properly
weight,
each
of
them
individually,.
D
A
We'll
do
other
than
that
break
was
nice.
It's
all
family
that
I
haven't
seen
in
two
years.
A
B
Quickly,
I'm
sure
most
people
saw
this
here,
but
I'll
just
point
it
out.
If
you
know
anybody
who's
interested
in
a
postdoc
looking
at
em
for
uxo,
primarily
so
both
sort
of
combining
inversion
as
well
as
machine
learning.
Please
put
them
in
touch
so,
but
yeah,
em
kind
of
being
the
biggest
thing.
If
they've
never
worked
with
uxo,
that's
fine,
but
if
they
worked
with
em
or
neural
networks.
That
would
be.
That
would
be
great.
C
It's
do
we
did
you
create
once
a
channel
or
will
you
create
a
channel
once
you
have
a
phd?
C
B
D
If
I
may
ask
lens
you,
you
can
not
answer
if
you,
if
you
can,
but
is
there
like
a,
is
there
a
sponsoring
company,
because
I
know
that
you
know
ocean
floor
is
all
over
all
of
that
stuff
right.
What
peter
was
talking
about
yeah.
B
So
it's
a
funded
proposal
through
ceradip,
which
is,
I
forget,
actually
what
that
stands
for.
It's
the
a
combination
of
the
department
of
energy,
the
epa
and
the
department
of
defense
also.
B
It's
us,
but
we're
partnering
with
black
tusk,
so
black
tusk
is
also
part
of
the
funded
group
and
so
they're
providing
data
and
sort
of
the
system
specs
and
helping
kind
of
motivate
and
shape
the.
D
B
D
A
C
Reports
yeah,
maybe
I
have
what
I
wrote.
I
I
it
was
pure
accidents
when
I
stumbled
on
this
art
read
the
docs
link
and
I
thought
even
it
took
me
a
while
to
realize
that's,
that's
very
old
documentation
and
I
know
simpex.
I
thought
if,
if
another
user
stumbles
up
in
these
old
documentation,
it
might
be
bad.
So
I
don't
know
who
has
the
login
credentials
for
read
the
docs
if
that
can
just
be
removed
or
if
you
have
to
update
something
or.
D
I
remember
we
encountered
the
same
thing
with
the
jif
tools,
with
the
the
manuals
right
people
who
can
still
hit
the
old
versions
of
the
other,
read
the
docs,
even
if
we
completely
pull
the
plug
on
them
and
yeah.
I.
C
D
E
C
B
B
A
C
Like
what
is
the
oldest
the
newest
version
on
rita
docks,
you
have
any
idea
because
it
says
latest,
but.
A
C
B
C
A
It
looks
like
there's
a
way
like
the
the
read
the
docs
link
does
kind
of
forward
to
where
the
docs
might
be
it
forwards
to
like
docs,
mpeg
xyz
at
the
end
latest
right
now,
they're,
just
all
on
like
we
don't,
we
haven't
been
keeping
around
older
versions
of
the
documentation.
B
C
C
B
Things
pointing
at
the
same
url
because
read
the
docs
right
now,
you're
right,
we
are
pointing
to
the
simpeg.xyz
site
and
we're
doing
that
with
github,
and
so
the
fact
that
you
have
the
slash
en
slash
latest
grabs
read
the
docs,
whereas
the
other
one
grabs.
If
you
leave
that
off
we'll
grab
github.
So
that's
super
confusing.
A
D
I
did
spend
two
or
three
days
working
on
the
the
refactoring
of
regularization,
what
we're
starting
to
talk
about
and
make
some
good
progress.
D
Now,
I'm
kind
of
working
down
the
unit
test
to
make
sure
they're
still
passing
and
then
adding,
like
you
know,
duplication,
duplication,
warnings
for
all
the
name
changes
I
think,
there's
like
so
much
stuff
in
there
that
we
should
probably
have
like
a
quick
overview
together
for
people
who
want
to
do
a
review,
because
it's
obviously
a
you
know,
it's
a
big
module,
the
regularization.
So
we
should
probably
agree
on
changes,
agree
or
disagree
on
changes.
D
And
then
other
thing
I
played
for
you
joe
right
on
on
pi
diesel.
Just
before
the
break,
we
were
hoping
to
do
a
release
and
then
all
our
inversion
unit
tests
starting
to
fail,
because
on
windows
only
because
of
changes
that
happened
on
mkl.
A
A
Want,
as
in
like
it
needs
to
be
pinned
to
a
compatible
version,
and
it
has
to
be
revealed,
I
wonder
if
it
needs
to
be
rebuilt
for
each
different
version
of
mkl.
A
So
because
sometimes
well
you
can
do
this
thing
you
can
pin
compatible
for
numpy
and
it'll,
create
different
pre-built
things
for
different
packages
of
numpy
if
it's
needed
right
right.
So
I'm
curious
if
it's
that.
D
D
I
re-ran
it
and
if
numpy
is,
if
I
pin
mkl
to
like
the
the
right
version
that
works
but
numpy
is,
is
ahead:
it's
like
the
latest,
then
it
still
doesn't
work.
So
it
needs
to
be
numpy
a
specific
version
and
then
pi
mkhill
specific
version.
I'm
guessing
we're
using
numpy
to
compile
the
c
code
right
still
so.
D
C
C
A
E
I
mean
I
don't
have
anything
else,
that's
too
much.
Simpag
related,
but
lately
I've
been
working
on
a
lot
of
utem
data,
so
this
is
like
a
large
inductive
loop
source
and
the
idea
is
you've,
got
sort
of
a
triangular
waveform
and
you're
measuring
db
dt,
but
it's
equivalent
to
measuring
the
b
field
for
a
step.
E
So
this
is
it's
kind
of
an
interesting
setup
because
to
invert
these
data,
you
have
basically
data
collected
during
the
on
time
and
so
looking
ahead.
There's
you
know
some.
I
don't
think,
there's
a
well-established
strategy
for
inverting
those
those
data
and-
and
you
know,
dealing
with
kind
of
the
same
issues
we
have
when
we
do
frequency
domain
em,
where
we
have
to
account
for
a
primary
field.
So
it's
kind
of
a
fun
project.
E
I
got
sent
a
data
set
to
make
a
comprehensive
workflow,
there's
tons
of
magnetic
effects,
there's
all
sorts
of
fun
business,
so
yeah
when
I
write
this
up,
but
it's
going
to
be
interesting
to
later
try
and
maybe
apply
simpeg
and
see
if
we
can
reproduce
that
result,
because
the
data
will
be
freely
available.
C
My
pinterest
a
lot
of
the
few
stuff
that
was
done
time
domain
in
marine
cscm
is
usually
done
the
same.
So
it's
a
continuous
source
signal,
but
it's
not
a
wave.
So
it
it
it's
it's
a
it's
a
more
complicated
source
signal
and
it's
then
pinned
and
then
done
into
time
windows
and
used
as
time
domain
data.
But
it's
also
measured
well,
there's
not
an
on
and
off
time,
it's
always
on
and
you
measure
at
the
same
time.
So
there
might
be
similar
strategies
to
use.
E
G
G
Isn't
it
like
what
they
provide
devin,
I
think,
as
far
as
I
understand
they,
they
do
remove
the
primary
field
and
I
think
that's
like
a
that's
their.
I
think
it's
one
of
the
important
step
on
their
end.
I
guess.
E
They
they
provide
you
with
basically
like
well,
depending
on
how
you
feel
like
representing
the
data
it's
either
dv
dt,
for
this
try
highly
regulated
triangular,
waveform
or
b
for
a
step,
and
the
conversion
is,
is
basically
a
constant
and
then
the
way
that
you
would
represent
the
data
is
that
you
would
either
through
the
latest
time
channel,
which,
assuming
all
the
inductive
effects
have
diffused,
is
kind
of
a
good
approximation
for
the
primary
field.
E
When
you
represent
the
data
you
normalize
by
by
the
primary
field,
essentially
to
try
and
see
what
kind
of
anomalous
structures
are
there,
so
I'm
not
sure
if
there
is
a
a
standard.
You
know
I've
only
really
played
with
this
one
data
set,
but
I
have
been
given
effectively
this
total
field
day.
I've
just
been
given
what
was
measured
by
the
receivers,
and
it
includes
the
primary.
E
Sure
but
they've
also
given
me
a
like
a
fairly
they've
taken,
a
bunch
of
data
points
to
really
define
the
path
of
the
res
the
transmitter
loop.
E
E
G
And
what
they,
what
they
actually
do,
like
that's
what
you
did,
what
what's
the
what's
that
company
like
lamontage,
they
think
you're
physics.
E
E
Well,
I
haven't
seen
inversion
results
for
this
yet,
and
I
know
scott
napier
had
a
masters
or
a
phd
thesis
in
like
over
a
decade
ago.
They
played
around
with
this
system,
but
he
inverted
the
the
total
field
like
he
inverted
all
of
it,
and
I
mean
a
lot
of
what
we're
doing
right
now
and
what
the
sponsors
are
interested
in
is
what
is
the
best
approach,
so
I'm
definitely
gonna
try
a
few
things
they
want.
They
want
a
consistent
strategy
for
this
type
of
data.
F
I
did
a
project
like
this
with
the
old
company
that
I
worked
with.
We
were
doing
downhole,
step
response
and
yeah
soggy.
That's
exactly
what
you
do
is
you
remove
the
primary
and
then
they
would
just
put
it
into
maxwell
and
they
would
just
model
plates
with
the
on
time.
There
wasn't
an
inversion
at
the
time,
but
yeah
you
removed
the
primary
for
sure.
F
That
seemed
to
be
pretty
standard,
but
I
was
kind
of
reading
a
lot
of
stuff
from
the
montanes
guys.
E
Yeah,
I
don't
know
exactly
what
they're
working
with,
but
you
just
basically,
if
you
are
modeling
the
total
field
for
in
with
using
these
codes,
the
new
the
errors
associated
with
numerically
modeling.
The
primary
field
are
larger
than
the
uncertainties
on
your
data
and
that
that
buggers
everything
so
always
wanting
to
try
to
invert
anomalous
and
not
total
field.
That
so
far,
that's
that's
my
my
feeling,
unless
somebody
can
convince
me
otherwise,
but
I
think.
G
It's
very
worthwhile
to
kind
of
read
a
little
bit
more
because,
like
then,
and
to
me
like,
what's
the
point
actually
like
a
measuring
on
time,
data
like
like
a
time
the
beauty
of
the
time
domain,
I
think,
can
actually
measure
the
out
of
time
without
the
primary
field.
I
think
so.
I
think
that
there
should
be
there's
some
physical
reason
why
they
are
doing
it
and
yeah
actually
understanding.
That
would
be
actually
quite
nice.
E
C
E
G
Interesting
is
that
it's
interesting
because
because,
but
still
what
you
physically
measure
is
the
voltage,
so
I
think
the
is.
It
really
helps
because
I
think
the
the
beauty
of
b
field.
I
think
I
think
it's
better
for
conductor.
I
think
a
very
strong
conductor.
I
guess
that's
my
understanding.
G
So
if
you
got
very
late
time
gates
that
you
want
to
measure
a
signal
just
using
the
voltage,
it'll
really
suffer
from
kind
of
having
that
very
like
a
late
time
signals,
whereas
the
b
field
it
seems
like
that
was
my
understanding
in
general,
like
you,
can
have
lower
frequency
band
by
using
the
b
field,
but
here
by
kind
of
like
by
manipulating
the
waveform
you're
getting
the
b
field.
But
is
it
really
helps?
E
Yeah
I
mean
I
can
I
can
see
it
like.
I
can
see
that
for
synthetic
modeling,
that
over
a
conductor
that
b
field
is
decaying
more
slowly,
so
I
can
actually
see
that
the
yeah,
the
rate
of
decay
for
the
the
induced
currents
in
in
those
conductors
it
happens
more
slowly.
E
E
A
Cool
cool,
no
worries,
okay,.
G
Nothing
particular
I
was
completely
off
completely
playing
with
my
son.
I,
before
the
break,
I
made
some
improvements
in
the
the
1d
code,
so
now
I'm
actually
pretty
happy
with
the
the
current
version,
so
it's
actually
now
optimized
so
and
it's
actually
quite
faster
than
I'm
not
sure.
I
haven't
really
checked
the
one
two
one
I
think
it's
couple
times
faster
and
the
memory
issue
like
I
had
a
bit
of
memory
issue
with
the
new
version
I
fixed
it.
G
G
So
one
example
that
I'm
currently
working
on
is
the
the
sea
water
intrusion
at
the
coastal
airborne
data
at
quite
a
coaster
area,
so
yeah,
it's
all
three
large
like
a
typical
size
of
about
a
thousand
micrometer,
and
it
takes
about
like
a
couple
of
hours
to
run
with
like
10
quarts
or
something
like
that
and
yeah.
So
I'm
like.
So
this
is
actually
what
was
my
first
time
using
the
new
code
for
sort
of
the
field
scale.
A
large
scale
am
version
and
it
seems
like
working
quite
well.
G
At
least
like
gives
sort
of
the
same
result
that
I
have
used
before.
So
I'm
pretty
happy
and
yeah.
This
is
kind
of
like
a
typical
way.
The
steps
that
I
do
I
invert
for
smooth
and
use
the
different
reference
model.
You
can
see
pretty
big
differences.
The
bread
is
the
big
conductor,
the
intruder
sea
water.
But
this
part
we
got
very
low
confidence,
as
you
can
see,
because
I
use
the
like
a
0.3
as
a
reference
or
I
can
use
like
20
ohm
meter,
for
instance,
freshwater
resistivity
as
a
reference.
G
Then
you
can
clearly
see
where
we
got
the
kind
of
very
low
sensitivity
anyway
and
what's
kind
of
cool
about
this
data.
G
They
it's
a
time
lapse
data,
so
we
got
2007
data
and
19
data,
so
I
am
going
to
kind
of
like
slightly
modify
the
code
to
can
I
can
handle
the
time
lapse,
inversion
and
I
had
a
question
to
joe
because,
like
I
do,
need
a
same
functionality
that
can
link
two
kind
of
two
model
then
have
a
constraint
in
between
which
is
basically
the
same
problem
in
the
cross
like
a
cross
cross,
the
joint
inversion
process,
because
you
got
two
physical
property
and
I
have
a
constraint
getting
two
models
I
want
to
kind
of
in
this
case,
like
I
can
expand
that
I've
got
many
times
elapsed.
G
Data
like
my
many
times
like
a
10
time
lapse
data.
I
have
10
models
that
I
want
to
connect
through,
but
I
may
connect
for
each
two
of
them
so
yeah.
I
could
wait
on
myself,
but
this
is
something
that
I
need.
So
if
there
is
an
existing
sort
of
the
base
class,
I
wanted
to
use
anyway.
So
you
want
you
want
to.
A
Make
it
like
measure
similar
to
to
a
model
that
is
not
the
same
physical
property
or
it's
just?
No,
it's
the
same
physical
property.
G
It's
it's
the
same
physical
property,
but
set
it
as
a
different
model
because,
like
it's,
a
resistivity
at
2017
and
resistivity
in
2019..
So
although
it's
the
same
physical
property,
it's
it's
different
because,
like
there's
a
time
time
component-
and
I
think
it's
potentially
what
we
can
develop
as
a
general
like
a
class
like
a
time-lapse
regularization
class,
because
I
think
the
need
is
kind
of
like
a
getting
more
and
more
because,
like
ert,
for
instance,
there
are
many
time
lapse.
G
Data
like
a
co2
injection,
the
types
of
monitoring
issues
so
having
something
simple
like
basically
putting
the
the
flatness
or
the
smoothness
in
time
direction,
so
that
in
the
easiest
case,
if
the
mesh
is
exactly
the
same,
we
can
literally
put
the.
I
think
you
can
just
use
the
existing
kind
of
mesh
structure
to
put
the
time
constraint.
But
sometimes
the
mesh
can
be
different.
That
is
the
case
for
me.
G
G
That's
that's
something
that
I
could
do
as
well
yeah
if
there,
if
there
was
no
base
class
like
that,
I
could
just
like
redevelop,
revamp
the
existing
class
and
right
right.
D
One
yeah:
I
was
trying
to
dive
into
that
joe
actually
at
the
the
similarity
symmetry
measure
class,
and
then
there
is
no
reason
why
we
we
only
limit
two
right.
The
wire
map
should
have
should
be
like
could
be
n
and
dimensional
really.
A
D
A
Yeah
yeah,
I
think
that
I
think
there's
a
it's
possible
to
extend
that
joint
total
variation.
One
two
multiple
you
can
probably
just
just
keep
adding
them
into
the
into
the
square
root
function
that
one
yeah.
G
That's
sort
of
what
I
was
thinking
joe,
because
I
basically
need
the
two
adjacent
times
so
like
yeah.
G
Yeah
yeah
yeah,
so
I
can
just
slack
stack
over
a
whole
bunch
of
regularization.
Well,
in
this
case,
I
just
only
need
two
only
one
linkage,
but
we've
got
many
times
we
can
just
stack
over.
So
I
thought
that's
that's
an
okay
way
to
deal
with
that.
So
so
joe,
there
is
a
base
class
for
that
that
I
can
use.
A
Is
that
right,
yeah
for
that
similarity
measure,
yeah.
C
A
G
Perfect
yeah
and
that
that
is
a
project
that
I'm
kind
of
working
for
the
next
three
months
and
we're
basically
constraining
the
the
hydrologic
modeling,
the
smaller
intrusion
modeling
using
the
airborne
em
data.
So
the
other
item
that
I
am
going
to
update
is
the
3d
airborne
em
simulation.
So
I
kind
of
wrote
the
the
local
mesh
idea
very
specific,
very
specific
example
for
skytem
system,
but
I'm
hoping
to
either
bring
in
or
at
least
update,
that
that
to
be
compatible
with
the
existing.
G
That's
the
the
current's
impact
structure,
so
yeah
and
at
least
then
we're
getting
an
ability
to
invert
in
1d
and
then
simulate
a
similar
entry.
G
G
C
C
It's
it's
always
a
freaky
domain,
so
we
would
need,
like
20,
runs
to
get
the
time
domain,
but
I
don't
know
how
that
scales
with
time,
but
I
think
you
could
maybe
once
brainstorm
if
it's
possible.
G
Yeah
yeah
yeah.
I
think
at
this
point
the
the
the
easiest
way
is
using
for
the
frequency
domain
data.
Here
I
guess
or
yeah
that
is
the
the
most
easy
one
moving
to
time
domain,
I
think
it'll
be
quite
expensive.
G
So
not
sure
that
is
some
the
way
that
we
want
to
go,
but
in
terms
of
implementation
I
think
you've
got
all
the
the
ingredients.
I
guess
you
need
to
just
generate
many
frequency
domain
response
and
then
transform
transformative.
Yeah.
That's
true.
G
Yeah,
so
I
made
a
bit
of
progress
on
inverting
that
the
deformation
data
I
have
presented
like
sometime
several
months
ago,
so
I
got
the
inversion
code
and
it's
actually
working
well
kind
of
working.
Well,
in
a
sense,
it's
let
me
it
so
we
I
actually
inverted.
A
A
G
Whereas
what
the
I
think
for
the
static
case,
it's
it's
a
it's
very
similar
to
potential
field
inversion!
It's
it's
like
a
it's!
It's
it's
not
even
3d,
so
it's
actually
just
a
2d
inversion.
G
So
you
assume
the
physical
property
elastic
property
of
the
subsurface
is
constant
and
then
just
invert
for
the
mass
that
applied
to
the
surface,
which
is
basically
the
same
same
kind
of
inversion,
whereas
what
I
was
doing
is
slightly
different
because
there's
a
time
component
and
then
and
you
need
to
solve
the
diffusion
equation
and
there's
some
sort
of
nonlinearity
comes
in.
G
So
this
was
actually
the
top
that
I
gave
in
agu
and
this
is
the
inside
data.
So
if
this
threat
is
the
big
subsidence
for
five
years,
so
it's
about
like
a
meter
for
five
years,
which
is
crazy.
So
if
you
think
about
the
infrastructure
like
a
train
having
a
one
meter
substance,
it
can
kind
of
destroy
when
there's
a
canals
going
on.
So
it's
actually
a
big
risk.
G
A
whole
bunch
of
insert
signals
and
take
the
mean
it's
a
kind
of
it's
a
proof
of
concept,
so
I
just
take
the
mean
value
and
then
I
got
like
you
do
need
to
know
the
subsurface
properties
about
the
clays
and
what
is
the
storage
coefficients
and
stuff
like
that?
So
I
assume
that
I
can
get
invert
like
I
get
that
information
from
other
literature.
So
I
fixed
that.
Then
this
is
the
data
in
star
data
and
the
goal
is
getting
recovering
the.
G
G
Function
is
the
the
changes
in
head
is
the
model.
The
data
is
the
that
subsurface
information
and
that's
what
I
was
able
to
recover.
So
I
fit
the
data
and
that
this
part
is
actually
important.
The
data
is
not
only
affected
by
this
point,
but
the
historical
have
changes
actually
affecting
the
data
in
future.
So
somehow
you
estimate
this,
but
you
need
the
good
rate,
the
linear
rate,
but
once
you
got
okay
rate,
I
was
able
to
fit
the
data
well
and
here's
my
pitch.
G
So
if
you
actually
compare
with
the
the
the
the
actually
measured
data
close
by
and
that's
what
you
get
so
this
is
the
actually
measured
head
and
how
the
measured
head
is
changing
and
this
threat
is
where
you
got
the
drug
period.
So
if
you
think
about
the
drug
period,
had
this
going
down,
so
that's
the
problem
like
for
groundwater
and
management
head
is
going
that
water,
water
level
going
down
is
a
problem
in
general
and
after
the
drop
it
kind
of
slowly
recovers
and
oscillate,
you
can
see
kind
of
the
good
trend.
G
Goes
up
goes
up,
goes
down,
goes
up
and
the
the
the
measured
data
only
have
a
sparse
point,
whereas
the
measured
instar
data
have
like
a
much
better
time
sampling.
So
the
recovered
the
recovered
head
that
we
can
get
has
a
much
better
temporal
resolution
and
what
will
be
really
cool
is
expanding
that
into
the
entire
central
valley
like
here.
G
So
then
you
can
actually
monitor
the
head
changes
in
the
entire
central
valley
with
the
high
spatial
resolution,
which
will
be
absolutely
challenging
yet
because
there
are
other
challenges
remaining
challenges,
but
I
was
relatively
happy
with
what
we
were
able
to
get
and
the
implementation
wasn't
wasn't
too
hard
with
the
existing
syntax
structure
and
this
especially
for
desperate
eyes.
So
anyway,.
D
G
D
So
but
sag,
if
you
instead
of
that,
so
you
you,
you
know
before
your
observation.
You
just
use
like
a
linear
thing.
If
you
kind
of
approximate
the
measurements
with
the
curve,
do
you
improve?
Do
you
improve
it?
Do
you
improve
your
your
bottle?
I.
G
Think
so
I
think
so,
but
what
I
was
hoping
like
a
kind
of
yeah,
though
sort
of
the
assumption
that
I
made
was
kind
of
similar
to
the
time
domain,
although
your
only
time
was
like
a
slight
pretty
long
in
the
simulation
process.
G
What's
actually
the
good
property
of
this
kind
of
implicit
time
modeling
you
get
better
as
you
go
time
by
so
even
if
you
got
like
a
very
poor
time,
sampling
of
the
all
the
time,
the
late
time
as
you
go
late
time,
the
property
gets
better
and
the
actual
simulated
result
is
not
bad
okay,
so
that
that
is
something
that's.
G
D
G
G
History-
it's
not
just
one
year
ago,
it's
like
tens
of
like
hundreds
of
years
so
but
that
impact
is
getting
reduced
as
you
as
time
goes
by
and
then
the
inverted
process
likely
can
find
an
effective,
effective
like
a
time
history
in
a
way,
and
then
there
are
many
wells
so
not
only
like
I
it's.
We
have
some
data
that
I
can
interpolate
over
there
that
I
can
start
from
so
in
the
end,
we're
going
to
invert
all
of
them
together.
G
D
You'll
be
you'll,
be
a
a
superhero
in
the
hydro
hydrogen.
A
Guys
again,
if
you
have
a
thought
about
what
you
guys
would
like
to
tackle
in
the
next
year,
we
can
chat
about
it
soon.