►
From YouTube: SimPEG Meeting March 10th
Description
SimPEG weekly meeting from March 10th, 2021
D
Anyway,
sorry
for
getting
that
started
up
a
little
bit
late,
there
just
got
sidetracked
doing
something
else
for
a
second
and
then
realized.
I
was
five
minutes
late.
So
that's
my
bed.
Thank
you
guys,
all
for
being
here,
so
I'm
starting
right
up.
Let's
get
right
into
it!
I'll
share
the
meeting,
notes
practice
them
myself.
D
Okay,
no
one
has
posted
any
agenda
items,
so
I'm
just
gonna
say
I
think
next
week,
we'll
probably
have
a
friday
session
people
who
want
to
stop
by
and
mess
around
I'll
remind
them
again
next
week
as
well.
I
think
it's
a
friday
session
next
week.
D
And
that
we
can
go
into
some
quick
reports
here,
so
you
can
start
I've
been
I'm
working
been
working
on.
I
was
actually
just
putting
together
for
a
lecture
I'll
be
giving
later
today,
on
kind
of
you
know,
introduction
to
geophysics
stuff
and
showing
a
bunch
of
geological
engineers
how
they
can
use
geophysics
to
solve
problems
going
through.
D
I've
been
working
off
of
a
lot
of
the
old
material
that
we
had
just
kind
of
put
in
a
couple,
or
at
least
one
new
example
in
there
from
my
own
research,
it's
kind
of
showing
a
really
simple
case
of
you
know.
What's
the
problem
design
a
survey,
do
a
few
tests
see
if
we
can
collect
it,
see
if
it
actually
works
kind
of
go
through
the
general
process?
D
It
was,
and
then
I
just
ended
up
erasing
a
lot
of
ip
code
using
dc
just
using
the
dc
stuff.
So
all
the
tests
still
work.
It
just
means
that
it's
just
relying
directly
off
of
the
dc
instead
of
the
instead
of
copying
it
into
ip.
Every
time.
F
D
Yeah
like
it
should
all
the
tests
go
past.
Obviously
it
should
it
works.
Yeah,
like
I
said,
I
got
the
parent
resistivity
apparent
or
repair
chargeability
bug
fixed.
That
was
in
there
causing
a
couple
of
issues
I
tested
a
little
bit
more
now.
I
think
I
test
some
apparent
resistivity
chargeability
receivers
that
wasn't
getting
tested
before.
B
That's
one
that
would
be
nice
to.
We
can
rerun
the
century
inversion,
notebook
on
that
and
just
double
check
that.
Well,
hopefully,
we
should
be
getting
similar
results,
but
I
know
you
did
change
some
things
under
the
hood.
So
I'm
happy
to
happy
to
run
that.
D
D
C
D
It
does
it's
up
to
your
receiver,
so
what
happens
now
is
that
the
receiver
for
inherent
resistivity,
the
receiver
stores
the
geometric
factor,
that's
indexed
by
the
source
right.
So
that
way,
if
it's
an
apparent
resistivity
receiver,
when
you
pass
the
source
to
the
evaluate
function
right,
it
just
looks
at
the
geometric
function
from
that
from
that
source.
D
And
then,
if
it's
apparent
chargeability,
it's
all
stored
on
the
simulation,
because
it
doesn't
make
sense
to
start
any
receiver
because
it'll
be
unique
to
each
source,
as
well
as
to
this
conductivity
model
and
the
mesh
that
it's
on.
So
there's
a
bunch
of
different
things
that
you
know
lead
to
different
dc
voltages
for
the
parent,
chargability
calculation.
D
Great
yep
and
move
on
to
devon
cause.
That's
all
I
am.
F
Yeah,
so
I
have
the
pr
ready
for
a
final
review
for
updating
all
the
dc
and
ip
utilities,
and
I've
got
a
you
know
some
more
tests.
One
of
the
things
I
was
able
to
finish
was
the
io
for
xyz
formatted
files,
the
octree
code
dc,
ip3d,
dcip2d
and
and
tested
all
that
and
made
a
couple
more
tutorials.
We
have
a
pretty
quick
running,
3d
dc
and
ip
inversion
in
the
tutorials,
which
I'm
pretty
happy
with.
So
I
think
we've
yeah
we've
made
some
advancements
there.
F
So
just
looking
for
a
final
review
and
make
some
changes
and
hopefully
get
that
in
as
soon
as
possible,
and
then
my
sort
of
periodic
reminder
of
what's
happening
with
getting
em1d
into
simpeg.
F
That's
that's
kind
of.
What's
going
on
lately.
A
C
C
So
I
was
actually
using
that
for
my
teaching
demo
and
figured
that
that
actually
function
is
actually
just
in
the
gwb,
but
not
in
the
domain.
Io
function.
So
like
a
do,
we
did
you
actually
like
have.
F
F
That
was
a
major
motivator,
but
I
didn't
bring
in
anything
from
the
gwb
repository.
The
other
thing
I
really
didn't
touch
was
this:
this
dcio
class
that
you
made
I'm
not
really
sure
what
its
purpose
is
and
how
it
fits
in.
So
I
didn't,
I
didn't
touch
it.
I'm
not
really
yeah.
I
didn't
want
to
mess
around
with
something
that
was
maybe
being
used
somewhere
else,
but
I
think
it's
nice
to
maybe
have
a
utility
that
does
automatically
create
meshes
based
on
survey
geometry,
but
that
could
kind
of
be
added
any
time.
F
D
D
D
F
Well,
there's
an
there's
an
I
o
utils,
that's
under
the
general
sim
peg
utilities.
That's
meant
to
be
a
place
for
for
ios
and
then
within
static
utils.
We
just
have
one
plot
pseudo
section
for
2d
that
takes
into
account
topography
and
does
all
the
things
we
want
it
to
do.
And
then,
if
you
have
the
plotly
package,
you
have
access
to
a
3d
pseudosection
plot.
F
B
F
But
yeah
any
any
input
from
the
groupon.
What
we
want
to
do
with
that
would
be
good,
because
it's
kind
of
hanging.
D
C
Yeah
well,
okay:
the
idea
is
at
the
level
of
like
somebody,
who's
really
don't
want
to
get
into
detail,
but,
like
just
want
to
run
the
code
quickly
load,
the
data
enter
the
mesh
run.
I
I
think,
having
something
like
that.
I
think
it's
quite
useful,
especially
like
probably
those
are
the
majority
of
people
who's
using
simpac,
so
yeah.
I
thought
that
having
something
like
I
I
think
the
name
is
not
good,
but
the
having
something
like
that
is.
It's
nice.
C
I
think,
like
even
myself
like
when
I'm
actually
like
okay,
when
I'm
building
things
and
writing
a
tutorial,
I
start
from
scratch
and
do
that,
but
the
when
I'm
having
to
fill
data.
I
don't
even
need
to
worry
about
that
part.
So
if
I
have
something
built
up
that
can
readily
generate
the
mesh
and
stuff
I
like
yeah,
I
would
jump
that
step,
often
so
yeah
yeah.
That's
sort
of
kind
of
my
thought,
though,.
D
F
But
is
there,
is
there
any
plans
to,
I
guess,
move
forward
with
the
em
1d
stuff?
It's
kind
of
installing
I
think.
Last
last
I
worked
on
it.
I
had
sort
of
gotten
it
to
a
point
where
it
was
ready
for
some
kind
of
review,
and
I
think
that
joe
and
soggy
maybe
wanted
to
take
a
look
and
optimize
a
few
things
before
we
brought
it
in.
So
I
just
want
to
see
if
that's
on
anyone's
radar
at
the
moment.
D
C
Deb
I'm
happy
now
I
got
a
bit
of
time
and
I'm
happy
to
I'm
happy
to
do
so
so
I'll
I'll.
Take
a
look
this
week.
Yeah
I'll
do
that
in
this
week.
H
Hey
joe
real
quick,
maybe
this
is
a
little
bit
different
from
the
topic,
so
I
don't
know
if
you
remember
you
and
I
we
have
a
conversation
a
long
time
ago
when,
when
we
both
were
still
young
about
this
ip
thing,
you
talk
about
approximating
iphone
app
effect
as
a
capacitor.
Have
you
ever
got
to
implement
any
of
that
idea?
D
Because
it's
just
an
interesting
little
thing:
if
you
start
thinking
about
it,
as
a
buildup
of
you
know,
to
build
up
a
dipoles
right,
essentially
in
the
subsurface,
it's
built
up
with
charges
on
both
sides
and
little
sources
all
around.
So
you
can
almost
think
of
it
as
like
an
sp
type
problem
where
it's
like
you're.
Looking
for
a
bunch
of
different
types
of
subsurfaces.
H
Yeah
I
had
a
similar
conversation
with
misak
many
years
ago.
He
thinks
that
the
fact
that
chargeability,
the
way
we
treat
it
now,
the
chargeability
as
a
function
of
time
is,
is
strange.
It's
weird,
so
it
should
be
something
else,
maybe
like
a
capacitance
right
if
we
approximately
have
the
capacitor.
H
C
Yeah,
isn't
it
then,
like
basically
similar
to
using
kind
of
cold
coal
or
whatever,
this
parametric
types
of
model?
Tragic?
It's
like
once.
You
have
some
sort
of
parameter
that
defining
the
time
dependency,
yeah.
H
The
whole
idea
is
to
to
remove
the
time
dependence
because,
according
to
misak,
he
thinks
he
thought
that
if
you
think
about
conductivity
relativity
density
velocity,
it's
I
mean
extremely
rare
cases
is
very
with
time,
but
in
general
cases
they
are
not
right.
So
he
thought
that.
H
We
need
a
better,
maybe
mathematical,
more
theory
for
the
ip,
so
that
it
doesn't
depend
because,
right
now,
if
you
measure
the
four
decay
of
ip
curve,
you
can,
you
can
invert
for
chargeability
3d
characteristics,
distribution
at
all
the
time
channels
right,
you
can
end
up
with
10
chargeability
model.
If
you
have
10
time
channels,
he
thought
that
maybe
we
can
have
a
better
mathematical
theory
to
to
capture
everything,
including
the
time
decay.
If
you
think
about
time
dominate
we,
I
mean
our
conductive
does
not
change
with
time,
but
still
we
have.
H
C
They're,
like
we
use
some
like
some
sort
of
cold
cold
types
of
model.
I
think
I
use
the
stretch
exponential
in
this
case
to
parameterize
the
time
dependency
and
it
kind
of
does
it
like
yeah.
So
there
is
a
a
code
built
up
for
that,
not
maybe
not
exactly
what
you're
calling
as
a
capacitance
but
there's
something
similar
in
in
the
current
syntax
code.
D
Base
yeah
all
right
we're
starting
to
get
into
a
bunch
of
research
topics.
Here,
no
worries.
So
unless
there's
anything
else
from
devon
and
go
on
to
john,
it
has
his
name,
which
did
next
time.
J
Yeah:
okay,
weird
my
computer's
being
funny
yeah
I've
got
a
meeting
at
11,
so
it'll
be
a
little
bit
short
yeah.
I've
been
profiling,
the
dc
child
code
on
a
cluster,
and
it's
awesome.
F
J
I've
done
it
on
kubernetes
and
kubernetes
can
handle
the
bigger
problems
over
the
ssh
with
desk,
but
I'm
still
kind
of
playing
around
with
the
mpi
solution,
because
I
can
actually
target
the
the
pneuma
nodes
or
the
memory
controllers
on
each
on
each
worker
and
yeah
with
five
pneuma
nodes.
I
can
get
up
to
seven
times.
J
The
speed
is
speed
increase
so,
as
you
can
just
keep
adding
notes
how
the
tiled
code
is
written
as
long
as
you
can
just
keep
adding
workers
to
it,
you
can
really
start
cutting
that
time
down
almost
like
factors
of
like
a
magnitude,
so
it's
yeah,
it's
actually
pretty
exciting
and
pretty
fun.
I've
also
got
the
tiled
mt
inversions
or
I
got
a
colleague
working
with
those
right
now
and
yeah
he's.
He
loves
the
speed
up
as
well.
J
The
only
thing
is
is
he's
finding
that
the
depth
there's
a
depth
limitation
compared
to
other
mt
codes.
It
just
kind
of
seems
like
about
half
of
what
you
would
expect
to
get
a
resolution
at
depth.
It
just
kind
of
cuts
off
and,
whereas,
like
say
occam
or
something
just
really
just
starts
forming
structures
to.
J
Like
the
maximum
like
skin
depth,
which
doesn't
seem
to
be
quite
happening,
he's
going
to
try
out
a
few
more
receiver
setups
just
to
see
if
it's
something
that
he's
got
done
but
yeah,
it's
kind
of
something
that
I'm
going
to
be
pondering
over
the
next
little
bit
to
see.
If
we
can
get
those
to
match
kind
of
the
other
production
codes.
Yeah
synthetic
will
probably
help
a
lot
right.
I
You
move
it
down,
see
how
far
you
can
catch
up.
Are
you
gonna
be
able
to
share
kubernetes.
J
Implementation,
oh
yeah,
it's
it
literally!
It's
just
like
a
yml
file
and
the
pain
of
it
is
just
setting
up
the
cluster.
Putting
in
your
code,
you
really
just
have
to
say:
oh,
my
cluster,
is
you
give
the
cluster
or
what
is
it?
Das,
das,
cluster?
You
give
that
a
one
or
a
kubernetes
limo
file,
okay,
and
then
it
just
yeah
it
loads
it
all
up.
J
It's
the
it's
setting
up
the
cluster,
that's
the
hard
part
going
on
to
each
node
and
making
sure
that
it's
all
set
up
properly,
all
the
same
codes
that
that's
really
the
annoying
part
but
yeah.
I
definitely
will
share
all
that
for
sure
sweet.
I
just
want
to
give
it
I'm
kind
of
memory
limited.
So
I
I
can't
really
test
the
big
big
problems
right
now,
but
it's
doing
better
than
the
ssh
implementation
so
and
that's
on
your
local,
your
local
cluster
right
yeah.
F
J
D
D
B
J
Kind
of
complicated
just
because
you've
got
to
generate
those
rsa
files
and
then
sure
I
think
they're
rsa
they're
they're,
just
like
keys
that
you
can
you
pass
between
all
the
computers
and
you
have
to
give
it
to
them,
because
a
normal
cluster
you'll
have
to
let's
sign
in
with
a
login
name
and
then
use
a
user
and
password.
J
So
you
make
these
keys
that
make
it
so
you
don't
have
to
use
username
or
password
and
those
are
kind
of
annoying
to
set
up
and
put
in
the
right
spot
on
each
computer,
okay,
yeah,
but
the
after
that's
set
up.
It's
super
easy
because
it's
just
a
weimar
file-
and
you
just
tell
it
exactly.
I
want
these
workers
to
do
that.
This
that
you
just,
I
can't
find
a
way
to
target
the
actual
newman
nodes
like
I
can
with
mpi,
but
yeah
I've
got
both
implementations
going.
So
it's
good
to
go.
J
C
John,
so
did
you
kind
of
figure
it
out
like
this
you're,
storing,
j,
matrixes
right
and
then
yeah?
Potentially,
you
may
need
to
distribute
that
into
a
different
computer.
I
guess-
or
you
just
use
one
single
computer
just
like
to
load
things
up.
J
They're
they're,
it's
kind
of
on
a
share.
You
make
you
kind
of
set
up
a
shared
drive,
however,
it
they
don't
share
on
to
that
one
local
computer,
it's
like
just
a
like
a
virtual
drive,
that's
sent
across
all
the
nodes
and
it
puts
its
stuff
into
their
own
local
computer,
but
you
can
still
access
it
from
the
other
ones.
J
It's
does
that
kind
of
make
things.
C
Right
is
that
kind
of
brings
up
a
bit
of
speed
issue.
Let's
say
like:
if
you
have
something
hard
drive
hardware
that
you
can
fastly
load
it
up,
you
may
just
want
to
use
that
or
if
you're
just
using
all
the
local
machines,
then
it
could
potentially
cause
like
kind
of
this
potential
speed
that
decreases
and
yeah.
C
J
It
yeah
no
yeah,
it's
it's
that's
correct.
It
depends
on
how
you
have
the
clusters
set
up,
because
the
I
had
the
ssh
dash
cluster
set
up
and
as
soon
as
I
got
to
too
big
of
problems
too
too
big
a
js
yeah
it
just
it
failed.
It
just
couldn't
couldn't
handle
it,
but
the
mpi
solution
seems
to
yeah
no
matter
well.
C
C
So
mpi
is
on
top
and
then
you
distribute
like
then
then
use
the
desk
at
each
kind
of
local.
J
D
John
long,
I
see
you've
been
pushing
a
little
bit
on
your
joint
inversion
branch.
D
K
Yep
you're
good
yeah,
okay,
so
at
this
moment
I
finished
the
I
end:
a
new
directive
in
the
directive
folders
named
the
journal
draft
use
and
for
the
regularization
folder
and
the
coupling
py
files
and
for
the
tutorial
I
enter
the
number
13,
the
drone
team
version
and
for
the
test
and
files
in
the
base.
Folder
named
the
test
across
gradient,
and
so
this
is
a
synthetic
models
I
used
for
the
cross
median
joint
invariant,
which
is
the
exact
same
with
the
dom
used
for
the
gravity
and
magnetic.
I
think-
and
this
is
the.
K
Anomaly:
data
maps
for
the
magnetic
and
the
gravity-
and
this
is
the
recovery,
the
more
gently
recovering
models,
the
top
figure
in
the
susceptibility
model
and
the
important
part
is
density
model,
and
this
figure
is
normalized
across
gradient
which
can
be
used
to
measure
the
similar
that
measure
the
degree
of
the
structure
similarity.
So
the
one
color
indicates
the
structure
different
and
the
code
color
indicates
the
struct,
similar
structures.
K
So
for
the
tutorial
I
will
show
the
recovery
model
recover
the
two
models
and
normalize
the
cross
gradient,
and
this
figure
is
the
jointly
is
a
scattering
plot
for
the
jointly
recovery
density
and
associated
model,
and
the
right
figure
here
is
a
separate
recovery
model.
So
we
can.
We
notice
that
the
joint
inversion
can
enhance
the
this
linear
or
non-linear
features.
K
K
K
Where
again,
we
observe
the
enhanced
linear
feature.
So
so
I
think
this.
The
field
data
application
has
the
same
co-experience
with
our
synthetic
example
showing
here
that
is
joint
environment.
Can
you
enhance
the
linear
feature,
so
I
think
this
this
synthetic.
This
synthetic
example
may
be
good
to
put
in
the
website.
I
think
because
the
model
first
reason
is:
the
model
is
consistent
with
our
other
tutorials
and,
secondly,
the
results.
It
is
reasonable.
H
Show
them
real
quick.
So
if
you
want
to
show
the
improvement
from
johnny
version,
you
might
want
to
consider
increasing
the
complexity
of
your
synthetic
mode
a
little
bit.
So,
instead
of
three
points
on
the
line
you
you
might
want
to
add
another
anomaly
that
shows,
for
example,
another
point
on
in
the
upper
left
yeah
there.
H
If
you
run
a
separate
inversion,
you
can
imagine
that
you
have
scattered
pawns
in
both
direction
right
on
this
min
anti-diagonal
line,
and
also
some
scat
points
will
be
attracted
to
the
upper
left
right,
so
that,
and
I
expect
to
see
more
obvious
improvement
from
joining
version
in
that
more
complicated
example.
Did
you
get
what
that?
What
I
see
yeah,
I
see
so.
K
K
The
final
is
yeah
after
that,
after
the
after
using
the
field
data,
I
can
enter
the
few
data
applications
to
the
gallery,
either
in
the
in
publication
or
in
an
independent
subsection.
Not
this
id
yet
xiao
long
is
that
working.
K
It
works
well
for
the
2d,
so
it
basically
works
for
1d
2d
3d,
no,
not
other
1d,
because
for
the
1d
we.
H
C
I
see-
and
I
was
actually
curious,
so
you
put
the
two
buddies
and
then
they
got
like
a
both
of
them
were
anomalous
in
that
susceptibility
and
the
density
and
suppose,
like
you,
got
one
susceptible
body
and
the
other
is
not
so.
I
was
just
thinking
about
again
as
an
example
like
it
seems
like
that,
because
it
seems
like
the
example
it's
pretty
obvious,
just
using
the
magnetic
it
seems
like
you
can
see
both
of
them.
C
So
I
thought
that's
not
very
exciting,
so
you
just
put
like
a
one
body
for
each
of
the
two
properties,
and
I
was
actually
curious
so
now,
like
you,
just
have
a
single
body
for
a
single
physical
property,
and
then
this
cross
gradient
can
generate
some
sort
of
artifact
because,
like
what
the
other
physical
property
tells
you
through
that
cross
gradient,
there
could
be
some
sort
of
bodies.
C
H
Shalom,
maybe
I
can,
can
you
go
back
to
your
synthetic
model
yeah?
What
soggy
was
talking
about
the
days,
so
you
have
two
co-located?
Oh
you
don't
have
the
the
truth
or
here
so
you
have
a
you,
have
two
bodies
right.
They
are
co-located,
especially
coincidence.
What
what
soggy
said
is
that
what
about
what?
If
you
remove
one?
Let's
see
the
left
body
in
the
from
the
sensible
model?
H
Okay
and
then
you
remove
the
right
body
from
the
density
model,
and
then
you
run
a
cross-gradient
joining
version,
because
in
this
case
they
are
not
co-located
right
yeah.
I
think
what
soggy
was
asking
that,
in
this
case,
would
cross
gradient.
Bringing
some
artifacts
into
the
densities
of
the
build
models.
Is
that.
D
C
C
C
Real
case,
what's
what's
what
would
happen
so
I
think
it
might
be.
I
think,
rather
than
this
actually
having.
That
seems
more
interesting
to
me
anyway.
Just
a
thought.
I.
I
H
C
And
recovered
think
kind
of
combining
a
little
bit
more,
let's
say
dc
and
potential
field.
Something
like
that
combination
seems
more
interesting
rather
than
like
two
potential
fields,
because
let's
say
you
see
kind
of
basement
pretty
well
with
these
creativity
or
something
like
that,
I
think
something
that
can
give
a
high
pass
and
potential
feel.
I
think
it
seems
like
a
little
bit
more
interesting
example.
Anyway,
I'm
actually
excited
I'm
excited
to
play
with
that.
With
that
code,.
I
And
it's
a
speed,
it's
a
speed
thing
too.
Can
we
go
back
to
your
structure?
So
I
I
like
jean
that
you,
you
moved
your
directive
outside
of
the
that
master
directive.
Should
we
do
this
generally
start
parsing
out
the
directive
file
into
sub
ones?
Oh,
you
mean
yeah
yeah,
yeah,.
I
A
K
So
so
are
we.
I
will
keep
these
separately
merge
the
jumps
to
the
master
files.
D
D
E
Shalom
just
to
follow
up
on
a
previous
comments.
You
made,
why
isn't
cross
gradient
applicable
for
1d
problems.
K
E
K
H
It's
a
so
so
in
the
1d
case,
the
gradient
is
always
pointing
either
upward
or
downward
right
at
any
location.
So
at
any
location,
if
you
calculate
cross
gradient,
it's
always
zero,
because
it's
either
upward
or
downward
right
either
parallel
or
antiparallel
either
case
will
give
you
zero
cross
gradient.
So
it's
it's
not
it's!
Cross
grain
will
be
zero
everywhere,
no
matter,
regardless
of
the
structure.
E
Is
it
possible
to
contort
that
just
to
kind
of
ask,
for
you
know
two
models
which
have
the
gradient
in
the
same
location?
It
just
seems
that
kind
of
in
principle,
these
things
should
be
able
to
work
in
1d,
2d
or
3d.
I
mean
it
might
be
implemented
somewhat
differently,
but
sort
of
the
idea
of
trying
to
find
models
that
have
gradients
at
the
same
location
seems
that
should
be
applicable
across
all
things.
Or
am
I
missing
something.
H
So
so
doc,
in
one
d,
one
thing
we
can
do
is
to
modify
the
standard
cross
code
a
little
bit
to
make
to
encourage
the
the
gradient
to
change
in
the
same
direction.
For
example,
we
might
want
conductivity
and
density
model
to
have
a
vert
to
have
a
gradient
in
the
downward
direction.
Right
means
that
they
all
increase
when
we
go
deeper.
That
is
something
we
can
do
yeah.
That
is
that
what
you
you
said
no.
E
D
I
think
it's
it's
like
a
a
little
bit
of
a
misnomer
of
what
the
cross
gradient
is
like
trying
to
do,
because
the
cross
gradients
like
it's
not
necessarily
encouraged
gradients
to
be
at
the
same
location.
It's
encouraging
gradients
to
be
in
the
same
direction
are
anti-parallel
to
each
other
right
like
it's.
Unless
they're
saying
they
have
to
be
at
the
same
spot,
they
just
oh.
I
want
the
gradient
this
way
and
the
gradient,
and
then
the
gradients
of
the
two
things
should
be
in
the
same
direction.
D
Only
when
you
start
thinking
about
it
in
terms
of
like
sparseness
like
if
it's
like
a
sparse
solution,
then
it
kind
of
gets
a
little
bit
different,
like
even
with
the
sparse
solution,
though,
if
both
models
are
just
like
all
l
ones
or
l,
zero
solutions,
and
it's
just
no
positives
and
negatives
everywhere.
It's
like
okay,
well,
either
the
gradient's
changing
in
the
same
location
or
it's
not
so
like
in
the
sparse
norm.
I
L
I've
just
been
playing
around
with
some
gravity
inversions
this
last
week,
and
I
don't
know
I'm
getting
some
kind
of
weird
behavior
when
I
put
bound
constraints
on
the
inversion,
just
kind
of
some
really
unstable
behavior
that
I
need
to
try
to
get
to
the
bottom
of.
I'm
not
really
sure
why
it's
happening
like
I've
even
moved
from
my
field.
L
So
there's
just
some
some
funny
things
that
I'm
going
to
have
to
try
to
investigate
a
little
bit
further
and
try
to
try
to
get
a
grip
on
because
they're
not
making
a
whole
lot
of
sense.
Right
now.
Did
you
like.
I
Your
balance,
sometimes
it's
just
if
you
like,
unlock
connectivity
right.
You
need
to
make
sure
that
you're
in
the
same
space,
for
your
balance,.
L
L
L
If
I
allow
it
to
put
well
it
just
it
really,
I
don't
know
I'm
gonna
have
to
dig
into
it
further
because
it
just
really
doesn't
make
any
sense
right
now.
I
think
I
have
to
allow
my
upper
bound
at
depth
to
put
positive
density
constraints,
and
then
it
kind
of
seems
to
give
me
a
more
reasonable
result,
but
it
doesn't
really
put
that
much
positive
density
material
into
the
model
at
depth.
L
It
just
seems
like
it
needs
to
have
the
option
to
do
that,
which
I
don't
understand,
but
I
don't
know
because
if
I
just
do
the
inversion
without
any
bounds,
if
I
just
let
it
be
positive
and
negative
infinity
for
my
bounds,
the
model
that
comes
out
I
mean,
looks
somewhat
reasonable
and
it
doesn't
necessarily
have
density
values
that
are
beyond
what
I
would
expect
in
the
the
true
model
or
what
I'm
trying
to
put
in
as
the
bounds.
D
Kind
of
a
question
I've
never
really
gone
through
and
looked
at
the
optimization
code
for
the
like
bounded
problems.
I
know:
are
you
using
the
projected
conjugate
gradient
or
rejected
gaussian
solver.
L
F
D
F
D
Ideally,
the
way
that
you
should
do
like
a
projected
gradient
algorithm
is
you
do
it
like?
You
calculate
the
gradient
and
then
you
project
along
that
path,
and
then
you
do
a
conjugate
gradient
solve
on
that
like
face
of
the
bound
and
that's
like
you're
alright.
So
I
was
just
need
to
go
through
and
look
at
it
and
see.
F
D
It's
kind
of
like
what
you
can
imagine
it
doing.
It's
like
the
conjugate
gradient
on
a
gravity
solution
right.
It's
always
going
to
point
to
the
same
spot
like
if
you
it's
like,
okay.
Well,
I'm
going
to
go
here,
that's
the
minimum!
So
I'm
always
every
time
you
go
around
you're
just
going
to
keep
pointing
back
to
that
same
minimum.
So
you
end
up.
L
I
haven't
looked
at
the
code,
yes
yeah.
I
wouldn't
be
surprised
if
there
isn't
something
kind
of
funny
like
that
happening,
because
what
I
was
trying
to
do
is.
I
was
trying
to
set
like
one
set
of
bounds
for
like
kind
of
the
top
six
kilometers
and
then
a
different
one
like
below
that,
so
that
I
could
allow
for
some
like
low
density
melt
to
be
in
that
lower
crustal
area
and
when
it
does
kind
of
go
off
the
rails.
L
It
just
has
a
tendency
to
make
like
most
of
the
stuff
in
that
near
surface,
the
top
six
kilometers.
It
takes
all
that
and
pushes
it
all
the
way
up
to
like
the
maximum
density
bound,
and
then
it
tries
to
put
in
a
lot
of
like
really
low
low
density
material
in
the
in
the
deeper
cells.
So
it
kind
of
shoves
those
two
to
their
opposite
bounds
and
then,
like
it's,
not
fitting
the
data
at
all.
D
L
Okay,
yeah!
No,
I
don't
I've
gone
through
that
a
few
times
just
to
try
to
sanity,
check
things,
and
I
haven't.
L
C
Mike
make
sure
your
initial
gas
is
greater
than
your
bounds.
I
think
that's,
usually
the
typical
like
so
you.
Oh,
we
start
from
zero.
Then
your
bounds
is
zero.
So
if
that's
the
case,
I
think
inversion
could
have
a
hard
time
so
just
use
slightly
larger
value
like
one
minus
four,
or
something
like
that.
That
probably
the
case
other
than
that,
I
didn't
have
much
problem
with
about
like
that.
Projected
cosmetic
code.
That
kind
of
works
well.
L
C
L
C
If
it
seems
like,
then
I
would
remove
the
bounds,
because
sometimes
what
I
see,
if
you
put
the
bounds,
that
kind
of
like
reduce
your
sort
of
domain,
that
you're,
like
a
great
step
like
your
search
step,
go
far
away
because,
like
sometimes
you,
you
need
to
go
far
away
and
then
go
back,
but
putting
bounds
somewhat,
like
kind
of
reduce
that
ability
to
wander
around.
So
I
think
as
a
first
shot,
because
it
seems
like
you're
not
even
posing
the
positivities
and
stuff.
C
L
L
L
D
C
Yeah
I
was
curious
like
how
people
like
a
in
in
ubc
code.
Do
we
have
a
face
constraint?
C
Is
that
either
dev
or
like
a
do?
We
have
a
phase
constraint
and
then,
if
we
have
what
does
that
actually
mean
in
in
the
ubc
code,
when
you
face
weights
like
a
phase
constraint,
do
we
have
do
we
have
something
like
that?
Well,
we.
D
C
I
C
I
see
so
I
see
so
like
you
would
like.
Let's
say
if
you,
if
you
know
that
so
suppose
you
got
a
seismic
section,
I
want
to
put
a
jump.
Okay,
I
don't
know
the
values,
but
I
just
want
to
put
a
jump.
So
is
that
like
so,
then
you
put
zero
yeah
you
put
zero.
C
Then
you
have
your
hard
edge
exactly
I
see
so
it's
basically
just
like
the
weights
is
the
number
of
face,
and
then
you
can
put
the
values
exactly
nice
yeah
and
then
it
seems
like
we
don't
have
that
implementation
in
this
impact
code
is
that
right.
That's.
I
Correct
right
now,
the
cell
weights
are
just
average
to
their
average
to
the
faces.
We
don't
have
we're
not
making
a
distinction
between
between
between
cell
weights
and
face
weights
yeah
right.
C
I
The
regularization
those
cell
weights
are
passed
to
each
of
the
of
the
the
functions
the
sub
functions
right,
so
you
would
need
to
basically
loop
over
all
your
sub,
your
suburbanizations,
you
know
for
the
derivative
terms
and
just
overwrite
the
cell
weights.
There
basically.
I
D
C
Yeah,
I
I
kind
of
did
implement
it,
but
I
was
just
kind
of
curious
like
what
how
other
people
are
using.
So
I
had
a
similar
situation.
I
I
know
the
interface,
so
I
wanted
to
force
the
inversion
to
put
a
jump.
So
it's
just
kind
of
curious.
What's
what's
kind
of
typical
way,
that's
implemented.
D
I
You
can
discuss
how
easy
as
it
is
right,
because
if
we
start
having
to
put
weights
on
the
actual
faces,
then
you
need
to
know
the
indexes
of
those
spaces
which
can
be
a
little
bit
a
little
bit
hard.
I
don't
know,
maybe
it's
just
better
to
you
know,
give
it
the
model
and
then
compute
do
the
do
the
weighting
from
the
model.
I'm
not
sure.
C
I
had
a
specific
kind
of
interest
like
so
I
wasn't.
Even
thinking
like
my
face
weight
was
basically
one
and
zero,
so
the
specific
interest
that
I
had-
okay,
I
know
the
interface
like
I
got
a
surface
in
such
a
case.
I
think
of
finding
whatever
surfaces
like
closer
to
that
interface
seems
not
too
hard,
so
yeah,
so
you
basically
just
put
zeros
in
one,
then
what
the
inversion
does
it
allows
a
jump
on
that
interface,
so
that
was
my
major
use.
D
I
That's
what
I
did
previously
just
something
like
that.
I
think
so.
You
tell
us
to
go
from
a
triangulated
surface
to
to
the
the
mesh
the
mesh
faces.
Basically,.