►
From YouTube: SimPEG meeting October 28, 2020
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
Perfect
so
now
we're
all
live,
and
so
what
we
usually
do
is
do
sort
of
just
a
quick
rounds
when
everyone
gets
a
chance
to
sort
of
say
what
they've
been
up
to
and
what
they're
interested
in.
A
If
they've
got
any
questions
or
sort
of
longer
discussion
items,
and
so
actually,
maybe
today
we
could
start
off
with
the
new
faces
in
the
crowd
and
if
you
want
to
just
kind
of
share
what
you're
interested
in
in
sort
of
connecting
on
what
you'll
be
working
on
and
sort
of
yeah.
So
how
you?
How
simple
can
in
is
used
in
your
work
and
how
you
might
want
to
contribute
down
the
line
so
jaja?
Would
you
like
to
kick
us
off.
B
Oh
yeah
sure
hi
everyone,
I'm
jason,
I'm
currently
assistant
professor
at
university
of
houston.
I
did
my
phd
degree
at
college
school
mines
with
yoga,
so
currently,
I'm
working
with
my
graduate
students,
mostly
on
joining
version.
So
that
is
my
biggest
interest
and
also
the
focus
of
my
research
here
at
ua.
So
both
charlotte
and
xinium
are
my
pg
students.
They
are
working
on
joining
version,
but
we
want
to
do
more
than
just
joining
version.
We
wanted
to
extend
a
geophysical
version
to
the
construction
of
3d
quasi
geology
model.
B
So
basically
we
want
to
go
from
geophysics
and
to
all
the
way
to
the
emergency
interpretation,
classification
and
all
the
way
to
to
our
final
product.
That
shows
a
3d
spatial
distribution
of
geological
units.
So
that's
a
big
focus
of
our
work
and
we
have
been
using
simpac.
We
have
benefited
so
much
from
simpac,
both
xin
yin
shalom
jay,
who
are
who
already
graduated
and
also
kenneth
he's
another
of
my
master's
students
we're
all
using
simpac.
B
So
thank
you
so
much
for
for
this
such
a
wonderful
two
framework
upon
which
we
we
can
build
our
work.
It's
greatly
accelerated
our
work
and
also
make
our
work
much
more
reproducible.
B
So
we
are
here
because
we
have
developed
some
joint
inversion
codes,
and
so
I
think
we
have
tested
in
quite
a
few
data
sets,
so
I
think
we
maybe
yeah
we
wanted
to
to
to
see
how
we
can
contribute
our
piece
of
code
to
to
simpac.
Also,
of
course,
at
the
same
time,
we
want
to
learn
more
about
simpac.
A
Excellent
thanks,
jojo
and
correct
me
if
I
mispronounce
your
names,
xiaolong
or
sheen
would
either
of
you
like
to
jump
in
and
sort
of
add
or
if
both
of
you
are
willing
to
just
a
bit
of
what
you're
working
on
and
maybe
how
you're
using
something.
C
Yeah
sure
so
I
can
go
first.
My
name
is
xin
yin,
I'm
working
with
dr
jajasun,
I'm
also
using
simpac
and
one
of
our
colleagues.
Oh
sorry
kim.
He
already
developed
a
part
of
the
coupling
method
which
is
cross
gradient
into
the
same
pack,
so
I'm
using
his
coupling
term
to
do
a
lot
of
tests
right
now,
I'm
doing
the
synthetic
modeling
to
do
the
jointing
versions.
C
Also
using
the
mixed
rp
norm,
I'm
using
l12
norm
a
lot
to
do
the
joint
inversions
current
stage.
I'm
testing
the
synthetic
modeling
different
model
pairs,
different
model
scenarios
to
test
the
recoverability
of
the
physical
property
relationships
from
based
on
the
scatter
plot
of
the
inverted
values.
The
next
stage.
We
hope
to
apply
this
method
along
with
zinpag
into
the
field
data.
Specifically
it's
in
the
canada,
the
quest
project
using
the
album
magnetic
data
and
also
the
gravity
data
set
to
do
the
don't
mix
the
r1
r12
inversions.
D
Hi
hi,
my
name
is
xiao
longway
from
university
of
houston,
I'm
ph
students
working
with
dr
tedderson,
so
the
scope
of
my
research
is
a
journal
inverter
and
uncertainty
analysis.
So
currently
I
do
the
joint
inverter
and
analyze
on
strategies
in
the
deterministic
framework
and
in
the
future,
maybe
start
from
the
next
year.
I'd
like
to
implement
the
joint
inverter
and
analyze
uncertainty
in
the
basin
basin
framework.
So
I
benefit
a
lot
from
this
impact.
D
E
E
I
guess
I'm
just
starting
to
get
back
into
some
big
stuff
after
a
while
hiatus.
Here
I
just
started
a
new
product
project
with
the
u.s
geological
survey
and
hopefully
going
to
be
using.
E
A
E
Yeah,
okay,
I
don't
know
what
I
changed,
but
so
I
guess
I
just
started
a
new
position
with
the
us
geological
survey
and
I'm
we're
going
to
be
using
mt
gravity
and
some
passive
seismic
data
to
try
to
study
a
volcanic
field
kind
of
at
the
north
end
of
the
napa
valley
here
in
california.
E
A
Perfect,
okay,
I
guess
before
we
move
on
to
everyone
else,
does
any
folks
who
are
sort
of
rejoining
or
first
joining.
Do
you
have
any
specific
questions
or
things
that
you
want
to
talk
about
today
or
just
sort
of
tune
in,
and
then
we
can
bring
questions
next
time.
A
No
okay,
that's
fine!
Then,
let's
start
around
around
the
room
and
if
folks
want
to
provide
updates
and
maybe
just
a
bit
of
context
of
what
you're
working
on
too
for
all
the
new
phases.
So
I
see
joe's
got
some
notes
in
there.
You
want
to
kick
us
off.
G
Sure,
as
I
was
playing
around
this
last
week
with
stuff,
I
noticed
that
numpy
is
switching
their
document
like
they're
the
way
it
looks
when
you
go
to
it
like
the
documentation,
page
they're,
updating
it
to
this
pi
data.
Sphinx
theme,
like
the
written
style,
is
still
the
same.
I
posted
about
it
on
in
slack
really
quick,
but
it
the
theme
looks
really
nice
and
clean
and
I'll
share
a
little
bit
of
it
here,
really
quick.
G
I
I
it's
a
it's
a
theme
that
seems
to
be
used
as
a
from
the
pi
data
community,
which
is
a
big
open
source
community,
focusing
on
using
python
for
data
science
and
stuff.
So
I
think,
it'd
probably
be
a
good
thing
that
they're
trying
to
you
know
push
people
to
write
a
common
interface.
I
guess
so
it
could
be
a
good
thing
and
I
think
we
should
use
it.
G
A
So
it's
basically
like,
in
a
sense,
a
collection
of
like
projects
and
developers
who
work
broadly
in
the
under
the
umbrella
of
data
science,
and
so
projects
like
numpy,
scipy,
scikit,
learn
matplotlib
all
sort
of
fall
under
the
pi
data
umbrella.
I
don't,
I
think
I
don't
know
if
there's
like
an
official
project
listing,
but
there's
also
been
sort
of
like
community
meetups
and
things
like
that.
A
G
A
G
G
A
Excellent,
how
much
of
a
lift
do
you
think
it
will
be
to
move
over
to
that?
I.
H
One
thing
to
look
at
further
is
that
those
themes
sometimes
don't
support
some
features
like,
for
instance,
I
got
stuck
on
that
used
trying
to
put
the
embed
flatly
stuff
in
the
in
our
pages
and
you'd
only
work
on
alabaster
those
kind
of
things
right,
but
I
don't
think
I
don't
think
we
were
doing
anything
too
fancy
with
snapdragon.
Are
we?
If
you
accept
the
tiles?
I
don't
think
it's.
I
think
it's
freeze-down.
G
I
think
we
can
test
it
out,
but
it's
just
it's
a
much
cleaner
interface
looks
much
more.
You
know
2020
instead
of
2010
2005,
something
like
that.
Let's
see
what
else
was
I
gonna
say
I
put
up
a
pr
on
geoanna
about
that
adds.
The
kernel
functions
that
we've
used
in
the
simpeg
e
m1d
to
that
as
well,
as
it
also
adds
a
really
really
simple:
just
layered
em
salt,
like
layered
em
solution,
so
it's
actually
using
the
kernel
inside
synthetic
or
inside
geoana.
G
G
Just
kinda
asking
about
the
empire
modern
filters
and
stuff,
because
I
can't
figure
out
how
to
get
the
empire
mod
solutions
to
match
with
the
theoretical
stuff
from
morgan
homan.
I'm
not
quite
sure
why,
like
I,
I've
got
the
word
home
and
stuff
into
one
I
copied
most
of
it
over
from
what
was
before
and
double
checked
some
things
on
the
components.
I
think
there's
a
little
bit
of
a
sign
off
on
a
couple
of
them,
but
it
all
the
components
now
check
out
xyz
for
everything.
G
G
Obviously,
the
z
component
is
not
too
much
of
an
issue
because
okay,
it's
polarized
in
whatever
direction
the
dipole
is
polarized
right,
but
I
feel
like
okay,
maybe
the
z
component
from
the
x
polarization
might
be
backwards
or
something
possibly
so
I
we
need
to
that's.
I
was
hoping
to
double
check.
It
was
something
external
to
the
word
homan,
because
their
coordinate
system
is,
you
know,
speed
down,
z,
positive
down.
I
Then
would
it
be
better
to
test
with
the
pde
code
rather
than
employment,
because
I
we
don't
know
vm
pymod
and
synthetic
yeah.
G
Probably
tested
I'll
test
it
with
the
you
know,
with
the
synthetic
code
just
to
double
check,
and
if
we
get
the
you
know
the
sign
conventions
right
with
that
compared
to
the
the
layered
solution,
then
I'll
feel
much
more
comfortable
because
it's
like,
oh,
my
god,
I'm
concerned
about,
is
whether
it's
got
the
right
sign
of
it.
I'm
not
necessarily
concerned
about
its
accuracy.
At
that
point,
I
just
want
to
double
check
that
it's
got
the
right
sign.
J
J
G
J
G
J
F
L
Okay,
I
wouldn't
I
wouldn't
be
like
100
percent,
confident
in
things
that
are
associated
with
signs.
I
think
there
were.
I
seem
to
recollect
that
there
were
one
or
two
places
within
word
and
home
and
that
you
know
maybe
signs
were
not
right.
So
I
think
soggy
suggestion
is
perfect.
Let's,
let's
just
deal.
L
I
That's
that's
the
cleanest,
there's
quite
a
few
places
in
the
one
that
also
can
go
wrong
because,
especially
like
like
time
domain
like
transform
then
depends
upon
what
your
waveform
is.
The
same,
like
sign,
could
change
and
then
yeah.
I
I
think
it's
yeah.
It's.
G
I
G
G
I
think
I
I'm
sure
it's
it's
right
now,
I'm
sure
that
they're
matching
each
other
for
each
of
the
like
the
half
space
solution
from
modern
home
and
is
matching
with
the
like
the
layered
solution.
G
In
all
the
cases
that
I
have
right
now.
I
just
want
to
double
check
that
their
signs
are
right.
G
I've
gotten
so
anyway,
the
the
fast
the
faster
kernel
functions
are
on
there,
as
well
as
the
faster
derivatives.
The
way
I
implemented
it
was
the
there
are
numpy,
there's
a
numpy
version
of
the
same
function
and
it's
tested
against
that.
The
numpy
and
the
scython
sequels
plus
thing
produce
the
exact
same
result.
It's
just
the
way
it
works
is
that
okay,
you
would
have
to
specifically
tell
it
to
build
those
external
libraries
before
you
ever
called
the
setup
of
the
python
setup
file.
G
That
way
we
could
still
distribute
the
source
code
on
pip
and
people
can
just
pip
install
it
without
worrying
anything
about
it
would
still
be
a
pure
python
code
and
actually,
at
this
point,
those
two
I
rewrote
it
in
numpy
and
essentially
it's
the
same
speed
as
the
psython
staples
plus
one
that
I
wrote,
but
it
obviously
it
uses
more
memory
to
do
that.
G
Here,
but
that
you
mentioned
the
sensitivity,
calculation
was
actually
faster.
Oh
yeah,
the
sensitivity
calculation
is
much
faster.
It's
able
to
calculate
the
sensitivity
with
respect
to
the
you
know,
susceptibility
the
connectivity
and
the
thicknesses
all
at
the
same
time,
for
the
cost
of
about
two
forward
models.
I
That's
a
that's
a
that's
interesting
because
it
means
like
spectralization
in
numpy
sort
of
enough
to
replace
lower
level
code.
Is
that
the
kind
of
conclusion.
G
G
G
So
the
way
the
testing
works
is
the
first
stage
it
does
like.
Okay
well
for
like
a
pr
essentially
would
test
everything,
but
it
wouldn't
deploy
anything
like
normal
and
then,
when
it
gets
merged
into
master
and
tagged
it'll
test
everything.
And
then,
if
it
all
tests
and
it's
a
tagged
release
that
will
do
the
extra
steps
of
building
a
source
code,
distribution,
uploading,
the
documentation
and
building
windows
wheels
for
each
363738
wow.
G
Yeah
I
figured
out
I
found
out
where
it's
going
wrong
on
the
simpeg
em
tests,
but
I
I
still
don't
know
why
I'm
still
trying
to
double
check
on
how
to
fix
it.
It's
a
really
weird
memory
problem.
That's
happening
inside
a
function
in
geoana
that
should
never
like
it.
There's
no
reason
for
it
to
happen
all
of
the
matrix,
all
the
I
don't
know.
What's
going
on
with
it,
it's
happening.
F
A
G
A
Nicely
done
very
nicely
done,
for
you
are
on
geoanna.
What
do
you
need
for
that?
Do
you
want
folks
to
test
some
things
review?
What
what
would
you
like
from
the
group.
G
I
G
A
you
basically
just
set
an
environment
variable,
and
it's
detailed
in
there
and
that'll
tell
it
to
include
this
python
and
source
files,
I'm
pretty
sure
I'll
be
able
to
have
the
conda
forge
installation.
Do
it
automatically.
G
A
J
Yeah,
so
I
think
maybe
the
first
thing
kind
of
most
important
is
there's
a
pr
hanging
out
there
for
improved
stability
of
the
mag
problem.
There
is
something
with
the
travis
test
that
seems
to
be
failing
on
the
frequency
domain.
Em,
I'm
noticing
in
all
the
pr's,
but
basically
this
one
for
mag
has
been
vetted.
It's
been
approved,
but
it's
gonna
fail
on
something
completely
unrelated
to
what
it's,
what's
being
added,
so
kind
of
hoping
that
this
issue
can
get
addressed.
So
we
can
bring
it
in.
J
H
The
radiometry,
what
the
what
joe
asked
about
did
you
check
if
the
the
same
problem
happened
on
the
graph
on
the
gg?
Oh
magnetic
radiometry,.
J
Okay,
no
I'll
I'll
do
that
test.
So
I
guess
there
is
one
more
thing
that
I
could
do,
but
yeah
I'd.
G
J
G
The
other
stuff,
the
error
happens
in
like
a
code,
a
call
to
some
geoana
function
so
that
the
error
is
happening
inside
geoana,
but
there's
still
no
reason.
It
should.
J
J
All
right
well.
F
M
J
G
I
can
show
you
I
can
tell
I'll
show
you.
I
can
I'll
write
the
exact
line
that
it's
failing
on
you're
it's
into
you
and
if
you
go
to.
A
G
J
J
J
I
F
I
F
J
I
I
Yes,
I
wasn't
sure
like
using
a
10.
Fd
is
like
a
sort
of
relevant
use
for.
J
That
yeah,
but
we
should
definitely
implement
the
octree
capability.
I
Yeah
that
requires
few
other
things
so
setting
up
that
boundary
condition
is
not
implemented
in
the
tree,
but
I
think
a
joe
got
most
of
the
pieces.
So
we
all
we
need
to
have
is
an
ability
to
grab
the
boundary
faces
and
the
boundary
cells
and
stuff.
I
think
we've
already
got
that
yeah.
J
Yeah
so
I
mean,
if
that's
not
too
challenging,
and
I
have
the
pieces
to
to
do
it-
then
I'll,
try
and
implement
that
and
and
validate
it
with
the
the
tests
that
we
have
and
then
kind
of.
One
of
the
final
things
would
be
then,
to
just
come
up
with
a
couple
of
good
tutorial
examples
for
maybe
the
forward
problem
and
and
the
inverse
problem.
J
Progress
is
being
made
on
the
on
the
sp
stuff
and
the
last
thing
that
I've
been
working
on
is
with
the
dcip
so
la
I
don't
have
all
up
now,
but
last
last
week
I
sort
of
I've
been
running
kind
of
these
validations
of
our
code
against
other
coding
packages
and
ubc
has
a
dcip
octree
code
that
the
electric
potential
is
discretized
at
cell
centers
and
it
uses
kind
of
a
like
a
primary
secondary
approach,
so
you're
using
an
analytic
solution
to
to
give
yourself
the
electric
potential,
and
then
you
use
a
discretized
operator
to
get
your
right
hand
side,
and
then
you
put
in
your
your
troop
you're,
your
actual
conductivity
model
you'll
solve
that
system
to
get
get
the
solution
that
you
want.
J
So
the
way
that
they've
implemented,
it
seems
to
do
very
well,
and
I've
noticed
that
there's
maybe
some
questions
we
have
with
how
we
are
discretizing
our
source
term
and
potentially
interpolation,
and
whether
or
not
something's
being
different.
So
I'm
I'm
kind
of
looking
at
the
way
that
simpeg
is
solving
this
problem,
either
with
the
electric
potential
on
nodes
and
at
cell
centers,
looking
at
how
it's
creating
the
right
hand,
side
and
just
kind
of
playing
around
with
that
to
figure
out
the
what's.
J
What's
best
so
as
of
right
now,
if
you,
if
you're
putting
the
sources
at
arbitrary
locations
and
the
receivers
at
arbitrary
locations-
and
you
simulate
the
half
space
response,
you're
you're
going
to
get
some
errors
that
are
they're,
not
massive,
but
depending
on
your
your
cell
size.
I
think
we
can
do
better
and
I
think
we
should
do
better.
J
So
I've
just
been
playing
around
with
that.
I
J
10
error
and
I'm
just
trying
to
figure
out,
what's
what's
the
biggest
source
of
it.
So
one
thing
I
played
around
with
at
least
for
simpeg:
the
nodal
formulation
does
better
and
I've
tried
playing
around
with
some
things
to
take
the
the
source
term
and
discretize
it
at
the
nodes,
and
so
I
think
right
now
it
does
a
very
basic
averaging
to
adjacent
nodes,
so
you're
somewhere
in
the
cell,
and
then
it
just
averages
very
rudimentary
to
the
nodes
and
I've
started
playing
around
with
kind
of
a
awaiting
it.
J
So
it's
the
you
know
the
one
over
the
distance
or
one
over
the
distance
squared
and
got
a
little
bit
of
improvement.
Now,
I'm
wondering
if
the
interpolation
matrix
from
the
nodes
to
the
receivers
is
being
done
differently
in
that
fortran
code.
J
So
I
I
still
have
to
do
some
some
tests,
where
I'm
you
know
putting
electrodes
on
specific
locations
and
really
trying
to
investigate,
what's
happening
at
each
step,
but
I
mean,
as
of
as
of
right
now
I
had
a
mesh
that
I
thought
was
pretty
reasonable
and
the
ubc
code
will
give
me
errors
that
are
maybe
one
percent
or
two
percent
compared
to
an
analytic
solution
and
simpeg
right
now
is
more
like
that.
J
I
J
So
these
are
the
tests
that
I'm
just
about
to
go
and
go
and
do
right
now,
I'm
just
doing
it
in
comparison
to
a
the
manual
example
that
we
used
for
dcip
auction,
so
the
the
ubc
code,
so
that
sort
of
motivated
this.
But
now
we
actually
get
to
okay.
Let's,
let's
do
this
for
real,
let's
really
figure
out
where
the
differences
are
and
really
make
a
a
good
test
and
start
to
to
dive
into
it.
But
I
think
it
would
be
pretty
useful
and
one
thing.
I
Kind
of
just
to
be
fair
would
be
nice
to
test
for
the
layered
solution,
dev,
because
what
the
uvc
code
does.
Actually
you
know
that
half
space
solution
and
force
the
simulation
to
be
like
that,
at
least
for
the
half
space
yeah
yeah
you're,
going
to
get
absolutely
good
like.
J
I
J
J
Actually,
a
video,
so
that's
that's
about
it
for
for
me,
I
think
I've
kind
of
joe-
and
I
maybe
at
some
point
in
the
future,
we'll
we'll
continue
on
the
documentation
for
discretize,
but
these
are
sort
of
the
three
things
I've
been
working
on.
E
Devin
one
one
more
thing
about
the
dci
p
octree
stuff,
just
keep
in
mind
that
there
is
some
assumptions
being
made
when
they
create
the
right
hand,
side
that
didn't
work
at
all.
When
I
was
trying
to
do
mine
underground
simulations.
J
Yeah
right
now,
this
is
for
a
surface
survey
and,
as
I
recall
once
you
voice
those
concerns,
we
fixed
them
recompiled
and
have
released
a
new
version
of
that
code.
L
J
E
It
could
be.
I
just
remember
that
when
I
had
talked
with
roman
at
the
time
when
I
was
having
the
problems,
he
didn't
really
know
how
to
fix
it,
and
he
didn't
seem
like
there
was
a
lot
of
impetus
to
fix
it
either.
So
I
didn't
know
that
he
was
planning
on
doing
that
or
not
so
just
keep
you
don't.
I
don't
don't
need
to
do
anything.
Just
keep
that
in
mind
that
there
may
be
a
limitation
there.
You
may
run
into
weird
things.
H
Yeah
sure
that's.
A
Maybe
one
we
can
regroup
on
next
time
if
we
need
to
okay
cool.
Thank
you
all
right,
john.
I
see
you've
got
some
notes
in
there.
M
Yeah
yeah
they
got.
I
got
the
derivatives
passing
for
the
resistivity
in
the
phase
yeah,
but
that's
now
the
odd
joint
and
yeah
I
I'm
at
a
loss.
I've
tried
everything.
I've
pulled
that
apart
reformulated
it-
and
I
just
I
I
don't
know
it's
just
not
passing
like
they're
on
the
same
order
of
magnitude
like
what
comes
out
of
it
but
yeah.
I
just
I
don't
know
I've
I've
tried
yeah,
I
don't
know
I
had
I.
M
I
need
some
material
or
something
to
be
looking
at
the
point
of
how
how
this
was
done.
I
guess
in
the
first
place
with
just
like
the
impedance
like
I,
I
see
how
it
was
formed,
but
it's
like.
Where
did
this
come
from,
and
how
can
I
use
this
idea
because
just
having
it
broke,
we're
having
to
break
the
derivative
into
the
real
and
the
imaginary?
I
M
M
Phase
I'm
using
the
test,
I'm
using
it
like
an
empty
problem
that
I
made
up
and
then
I'm
also.
F
M
Using
the
the
test,
one
on
under
test
as
well,
okay,
yeah
and
both
of
them
are
giving
me
the
same
thing.
It's
like
it
looks
like
it's
almost
there
and
it
runs,
and
it's
like
all
the
matrices
are
the
right
sizes,
but
yeah
the
values
just
aren't
yeah.
I
M
A
Jump
in
and
take
a
look
if
we
can
think
through
this
together,
adjoints
can
be
such
a
pain,
so.
M
Yeah,
I
think
it's
just
I'm
missing
like
the
whole
idea
of
adjoint,
like
I
I'm
just
not
too
familiar
with
how
that
is
formulated,
so
yeah,
maybe
something
like
that
will
help
but
yeah
and
then,
after
that
I
think,
yeah.
We
need
the
two
and
a
half
d
solution.
I'd
really
like
to
work
on
that,
and
someone
could
point
me
at
like
like:
would
it
just
be
warden
holmen?
I
There's
no
two
and
a
half
d
and
empty
it's
2d.
Oh
sorry,
you
don't
have
to
do
two
and
a
half
d.
I
actually
played
a
little
bit.
I
can
point
to.
I
can
share
my
notebook
later
yeah,
so
yeah
it
it's.
M
M
I
think
the
3d
problem
isn't
going
to
be
too
accessible
until
we
get
this
like
tiled
version
kind
of
going,
and
then
we
can
invert
over
multiple
frequencies
because
just
yeah,
it's
just
taken
a
long
time
to
invert
and
compared
to
some
of
these
other
codes,
like
they've
they've,
figured
out
how
to
do
this
fast.
So
there's
maybe
do
go
from
2d
and
then
yeah.
The
3d
will
start
doing
some
top
we'll
go
right
into
the
tiling
over
frequencies
and
see
how
that
works.
I
Yeah,
that's
great.
I
I
think
I
have
a
little
bit
of
write-up
and
example
code
that
I
wrote
I
haven't
tested
yet,
but
I
can
point
to
that.
It
seems
like
you
got
pretty
big
motivation
to
develop.
M
So
that's
great
yeah!
It's
we
just
got
into
a
like
we're,
starting
to
get
a
couple
jobs
and
what
was
working
people,
of
course,
when
you
test
everything
before
it's
working
and
then
once
you
put
into
production,
it's
all
falling
apart,
so
yeah!
So
definitely
I
I
can
put
some
time
into
this
and
I
need
to
get
back
to
the
dc
tiling
problem
so
we'll
just
after
I've
done
this
mt,
adjoint
stuff
I'll,
just
go
in
and
do
dcn
frequency
all
right
dc
and
mt
into
the
tiling.
A
Thanks
sean
yeah:
well,
we
can
definitely
iterate
on
the
adjoint
and
having
a
2d
solution.
I
think
will
be
super
useful.
I
don't
know
if
you
watched
so
on
mtnet
there's
now
a
series
of
webinars
yeah.
I
missed
the
first
yeah,
but
they're
they're
recorded,
which
is
great.
You
do
want
to
catch
up
on
it
afterwards.
They're
now
up
on
youtube.
A
Oh
awesome,
yeah,
but
so
martin
nunsworth
gave
the
first
talk,
and
it
did
sort
of
underscore
to
me
that
you
know
having
a
2d
code
would
be
really
valuable
because
he
showed
you
know,
results
where
they
did
both
2d
and
3d
inversions,
and
it's
just
nice
to
be
able
to
compare-
and
you
know,
see
that
you're
getting
similar
structures
and
builds
a
little
more
faith
in
your
results,
and
things
like
that.
So.
A
Excellent
thanks
john
dumb.
H
Elbow
deep
in
desk
again
learning,
I
thought
I
knew
what
I
was
doing
and
then
turns
out
that
I
don't,
but
I
think
I'm
getting
closer,
just
learning
how
to
deal
with
futures
right,
passing
futures
and
scattering
data
and
all
that
stuff,
but
I
think
getting
closer
and
yeah.
That's
about
it!
That's
it.
A
If
you
do
have
like
notebooks
as
you
learn
stuff
or
like
even
code
snippets,
you
want
to
share
around
I'm
sure
that
would
be
useful,
but
also
no
pressure.
If
you're
still
sort
of
in
the
figuring
things
out
face
cool
soggy.
I
I
have
nothing
really
particular
in
these
days
for
developments,
but
I
like
I
just
want
to
appreciate
what
joe
did
it's
so
I'm
mostly
using
in
these
days
and
so
joe's
efficient
improvement,
made
a
will,
make
a
huge
improvement
on
my
end,
because
I'm
running
lots
of
airborne
inversion
and
as
well
as
joe
implemented,
this
sensitivity
for
thickness
as
well
as
the
susceptibility,
so
that
kind
of
gives
us
another
ability
to
have
more
inversions
like
we
can
put
the
susceptibility
in
when
we're
inverting
for
or
you
can
invert
for
both
the
resistivity
and
the
thickness
value.
I
So
you
could
do
like,
let's
say
five
layers
and
then
try
to
find
thickness
and
resistivity
values.
So
that
would
be
really
nice
and
one
thing
and
kind
of
connected
to
that.
We
just
submitted
the
proposal
here
back
in
central
valley,
basically
flying
over
entire
california
and
we're
not
sure
where
whether
we're
gonna
win
or
not.
But
if
we
win
we're
going
to
use
simpek
extensively
to
the
entire
airborne
data
set
and
a
few
things,
we
suggest
a
few
things
about
handling
the
magnetic
data
as
well
as
the
3d
effect.
I
I
H
I'm
not
not
mvi,
they
they
have
a.
They
have
the
sorry
they're
computing,
the
the
secondary
fields.
They
can
do
like
the
self
self
demag,
but
not
not
full
vector
yeah.
What
do
you
have
in
mind
for
a
layer,
though,
because
vpng
is
not
actually
dealing
with
layers
right,
they're,
just
prisms
that
are
shifting
up
or
up
or
down,
or
do
you
want
to
parameterize
it
or.
I
But
I
think
I
would
I
would
do
a
similar
sort
of
idea
of
like
so
tensor.
It's
actually
simple,
you
got
a
tensor
mesh
and
you
got
a
2d
plane
and
each
cell
has
a
number
of
layers.
So
you
can
parameterize
that
as
a
layering,
but
that
we
can
put
some
sort
of
smooth
constraint
each
point
of
the
layer
top
and
bottom
it's
a
kind
of
similar
idea
in
the
airborne
code,
but
use
the
3d
mesh
to
parameterize.
That.
I
I
was
going
to
say
yeah
I
did
I
did
so.
I
actually
got
the
like
the
the
mapping
function,
but
I
think
at
that
point
I
used
the
dc
code
to
do
it
and
then
but
yeah.
It's
basically
done,
but
I
want
to.
This
is
better
kind
of
application.
I
So
what
we're
interested
in
this
thin
sort
of
lava
flow
in
california?
And
that's
like
it's
called
lip
joy
basalt
and
it
is
actually
randomly
magnetized.
So
you
can
see
this
random
signal
on
just
looking
at
the
data,
but
it'll
be
nice
to
map
of
that
thin
layer
and
as
well
as
the
random
direction,
and
I
think
it
could
make
a
pretty
big
impact
on
their
hydrogeologic
interpretation.
So
yeah,
that's
a
detail,
but
that's
sort
of
where
I'm.
L
I
I
think
that
in
general
this
set
there
is
like
a
mechanic.
There
is
flow,
it's
not
a
permeable
pathway.
So
what's
the
hard
part
is
like
just
using
resistivity,
you
have
no
chance
to
distinguish
that,
but
if
you
have
another
data
set
that
can
kind
of
distinguish
this
sort
of
like
a
lava
flow
or
whatever
mechanic
material.
I
That
kind
of
gives
you
an
idea.
Oh
where's,
like
this
boundary
impermeable
boundary
that
can
flock
your
hydrogen
hydraulic
pathway.
H
The
bottleneck
with
those
methods
is
always
just
to
generate
your
generator
3d,
like
your
your
model
right
go
from
a
surface.
If
it's
a
layer,
it's
not
so
bad,
but
if
things
are
complicated,
you
know,
if
they're,
if
it's
non-convex
layers
I
mean
that's,
that's
always
the
hard,
tough
part
to
start,
but
maybe
jennifer.
I
can
help
you.
L
L
If
you've
got
a
single
interface-
and
you
you
know
represent
that,
as
you
know,
some
kind
of
variable
sinusoidal
spine
function
that
should
be
that
should
be
quite
doable.
H
I
One
thing
probably
the
stuff,
but
one
thing
nice
about
that
layered
inversion
like
for
each
pixel.
You
got
different
permeability
like
a
magnetic
permeability
value,
so
I'm
not
inverting
for
a
single
value
for
entire
layer.
You've
got
lots
of
value,
so
within
the
layer
it
can
change.
It
can
vary.
So
even
if
you've
got
a
complicated
structure,
but
the
assumption
is
suppose
you
got
general
layering
where
the
lava
flow
happened,
and
then
I
want
to
sort
of
like
see
that
kind
of
variation
within
that
layer
anyway,.
G
I
K
K
So
what
I'm
trying
to
so
I'm
able
to
re
rerun
most
of
my
example,
but
I'm
very
I'm
running,
obviously
in
a
travis
issue
at
the
moment,
and
and
also
I
had
a
lot
of
things
on
my
branch,
so
I'm
trying
to
pass
out
all
the
things
that
I've
changed
that
are
not
related
to
pgi,
in
particular,
so
I've
created,
like
a
bunch
of
small
pull
requests
that
will
be
also,
maybe
also
like
easier
to
merge
and
might
also
help
in
fixing
the
travis
problems
later
on
on
the
on
the
on
the
on
the
main
pgi
branch.
K
So
hopefully
we
can
move
forward
with
that
quickly,
like
joe
already
helped
me
a
bit
with
some
stuff
but
yeah.
That's
it!
I'm
trying
to
also
reduce
like
the
pr
to
only
what
I
know
works
well
and
remove
all
the
like
experimental
stuff
that
I
tried
at
some
point
that
you
know
that
just
clutter
the
code,
but
that
will
still
be
like
a
fairly
big
pr,
so
yeah.
I
will
need
some
help
and
reviewers
of
the
code
for
that
and
some
help
with
with
the
trunk
with
the
travis.
K
L
Have
a
whole
new
sempei
community,
that's
in
houston
and
interested
in
geologic
inversion,
so
there
might
be
some
input
down
from
the
new
members.
B
B
We
cross
plots
the
inverted
values
at
this
point,
mostly
in
2d,
because
in
3d,
if
you
want
to
the
cluster
classification,
is
a
little
bit
more
tricky
more
complicated,
so
jay's
work
and
charlotte's
work
and
also
cnn's
work
are
mostly
concentrated
on
2d,
just
two
properties:
mag
grab
mac
and
then
what
we
do
is
that
we
write
a
very
small
piece
of
code,
python
code
that
allows
you
to
draw
interactive
to
a
polygon
and
then
all
the
points
within
that
polygon
can
be
mapped
into
3d.
B
So
that's
how
I
do
do
it
so,
but
I
always
wondering
if
there's
better
way
of
doing
that,
because
right
now
it's
it's.
It's
mostly
manually
select
the
polygon
and
you
do
so
it's
a
many
iterations
of
trials
and
errors,
because,
because
at
the
beginning,
you
probably
don't
know
how
to
cut
again,
partly
because
we
use
the
smoothness
inversion,
there's
a
lot
of
scattering.
So
you
see
a
lot
of
the
overlapping
overlapping
between
the
different
units
and
especially
especially
around
the
background
region.
B
Where
you
have
the
density
around
zero
celebrity
around
zero.
You
have
a
lot
of
black
dots
there
right,
so
yeah
next
step
would
be
to
see
if
some
kind
of
unsupervised,
machine
learning,
maybe
gaussian
mixture
models,
self-organizing
map
or
maybe
clustering-
can
help
automate.
That
and
with
the
help
of
the
awesome
inversion
algorithm
divided
by
developed
by
dom
mix
rp
right
now,
we
are
focusing
on
l
one
two,
because
we
found
that
our
one
two
does
cs
work.
B
Most
current
work
is
that
we
found
that
in
all
the
seven
or
six
scenarios
we
tested
our
one
two
nominee
version
joining
version
always
produce
the
best
physical
properties.
With
much
less
amount
of
scattering,
so
you
can,
you
can
recover
the
linear
features,
pretty
clean
and
much
much
cleaner
and
so
well
separated.
B
C
C
Oh,
this
is
actually
a
hard
one
that
I'm
gonna
use
for
the
sheriff
lecture.
I
haven't
finished
it
yet,
but
the
the
figure
that
dr
cena
are
just
talking
about.
I
think
it's
this
figure
over
here,
so
we
simulated
six
different
model
pairs
with
different
depth
range
and
also
different
values.
So
the
true
model
properties,
scatter
plot-
are
in
the
last
column
over
here
and
we
did
separate
inversions
in
both
our
two
norm
and
also
the
mixed
l1
tuner,
and
also
we
did
the
joint
inversions
as
well.
C
So
we
put
all
the
inverted
results,
scatter
plot
together
and
in
comparison
with
the
true
scatter
plot,
we
can
say
that
the
error
one
two
norm
joining
version
can
give
us
the
best
recovered
model
values
in
comparison
with
the
true
scatter
plot.
So
all
the
linear
features
are
very
much
differentiated
from
it
from
each
other.
C
Even
though
some
of
the
anomalous
object
is
very
deep
and
has
very
small
susceptibility
values,
but
with
the
help
of
the
our
one
two
norm
joint
inversion,
we
can
still
differentiate
the
two
linear
features
that
cannot
be
differentiable
in
the
l2
norm,
either
separate
inversion
or
20
version.
L
Oh,
I
was
just
going
to
say:
that's
that
looks
really
nice.
I
think
my
earlier
comment
was
that
tebow
was
sitting
at
a
place
where
he's
putting
up
the
the
pgi
inversions,
trying
to
put
some
pull
requests
and
and
get
people
to
review
it
and
pass
it,
and
you
know,
you've
got
a
whole
houston
group
here
that
is
kind
of
working
on.
You
know
the
same
kind
of
problem,
but
in
a
different
way.
So
you
actually
have
you
know
some.
L
You
know
mutual
goal
here
to
kind
of
work
together
and
to
you
know,
perhaps
develop
the
codes
within
simpeg
and,
ultimately,
maybe
even
you
know,
result
in
a
joint
collaboration
here
where
we
look
at
different
techniques,
for
you
doing
geological
inversion
or
quasi
geologic,
getting
out
quasi
geologic
models.
So
just
just
kind
of
kind
of
saying
that
there's
I
think
this
is
kind
of
one
of
those
those
ideal
opportunities.
L
You've
got
two
different
groups
of
people
working
on
the
same
problem:
everybody's
trying
to
use
simpag
it's
a
nice
format
on
which
to
yeah
kind
of
work
together
and
try
to
develop
something.
B
Yeah
yeah
sure
definitely
sure
I
agree
so
so
t-bot,
maybe
you
can.
We
can
talk
about
later
on
about
the
the
pull
request
that
pr
later
on
with
more
details,
sure
yeah.
I
No,
no,
not
not
particularly.
I
was
curious
about
that.
What
is
the
connection
in
between
physical
property
is
that
the
cross,
gradient
or
fuzzy
mean,
or
what
what's.
B
Up
so
right
now
we
just
purely
focus
on
structure
based
cross
yeah
got.
I
B
So
that's
that's
just
so
that
goes
back
to
the
common
doc
just
made
earlier.
I
think
so.
Right
now
we
are
just
working
on
structure
based
joining
version
and
tibot,
focusing
on
the
petrophysic
pgi.
So
I
think
our
our
work
just
perfectly
complementing
complementary
with
each
other
so
yeah.
So
I
would
like
to
work
with
keyboard.
Our
group
who
would
like
to
work
with
t-bot
luton,
so
give
me
one.
K
Yeah,
I
know
that's
a
wrap
like
like
they
might
things
in
the
pr
that
we
can
discuss,
but
it's
not
important
like
it's
not
urgent
or
very
important
at
the
moment,
especially
like
I
want
to
improve
the
beta
estimate
directive,
but
people
that
are
interested
actually
included
a
notebook
in
the
pr
to
show
the
problem.
It's
just
that
just
yeah
just
to
give
it
like
in
very
quickly
that,
with
the
beta
estimate
right
now,
we're
doing
like
this
red
light
quotient
and
we're
doing
one
iteration.
K
But
actually,
when
you
do
one
iteration,
you
know
the
order
of
magnitude
of
the
of
this
of
the
ratio.
Right
like
I
run
the
test,
and
I
run
that
one
iteration
thing
like
several
times
and
basically
the
range
of
beta
value
you
can
get
from
that
by
running
it.
It's
like
it
covers
five
order
of
magnitudes,
but
it
still
means
the
real
ratio
by
two
order
of
magnitude.
So
it's
it's
basically
a
random
generated
at
the
moment.
K
So
I
so
I
propose
to
like
basically
like
do
so.
I
basically
propose
that
we
can
have
an
option
to
choose
a
number
of
iterations
of
the
tree
like
quotient
ratio.
We
want
to
do
and
actually
just
doing
three
gives
you
in
the
bulk
part.
So
it's
not
that
much
of
a
computational
cost
to
get
like
a
proper
ratio
estimation
so
and
also
that
that
that's
also
makes
the
the
directive
like
a
compliant
with
giant
inversion
and
with
like
many
different
form
of
functions.
K
So
so
I
think
that's
one
that
could
be
good
to
have
like
looked
at
and
and
merged,
but
we
might
have
to
look
through
because
because.
K
N
Hello,
yes,
I
learned
today
that
canada
switched
back
winter
time,
one
hour
one
week
later
than
europe
apparently
so
here
I
am
yeah
not
much.
The
the
our
joint
paper
is
out.
Well,
it's
not
out.
That's
not
true.
It's
submitted,
so
it's
out
to
be
read,
not
peer-reviewed,
but
for
anyone
I
think
you
have
to
know
dry
cleansing.
N
Other
than
that,
I'm
currently
working
on
automating
the
meshing
like
frequency
dependent,
so
that
the
joint
inversion
framework
can
throw
any
model
at
it
and
it
will
create
the
appropriate
computational
measures
but
give
back
the
gradients
or
the
model
of
the
of
the
inversion
mesh,
so
kind
of
make
that
a
bit
easier
for
people
like
in
the
joint
inversion
framework.
Most
of
them
are
either
used
to
seismix
or
gravity
and
not
really
to
em.
So
it's
kind
of
a
hard
task
to
do
a
meshing
and
yeah.
N
N
A
A
Doug,
do
you
have
any
updates?
You
want
to
share.
L
No,
not
particularly,
I
just
want
to
say
that
judge,
I'm
just
so
pleased
that
you
and
your
students
are
joining
us,
and
I
hope
that
we
can
see
you.
You
know
on
a
regular
basis.
I
think
there's
just
so
many
things
that
we
actually
have
in
common
as
far
as
as
goals,
both
scientifically
as
well
as
numerically
and
yeah.
It
would
just
be
it'd
just
be
really
fun
to
have
you
guys
as
regular
participants.
B
Yeah
sorry
yeah.
Thank
you
doc.
We
are
very
glad
to
be
here
to
have
to
keep
our
connection
with
all
you
and
all
you
guys,
the
same
pack
team,
and
we
hope
that
we
can
contribute
to
this
impact
and
it's
a
really
really
good
great
tool.
We
want
to
to
promote
the
use
of
zincbag
in
our
community.
As
I
mentioned,
they
just
greatly
accelerated
my
my
group's
research
and
make
it
much
more
reproducible.
B
A
Excellent,
well
thanks
everyone.
I
don't
have
any
really
concrete
updates
other
than
the
the
paper
that
wanted
to
share
and
so
feel
free
to
take
a
look
it's
in
review
right
now,
so
we
can
also
sort
of
if
you
see
things
that
we
should
improve,
we
can.
We
can
take
the
informal
review
comments
too
yeah
other
than
that
thanks
everyone.
One
thing
that
I
think
want
to
think
about.
Maybe
a
bit
and
I'll
try
and
get
some
thoughts
together.
A
Maybe
by
next
week
is
I
mentioned
sort
of
thinking
through
a
bit
of
sort
of
simple
governance,
and
what
kind
of
things
do
we
want
to
have
in
place?
For
that?
What
kind
of
questions
do
we
need
to
sort
of
be
able
to
answer
as
a
as
a
community?
So
if
you
have
thoughts
on
that,
we
can
maybe
start
start
to
kick
off
that
conversation
a
bit
too
yeah
other
than
that
thanks.
Everyone
have
a
good
afternoon
and
welcome
again
jojo
and
team,
and
mike
it's
great
thank
you.