►
From YouTube: SimPEG meeting July 2, 2019
Description
Seogi Kang provides an overview of inversions that consist of a parametric and a voxel model
Lindsey Heagy gives a demo of the simulation-refactor
A
Cool
so
I
believe
that
is
now
recording
thanks
everyone
for
being
game
to
try
this.
If
this
well
for
us,
then
we
can
continue
doing
this
if
your
instead
of
Google,
Hangouts
and
hopefully
it'll,
be
a
bit
more
stable
right
now.
A
B
A
B
Not
sure,
like
a
not
sure
who
who
asked
last
meeting
was
either
dumb
or
Sarah
like
if
she
was
working
on
some
inversion
like
DC
inversion,
but
it
was
like
a
very
short
boundary,
so
yeah
I
think
I
I
was
working
on
similar
problem,
so
I
wanted
to
kind
of
just
share
what
what's
the
possibility,
then
you
see
if
I
can
share
my.
C
B
B
B
So
how
do
we
invert,
like
both
interface
and
and
the
structures
like
isolated,
like
a
structures?
So
like
a
usual
way?
We
do
we
do
a
voxel
inversion
and
that's
like
I'm
running
to
the
inversion
and
that's
what
you
would
get
we're
still
fitting
the
data
pretty
well,
but
you
can
see
like
a
large-scale
structure
but
we're
kind
of
losing
where
the
interface
is.
B
Okay
data,
that's
what
I'm
getting
and
then
like
now
what
I'm
doing
is
I
could
just
do
a
parametric
inversion.
Okay,
so
but
what's
gonna
happen
is
I'm
gonna
find
like
an
interface
and
the
two
values,
but
like
I'm,
finding
in
a
wrong
place.
Just
like
that's
not
enough.
Your
mouth
is
more
complicated
and
then
what
we
could
do
we
can
combine
that.
So
let's
say
we
want
to
invert
fourths
interface
and
the
two
values
plus
some
some
structures.
So
here's
how
I'm
building
things
up
so
I
haven't
done.
B
B
B
C
C
B
Rather
than
like
a
using
polynomial,
I
can
I
can
think
about
okay,
so
each
point
of
the
cells
here
has
a
set
value.
So
let's
say
I
got
an
interface
here:
okay
and
then
let's
say
I
got
like
a
10
cells
in
X
Direction.
It
excels,
have
a
set
value
that
value
right.
I
can
invert
for
that
value
and
then
put
a
smooth
constraint,
for
instance,
to
make
an
interface.
B
B
B
B
B
That's
what
you
get
here,
so
it's
not!
It
doesn't
look
great.
It's
like
there's
a
there's
a
variability,
so
it's
a
non-unique
problem.
Yes,
this
example
is
not
great,
but
actually
I
had
a
better
example
like.
So
that's
an
idea,
but
if
you,
if
I'm
using
the
polynomial
and
then
a
little
bit
like
all
your
time
channels,
that's
what
I
would
get
so
yeah.
It's
a
yeah
like
you
need
some
sort
of
fine-tuning.
C
C
C
B
Right
so
yeah,
so
there's
a
there's,
a
whole
bunch
of
models.
I
could
fit
the
data,
but
that's
what
we're
up
against
that?
Yes,
I
like
the
pan.
So
if
if
the
real
goal
was
actually
finding
a
water
table,
I
think
I
would
get
rid
of
some
late
time
channels,
which
is
really
kind
of
contaminated
by
this.
Like
sand
sand
channels,
I
think
so
yeah
it
says
yeah
it's
you
need
a
little
bit
of
fine
tuning.
I
got.
G
A
question
soggy
I
guess:
is
there
a
possibility?
You
could
invert
it
that
are
sensitive
to
this
big
interface
and
do
a
parametric
conversion,
or
maybe
you
do
a
voxel
inversion
and
just
recover.
This
large-scale
geological
feature
correctly.
Mm-Hmm
use
that
as
a
starting
and
reference
model
on
a
voxel
inversion
to
then
try
and
recover.
Yes,.
B
That's
exactly
what
Mike
McMillan
Yeah
right,
what
they
call
he's
called
hybrid
inversion.
One
of
my
colleague
here
actually
tried
that
for
a
seawater,
so
she
used
like
a
quick,
some
DC
data
at
a
lab
scale
and
then,
yes
and
also
yeah.
She
tried
that
same
thing
like
use
that,
like
I,
think
there's
a
little
bit
of
like
a
trick.
Cuz
you
because,
like
your
data,
actually
includes
everything,
it's
not
really,
including
just
the
parametric
surface.
So
it's
somehow
like.
G
B
May
set
tenant
like
a
your
pump
like
order
of
polynomial
pre
low
because
they
I
want
to
put
a
lot
of
structures
on
your
interface.
So
then,
once
you
got
that
and
then
you
can
use
that
as
a
reference
model
later
to
kind
of
like
I
for
your
voxel
inversion,
so
I
mean
that's
another
possibility.
That's.
C
Kind
of
better
right
because
you're
doing
both
at
the
same
time,
so
we
could,
like
you,
do
a
joint
inversion.
But
the
only
point
of
connection
is
that
parametric
model
that
one
into
and
then
you
can
let
them
both.
Do
you
like?
Fourth
origin
84
there
in
their
own
physical
properties?
You
know
I
mean
so.
C
B
Like
after
a
little
bit
of
detail
is
because,
if
you
think
about
like
a
here,
we
need
to
be
a
little
bit
careful
cuz,
we're
thinking
like
our
properties
that
log
resistivity
or
law
conductivity.
So,
if,
like
you
need
to
think
about,
when
do
you
want
to
sell
right?
Because,
if
you're
summing,
that
up
in
a
log
space
you're
actually
multiplying
it?
So
that's
probably
what
you
may
not
want
to
do
so
you
may
want
to
in
to
resist
your
conductivity
space
and
some
data
right.
B
So
they're
actually
like
bit
of
details
to
make
things
work
and
there's
a
little
bit
of
like
a
kind
of
disadvantage
here.
So
I
can
only
always
put
I
think
it's
one
way.
So
I
can
always
put
the
conductor
I.
Think
in
this
case,
for
for
the
fossil
fossil
part,
I
can't
really
put
the
resistor
so
yeah.
If
you
actually
write
down
and
then
you
can
figure
it
out,
but
that's
a
bit
of
a
disadvantage
for
this
methodology.
B
Put
it
on
sec,
yeah
I'll
put
it
on
this
line
right.
G
C
E
H
B
Next
week
might
I
might
have
a
look
I
think
as
I
was
using
LP
inversion
quite
a
bit
in
the
context
of
like
hydrogeological
problems.
I
can
kind
of
update
that
part
later
next
week.
So,
like
the
problem
that
I
have
here,
we
got
a
lot
of
well
data
as
well,
so
I
mean
I
bet
the
situation
here.
People
don't
really
use
the
well
data,
what's
actually
happening
so
in
a
concerning
company,
so
they
really
try
to
fit
the
data
well
and
then,
like
it's
Roy
bunch
of
them
and
I
find
the
model.
B
That
kind
of
like
a
matches
well
with
that
well
dated.
So
that's
sort
of
the
current
framework
which
stop
back
right.
You're
still
doing
some
inversion
find
the
best
match,
so
we're
gonna
get
an
optimal
model,
so
that's
sort
of
the
workflow
here,
but
then
on
Mayan
or
why
don't
you
put
that
in
to
your
inversion?
I
mean
even
like
with
the
conventional
methodology.
You
can
put
quite
a
bit
of
information
into
your
an
inversion
like
a
reference
model
or
bones,
but
that's
really
happening.
Kids
like
it.
B
Just
looked
like
a
little
bit
cumbersome
and
a
commercial
package
Ora,
it's
actually
impossible
to
put
that
in
to
a
commercial
package,
so
yeah,
that's
sort
of
what
I'm
working
on
that.
Maybe
in
that
space
you
can't
we
can
use
LP
norm
so
like
we
integrate
everything
into
an
inversion
about
prior
knowledge
about
that
and
then
use
whatever
functionality
that
we
have
an
inversion
to
ask
me
some
sort
of
a
certainty,
so
I
mean
like
here
we're
getting
more
samples
to
estimate
some
uncertainty
in
your
model
space.
G
A
A
To
address
the
handful
of
lingering
comments,
so
there's
some
conversations
that
are
not
yet
resolved.
Just
left
a
comment
just
saying
you
ping
when
those
are
done
and
so
yeah.
So
if
you
don't
mind
taking
a
look
at
that
and
then
just
feel
free
to
ping
once
once
you're
happy
with
and
those
those
little
comments,
the
lingering
ones
are
resolved.
Yeah.
A
Okay,
then,
let's
maybe
move
on
to
the
stimulation
refactor,
if
folks
are
good
with
that.
So
just
a
quick
like
status,
update
as
to
where,
where
we're
at
with
progressing
through
simpe.
A
Basically
as
of
like
this
morning,
the
whole
electromagnetics
stack
of
synthetic
is
on
the
simulation.
I
haven't
necessarily
pushed
it
so
that
like
right
now,
the
spectral
induced
polarization
is
the
last
one
that
I
did
so.
But
right
now,
if
you
go
and
if
you
look
at
frequency
domain
time,
domain
natural
source
am
DC
for
induced
polarization.
All
of
those
should
now
be
sitting
on
the
simulation
class.
A
Well
make
sure
that
s
IP
is
up
after
the
meeting,
and
so
what
I
wanted
to
do
is
I've
got
a
notebook
just
like
running
through
an
e/m
inversion,
and
we
can
use
this
as
a
way
to
just
like
chat
about
some
of
the
changes
and
then
yeah
work
on
sort
of
any
sort
of
clarifications
and
go
from
there.
So
I'm
just
gonna
share
my
screen.
A
So
one
of
the
first
things
I
tried
to
do
at
least
internally
within
simple-
is
now
always
import
discretized,
instead
of
importing
mesh
down
the
line.
I
think
this
is
like
easier
to
debug,
because
it's
just
exactly
like
the
true
thing
that
we're
calling
that
said,
if
you
want
to
do
from
sim
peg
important
mesh
that
will
still
work.
A
And
then
the
other
thing
to
note
is
now
renaming
things
trying
to
be
a
little
more
in
line
with
Python
standards.
So
all
of
the
module
names
are
lowercase,
and
if
there
are
two
words
we
use
an
underscore
so
data
misfit
is
now
data
underscore
misfit.
Inverse
problem
is
inverse
underscore
problem,
then
here,
if
you
look
at
module
names
for
the
physics
now,
instead
of
using
shorthand
and
all
caps
using
spelling
it
out.
A
A
So,
throughout
all
the
testing
right
now
and
eventually
in
the
examples,
this
is
like
I
think
a
pattern
that
is
kind
of
nice
to
follow
is
that
you
know
we
should
we
spell
it
out
full
force
in
the
package,
but
then,
when
we're
importing
stuff
into
your
examples
or
into
your
own
code,
I
think
using
shorthand
is
is
totally
fine
if
you
want
to
stick
with
spelling
it
out
in
your
own
code.
That's
also
fine,
but
I
think
there's
sort
of
enough
of
a
standard,
and
this
keeps
it
a
little
more
concise.
A
One
thing
that
I
will
do
with
this
is
also
allow
from
simple
old
API
important
star,
and
this
will
import
the
old
namespace
but
sitting
on
sort
of
the
new
framework.
So
all
of
the
changes
that
I
discuss,
as
is
following
like
if
you
were
to
import
problem
here,
if
you
spell
that
properly,
it
would
still
be
importing
the
simulation
class,
but
as
problem.
So
it's
still
going
to
behaving
it's
going
to
be
behaving
like
the
simulation
class,
so
everything
is
sort
of
moving
on
to
this.
A
So
I'll
walk
through
this
example
and
highlight
where
things
are
changing
and
then
we
can
go
in
and
look
at
code
as
it
is
helpful,
so
names
we're
just
gonna
create
a
mesh.
Like
usual
one
of
the
reasons
I
like
using
dis
critize
now
a
top-level
for
creating
meshes
is
because
I
like
calling
my
mesh
mesh
and
if
you
import
that
from
simpang
as
mesh,
it's
now
your
overloading
names
and
it's
confusing.
So
this
is
a
pattern.
I
would
try
and
encourage
folks
to
follow
so
a
still
mesh.
Nothing
here
is
changed.
A
That's
all
dis,
critize,
that's
great
I'm
gonna
make
a
1d
model
because
we're
gonna
do
a
1d
inversion.
So
this
is
all
just
numpy.
Basically
right
I'm
just
arranging
a
vector,
so
nothing
special
here
when
we
look
at
the
maps
now
Maps
is
lowercase.
That's
the
only
thing:
that's
changed
there,
so
I
can
go
ahead
and
plot
my
model.
That's
great!
We've
got
a
conductive
layer
in
a
resistive
half
space.
So
that's
what
we're
going
to
work
with
so
nothing.
There
has
really
changed
other
than
uppercase
map's
went
to
lowercase
Maps,
okay.
A
A
This
still
exists,
but
right
now
so
like
we
previously
used
to
call
sort
SRC
and
that's
how
you
found
your
sources
so
that
still
exists
in
the
namespace,
but
eventually
will
depreciate
that
so
try
and
move
things
over
to
here.
But
do
you
want
it
to
be
too
painful?
Okay
here,
so
this
is
similar
to
what
we
had
before.
One
thing
that
changed
is
so
before
we
used
to
always
do
our
X.
So
let's
see
already
looks
right
now:
that's
gonna,
throw
a
warning,
and
so
instead
we're
gonna,
try
and
use
locations.
A
A
These
are
the
same
as
before
component
orientation.
One
thing
that's
kind
of
nice
is
with
properties.
It
lets
you
cast
things
so,
for
example,
if
instead
of
spelling
out
real
I
used
re,
that's
fine
it'll
accept
that
and
it'll
cast
it
to
real
under
the
hood.
So
that's
kind
of
nice
is
that
you
know
for
for
different
people
if
you're
using
different
shorthand
or
it's
instead
of
using
a
match,
I
actually
spelled
out
imaginary
it'll,
just
all
cast
it
to
the
same.
A
The
same
thing:
if
you
want
to
see
that
I
can
show
you
the
code
afterwards,
but
that's
just
like
a
minor
convenience
thing
same
as
before.
We
have
an
orientation
then
here
when
we
construct
a
source
list,
so
we've
got
F
Tem
sources,
Meg
dipole,
so
that
API
is
the
same
instead
of
using
freak,
fre
Q,
raising
frequency
same
thing
again,
it'll
still
let
you
set
using
fre
Q,
but
it'll
throw
a
warning.
A
A
A
Okay,
so
classes
should
be
upper
case
and
camel
style,
so
mag
dipole
is
a
class.
So
that's
okay,
this
I,
don't
actually
like
I
agree.
We
should
be
consistent
and
not
use
so
four
classes,
you
don't
typically
use
underscores
for
the
initial
refactor
I
was
sort
of
it
sort
of
gonna.
Let
this
level
of
stuff
continue
and
then
we
can
hit
it
in
another
refactor
or
like
hit
it
in
the
next
iteration.
A
A
A
But
the
survey
now
no
longer
has
the
deep-red
method.
It
also
there's
no
longer
any
pairing
that
you
do
between
the
problem
or
the
simulation
now
and
the
survey.
So
when
we
instantiate
a
simulation
and
right
now,
the
the
this
is
a
bit
confusing
and
I
think
probably
worth
changing
before
we
move
this
in
the
use
of
the
term
problem
here
is
simply
just
because,
like
I
wanted
the
API,
at
least
for
now
to
be
the
same,
but
we
can
perhaps
call
this
simulation.
A
But
that's
okay!
Well,
we'll
ignore
that
detail
for
now,
but
you
can
see
here
so
one
important
thing
that's
changed
is
now
the
simulation
takes
a
survey
when
you
instantiate
it
you
can
still.
So
if
we
had
not
passed
this,
it
will
still
let
you
do
this,
but
if
I
did
not
have
some
warnings
suppressed
well,
all
of
this
does
actually,
let
me
show
I'm
just
going
to
stop
sharing
for
a
second
and
reshare.
My
entire
screen.
A
So
here,
if
we
actually
look
in
the
code
at
what
that
does,
so
if
we
look
at
the
base
simulation,
if
you
actually
call
pair
this
is
what's
happened,
is
all
it's
doing
is
saying:
don't
do
that,
but
okay
I'll,
let
you
and
then
it
just
sets
the
survey.
So
it
says
self
dot
survey
is
survey,
so
that's
all
that
that's
doing
so.
It's
not
necessary
anymore
in
your
code,
but
if
that's
the
way
that
you're
used
to
having
like
set
up
a
simulation
like
object,
it's
still
gonna.
Let
you
do
that.
A
Okay,
so
we've
constructed
our
survey
and
our
simulation
object.
There's
a
few
warnings
that
are
popping
up
because
I
haven't
gone
through
the
code
rigorously
and
gotten
rid
of
all
of
the
old
style
naming
so
like,
for
example,
here
trying
to
use
source
list
instead
of
SRC
list.
It's
still
gonna.
Let
me
use
it,
but
it's
gonna
throw
warnings
so
now
I'll
show
you
there's
a
few
different
things
going
on,
so
if
we
actually
the
simulation
it
has
so
it
now
has
the
deep-red
method.
A
It
also
has
fields
like
it
used
to
before,
and
it
also
has
a
make
synthetic
data
method.
So
previously
the
survey
had
the
ability
to
make
synthetic
data.
That's
no
longer
true.
The
survey
is
a
very
lightweight
object.
All
it
does
is
store
sources
and
receivers
and,
for
example,
in
the
frequency
domain.
It
knows
how
many
unique
frequencies
it
has
in
it
and
I
notice
how
many
data
it
has,
but
other
than
that
like
it
can't
really
do
anything
useful
for
you
in
terms
of
like
computing
things
it
just.
It
stores
the
survey.
A
It
also
does
not
have
any
data
attached
to
it
anymore.
We
used
to
when
you
used
to
create
a
run
survey,
don't
make
synthetic
data,
it's
stuck
data
on
to
the
survey,
so
you
used
to
be
able
to
call
survey
dogs,
that's
no
longer
the
case.
So
what
happens
here
when
I
call
simulation,
don't
make
synthetic
data
I,
give
it
my
model
in
this
case,
I've
told
it.
Please
don't
add
noise
to
this
and,
let's
add
a
noise
floor.
There's
also
a
default
standard
deviation
of.
A
0.05
so
5%,
so
what
this
animal
is
that
it's
given
me
back
is
actually
a
synthetic
data
object
so
like
before
it'll
still,
let
you
Index
this
actually
by
source
and
receiver,
and
so
it'll
still
give
you
now
the
datum
for
a
given
source
and
receiver
pair,
but
all
the
dates
doing
when
you
do.
That
is
it's
looking
up.
It's
got
a
dictionary
called
indexed
dict
right
now
we
can
debate
about
the
naming
they
given
a
source
and
a
receiver
gives
you
back
the
indices
that
corresponds
to
that
data.
A
A
A
A
And
then
it
also
has
uncertainty,
and
so
the
uncertainty
is
just
computed
by
multiplying
the
standard
deviation
by
the
absolute
value
of
yelps
and
adding
a
noise
you
can
set
any
of
these
things.
So
let's
say
I
wanted
to
set.
My
standard
deviation
actually
is
point
point
one
then
it's
it
still
is
it's
using
properties
under
the
hood?
So
it's
smart
enough
to
realize
that
oh
yeah,
when
you
actually
are
saying
that
what
you
actually
mean
is
this.
So
a
sign
point
one:
two,
all
my
data.
A
You
could
equally
pass
this
a
vector
if
you
wanted
to
use
a
different
standard
deviation
for
each
datum.
So
if,
for
example,
I
wanted,
you
change
the
standard
deviation
by
frequency,
you
could
do
that
and
provide
a
vector
of
the
correct
length
here
and
what's
nice
is,
is
if
you
wanted
to
do
that?
With
that
index
dictionary
you
can
you
can
find
the
correct
indices
based
on
your
source,
to
help
to
help
do
that?
B
Is
it
smart
enough,
like
let's
say,
I,
have
my
own
way
to
Gendry
uncertainty,
so
I
generated
my
uncertainty
as
a
vector
yeah
pass
into.
Let's
say
your
make
synthetic
data
or
to
a
data
object
and
then
that,
like
a
say,
okay,
is
it
gonna
fixed
it
or
like
a
I,
wasn't
sure
cuz
like
it
seems
like
what
would
happen.
I
say
you
set
that
standard
deviation
from
point
one
to
point
two.
Then
you
would
reevaluate
the
ax
certainty
right,
yeah.
A
So
what
happens
is
is
let
me
show
you
the
code
actually
so
here.
If
we
look
at
the
data
object,
let's
just
look
at
the
data,
so
the
only
difference
between
the
data
object
and
then
the
synthetic
data
is
that
the
synthetic
data
object
will
keep
track
of
your
clean
data.
Just
so
that,
if
you
wanted
to
add
noise
but
then
sort
of
double-check
sanity
check
what's
going
on,
you
have
both
of
them,
but
a
normal
data
object
just
has
the
opps.
A
So,
yes,
we
have
standard,
deviation,
noise,
floor
and
a
survey
are
the
core
things.
So
the
basic
things,
sorry
that
you
need
to
create
a
valid
data
object,
are
a
survey
and
D
ops
from
there.
You
then
have
the
option
to
add
standard
deviation
and
noise.
One
and
those
will
be
necessary
we're
we
need
a
way
to
calculate
uncertainties.
If
you
want
to
use
it
in
the
data
misfit.
A
A
If
you
then
went
in
andrey
added
a
standard
deviation
now
that
would
be
a
bit
funky,
because
you
would
have
the
uncertainty
that
you
set
as
a
noise
floor
and
then
you
would
have
added
back
a
new
standard
deviation.
So
when
we
go
in
and
tell
the
uncertainty,
all
it's
doing
is
saying:
hey,
let's
set
my
uncertainties
to
zero.
If
I
have
a
standard
deviation,
then
what
I'm
gonna
do
is
add.
The
standard
deviation
times
the
absolute
value
of
the
observed
data
and
then,
if
I,
have
a
noise
floor,
I'm
just
gonna.
H
Have
missed
something
easy?
So
so,
if
you
really
stick
that
let's
say
I
have
UBC
data
file
like
gravity
or
whatever
that
I
want
to
load.
You
said
that
survey
doesn't
have
data
anymore,
but
if
I
want
to
load
I'm
gonna
load
this
issue
way,
but
I'm
an
unlikely
to
have
a
simulation
at
this
level.
If
I
just
want
to
know
the
data
file.
So
how
you
do
that
you
quick.
Do
you
create
the
data
object
along
with
the
survey?
Oh
yeah.
A
So
actually,
what
I
would
do
is
pass
back,
and
this
is
currently
what's
the
DCU
tools,
the
way
I've
set
it
up
or
the
way
of
migrating
it,
because
previously,
you
handed
back
only
a
survey
in
this
case
we
now
only
hand
back
data
because
data
knows
about
the
survey
and
it
knows
about
the
value
of
the
data
as
well
as
uncertainties.
So
in
this
case,
if
you
were
to
read
a
UBC
data
file
that
has
all
the
survey
information
into
it,
the
the
simplest
thing
to
create
is
a
data
object.
A
E
And
I
think
excuse
me
Lindsay,
so
it
just
thought
so
suppose
that
you've
assigned
standard
deviations
around
you
designed
uncertainties
to
your
data
and
then
yeah
as
often
the
case.
You
know
you
go
ahead,
you
do
the
inversion
and
then
you
might
want
to
change.
You
know
what
you
had
a
sign
for
uncertainties
where,
where
is
that
done
in
here?
Is
that
just
done
on
some
separate
module
outside
so.
A
A
Sorry,
I'm,
just
gonna!
You
would
set
that
on
the
data.
E
A
A
Cool
okay,
so
then
this
example,
we
created
some
synthetic
data
and
we
could
look
at
them.
We've
got
some
real,
so
imaginary
as
a
function
of
frequency,
and
so
this
is
what
I
was
saying
before
is
that
you
can
still
index
the
data
by
source
and
receiver
and
all
that's
doing,
let
me
show
you
is
in
the
yet
item.
It's
just
saying:
okay,
look
at
my
index
dictionary
key
zero
is
the
source
and
key
one
is
the
receiver,
and
so
the
index
dictionary.
A
All
it's
doing
is
its
loop
through
your
survey
and
they
said
ok
for
each
source.
Here's
the
receivers
that
it
has
and
here
are
then
the
indices
that
map
to
that
data.
Does
that
make
sense?
Okay,
so
that's
hopeful
Doug.
If,
for
example,
you
wanted
to
find
all
your
sources
that
have
a
frequency
or
find
all
your
data
with
a
frequency
of
one
Hertz
and
set
the
uncertainties
to
something
specific,
perfect?
A
B
Is
is
it
like
a
Kyle?
It
depends,
but
so
like
a
sorting
with
the
source
and
receiver,
it's
kind
of
helpful,
but
sometimes
like
actually
just
like
having
one
bacon
this.
You
know
what
I
mean
like
I,
let's
say:
I
got
thousand
data
for
DC
data,
its
numbering
to
zero
mm
and
I
have
a
some
functionality
that
I
can
passed
in
desist.
B
Okay
from
this,
and
this
is
set,
the
standard
deviation
of
three
percent
or,
if
that's
set
those
in
dishes,
and
it's
like
sometimes
like
a
you-
know-
lots
of
data
and
like
a
even
just
having
a
source
and
receiver
as
a
classification
is
not
enough.
So
yeah
it'll
be
nice
to
have
some
flexibility
kind
of
like
other
than
having
kind
of
receiver
and
stores,
and
this
is
budgeting.
A
big
industry
and.
A
A
A
Yeah,
ok,
so
now
coming
to
the
data
misfit.
So
this
has
changed.
Is
the
data
misfit
now
takes
data
and
this
needs
to
be
a
data
object
and
it
takes
a
simulation
because
it
needs
to
know
how
to
compute
predicted
data.
So
those
are
the
two
things
that
it
takes
previously
used
to
take
a
survey
and
that
survey
had
to
be
paired.
B
Before
you
move
on,
I
took
a
look
and
there's
like
a
two
ways,
actually
generate
static
data,
because
you
could
just
do
a
problem
that
deprived
or
you
can
just
do
a
problem.
Dot,
make
synthetic
data
and
then
gives
you
a
data
class,
actually
problem
that
deep
rat
gives
you
a
vector
yeah.
A
B
A
Because
any
inversion
we
always
just
want
to
be
working
with
a
vector,
and
so
in
this
case
here
so
just
to
clarify
to
make
sure
people
are
clear
with
what
song
you
just
said
here:
I've
created
two
different
things:
Diab
and
dogs
too.
So
if
yours
is
what
we
showed
before,
it's
synthetic
data
cubes
to
is
just
a
vector,
so
it
is
just
those
data,
so
I
can
immediately
start
using
that
in
the
inversion,
and
that's
the
mean
that's.
A
The
main
reason
for
that
sake
is
just
that
like
this
is
just
a
simple
method:
all
it
knows
how
to
do
is
compute
a
vector
predicted
data.
That's
it
that's
all
nothing
more,
whereas
if
you
actually
call
make
synthetic
data,
what
I'm
saying
is
I
actually
want
a
data
object
that
knows
about
its
own
uncertainties
when
we're
in
an
inversion.
The
deep-red
has
no
uncertainties
associated
with
it,
so
it
doesn't
need
the
extra
infrastructure
of
a
data
object.
B
C
A
Yeah,
that's
a
good
point.
There
is
no
sanity
check
under
the
hood
that
currently
makes
sure
that
those
are
the
same
thing
but
I
agree.
That's
that's
something!
We
need
to
do
yeah,
okay,
so
then
the
data
misfit
behaves
very
similar
as
before,
and
so
it's
it's
gonna.
First,
look
to
see
if
you
have
set
uncertainties
on
your
data
and
that's
how
it'll
compute
the
WD
matrix
same
as
before,
though,
if
you
want
to
supply
your
own
WD
matrix,
you
can
do
that.
You
can
set
your
own.
A
So
L
2
data
misfit,
basically
a
misfit
sorry.
So
here,
if
you,
if
you
provide
aw
it'll,
let
you
it'll,
let
you
set
that,
but
if,
if
it's
not
there,
then
it's
gonna,
look
at
the
data
object
and
see.
Can
I
compute
uncertainty
if
I
can
I
will
construct
an
uncertainty
from
there?
But
if
they're
all
0
or
something
like
that,
it's
gonna
throw
errors
and
say
you
can't
set
uncertainties
of
zero.
A
Then
the
regularization
has
not
changed
at
all
so
same
setting
up
the.
A
So
that's
the
gist
of
the
main
things
that
have
changed.
I've
been
trying
to
track
on
the
this
work-in-progress
pull
request
the
main
items
that
are
changing
I'm
not
really
expecting
like
what
I'll
do
is
I'll
call
people
out
specifically.
If
I
have
questions
on
code,
that's
changed
and
I'll
perhaps
ask
that
people
jump
in
and
take
a
look
at
specific
things,
because
you
can
see
there
are
a
fair
Changez,
but
this
should
hopefully
summarize
most
of
the
main
things
that
are
that
are
going
on.
B
C
A
Serious
now
so
there's
there's
one
thing
that
is
currently
missing
is
that
the
solver
doesn't
serialize,
because
it's
a
class
right
now
and
so
what
we
need
to
write
a
custom
serializers
for
that,
but
other
than
that,
it's
serializing
everything
so
you
should
be
able
to,
and
this
is
some
functionality
that
we
still
need
to
add.
But
you
should
be
able
to
now
save
a
simulation
and
then
reload
it,
which
is
really
nice,
and
then
there
should
be.
One
of
the
things
I
would
like
to
do
is
work
on
how
to
do
that.
A
Cleverly
for
the
fields,
because
the
field
object
is
bigger
and
think
that
we
actually
want
to
be
looking
at
storing
the
fields
arrays
as
something
like
czar,
which
operates
with
x-ray
and
what's
nice
about
that
is
it's.
It's
meant
to
store
large
data
sets
and
you're
able
to
basically
index
into
it
without
loading
the
entire
thing
into
memory.
Yeah.
A
Yeah,
so
in
that
way
you
can
start
to
like
slice
through
stuff
or
just
grab
specific
time
indices
without
loading,
the
entire
animal
up
into
memory.
So
that's
something
on
the
radar
I,
don't
think
that'll
be
done
specifically
for
this
simulation
refactor,
but
it'll
be
a
small,
easy
pull
request,
basically
immediately
afterwards,
yeah.
C
A
Are
you
storing
it
Tom
as
Tsar
or
how
are
you
storing
it.
E
B
A
I
think
my
screams
that
I'm
more
to
show
there
right
now
but
I
think
so
he
that
we
should
be
looking
at
that
I
think
should
still
live
outside
I.
A
I
think
that
those
are
layers
that
sit
on
top
of
ups
in
thing,
because
the
thing
is
is
we
don't
want
to
start
bogging
down
like
very
simple
objects
that
should
just
stay
simple,
so
I
think
like
sort
of
similar
to
how
we
have
it
now
and
we
might
be
able
to
simplify
some
of
the
things
on
the
io
classes
and
try
to
make
them
a
little
more
uniform
across
other
methods,
but
yeah.
Those
are
basically
like.
A
C
A
Yeah,
no
I
think
that
that
would
be.
That
would
be
good
and
then
the
same
thing
I'm
noticing,
like
utils,
are
sort
of
all
over
the
place,
and
that's
something
that
like
I,
think
we
can
deal
with
once
this
pull
request
is
is
in
is
then
just
like
start
teasing,
some
of
those
things
apart,
but
yeah
I
wanted
to
try
and
tackle
like
one
thing
at
a
time
rather
than
then
changing
too
many
things
on
people
all
at
once.
100%.
A
Yeah
and
then
was
kind
of
cool
too,
is
I,
think
we
can
start
thinking
about
creating
like
simple
biz
and
be
able
to
pass
a
data
object
to
simpang
biz
and
get
back
plottable
things
and
things
like
that
and
so
yeah.
We
can
start
to
sketch
out
what
that
looks
like
because
I
know,
soggy
has
done
a
bunch
of
work
on
getting
soooo
sections
and
stuff
like
that.
A
Ready
and
I've
had
some
really
good
chats
with
Fernando
about
like
how
to
start
thinking
about
when
we
actually
want
to
really
be
scaling
up
and
working
with
large
data
and
potentially
with
different
plotting.
Api's
like
Claude
Lee
is
actually
pretty
slick
for
interactive
plots
like
if
you
want
to
drag
things
around,
and
so
it
would
be
nice
if
we
could
readily
just
like
interchange,
which
plot
style.
We
want
to
use
yeah.
But
those
are
those
are
things
we
can.
We
can
work
with
as
we
iterate
forward.
A
Well,
if
you
want
to
start
playing
around
with
this
branch
and
you're
sort
of
in
the
e/m
side
of
things,
I
would
say:
go
for
it
patiently
realize
that
things
might
break
or
still
slightly
change
if
there
are
things
that
break
or
are
not
working
as
expected,
just
paying
on
slack
and
and
we'll
get
it
sorted
and
then,
hopefully,
in
the
next
week,
or
so
all
have
a
passed
on
at
some
of
the
potential
field
stuff
and
get
some
of
your
feedback.
At
that
point,
you
social.
B
A
A
Can
also
that's
something
that
we
can
sort
through
asynchronously
to
you,
yeah.
A
Sure,
if
you
want
to
give
it
a
try,
you're
welcome
to
if
you
want
to
one
thing
that
you
can
do
is
like
start
a
pull
request
with
the
current
examples
and
then
like
we
can
you
can
then
sort
of
branch
off
and
upgrade
them
whatever
you,
okay,
the
easiest
way
forward:
yeah
yeah,
totally
up
to
you.
Okay,.
F
D
D
A
H
A
Sounds
good,
do
you
think
it
would
make
sense
to
start
an
issue
just
like
have
a
github
issue,
and
then
people
can
add
random
thoughts
and
ideas
as
they
come
up?
Would
that
be
a.
A
E
E
E
Like
it's
just
the
same
as
you
were
talking
about
with
respect
to
examples
for
using
the
different
kinds
of
surveys,
you
know
a
potential
field
and
they're
born.
You
know
an
MT
just
to
kind
of
consolidate
both
with
the
forward
and
hopefully
with
some
with
with
an
inversion
and
just
to
you
know,
it'll
be
the
example
section,
and
it
also
would
be
you
know
by
the
time
you
get
through
that,
then
you've
got
a
lot
of
tick
marks
on
various
boxes
that
okay,
we
could.
We
could
actually
do
this
so.
G
I've
actually
I've
actually
started
doing
this
already.
I've
got
tutorials
for
gravity
and
mag
and
started
on
frequency
domain
a.m.
it's
like
good,
a
tutorial
based
set
of
exercises
that
will
show
you.
You
know
how
to
make
the
the
survey,
how
to
define
different
types
of
receivers,
to
predict
different
kinds
of
data,
how
you
might
want
to
plot
some
stuff
afterwards
and
it
goes
through
slowly
and
it
yeah
it
acts
as
a
tutorial.
It
provides
examples
and
then
it
also
would
be
a
test
for
making
sure
that
the
programming
is
working.
A
Yes,
so
that
would
be
a
great
starting
point
and
then
judge
point
two
is
like:
if
we
can
all
bring
examples
that
we
have
together,
this
will
also
I
think
help
outline
like
what
we
need
to
do
with
respect
to
IO
and
what
we
need
to
do
with
respect
to
having
like
simple
drivers
or
something
like
something
like
that.
That
is
basically
a
quick,
quick
wrapper
to
inversion
so
for
when
people
want
to
run.
You
know
on
vanilla,
l/2
inversion
that
it's
easy
for
them:
cool,
yeah,
so
Tebow.