►
From YouTube: SimPEG meeting July 15
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
B
D
Yeah
hi
peter,
so
I'm
devin.
I've
worked
for
the
ubc
gift
group
for
the
last
six
years.
I
did
a
masters
and
with
doug
and
for
the
last
three
years
I've
been
doing
some
development
with
jif
tools.
D
I've
got
my
hands
in
simpag,
doing
a
lot
of
documentation
and
support
for
the
fortran
codes,
so
I
kind
of
have
my
hand
in
in
a
lot
of
different
things:
yeah,
that's
more
or
less
my
story,
I'm
still
working
at
ubc
under
doug.
A
B
B
A
good
plan
yeah
so
since
2009
I've
been
working
with
colin
foxen
at
memorial
university
and
through
those
years
we've
developed
quite
a
a
large
suite
of
fortran
code,
which
has
some
parallels
with
with
what's
being
done
in
some
of
the
python
packages.
So
we
we
felt
that
there's
definitely
some
lessons
we
could.
We
could
learn,
and
I
would
also
like
to
provide
any
any
opportunities
to
to
help
in
any
way.
B
I
can
that's
basically
it,
but
I
don't
really
have
the
first
clue
about
where
to
start
with
that,
not
not
a
python
person,
but
there's
there's
code
that
could
be
could
be
connected.
So
I
wanted
to
start
sort
of
feeling
out
how
that
could
be
done.
A
B
Sure
yeah,
that's
it
that's
a
good
idea.
So,
right
now
we
have
support
for
rectilinear
meshes,
but
most
of
the
focus
has
been
on
unstructured
meshes
in
in
2d
or
or
3d,
so
triangles
or
tetrahedra.
B
And
then,
as
far
as
the
the
types
of
data
we
we
can
support.
Magnetics
gravity
gravity.
Gradiometry,
we've
done
some
some
of
the
simpler
seismic,
so
just
a
fast
marching
solution
to
do.
First
arrival
travel
times
we
have
uography
capabilities.
B
Oh,
what
else
do
we
have
in
there
electromagnetics
controlled
sources?
Electromagnetics
we
haven't.
The
only
thing
really
missing
is
any
of
the
electrical
methods
we
have
something
for
mt
and
then,
as
far
as
the
inverse
problem
is
concerned,.
B
A
whole
bunch
of
different
regularization
options,
so
not
just
l2,
but
also
a
couple
of
fundamentally
different
ways
to
to
solve
the
inverse
problems
so
or
three.
I
suppose,
one
in
the
kind
of
standard,
voxel
approach
where
you
have
each
cell
has
continuous
physical
properties
in
it.
B
We
can
also
do
the
same
sort
of
thing,
but
have
discrete
property
values
so
more
of
like
a
rock
type
inversion,
and
then
we
also
have
this
other
approach
where
we're
working
with
surface
surfaces.
So
the
the
it's
a
surface
space
model
and
those
surfaces
represent
the
contacts
between
two
discrete
rock
units
and
we
move
the
contacts
in
space.
B
F
What's
the
difference
appear
between
the
surface
and
version
and
the
the
geological
model
versions,
the
one
is
not
discretized
at
all.
Only
by
its
boundaries
is
that
what's
the
difference.
B
Yeah
the
geological
lithological
inversion
type
thing
is,
is
just
you
have
a
mesh
and
it
it
stays
where
it
is,
but
you're
just
putting
in
you're
putting
in
lithology
ids,
which
are
integer
values
rather
than
some
continuous
physical
properties
so
and
then
those
those
lithologies
get
connected
to
the
physical
properties.
Through
some
some
mapping,
yeah
and.
H
B
There's
there's
a
whole
bunch
of
joint
inversion
options
in
there
too,
so
looked
at
a
whole
bunch
of
different
types
of
coupling
options
for
just
general,
inverse
joint
inverse
problems.
B
B
I
I
suspect
that
you
know
parts
of
it
will
and
can
become
open
source
yeah.
Just
we
just
need
to
kind
of
decide
how
to
make
how
to
make
those
decisions.
B
Fair
enough,
I
I
gather
there's,
there's,
there's
still
the
question
of
how
to
sell
open
source
to
to
industry,
but
maybe
some
of
those
questions
are
being
answered
and-
and
so
once.
F
B
A
drive
for
for
me
anyway,
to
to
help
the
open
source
community
with
the
code
in
in
ways
that
I
feel
comfortable
with.
A
Yeah,
what
would
be
a
helpful
overview
or
are
there
questions?
Are
there
specific
questions
you
have
in
mind
right
now
or
just
kind
of
getting
plugged
in,
because
we
could
sort
of
spend
some
time
either
talking
a
bit
about
discretize
like
where
the
meshing
is
or
go
through
a
few
examples:
yeah.
B
Well:
let's
it's
it's
a
bit
need
space
for
me.
I
I
mean
doug
has
been
encouraging
me
to
get
involved
with
the
simpek
group
for
a
long
time,
and-
and
I
haven't
because
I
haven't
really
felt
a
need,
but
I
I
would
like
to
start
working
with
electro
electrical
data
dc,
resistivity
and
whatnot,
and
I
talked
with
florian
and
the
pi
gaming
group,
and
he
said
they're
they
they
don't
really
have
a
a
great
set
of
potential
field.
B
Ford
solvers
for
unstructured
meshes,
and
the
senpai
group
has
has
mentioned
that
you
you
know
you'd
like
to
start
having
support
for
unstructured
meshes.
So
I
feel
like
it's
time
to
to
connect
everyone.
So
those
are
the.
I
guess
those
are
the
three
goals
driving
my
my
interaction.
B
And
yeah
lindsay-
and
I
talked
a
bit
about-
maybe
what
the
first
steps
might
be,
and
we
we
thought-
maybe
maybe
just
the
first
step-
is
something
simple
like
let's
try
and
get
simpag
to
the
point
where
it
can
read
an
unstructured
mesh
from
from
from
a
file,
something
that's
created
elsewhere
and
then.
A
Joe,
what
do
you
think
is
a
good
first
step
for
sketching
for
sketching
that
out
I
mean
because
we
don't
have
that
support
right
now.
Do
you
think
that
yeah
do
you
do
you
think,
maybe
starting
from
an
example
from
pi
gimli,
or
do
you
think
we
should
basically
just
sketch
out
the
architecture?
First,
I
mean
because
most
of
the
like
properties
and
methods,
what
we
need
is
probably
reasonably
well
defined.
C
Yeah,
it's
a,
I
think
we
should
just
go
through
and
identify
okay,
what
we
need
to
get
what
we
need
need
to
be
able
to
have
from
this
mesh
so
like
we
need
to
have
things
like
a
you
know,
a
face
divergence
operator
like
the
things
that
we
need
from
the
mesh
right
for
it
to
be
able
to
work
well
with
the
discretize
ecosystem,
at
least
yes,
like
just
list
the
things
that
we
that
we
need
from
it,
and
then
we
can
see
where
we
can
pull
things
in
from
other
sources
to
be
able
to
construct
those
without
having
to
do
them
ourselves
too
much
yeah.
F
Well,
you're
using
that
gen
right
peter
to
do
it
yeah,
that's
that's
right,
yeah,
and
so
this
is
already
using.
So
I'm
sure
they
already
figured
out
the
hook
up.
The
hooks.
C
A
Yeah
and
so
for
a
first
problem,
because
that
even
then
maybe
prioritizes
which
operators
we
implement.
First,
are
you
thinking
potential
fields
then
or
dc
resistivity,
because
we
could
do
either
is
an
initial
focus?
It
doesn't
really
matter.
Yeah.
B
I
think
I
think
we
start
with
potential
fields,
just
I'm
just
trying
to
pick
the
simplest
fastest
thing
just.
C
B
Yeah
yeah,
so
it's
all
fortran
code
that
calculates
all
that.
So,
okay,
you
know
I
I
would
just
need
to
know
the
observation
locations
and
then.
E
I
B
So
that's
a
bit
like
the
what
is
it
called
the
cut
mesh
approach
that
is
used
in
something.
B
Yeah,
yes,
so
sure
that
we've
never
looked
at
that.
I
think
ultimately,
we
wanted
to
work
with
just
purely
unstructured
measures
because
they
were
just
so
generally
flexible,
because
something
that
we
have
been
doing
a
lot
is
is
getting
geological
models
from
from
people
to
use
as
constraints
and
and
once
you
have
those
it
really
just
makes
sense
to
have
a
fully
unstructured
tetrahedral
volume.
B
A
Okay,
well,
let's
maybe
we
can
get
updates
from
other
folks
and
then,
if
there's
other
questions
that
come
up,
we
can
go
through
there.
Maybe
john,
would
you
mind?
Kicking
us
off
you've
been
organizing
lots
of
things
with
the
seg
and
making
things
happen
there
yeah.
I
don't
know
if
you
have
other
things.
F
Yeah,
that
was
my
great
great
pain,
great
pain
of
last
week.
Basically
just
to
get
everybody
up
to
speed
is
that
I
for
sure
not
gonna
go
to
texas
in
october
and
decline
is
not
going.
Basically,
no
one
here
is
going
to
seg
this
year.
As
far
as
I
know,
and
then
their
hybrid
approach
seemed
super.
Annoying
is
that,
basically,
you
cannot
have
an
online
presentation.
You
can
only
pre-record
your
video
and
at
the
end
of
it
you
can
ask
questions
by
calling
in
basically
with
a
with
a
phone
which
is
insane
but
anyway.
F
So
in
the
end
I
kind
of
accepted
okay
we'll
do
it
we'll
do
the
we'll
do
the
video?
Basically,
the
idea
would
be
that
we'll
just
part.
It
part
something
like
three
three
segments
lindsay
if
you're,
if
you
want
we'd
like
to
do
the
first
one
it'd
be
great
sure,
just
do
a
general
intro
right.
Sorry,
you
were
saying.
F
F
F
We
do
a
one,
long
video
which
has
three
segments
in
it,
but
then
we
have
like
a
15
or
20
minutes
like
still
frame
where
nothing
is
happening,
and
then
we
let
people
just
get
out
of
scg
and
bring
them
to
either
a
zoom
call
or
a
chat
session,
and
on
slack
so
that
people
can
ask
questions
and
you
can
interact
with
them
and
then
the
video
is
going
to
keep
rolling
until
the
next
thing
comes
on
and
then
we're
gonna.
You
know
do
this
three
times.
A
Yeah
and
we
could
even
do
like
what
transform
did
with
basically
floating,
like
a
slack
link
on
all
the
slides
or
on
the
entire
presentation,
and
just
have
that
there
and
sort
of
be
in,
and
you
know,
be
able
to
answer
questions
and
things
like
that,
because
yeah
without
the
interaction,
it's
you're
just
watching
a
recorded
tutorial.
F
Yeah,
it's
not
it's
not
really
a
workshop
to
me.
This
is
more.
This
is
more
like
a
basically
a
youtube
video
which
would
technically
we
shouldn't
have
to
pay
a
cd
to
to
present.
But
the
thing
is
that
what
made
me
change
my
mind
is
that
people
are
already
registered
for
it
they
already
paid
for
the
workshop.
So
I
was
like
well
okay.
If
we
have
people
that
are
interested
in
senpai
enough
to
pay
for
it,
you
know
we
should
probably
do
something.
So,
let's
do
it,
but.
I
F
Yeah,
I
don't
think
I
don't
see
why
not,
because
technically
what's
gonna
come
out
of
the
scg
is
only
going
to
be
a
like
a
like
a
sound,
a
sound
feed.
So
as
long
as
you
don't
have
to
share
a
mic
or
or
a
camera,
you
could
technically
have
the
youtube
or
whatever
they're
gonna
platform,
they're,
gonna
use
and
then
just
and
just
have
a
zoom
going
on.
At
the
same
time,
I'm.
H
F
Hoping
so
because,
well
otherwise,
it's
just
gonna
be
a
video.
What
can
I
say?
That's
all
they.
Let
us.
E
I
F
Because
you
know
like
learning
about
simpeg
and
how
to
use
it
if
you're
actually
not
doing
it
and
I'm
and
I'm
sure
sdg
is
not
going
to
let
people
like
re-watch
the
video
afterwards,
because
it's
all
going
to
be
like
under
lockdown
right.
So
so
it's
a
little
bit
useless
in
my
mind
to
do
that
kind
of
exercise.
F
But
whatever.
A
F
Yeah,
exactly
it's
like
advertisement
right,
we're
basically
doing
doing
a
spot
and
then
see
what
see
what
comes
out
after
yeah.
A
A
F
G
F
That
people
don't
are
not
like
tuning
out
and
have
you
know,
questions,
question
periods
in
between
and
then
I'll
be
focusing
on
the
on
the
because
the
team
is
taking
the
open
source
right
so
I'll
be
focusing
on
maybe
a
little
bit
of
2d
data
processing
with
scikit-learn
and
then
the
apps
that
I'm
working
on
at
mira
for
the
potential
fields
and
the
em-1d
so
have
people
run
those
those
apps
because
yeah
I
can
work
on.
You
know
I
can
put
time
on
this
and
charge
it
to
mira.
F
So
I
that's
that's
a
that's
a
double
double.
Why
me
for
me,
I
don't
have
to
work
over
the
weekend.
I
can
just
do
it
there.
F
G
C
C
I
could
definitely
like
it
could
definitely
use
a
little
bit
of
touch
driving
the
only
that
I've
done
with
it
is
the
simple
examples:
I've
written
it's
it's.
It
seems
it's
a
pretty
robust
example
and
that
it
creates
two
random
meshes
and
interpolates
between
them.
C
F
I'll
I'm
going
to
take
a
look
man
this
weekend.
Did
you
see
you
can
we
can
do
both
right
answer
to
actually
an
actual
answer
or
yeah
just
one
way.
F
C
It's
good,
yep
and
and
then
treat
a
tree,
so
all
of
them.
What
else.
G
C
What
I
do
just
to
double
check
that
the
volume
averaging
is
doing
its
the
job
correctly,
you
can
double
check
if
the
meshes
are
on
the
same
extent,
when
you,
if
you
volume
average
one
to
the
other,
you
can
multiply
the
first
by
its
by
the
cell
volumes
and
sum
them
up
and
see.
If
that's
the
same
as
the
second
times,
the
cell
volume
summed
up
preservation.
C
C
C
If
you
call
with
the
two
meshes
and
then
an
input
model,
it'll
just
do
it
without
forming
the
matrix
and
then,
if
you
also,
you
can
supply
an
output
value
for
it.
So
if
you
don't
want
it
to,
if
you
want
to
use
something,
that's
pre-allocated
it'll
set
use
that
and
fill
that
with
the
with
the
output
and
then
return
it
again.
So
three
different
ways
of
calling
it
it's
examples.
The
tests
test,
all
three
of
them
at
the
same
time,
so
it's
like,
I
said
it
seems
pretty
robust
the
test.
C
I
think
the
tests
are
doing
a
good
job,
but
it
could
use
a
little
test
driving
to
see
how
fast
like
if
it
there's
any
spots
that
might
need
a
little
bit
of
attention
and
improvement,
although
I'm
not
really
quite
sure
where
it
would
go
right
now,
but
yeah,
that's
mostly
just
make
sure
it
works
for
your
examples.
C
So
I
did
think
about
that
a
little
bit
and
right
now
I
didn't
put
it
in.
However,
if
you
do
it
on
the
input
mesh
right,
if
you
want
to
remove
cells
from
the
input
mesh,
what
you
can
do
is
just
remove
what
remove
columns
of
the
matrix
and
then
rescale
it
so
everything
some
all
the
columns
sum
to
one
for
all
the
rows
sum
to
one.
C
If
you
want
to
redo
it
that
way,
that
would
be
the
easiest
way
and
then,
obviously,
if
you
have
topography
in
the
output,
just
remove
the
rows
and
it
doesn't
matter.
F
Operation
you
just
put
zeros
and
the
in
the
other
mesh
if
you're
outside
or
you
know,
if
you
use.
F
C
So
I
mean,
if
you
have
the
interpolation
matrix,
it's
it
returns
a
sparse
matrix.
So
it's
just
that
what
it
is,
so,
if
you're,
if
you're
interpolating
from
a
mesh-
and
you
want
to
like
discard
the
value
of
cells
that
are
in
it-
and
you
don't
want
them
to
come
like
if
what
soggy's
talking
about
what
I
understood
from
previous
talks,
is
that
you
could
have
like
you,
don't
want
to
necessarily
include
the
no
value
of
the
empty
cell.
C
In
that
case,
like
I
said
it
has
each
each
row,
has
the
contribution
is
the
contribution
of
a
cell
by
every
value
overlapping
cell
in
the
input
mesh
right,
so
each
cut?
What
you
do
is,
then,
if
you
want
to
discard
a
cell
in
the
input
mesh,
you
remove
that
column
and
then
you
would
rescale
all
the
rows
to
sum
to
one
again,
because
that
would
mean
that
okay,
you
only
have
these
three
things
contributing
to
the
volume
instead
of
like
forward
three
run,
a
2d
sense.
F
Or
psyche,
I
think
what
we
would
do
is
just
do
a
mapping
just
do
the
the
active
map.
After
doing
you
do
the
interpolation
and
then
you
select
the
active
map.
So
then
you,
then
you
remove
the
air
so.
G
Then,
like
the
problem
is
like
let's
say
you
got
like
20
minus
eight
for
air
cell
and
you
got,
I
don't
know
one
ohmmeter
or
one
siemens
per
meter,
then
yeah
it's!
You
may
not
want
to
average
those
points
so
yeah
so
yeah.
I
think
I
would
at
some
point.
We
need
to
omit
that
when
we're
interpolating
so
yeah
what.
G
And
then
you
generate
that
the
sparse
matrix
that
you
can
move
one
from
the
other
yeah
yeah.
So
don't.
I
think
you
cannot
go
back
because
one
is
like
you're
sort
of
upscale,
but
we
can
have
like
a
transpose.
We
can
have
an
adjoint.
F
E
C
I've
been
also
playing
with
azure
pipelines
on
discretize
on
my
own
repository
to
get
it
to
so
we
could
have
one
spot
to
do
all
the
testing
on
all
three
platforms,
as
well
as
deploy
all
of
the
windows
wheels
that
get
deployed
right
now,
and
the
source
distribution
build
the
documentation
things
like
that
and
have
it
all
in
one
spot,
which
would
be
nice.
The
way
I
was
I
had
been
testing,
it
was
with
undiscretized.
C
Whatever
virtual
machine
they're
using
on
track
on
azure
runs
faster
than
the
travis
so
to
actually
complete
faster
and,
like
I
said
it's
just
kind
of
nice
to
be
able
to
have
it
all
in
one
spot
and
right
now,
the
the
app
layer
only
does
one
test
at
a
time
and
it
like
it
does
four
tests
and
it
takes
forever
to
complete
any
of
them,
but
I've
just
been
playing
with
it
on
my
local
repository
and
trying
to
figure
out
make
sure
we
can
do
everything
we
need
to
do.
C
C
The
install
like
the
install
and
build
time
is
a
lot
faster
on
azure
pipelines,
virtual
machines,
the
it
runs
fast
enough
that
I
can
get
it
to
build
these
concrete
documentation
without
stalling
out,
because
travis
the
travis
run
on
the
complete
documentation
stalls
out.
So
we
wouldn't
need
to
do
any
of
the
weird
uploading
and
downloading
stuff
that
we've
been
messing
around,
but
the
thought
is
on.
That
is
that
I
wouldn't
have
it
build
the
entire
documentation.
C
When
testing,
I
would
have
it
run
a
few
or
run
subsets
of
the
gallery,
examples
and
tutorials
and
make
sure
they
all
run,
but
not
they
don't
have
to
run
all
at
the
same
time
and
like
set
up
blocks
of
them
to
run
and
then
also
test
the
documentation
building
without
generating
gallery
examples,
and
then
once
everything
tests,
then,
whenever
it
like
deploys,
that
would
be
when
we
build
the
entire
documentation,
it
doesn't
have
to
happen
every
time.
A
A
A
C
Yeah
we
have
a
suite
of
tests
that
get
run
to
double
check,
that
everything
is
functioning
the
way
we
think
it
should
function
and
they
they
get
run
in
the
cloud.
Every
time
somebody
commits
to
a
github
or
somebody
creates
a
pull
request,
so
just
to
double
check
that
things
are
working
correctly.
Can
we
ask
if
people
contribute
new
things
that
they
also
contribute
tests
to
run
those
new
things?
Sure.
C
There's
a
really
nice
pull
request
right
now
on
disc,
on
simpfeg.
That
does
exactly
that.
I
didn't
check
in
the
documentation
side
of
it.
I
think
it
was
okay.
C
C
C
So
now,
instead
of
having
to
use
the
when
you're
writing
documentation,
you
used
to
have
to
use
that
plot
it
command
in
in
restructured
text
or
whatever.
But
now
you
can
just
use
numpy
examples
like
the
way
the
numpy
docs
work.
If
you
look
at
them
and
you
see
the
triple,
you
know
triple
right
arrows
and
then
things
going
on
like
that.
So.
A
E
I
guess
so
little
little
bits
everywhere,
but
all
the
things
I've
been
working
in
the
last
weeks
are
sort
of
slowly
trickling
into
master.
So
now
the
survey
class
in-
and
I
hope
soon
that
simulation
and
the
optimization
and
then
finally
the
command
line
stuff
will
be
documented
and
tested
enough
to
be
in.
E
E
And
that
actually
brought
me
up
this
this
issue
I
raised
which
I
was
wondering
what
you
think
about
it
in
in
in
the
discretize,
and
I
tried
to
subclass
a
tensor
mesh,
I'm
getting
an
issue
because
I
would
I
would
like
to
to
assure
if
I
use
a
discretize
mesh,
that
is
a
3d
mesh,
so
I
don't
have
to
take
care
of
that
later
and
when
I
subclass
discretize.
I
also
want
to
add
two
three
things
that
are
specific
to
my
code.
E
E
I
don't
know
if
you
should
there
use
a
discretized
tensor
mesh
instead
of
a
self
thunder
class
thing,
but
I'm
not
actually
sure
if
that
would
resolve
the
problem
but
yeah,
because
I
think
discretize
could
be
interesting
for
other
folks
down
the
road
to
subclass
as
well,
for
their
particular
use
case.
So
might
be
something
we
could
think
about
how
to
improve
that
yeah.
E
A
Yeah,
we
didn't
talk
about
that
further,
I
don't
know.
If
what
do
folks
think-
and
I
guess
what
do
you
also
think-
is
sort
of
a
reasonable
amount
of
time
they
could
commit?
Maybe.
E
A
A
F
A
A
Yeah,
okay!
Well,
we
can
maybe
start,
would
somebody
be
willing
to
start
a
poll
of
sorts
and
we
can
try
and
get
something
in
the
next,
maybe
at
the
end
of
the
month
or
or
early
august
kind
of
thing,.
A
Fair
enough,
they
all
blew
together,
cool
okay.
Well,
dear,
would
you
would
you
mind
putting
together
just
a
little
poll
for,
let's
maybe
say
the
last
week
of
july
first
week
of
august,
we
can
try
something
in
those
two
weeks
and
yep.
A
E
And
quick
question
in
em
data:
what
do
you
usually
use
a
standard
error?
If
you
don't
have
anything
better
for
observed
data?
Do
you
use
like
five
percent?
Do
you
use
three
percent?
Do
you
use
twenty
percent?
What
is.
G
D
D
Okay,
so
it's
actually
you
may
I
could
send
you
something
and
it
might
have
a
bit
of
a
similar
flavor
to
the
off-diagonal
components
of
mt
data,
and
I
can
kind
of
show
you
what
I
mean
but
yeah.
Usually
we
we
plot
it
up
that
way.
We
take
a
look
at
the
floor
or
we
choose
a
floor
and
a
percent
based
on
that.
A
Thanks
thanks
steeter
devin,
would
you
like
to
go
next.
D
Yeah,
I
really
don't
have
too
much.
Dieter
was
talking
to
me
about
having
some
errors
when
he
tried
to
import
the
stuff
and
and
work
with
the
em-1d
programming
that
I've
been
working
on.
So
I
need
to
go
in
there
and
clean
that
up.
So
I
haven't
really
gotten
too
around
to
it.
So
sorry
about
that
deter,
but
that
that
needs
to
be
done.
I
have
no
import
errors.
D
Things
seem
to
work
fine
on
my
computer,
but
dieter's
been
having
some
issues
so
whatever
is
set
up
in
those
init.pi
files
is
not
really
universally
working.
A
Okay,
thanks,
seven,
sorry
do
you
have
any
updates.
G
Yeah
and
one
question
from
like,
I
think
it's
a
italian
girl,
I'm
not
sure
what
what's
her
name,
but
she
was
actually
trying
to
do
a
surface
wave
inversion
but
he's
her
simulation
code
in
is
in
mad
lab
and
that
she
was
trying
to
generate
the
problem
class
in
simpac
to
run
an
inversion,
but
having
a
hard
time
but
yeah
I
haven't
followed.
I
didn't
have
the
follow-up
answer,
but
that
seems
like
an
interesting
use
case
that
we
are
hoping
to
have.
G
A
We
should
at
some
point
here
revive
the
example
that
we
started
with
ian
pymod,
because
that
should
now
actually
be
hopefully
pretty
straightforward.
And
if
we
have
one
example
of
a
different
board,
simulator
being
plugged
into
the
inversion
machinery-
and
hopefully
you
know
and
of
one
can
lead
to
lead
to
more
so
yeah.
H
Not
really,
though,
like
on
the
short
term,
I'm
still
making
my
way
through
merging
the
simulation
class
with
my
own
branch,
so
making
slow
progress
on
that,
but
yeah
I
will
like.
Hopefully
I
will
get
something
working
soon:
cool.
I
You
know
academic
licenses
for
a
lot
of
the
ubc
codes
and
you
know
they've
been
in
particular
the
potential
fields
as
well
as
the
you
know,
like
the
1d
inversions,
and
you
know
the
the
simple
codes
are,
you
know
they're
they're
mature
they're
being
used,
but
when
I
was
looking
at
the
most
recent
request,
I
didn't
have
a
really
kind
of
nice
kind
of
consolidated
way
of
saying:
hey.
You
know
you
guys
have
been
using
the
ubc
codes.
I
You
know,
that's
great
you're
happy
to
you
know,
keep
providing
those,
but
same
time.
You
know.
We've
been
developing
a
lot
of
stuff
on
simpag
that
you
know,
we've
got
potential
fields
and
now
we've
got
you
know
better
1d
inversion
codes
than
we
had
were
previously
distributed
at
ubc,
and
so
you
know
why
don't
you
take?
Let's
take
a
look
at
this
and
here's.
I
You
know
website
and
I
could
see
from
you
know
somebody
who's
just
not
sure
about
anything
like
how
they
would
try
to
make
that
connection.
I
have
a
feeling
there's
there's
a
potential
here
for
raising
visibility
of
simpeg.
Just
by
you
know
offering
you
know
some
suggestions
like
okay.
If,
if
you
were
used
to
to
this,
then
here's
you
know
here's
a
code
that
is
doing
the
same
kind
of
thing:
it's
open
source
and
here's
here's
an
example.
G
I
Yeah
so
the
thing
that
happens
a
lot
and
again
just
happened
the
other
day
you
know
when
somebody's
saying
you
know:
they've
got
they'd
just
like
to
have
an
academic
license.
You
know
for
grab
3d
and
mag
3d
and
for
the
frequency
domain
time
domain,
yeah
right
and
sure,
so
we
can
provide
that.
But
I
think
we've
you
know:
we've
gone
ahead
from
that
technology
and
it's
available
and
we
just
kind
of
need
to
have
something
that
would
point
you
know
somebody
who's
totally
unfamiliar
with
simpeg
with
python,
but
just
like
okay.
I
I
As
as
time
goes
on,
you
know,
the
whole
vision
is
that
you
know
the
codes
in
senpai
will
eventually,
you
know,
make
the
fort.
You
know,
be
enhanced
versions
of
anything
that
was
kind
of
done
at
ubc,
and
you
know
everything
will
somehow
be
transferred
to
to
the
open
source
and
that
that
might
be
an
incremental
process
as
things
you
know
as
things
proceed,
but
you
know
for
each
time
there's
a
tick
in
the
box
for
people
who've
been
using
a
ubc
code
or
other
code.
E
Maybe
it
might
even
be
worth,
I
don't
know
if
it
says
something
on
the
website,
but
if
it
would
be
saying
something
like
that,
simple
is
the
successor
to
ubc,
so
that
if
somebody
comes
across
that
website,
they
might
realize
that
it
comes
from
ubc
in
a
way
understand
new
way.
But
I
don't
know
if
everything
that
was
in
ubc
is
now
in
simpag
or
if
there
are
still
differences.
I
You
know
kind
of
loose
connections
there,
but
not
make
a
no,
not
formalize
it
too
much
or
make
too
big
a
thing
out
of
it.
Just
that.
Okay,
if
you
were
using,
you
know
a
ubc
code,
you
know
here's
we
now
have
one
in
simpeg.
That
is
is
available.
In
fact,
that's
kind
of
what
probably
dom
you're
talking
about
from
a
mirror
perspective
as
you're
talking
to
clients.
Is
that
correct.
F
F
This
is
sort
of
what
I'm
working
on
with
the
apps
that
that
are.
You
know
that
I'm
building
on
the
mirror
side,
but
at
the
same
time,
you
probably
don't
want
to
tie
synthetic
too
close
to
a
corporation.
So
we
kind
of
need
to
have
something
similar
to
this,
to
be
able
to
divert
people
towards
impact.
You
know
just
the
end
users,
people
who
just
want
to
run
jobs,
and
just
you
know,
explore
data
and
not
programming,
so
that
I
can.
I
guess
that's
that's
what
we're
that's,
what
we're
missing!
I
But
the
the
the
academic
licenses
are
coming
from
academia
and
sure
some
of
the
research
is
just
inverting.
I
You
know
inverting
the
data,
but
they
are
also
that
kind
of
target
group
that
we
might
think
about
from
the
point
of
view
of
actually
interacting
with
code,
and
maybe
doing
you
know
tailoring
it
a
bit
to
their.
You
know
to
their
research
as
well
as
mostly
using
it.
So
I
think,
there's
there's
another
segment
in
there.
Aside
from
just
some
end
user,
who
might
you
know
just
want
to
buy
a
commercial
code
that
has
an
input
and
output
and
all
the
bells
and
whistles.
F
Yeah
for
my
for
my
short
experience,
you
know,
like
obviously
I
haven't
talked
to
as
many
people
as
as
you
did
in
your
in
your
career,
but
I
would
say
that
90
are
end
users.
You
know
very
few
people
are
actually
interested
in
the
mechanics
and
seeing
it
so
most
emails
that
I
get
on
a
daily
basis
are
people
who
just
want
to
process
their
data.
They
don't
really
care
about.
You
know
what
solver
they're,
using
or
whatnot
so
and
those
end
users
they
need.
F
They
need
a
visual
they
need,
they
need
a
way
to
interact
with
their
data.
To
be
able
to.
You
know,
look
at
the
model
and
everything
so,
and
this
is
kind
of
beyond
what
simpeg
is
there
for
it's
a
bit
of
a
catch
22
right.
We
want
these
guys,
but
we
don't
want
to
spend
the
time
taking
care
of
them
either.
A
I
think
there's
at
least
for
me
what
I'm
seeing
is
there's
sort
of
two
levels
of
conversation
coming
out
of
this
one
is
like
you
know,
communication,
and
is
the
website
actually
answering
the
questions
that
a
new
user
would
have?
It's
like?
What
does
simpeg
actually
do,
and
I
don't
actually
know
if
our
website
fully
answers
that
and
like
what
does
it
do
and
how
do
you
get
started?
Yeah
we
should.
We
should
take
a
critical
look
and
actually
see
if
those
things
are
answered,
but
the
other
thing.
A
If
you
go
to
simpeg.xyz,
which
is
the
starting
point
where
you're
gonna
see
and
so
yeah,
so
that
that's
something
we
should
take
another
critical
look
at.
F
Maybe
we
just
need
to
put
a
youtube
video
together
that
just
you
know,
I
have
like
a
youtube
of
ubc
users
how
to
get
started,
and
then
just
you
know,
have
a
quick
video
that
just
showed
how
to
how
to
go
from
from
data
to
inversion.
Even
if
it's
a
written
notebook
that
has
a
level
of
code.
I
A
F
Because
we
have
all
the
loaders
right
for
ubc
already
we
have
the
mesh
without
the
data
you
know
they're
at
least
for
potential
fields
and
the
mdc.
So
we
could,
in
theory,
offer
that
it's
low
cost
of
us
to
put
together.
A
A
I
think
the
other
thing
that
this
brings
up-
and
I
know
it's
something-
we've
talked
a
bit
about,
but
is
it
might
be
timely,
to
start
thinking
through
this
because
I
think
it
feeds
into
where
we
need
to
refactor
next
is
actually
thinking
about
a
command
line
interface
for
simpeg,
and
what
does
that
look
like
because
for
a
lot
of
people?
That's
just
like
how
they
run
codes,
and
it's
not
a
hard
thing
to
do
with
python.
A
I
think
the
hard
part
is
like
how
do
we
design
a
an
appropriate
input,
file
and
sort
of
those
those
wrappers-
and
I
know
like
each
of
the
modules
right
now
has
their
own
flavor
of
wrappers
that
are
actually
very
unique.
Like
the
iodc
is
a
wrapper.
I
know
dom
you've
done
the
potential
fields
wrappers
and
those
have
enabled
other
people
to
get
up
and
running,
but
they're
like
we
don't
actually
sort
of
have
a
an
opinionated
approach
to
this
right.
Now
that's
kind
of
uniform
across
something,
so
it
might
be.
A
I
mean
it
might
be
easiest
to
start
there
with
the
forward
simulations,
because
I
feel
like
the
ford
simulations
were
actually
we're
now
reasonably
happy
with
the
structure.
I
think
the
inversion
and
the
optimization
classes
need
a
bit
of
work,
but
that
said,
some
of
the
like
wrapping
is
probably
independent
of
like
what
we
fix
under
the
hood
so
yeah.
It
might
be
worth
kind
of
thinking
about
how
do
we?
How
do
we
define
a
command
line
interface
for
simple
forward
simulations?
I.
E
H
E
From
the
start,
because
I
think
it
becomes
very
important
if
the
logging
module
or
another
python
package
to
store
all
that
stuff
but
yeah.
I
agree
that
the
the
terminal
based
stuff
is
very
easy
with
python,
but
define
how
you
do
it
because
then
it's
also,
I
think
I
feel
like
it's
even
harder
to
change
afterwards
than
python
code,
because
people
make
that
and
install
it
on
the
server
for
long
term
stuff.
So
you
really
not
want
to
paste
them
off
by
changing
all
the
time.
Your
stuff
so
yeah.
A
I
guess
I
can
give
a
couple
of
quick
updates
before
we're.
On
the
hour
I've
been
working
on
against
a
little
bit
of
the
stuff,
with
the
paper
with
dieter,
so
doing
some
em
modeling
and
I
actually
decided
to
get
up
and
running
on
google
compute
engine.
Just
mostly,
I
wanted
to
play
with
it
and
see
how
challenging
that
was,
but
also
they've
got
high
memory
nodes
and
not
having
to
deal
with.
A
You
know
hpc
centers
and
me
stealing
too
much
memory,
it's
actually
pretty
easy,
and
so,
if
anyone
wants
to
play
around
with
google
and
if
you
need
any
tips
or
things
like
that,
so
what
I
ended
up
doing
is
I
was
able
to
install
sort
of
the
simpeg
stack
and
get
a
jupyter
notebook
running,
and
then
you
can
actually
basically
attach
like
your
browser
to
to
a
persistent
ip
address.
A
That
is
basically
your
machine
and,
if
you
sign
up
with
a
new
account,
you
get
300
of
compute
credits,
which,
if
you're
just
running
moderate-sized
examples,
will
actually
go
pretty
far
so
been
playing
around
with
that
which
has
been
been
kind
of
fun.
G
How's,
the
cost
thing
is
working
lindsay
is
that
I
was
actually
like
a
like.
I
actually
got
a
thousand
dollar
credit
and
applied
that
research,
something
something
like
that.
Yes,
you
just
need
to
write
a
paragraph
to
ask
for-
and
I
was
actually
like-
I
haven't
tried
yet,
but
I
was
curious
like
how
their
costume
works.
So
is
it
like
once
you
turn
on
your
ssh
kernel?
I
think
it
seems
like
they
start
to
charge.
G
A
Yes,
so
you
can
turn
it
off.
I
mean
it
depends
on
what
kind
of
nodes
you
get.
But
yes,
if
you
do
sort
of
the
standard
virtual
machine
as
soon
as
you
turn
it
on
you're
paying
for
it,
because
you've
basically
reserved
that
machine
and
nobody
else
can
use
it.
A
But
you
can
you
can
turn
it
off
like
if
you,
for
example,
if
I'm
working
on
this
and
I'm
not
going
to
be
working
on
it
tomorrow,
I
can
turn
off
that
machine,
and
my
data
is
still
there
like.
I
still
paid
for
storage,
and
then
I
can
turn
it
back
on
the
next
day
and
get
up
and
running
again,
and
so
the
storage
fees
are
decoupled
from
the
compute
fees
and
the
storage
fees
as
long
as
you're,
not
generating
terabytes
of
data
is
pretty
minimal.
A
A
Yeah
single
note,
high
memory,
and
I
think
it
has
eight
cpus
or
something
like
that.
So,
but
the
costing
structure
I
can
send
you
a
website
that
they
do
a
pretty
good
job
breaking
down.
A
A
G
G
F
A
Perfect,
what's
all.
I
Thanks
thanks
for
coming
good,
seeing
you
and
yeah
we'll
hope
to
see
a
lot
more
of
you,
yeah.