►
From YouTube: SimPEG Meeting March 20
Description
Johnathan Kuttai presented on “Time-Series Processing — From Raw Signal to IP Decay”
A
B
C
D
B
Can
throw
on
a
percent
recycle
whatever
time
but
yeah?
It's
now
and
I've
implemented
your
guys.
It
gets
to
the
mapping
Oh
technique
so
that
you're
not
actually
just
calling
all
these
functions.
You
know
stack
this,
and
then
we
know
this.
It's
just
you
look
down
by
57
here
that
one
line
will
take
your
audio
series
and
transform
into
it.
B
So
yeah
I
kind
of
set
it
up
here
so
that
yeah,
if
you're
saying
you
do,
have
a
whole
bunch
and
you
need
a
batch,
do
it
and
you
want
to
see
play
with
your
filter
a
little
bit
see
what
your
perfect
settings
are,
that
you
have
access
to
all
that
data,
so
you
can
kind
of
see
it.
So
you
we
start
off
with
your
raw
time
series.
Unfortunately,
this
is
a
really
good
signal
again.
B
Base
frequency-
oh
yeah,
yeah
yeah,
so
this
is
pretty
much.
This
outlines
of
the
steps
of
what
people
were
doing
is
taking
the
time
series
and
then
it's
an
earth.
It's
using
the
filter
kernel
here,
which
is
the
green.
The
green
dot
fought
there
and
that's
essentially
what
year,
but
it
kind
of
looks
like
and
then
beside.
It
is
the
frequency
response,
so
you
can
see
when
you
use
the
tapered
window,
it's
more
like
a
low-pass
filter,
Thanks.
B
The
filter
kernel
is
its
weight.
It's
as
big
a
moving
average,
though
right.
So
each
one
of
these
I
guess
I
call
them
taps,
but
these
points
is
kind
of
like
a
weight
or
when
you
do
the
multiplication
through
and
right.
So
your
first,
your
end
in
your
beginning
cycles
are
going
to
be
weighted,
a
lot
less
because
they're
kind
of
starting.
E
A
A
F
G
H
B
B
Is
it
takes
that
cop
signal
and
uses
all
the
data
and
it
breaks
it
into
each
of
its
half
sizes
or
how
peer
concision
say,
and
that
may
and
it
forms
a
matrix
so
that
every
column
is
one
of
the
last
cycles
where
the
filter
kernel
is
just
you're
better
with
all
these
values,
so
that
when
you
do
the
multiplication,
it's
essentially
doing
like
a
moving
average
filter
and
then
having
the
alternating,
positive
and
negatives
helps
remove
the
linear
drift.
But
that's.
B
E
B
F
I
D
B
H
B
B
H
J
B
H
B
B
B
B
B
Definitely
not
enough
I
think,
depending
on
where
you
are,
but
there's
it's
been
seeming
that
you've
got
to
have
a
little
bit
of
a
trade-off
by
acquiring
the
data.
You
start
going
with
four
seconds
you're
starting
to
add
a
lot
of
time
on
Saturday
and
that's
yeah
clients,
sometimes
don't
like
to
pay.
D
F
J
I
D
E
E
B
D
C
D
D
B
B
So
you
just
like
a.m.
you
got
your
turn
off
time.
Hey
I
mean
this
is
kind
of
the
equivalence.
That's
a
that's!
A
wrap
off,
there's
a
little
bit
of
a
ramp
off,
so
it
goes
to
about
two
seconds
and
then
it
stops,
and
then
we
just
have
a
little
twenty
milliseconds
before
we
start
the
calculating
them
the
decay,
the.
B
B
H
H
B
H
H
B
H
B
D
B
L
E
B
B
B
B
You've
got
so
much
more
time
to
work
with
here,
yeah,
so
you're,
just
using
those
and
you're
marching
down
and
it's
kind
of
a
logarithm
space.
So
your
late
ones
get
they
get
larger
to
go
and
you
do
a
trapezoid
go
for
the
time
integration.
How
are
you
so?
The
integration
I
use
just
like
well,
that's
kind
of
when
we
you
just
want
to
look
at
the
code
here,
I'd
let
the
window
waiting.
So
it's
not
quite
a
trapezoidal
I
use
similar
to
kind
of
what
you'd
be
where
with
the
helper.
B
B
F
E
B
Some
it's
well
because
if
you
use
just
this,
if
you
think
of
it
that
you
use
a
trapezoid,
your
your
rectangle,
that's
essentially
kind
of
like
a
high-pass
filter,
you're
passing
everything,
whereas
if
you
start
again
papering
the
sides
and
the
weights,
it's
almost
like
a
low-pass
filters
happen.
So
it's
say:
let's
just
change
this
down
to
101
for
its
attenuation,
so
that's
kind
of
bringing
everything
up
on
the
sides.
E
D
D
E
E
E
D
E
If
that's
the
case
like
yeah,
what
what
would
you
do
like
like?
You
still
want
to
use
that
like
a
taper
in
Windows
and
if
you
want,
or
if
you
have
one
already
well,
usually
I-
think
that
the
reason
why
you
need
that
very
smooth
window
yeah,
like
this
lose
function
at
the
end
like
now,
you
attended
end
the
way
you're
gonna,
integrate
that,
where.
G
E
E
L
E
B
Like
you
know
something
like
that,
you
improved
the
decay
like
you
have
in
30,
you
fix
the
weights
a
bit
more
and
you
get
a
smoother
decay.
But
how
much
does
your
chargeability
actually
change
and
then
Innes
also
I,
guess
I've
written
here,
a
cold
cold,
fitting
a
hole,
hole
fitting
tool,
and
you
can
see
that?
Okay,
so
with
one
on
one,
we
have
yeah.
B
So
if
you're
looking
to
do
extract
of
the
parameter,
it
may
be
pretty
neat
it
out,
isn't
so
much
necessary.
But
again
this
is
kind
of
a
bad
example,
because
the
case
can
get
wild
with
that
and
by
doing
a
little
bit
effort,
you
can
start
recovering
a
nice
natural
looking
curve
and
there
I
can
see
the
benefits
of
getting
a
better
charge
ability
or
better
than
seeped
out.
B
A
B
B
A
B
B
D
B
Bt
data
I
have
written
just
a
little
function,
yeah
yeah
I,
don't
have
it
printed,
but
it's
just
a
little
matter.
There's
a
get
primary
voltage
and
then
you
just
give
it.
The
stack
and
it'll
send.
Okay,
however,
for
any
other
like
if
you
want
to
start
sampling
on
time,
I'll
have
to
its
local
easily
broken,
but
yeah.
It's
should
be
out
Riddler.
What
are
you
looking
for
in
the
on
time?
Question?
B
E
E
E
B
B
B
B
E
B
H
I
D
B
The
shape
there
there's
that's
a
your
window,
so
you
make
a
shape
and
then
the
th
day
here
is
short
for
K
/,
half
kernel
yeah,
so
you
just
simply
make
your
your
object.
Your
filter
kernel
object
and
then,
when
you
do
like
you
guys
showed
me
there
that
the
multiplication
I
fit
it
in
right
here.
So
when
you
give
we
make
this
object,
it
immediately
makes
the
kernel.
B
A
Hand
done
is
actually-
and
this
is
one
thing
we
could
play
around
with
the
structure.
A
little
bit
is
used
inheritance
to
deal
with
that,
so
we
didn't
like
in
in
the
maps
you'll
notice
like
yeah.
Well,
if
you
look
at
any
of
the
parametric
maps
or
any
look,
we
don't
actually
like
explicitly
over
800.
We
inherit
that
from
the
identity
map
and.
A
B
A
I
I
A
B
Yeah
and
then
the
decay
kernel
is
built
to
market
is
so
it
is,
makes
up
everything,
and
then
it's
multiplication,
of
course,
is
a
little
bit
different,
because
it's
actually
using
that
function
that
I
have
down
here,
get
these
it'll
get.
We
did
it
yes,
and
it
just
calls
that
for
everything,
though,
that's
pretty
much
with
the
what
that's
doing
what
that
mapping,
we
can
go
yeah,
so
it
goes
through.
All
the
windows
makes
up
the
filter
kernel
for
that
proper
size.
I
B
B
But
it
seems
that
they
have
it:
yeah
I'm
gonna
go
with
the
ticket
costing.
It
is
our
engineer
and
you
know
we're
asking:
oh
you
know.
People
are
asking
for
higher
frequency
and
he
pumps
up
with
200
and
then
it's
like,
oh
okay.
Well,
but
again
this
it
is
strictly
different
ideas
and
I.
Guess
there
really
hasn't
been
any
hard
evidence.
A
higher
sample.
Ip
signal
is
more
beneficial.
I
guess
you
would
think
it
would
be
nicer
for
frequency
domain
Thanks.
Then
you
could
have
look
a
lot
better.
A
lot
finer
example:
frequencies.
D
B
B
And
yeah,
the
final
thing
here
is
I've
been
working
with
writing
the
Cole
Cole
fitting
yes,
and
then
that
just
shows
what
came
of
it
so
right
now,
I'm
just
doing
like
a
simple
kind
of
look
up
method,
you
give
it
an
initial
C
pal
and
then
it
just
iterates
through
those
part
in
the
research
radius.
So
you
can
give
it
a
big
span
or
a
small
span
and
then
you
can
kind
of
define
it
or
refine
it.
B
B
It
for
errors
trying
to
get
some
areas
into
the
inversion,
just
to
see
what
it
does
and
if
we
do
better
in
the
inversion,
instead
of
just
using
a
blank
at
5%
or
something
like
that.
So
far,
things
have
been
working
out
a
little
bit
better.
I
think
that
errors
are
a
bit
high
because
I
even
this
one
here,
it's
saying
17%
you're,.
B
B
With
this
coke
hole,
you
can
give
it
a
week
as
well.
So
say
you
just
want
a
fist.
The
end
like
the
late
time.
You
can
just
give
it
a
weight
where
it's
zero,
zero
just
give
it
one
at
the
end.
So
then
you
can
because
I
guess
it
was
snacks
smack
slow,
PM
software.
There
you
can
kind
of
give
it
your
window
or
it'll.
Tell
you
a
cow
or
well.
Not
so
I
was
thinking.
You
could
do
the
same
thing.
I,
don't
know
if
that's
yeah.
E
B
H
E
B
I
Yeah
I've
I've.
B
B
I
do
have
a
version
of
this
that
can
be
a
bit
faster.
I
just
noticed
that
an
umpire
does
have
the
tiling
functions
and
that's
what
I
used
in
my
C++
code
that
made
it
really
fast.
So
this
one's
still
just
using
the
for
loop
but
I,
know
I
can
improve
that
there,
but
again
I
just
kind
of
been
making
it
to
see.
Because
what
came
is
me?
What.
C
I
B
L
B
D
B
K
B
I
E
B
Our
system,
because
it's
the
mesh
network,
sometimes
they
start
later
and/or-
start
scooter
than
the
others.
So
we
can't
quite
treat
it
with
the
same
filter
kernel
all
the
time.
Oh
we
will.
We
can
force
it,
but
if
it
one,
if
we
want
to
all
the
message
late
and
it's
a
half
and
the
signal
to
rest,
we
just
force
it
to
just
yeah
pick
up
its
own
staff
in
order
right
so,
ideally
from
say,
like
a
DDD
and
should
have
been
yeah
everything's
everything
starts
at
once
is
they're
all
interconnected
by
wire.
B
E
B
Just
a
little
things
like
that,
because
it's
the
system,
like
it's
processing,
later
thousand
valve
with
the
dipoles
and
it's
it's
like
you're,
trying
to
find
that
one.
That's
like
throwing
everything
else,
and
so
it's
just
kind
of
I
guess
the
hard
part
is
automating.
Finding
all
this,
these
problems
and
stamps
and
slowly
making
our
way
there
and
that
it's.
D
E
E
And
exhibiting
right
and
then
after
that,
if
you
want,
you
can
make
more
data.
Okay,
let's
say
you
did
this
and
that
yep
I
we
can
do
that
yeah,
so
I'm
like
and
then
go
using
that
or
like
a
proof
to
generate
those
values
were
oh.
B
We
just
use
the
notes,
just
change
the
note
there
are
the
signals,
so
this
one
records
one
signal
and
then
you
subtract
it
from
that
signal
to
get
that
dipole
so
that
we
can
do
it
from
English.
So
you
do
that
prior
to
person
prior
processing
yeah.
So
what
I've
done
here
is
I've
just
used
my
pool
that
I
was
producing
building
for
just
working
about
a
kilometer,
there's
the
alcohol
or
step
or
RMS.
B
D
H
G
D
G
H
H
B
Well,
the
thing
is:
I
have
been
really
pushed
into
finding
to
improving
what
we
put
into
the
immersion
because
yeah
now
that,
with
all
this
data,
it's
just
like
you
can't
go
through
every
single
data
point
one
by
one.
So
you
know
maybe
automating
this
a
little
bit
and
you
know
better,
but
feder
goes
into
the
inversion.
Gator
comes
out,
so
how
can
we
make
that
happen
with
all
our
data?
B
B
Things
happen
to
them.
It's
too
much
data
to
look
at
so
we've
kind
of
just
parse
it
into
pieces
and
we're
finding
that
it's
bit
more
beneficial
than
just
like.
Oh
they
want.
You
know,
200
to
600
meter
goggles
here,
throw
an
immersion
something's,
not
right
here,
but
everything
looks
all
right,
but
it's
yeah.
It's
just
finding
a
way
to
make
this
data
didn't
version
better,
be
a
better
data,
better
air
assignment.
What's
a
good
big.
E
B
Most
effective
one
right
now:
it's
that
one
we've
just
got
like
basic
little
rejections,
like
you
know,
give
you
a
range
of
Crouch
abilities
or
your
facilities
yeah
go
by
there's
of
the
on
time
or
how
well
your
signal
is
how
stable
it
is,
but
yeah
so
far
the
Cole
Cole
one
is
giving
us
the
best
best.
Automated
Long's
idiot
doesn't
give
you
the
sacred
desk
errors,
but
at
least
it
will
see
that
DK,
that's
along
like
this
I.
D
B
B
D
B
B
A
F
D
F
B
F
J
F
B
B
D
H
H
I,
don't
do
that
sure
that
what
you're
suggesting
is
particularly
useful
or
why
you
do
it,
because
if
you
take
the
same
datum,
let's
talk
about
a
linear
problem,
right,
you've
got
specific
kernel
function
and
now
you're,
basically
going
to
repeat
that
kernel
function,
twice
right
and
then
you're
going
to
you
know:
you're
assigning
a
particular
uncertainty
to
it.
So
it's
Gaussian
and
each
you're
handling
twice
the
number
of
data
you
should
get
to
are
misfits
that
are
consistent
with
your
Gaussian.
F
F
It
doesn't
sound
like
it's
a
big
deal.
I
mean
I
was
just
saying
like
basing
it
you're
it's
kind
of
like
you.
Don't
even
touch
data.
F
H
C
D
Would
still
be
careful
by
just
putting
extra
route
data
I
can
encode
like
the
road
attacking
the
inversion,
because,
let's
imagine
we
are
like
just
like,
for
we
need
a
full
renovation
of
the
center
at
all,
but
one
of
them
is
either
pink.
Outlier
yeah,
like
the
data,
speaks
square
zero.
So
this
outer
you're
gonna
have
like
a
huge
impact
on
your
feet
and
your
feet.
Gonna
take
something
like
three
in
between
those
people
and
this
outlier,
so
this
outlier
gonna
be
massive
weight.
J
F
There's
another
benefit,
so
so
Allah
liar
in
the
data
can
be
translated
into
the
model
update,
which
is
kind
of
really
weird,
for
example,
most
daily
requires
this
number
to
go
up
because
of
outlier,
so
one
realization
means
this
number
to
be
go
down,
and
maybe
you
is
it's
easier
for
us
to
pull
some
sorts
of
constraint
in
model
delay
and
say
this
number
to
go
up.
It
shouldn't
be
called
loss.
Certain
value
then
so
basically
means
this
out.
F
Sometimes
it's
hard
right
and
by
weather
is
really
all
I
are
not,
but
if
you
were
working
with
the
daily
everything
model
domain
may
be
easier
to
tell
okay,
this
number
should
go
up
and
this
model
parameter
should
be
like,
for
example,
below
certain
points
and
whatever
data
is
requiring.
This
model
update
is
outlier.
F
Anyway,
it's
it's
now
something
you
have
to
do,
but
just
like
maybe
a
different
approach.
Yeah.