►
From YouTube: SunPy Community Meeting (13/07)
Description
A
B
C
B
B
Talking,
okay,
hey
cool
anyway
other
than
that
I've
been
extracting
a
lot
of
the
assert
statements
that
I've
just
been
putting
in
the
code
and
putting
them
into
a
test
and
instrument
so
that
I
can
try
to
test
the
code
now
that
it's
more
written
and
not
changing
as
much
at
least
half
of
it.
And
I've
been
moving
on
to
the
anisa
tivity,
which
is
using
shanty
pie.
B
That's
essentially,
I'm
trying
to
model
in
acitivity
to
get
some
kind
of
idea
of
temperature
using
ionization
equilibrium
or
just
the
temperature
of
ions
and
I'm,
finding
that
she
octopi
or
shanty
dot,
IO
or
ìoh.
You
both
have
ways
to
get
Anissa
tivity
and
they
also
have
the
contribution
function,
which
is
the
main
thing
I
need
for.
The
temperature
response,
so
I
feel
really
good.
B
Moving
forward
that
I'm
going
to
use
the
contribution
function
and
times
that
by
my
wavelength
for
spots
that
I
just
made
good
and
we're
going
to
find
that
per
wavelength
and
I'm
working
on
getting
that
code
up
and
running
to
have
something
by
this
next
week.
So
we
can
adjust
for
general
case
of
them
and
that
kind
of
thing,
general
cases
and
any
solar
features,
I
mean.
B
Yeah,
it's
it's
coming
together
now,
I'm
really
starting
to
be
used
to
those
are
easy
erase
the
one-dimensional
ones
from
the
Gen
X
Files,
because
you
have
to
navigate
them
to
kind
of
understand
what
ssw
is
doing,
oh,
but
that
actually
helping
to
figure
out
kind
of
what
we
can
add
to
it.
Currently,
the
air
model
just
uses
constant
pressure
and
and
I
didn't.
F
E
Okay,
yep,
so
this
week
was
more
a
matter
of
refinement
of
the
metadata
class
that
I'd
previously
created.
So
it
was
a
better
trying
to
fix
a
couple
of
bugs
on
it,
trying
to
make
it
slightly
more
usable
and
throwing
it
actually
into
the
class
into
the
time
series
class.
So
those
of
you
who
don't
know
the
problem
that
I
keep
on
mentioning
essentially
the
issue
with
time
series
is
obviously
doing
it
generally.
You'll
want
to
make
time
series
out
of
multiple
separate
files.
Normally
that
will
be
the
same
instrument.
E
So,
for
instance,
you
might
have
ten
days
of
goes
data
you
want
to
deal
with,
but
of
course,
those
be
stored
in
ten
individuals
one
for
each
day,
and
if
you
want
to
combine
them
into
something,
then
you've
essentially
got
ten
different
petted
laters.
This
metadata
in
today
to
objects
deal
with,
and
obviously
you
can
try
to
get
rid
of,
try
to
streamline
it
and
get
it
into
one
metadata
object,
but
that
can
be
a
bit
unfortunately
confusing
and
you
can,
in
the
end,
miss
important
data
from
that.
E
So,
for
instance,
calibration
data
of
the
instruments
might
be
might
have
changed
between
those
days,
so
I
created
a
mess
later
object
which
store
multiple
individual
metadata
objects,
and
essentially
then
it
organizes
them
via
the
time
range
and
what
columns
they
are.
There
entered
it,
and
so
yeah
I
described
this.
Last
week
a
little
bit.
There
was
a
bit
of
discussion
on
some
of
features
that
it
wanted,
as
it
essentially
has
all
of
those
features
now
and
I
hope,
and
so
I
I
thought
I'd
do
a
little
demonstration.
E
E
Okay,
so
essentially,
if
I
want
to
create
a
nice
time,
series
object
or
even
less
less
great
and
not
so
nice
time
to
dodge
it
lets
make
something
quite
nasty
I'll,
bringing
my
general
imports,
and
so
this
is
going
to
be
a
time
series
that
contains
a
number
of
goes
files,
so
it
contains
a
total
of
six
ghost
files.
So,
if
I
wanted,
I
should
be
able
to
view
that
time
series
as
a
graph
nice,
and
simply
so
I
can
go
anita.
E
And
there
we
are
so
that
is
a
time
series
of
six
days
of
combined
data
from
the
goes
satellite.
So
the
important
thing
that
I've
been
working
on
primarily
is
this
method
aether
class.
So
it's
now
got
the
ability
to
print
itself,
nice
and
neatly.
So
here
it
is
it's
a
bit
long-winded,
but
essentially
you've
got
six
times
series
entries.
E
E
You've
got
guys
putting
and
goes
15
I
can
make
this
a
little
bit
more,
a
little
bit
more
legible
to
string
by
reducing
the
depth
of
information
so
different,
but
you
can
hopefully
see
now
that
it's
just
confine
and
concatenated
it
a
little
bit
and
obviously
it
tells
you
when
there's
more
data
to
be
viewed
so
the
essentially
it
tells
you
what
columns
that's
associated
with
with
what
time
range
its
associated
with
and
then
actually
just
stores
a
metadata
dictionary,
which
is
a
like
an
ordered
dictionary,
but
it's
a
caps
insensitive
with
all
the
individual
data.
E
So
maybe,
if
I
do
for
it
will
include
the
satellite
itself.
No,
maybe
not
yes,
so
this
is
the
primary
object
and
from
this
particular
metadata
object,
you're
actually
able
to
start
getting
values
to
say,
for
instance,
I
wanted
to
find
out
what
telescopes
are
used.
So
I
like
a
normal
dictionary,
I'd
use
the
get
method
like
a
normal
dictionary.
All
you
need
to
supply
it
with
is
the
actual
value
that
you're
looking
for.
So
let's
sail
looking
for
telescope
value,
0.
E
In
this
case,
it
will
return
because,
obviously,
there's
multiple
metadata
objects
in
their
multiple
metadata
dictionaries.
It
will
turn
a
list
of
results.
It
dbg
duplicates
a
list
because
in
the
end,
you
you
don't
necessarily
want
to
have
goes
14-3
times
and
goes
15
3
times.
This
is
designed
to
kind
of
be
a
almost
drop-in
replacement,
for
they
can't
get
from
the,
for
instance,
the
map
meta
data.
Obviously
the
map
metadata
is
only
going
to
return
one
entry.
This
will
have
to
return
a
list
because
it
might
turn
multiple
entries.
E
Hoping
that
that's
right,
one
yeah,
so
this
then
brings
back
a
list
similar
essentially
in
structure,
to
the
original
metadata
list,
but
you
can
see
the
time
range
that
it
is
related
to.
It
gives
you
the
column
names
that
it's
related
to,
and
then
it
gives
you
the
actual
result,
and
obviously
this
is
non
deduplicated,
so
you
can
see
all
of
the
metadata.
All
the
time
range
is
that
that
particular
satellite
is
active
for
annoyingly
time
ranges
don't
print
themselves
very
neatly,
for
if
you
just
want
a
concise
beginning
and
end
a
obvious.
E
E
Very
yeah
now,
if
I,
let's
get
rid
of
the
aunt
nan
itemize,
so
now
only
brings
back
run
one
result
because
of
course,
I've
now
defined
explicitly
what
the
time
it's
going
to
be.
So
this
is
just
within
the
middle
and
anytime
rain.
Time
within
this
time
range
is
obviously
only
going
to
come
back
with
one
result.
This
is
for
the
single
source
case,
which
is
more
of
an
ideal
case.
Generally,
most
time,
series
I
aspect
will
be
using
single
source.
E
Obviously,
if,
if
I
only
put
the
filter
for
the
date
time
but
I
had
multiple
sources
so
say
a
head
goes
and
I
had
Eve,
then
I
would
expect-
and
I
have
tested
this.
This
filter
will
obviously
then
come
back
with
more
than
one
result
again,
simply
because,
obviously
assuming
the
data
for
those
two
different
sources,
overlaps
and
time,
then
obviously
you're
gonna
get
to
results.
E
But
you
can
then
also
do
a
filter,
all
name
equals
and
you
could
filter,
for
instance,
on
this
particular
column,
and
then
that
will
give
you
pretty
much
always
a
unique
value.
So
essentially,
you've
got
optional
filters
and
in
fact,
a
new
filter
which
is
going
to
be
created
instead
of
just
the
date-time.
Another
filter
you
could
do
is
row
and
you
could
just
give
it
an
index
or
say,
and
this
might
work,
no,
it
did
but
I
get
that's
fine.
E
That's
a
that's
still
a
work
in
progress,
but
essentially
you
could
also
just
give
it
a
row
index.
So
if
you
know
in
your
actual
beta,
obviously
remembering
the
data
of
pandas
dataframe,
if
you
know
within
your
data
what
row
you're
looking
at
so
Row
5,
then
it
should
be
able
to
return
the
dates
time
for
that
row
and
then
get
the
index.
That's.
That
was
a
feature
that
was
only
suggest
we
had
yesterday
so
yeah.
Specifically
that
way,
I
did
yes
way.
E
E
So,
with
the
update
function
it
has
the
ability
to
update
to
all
so
essentially
now.
I
should
be
able
to
view
the
matter
again
and
it
should
have
yeah.
So
this
key
has
been
added
tool.
So
obviously
the
whole
point
is:
if
you're
doing
any
sweeping
changes
to
all
of
the
day.
You
decided
to
restart
with
the
data.
Then
you
have
the
ability
to
update
the
metadata
fall
of
them.
It's
actually
got
a
protection
against
overwriting
of
key
that's
already
there.
E
Used
in
capital
theory
that
should
now
instrument
edit
tool
yeah.
So
that's
now
changed
all
of
the
instruments.
It's
just
a
layer
of
protection
in
case,
because
obviously
we
ideally
don't
want
to
encourage
people
to
modify
the
original
metadata.
Otherwise,
there's
the
possibility
of
them
imaging
it.
So,
yes,
the
whole
point
is
you
can
update.
Obviously
this
is
a
sweeping
change.
Let's
say
you
wanted.
E
E
Hi,
in
this
case
once
again
we're
activating
one
of
the
filters.
So
again
you
will
be
able
to
use
a
date
time
as
a
filter
you'll
be
able
to
use
at
the
same
time
or
on
its
own.
You
can
use
a
column
name
or
you
can,
instead
of
using
date
time,
you
could
use
a
row
again
and
in
theory
this
should
only
target
one
of
them.
E
So
if
I
once
again
bring
the
meta
the
only
one
with
that
new
key
is
the
obviously
the
top
one
which
has
got
the
right
date
range
all
of
the
others.
They
should
have
that
new
key
added
to
them.
So
it's
a
again
just
that
gives
you
the
option
of
adding
details
to
either
one
or
more
of
the
metadata,
and
obviously
you
can.
If
you
had
multiple
multiple
sources
again,
you'd
want
to
add
a
column.
E
Whatever
you'd
want
to
add
another
column
filter,
so
then,
essentially,
you
can
only
you'll
only
be
targeting
one
metadata
object.
It's
a
one
metadata
entry
within
the
overall
metadata
object,
so
there's
the
ability
to
update
data
in
there
as
the
ability
to
get
data
in
their
related
to
where
you
can
filter,
according
to
whatever
you
have.
If
you
only
care
about
what
column
you're
looking
at,
then
you
can
first.
According
to
that,
if
you
don't
care
about
what
time
a
specific
time
you
can
filter
to
that,
so
you
get
the
idea.
E
It
gives
you
an
awful
lot
of
functionality
there
and
essentially
that's
that's
the
primary
functionality
of
the
metadata
object.
It
has
other
functions
as
well,
so
it
has
a
truncate
function
so
essentially
similar
to
the
time
series
where
you
can
truncate
according
to
any
up
time.
And
what
will
happen
is
the
metadata
will
then
truncate
according
to
that
time
as
well,
so,
for
example,
if
I
truncate
it
to
the
midpoint
of
this.
So
it's
worth
me
saying
these
are
all
in
order.
In
chronological
order,
the
metadata
objects
stores
all
these
entries
in
chronological
order.
E
So
essentially,
if,
if
you
add
another
entry,
it
will
look
at
the
start
state
and
try
to
put
it
in
order
of
that
start
date.
So
if,
for
instance,
I
wanted
to
truncate
the
time
series
object
to
halfway
through
this
particular
meta
data
entry,
so
essentially
halfway
through
the
fifth
day,
rather
than
the
sixth
day
it
will.
The
the
metadata
object
will
remove
this
last
metadata
entry
because
it's
outside
of
the
date
range.
E
It
will
then
also
truncate
this
time
range
specifically
to
the
end
time
range
that
you've
truncated,
your
data
to
because
obviously
it's
just
saying
this
metadata
only
applies
to
data
within
this
time
range
and
obvious.
If
you
take
out
half
of
this
time,
ranges
data,
then
of
course
it
only
those
gates
to
the
data.
Up
to
that
point
so
means
you've
got
a
little
bit
of
traceability.
There,
obviously
the
same
store
on
the
other
side.
E
If
I
truncated
on
the
beginning,
it
will
either
remove,
are
now
redundant
metadata
or
again,
it
will
truncate
the
time
range.
It
won't
modify
the
actual
metadata
values
in
there
because
those
relate
to
the
file
and
obviously,
ideally
you
don't
want
to
modify
for
its
start
time,
because
then
you
might
necessarily
know
which
file
you
got
the
data
from,
but
it
will
only
modifies
this
time
range,
which
basically
says:
where
is
this
metadata
relevant
to,
but
there's
truncate.
E
This
concatenate
of
seating
in
here
today
is
basically
just
taking
two
to
metadata
object
and
combining
their
entries
again.
In
chronological
order,
it's
just
again
to
mirror
the
time
series
concatenate
function,
I'm,
trying
to
think
where
there's
any
other
important
functions,
you
can
rename
columns
for
thats
a
hardly
an
important
function
of
the
object
other
than
that
I.
That's
all
of
the
main
functionality.
E
So
I
don't
know
whether
anybody
has
any
questions
on
it,
but
essentially
that
that
is
the
new
meta
data
object,
new
time
series,
meta
data
object,
and
hopefully
it
will
now
mean
that
we
don't
have
to
worry
before.
We
were
always
worried
when
we
concatenate
a
time
series
how
how
is
the
metadata
going
to
be
combined
and
plant
and
similar?
Now
it
obviously
means
there
isn't
going
to
be
any
any
issues
with
losing
metadata
or
nesting
metadata.
E
E
A
A
A
Then
all
right-
yes,
maybe
reload
right.
D
Hey
is
it
clear
right
now?
Oh
yeah
I
can
hear
you
now,
okay,
so
this
whole
week,
I
was
entirely
dedicated
to
parsing
the
asada
stables
and,
like
I
mentioned,
and
on
github,
that
there
was
a
small
issue
with
having
null
values
in
certain
columns
that,
for
a
certain
day
we
weren't
recording
certain
value.
So
I
had
to
initialize
them
to
null
so
null
values.
So
I
did
find
the
docs
on
what
he
called
a
strop.
D
D
Yeah,
that's
about
it
and
my
other
internship
finish.
So
I
can
dedicate
all
the
time
right
now
to
sun
pie
and
hopefully
speed
up
the
development
process.
A
Do
you
have
any
any
comments
on
this
album
know?
It
sounds
great
cool.
Stick
to
point
out.
Steve
suggested
that
on
IRC
earlier
that
you'd
want,
he
wants
a
an
S
EP
or
something
for
it,
but
I
I
do
so.
It's
been
flexible
anyway.
The
main
objective
of
that
website
is
that
we
drop
support
by
2020,
which
they
think
anyone
will
object,
but.
C
Forestry,
I
have
a-
I
haven't-
really
been
paying
attention,
so
maybe
it's
it's
easy,
but
the
start
of
the
NBA
2
pi.
Also,
if
you
need
it
working
this
yesterday,
nuke
my
anaconda
reinstall
it
fresh
and
REE
figure
out
how
to
get
pipe
on
working.
Yes,
helicopter
was
working
in
I.
Don't
want
to
do
it
a
third
time.