►
From YouTube: SJAA Imaging SIG 7 19 22 Alex Woronow Extracting Emission Lines and Creating RGBSHO images
Description
Alex Woronow, Creating RGB/SHO images
Topics to be Addressed
What objects have emission lines worth capturing?
How do emission lines arise?
Separating the Emission Lines from the Continuum
My Script: Narrowband Assistant
How about multi-narrowband imaging?
A
Hello,
everybody
and
welcome
to
the
sjaa
july,
imaging
special
interest
group
meeting.
I
am
paolo
barrettoni
and,
as
you
can
see
from
my
let's
say
name
on
the
screen,
I
am
feeling
for
imervate.
That
is
our
normal
usual
gracious
host.
That
is
good
for
him
in
vacation.
A
So
tonight
we
will
have
again
alex
waranov.
That
is
a
new,
a
member
of
the
sja
and
is
based
in
tucson
arizona
and
is
also
a
member
for
other
other
group,
but
is
so
so
nice
with
us
to
share
his
knowledge
about
processing
images
and-
and
tonight
I
will
talk
about
a
very
interesting
aspect
of
the
narrowband
processing
that
is
extracting
the
very
precise
band
from
the
continuum
that
normally
the
window
of
the
filter
has
I'm
sure
that
he
will
be
better
than
me
to
describe
everything.
B
Okay,
okay,
you
should
be
seeing
my
screen
so,
as
paulo
said,
I'm
going
to
talk
about
it
extracting
the
mission
lines
from
rgb
plus
ssho
image
data-
and
this
is
my
second
talk
in
a
row
and
I'm
really
pleased
that
somebody
came
back
from
that
to
see
it
again.
I
hope
there'll
be
something
in
here
for
everyone,
I'm
going
to
start
out
fairly
elementary
and
work.
My
way
up,
let's
see,
I
can't
it
doesn't
seem
to
be
forwarding
my
slide,
though.
B
There
we
go
so
let
me
see,
I
missed
one
slide
here:
I'm
gonna
go
back
there.
We
go
so
ground
up,
talk,
we're
gonna
start
with
some
easy
stuff
and
work
our
way
up
into
a
couple
of
topics
that
are
a
little
more
advanced
toward
the
end.
So
we're
going
to
talk
about
what
are
the
mission
lines
we
can
and
can
capture
and
are
worth
capturing
and
some
of
the
motivations
for
catching
capturing
those
lines.
What
can
we
do
with
them
and
we're
going
to
talk
about
what
emission
lines
are
and
how
they
arise?
B
Some
pretty
elementary
physics
that
probably
most
of
you
already
know
but
we'll
make
sure
everybody's
on
the
same
page,
then
you're
going
to
look
at
my
equations
for
extracting
the
emission
lines
from
the
continuum
background,
which
is
always
present
in
any
image
you
capture
and
the
assumptions
and
requirements
for
for
making
that
line,
extraction
and
extracting
one
line
from
a
broadband
image,
for
instance
extracting
the
o3
line
in
the
blue
from
the
blue
rgb
image
and
then
there's
a
problem
in
fact
of
extracting
two
lines
at
once
from
a
broadband
image
such
as
hydrogen
and
sulfur,
which
both
reside
in
the
red
broadband
image
and
so
I'll
look
at
the
equations
for
those
two
cases
and
then
roughly
I'll
quickly
show
you
why
the
only
published
thing
I
could
find
on
extracting
lines
was
published
by
paris
and
lozano
this
year
and
their
equations,
I
think,
are
demonstrably
inadequate
or
incorrect
and
I'll
show
you
why
I
wrote
a
script
that
does
my
narrow
band
extractions
of
insight
script
and
it
also
puts
together
images,
color
images
using
a
true
color
mapping
or
something
close
to
true
color
mapping.
B
I'm
a
big
advocate
of
not
using
the
hubble
palette,
which
I
find
annoying
and
shocking
to
the
eye,
but
that's
a
personal
thing
anyway.
I'll
show
you
the
the
script
and
maybe
work
a
little
example
and
then
have
a
suggestion
for
something
interesting
that
we
might
be
able
to
do
in
the
future
in
terms
of
extracting
the
mission
lines
a
little
better
than
we
have
what
I'm
going
to
show
okay.
So
where
do
we
find
the
mission
lines,
at
least
the
ones?
B
We
can
observe
mission
lines
come
from
common
elements
in
the
universe,
so
we
don't
find
too
many
any
mission
lines
if
any
from
uranium
or
or
things
like
that,
that
are
very
rare
in
the
universe.
Instead,
we
find
them
largely
from
chemical
elements,
we're
familiar
with,
and
those
that
are
in
the
human,
visible
spectra
are
largely
hydrogen
sulfur
and
oxygen.
B
Those
three
strong
ones
that
we
can
see:
hydrogen
there
are
actually
two
lines:
one's
called
hydrogen
alpha
and
the
other
hydrogen
beta
and
then
sulfur
s2
line
and
oxygen
o3
line
and
those
roman
numerals
indicate
the
the.
B
What
is
it
the
number
of
electrons
that
have
been
knocked
out
of
the
orbit?
It's
one
less
than
that,
so
o3
has
lost
two
electrons
emission
lines
generally
for
the
stuff
we
can.
Image
arise
in
turbulent
interstellar
clouds
and
at
high
temperatures,
and
these
kind
of
environments
are
found
in
star-forming
regions
in
those
clouds,
and
sometimes
when
interstellar
clouds
are
colliding,
and
sometimes
the
collisions
are
actually
formed
by
the
inner
star
formation
itself.
B
We
can
also
find
these
emission
lines,
particularly
the
hydrogen
and
oxygen
in
supernova
revenant
remnants,
like
the
very
nebula
and
things
like
that.
Some
stars
have
emission
lines.
Most
of
them
have
absorption
lines.
Some
have
a
few
emission
lines
as
well,
and
solar
flares
have
emission
lines,
but
I'm
not
going
to
talk
about
either
the
stars
or
the
flares
this
evening,
just
really
about
the
clouds.
B
B
For
this
image,
I'll
explain
the
true
color
part
a
little
bit
later.
B
Not
all
clouds
are
emission
nebula,
not
all
bright
nebula
or
emission
nebula.
This
is
the
pleiades,
and
this
is
a
reflection
of
your
level
and
it's
actually
just
like
the
scattering
in
our
sky.
It's
blue
because
of
the
light
being
scattered.
The
blue
light
is
scattered
with
its
wave
shorter
wavelength
than
the
red
light,
which
comes
through
more
cleanly
the
in
this
case,
it's
kind
of
interesting.
Those
stars
are
considerable
distance
behind
the
cloud
and
they're
illuminating
them
from
behind
as
the
two
pass
in
outer
space.
B
Okay,
how
do
atomic
missile
lines
arise?
An
electron
is
in
an
in
an
atom
is
kicked
from
a
low
energy
orbital
up
to
a
higher
energy
orbital
or
kicked
totally
out
of
the
out
of
the
atom.
This
atom
is
then
said
to
be
at
an
excited
state
and
that's
unstable.
It
wants
to
have
that
electron
and
eventually,
an
electron
wants
to
be
in
the
atom.
So
eventually
the
electron
resumes
its
ground
state.
B
So
it
moves
from
an
excited
state,
perhaps
free
around
the
atom,
to
the
ground
state
of
the
atom,
and
as
it
does
so,
it
releases
energy
and
that
energy
is
in
the
form
of
a
photon
and
the
foot,
energy
or
wavelength
of
the
photon.
It
depends
on
how
how
large
the
transition
was.
Was
it
in
all
way
into
the
ground
state
of
the
elect
of
the
atom,
or
was
it
part
way
down
and
we'll
show
a
little
bit
of
that
in
the
next
slide?
B
So
there's
two
ways
that
we
generally
think
of
as
the
prominent
ways
anyway,
of
stimulating
an
electron
in
an
atom
and
the
first
way
is
by
having
a
photon
hit
the
electron
and
if
the
photons
energy,
its
wavelength,
is
the
energy
difference
between
this
inner
orbital
of
the
atom.
This
is
the
nucleus
and
the
a
upper
orbital.
B
Then
this
electron
will
be
knocked
into
an
upper
orbital,
just
like
this
and
that's
an
exciting
state
of
the
atom.
Well,
that
that
leaves
this
orbital
empty
and
it's
lower
energy
and
everything
seeks
the
lower
energy.
It
seems
so
this
electron
is
going
to
try
to
get
back
to
the
lower
state
and
we'll
show
that
in
a
minute,
there's
also
collisions
and
take
these
two
atoms
here
and
slam
them
together.
B
There's
a
chance
that
the
energy
of
that
collision
is
enough
to
kick
at
least
one
of
those
electrons
from
the
atom
to
a
higher
energy
state
and
then
either
this
case
or
this
phase.
We
end
up
with
an
excited
atom
now
that
photon
that
atom,
I'm
sorry
that
electron
wants
to
get
back
and
when
it
does
so,
it
emits
a
photon
equal
to
the
energy
difference
between
these
two
states.
B
And
if
we
look
at
this
diagram
down
here,
it
shows
what
an
electron
around
a
hydrogen
nucleus
can
do,
and
probably
the
most
prominent
one
where
these
are
a
bunch
of
series
which
are
based
on
where
the
atom
ends
up
after
it
cascades
or
jumps
to
a
lower
energy
state.
B
So
it's
this
process
of
exciting
an
atom
through
a
collision,
perhaps
in
a
shock
wave
or
through
heat
which
jars,
the
the
atoms
together,
they
rattle
together
and
the
electron
jumps
up
or
through
absorbing
a
photon
okay.
Now
that's
the
emission
lines
and
they're
shown
down
here
in
this
diagram.
Here's
hydrogen
alpha-
and
this
line
here
this
set
of
observations
in
the
spectra
here
is
the
continuum
and
you
see
it
doesn't
sit
at
zero.
B
So
when
we
observe
a
hydrogen
our
alpha
line,
we
observe
everything
between
a
wavelength
here
and
a
wavelength
here.
So
that
includes
not
only
the
line,
but
all
of
this
continuum
which
isn't
very
interesting
to
us,
it
has
nothing
to
do
with
the
physical
processes
in
the
cloud
directly,
at
least
not
as
directly
as
the
emission
lines
do.
So
that's
hydrogen
alpha.
There's
hydrogen
beta
there's
oxygen
three.
B
Interestingly,
there
are
many
things
that
cause
this
background
here,
this
continuum
of
radiation.
It's
very
often
black
body
radiation,
which
maybe
you
studied
in
school.
B
It's
which
is
just
a
mix
of
of
transitions,
going
on
emitting
photons,
free
free
transitions,
where
electrons
are
free
in
a
matrix
in
the
gas
matrix
and
pass
by
atoms
and
change
their
energy
as
a
as
they
interact
with
the
atoms
around
them.
Molecular
vibrations
and
molecular
rotations
also
cause
the
background
radiation.
B
Interestingly,
the
molecular
vibrations
are
called
phonons,
not
photons,
but
phonons,
and
that
will
turn
away
new.
But
in
researching
for
this
talk,
I
found
out
that
there
were
locations
of
molecules
called
or
the
energy
differences
in
rotations.
That
spawn
light
are
called
rotons.
Isn't
that
interesting,
crossword
puzzle?
B
Anyhow,
here's
a
spectrum,
the
one
I
showed
before
last
one
with
the
buck
with
the
continuum
down
here
and
here's
the
I
have
radiation
on
the
star
and
whatever
it
is
v1150
taurus-
and
you
see
here
the
background
radiation
looks
like
this,
and
this
is
typical
black
body
continuum
of
radiation.
B
B
Let's
say
these
were
taken
by
an
amateur
with
spectrographs
spectrums.
Now,
what's
a
narrow
band
versus
a
broadband
image,
we're
going
to
use
both
of
them
to
extract
our
our
mission
line.
Broadband
images,
as
you
probably
know,
is
filters
that
are
about
100
nanometers
in
width
and
the
common
ones
we
use
are
l,
rgb,
luminance,
red,
green
and
blue.
B
So
here's
some
astronomic
filters
just
to
show
what
we're
talking
about.
So
each
of
these
filters
is
somewhere
around
100
nanometers
in
width
and
the
red,
green
and
blue
red
ones.
In
this
case,
are
the
green
ones,
in
this
case
overlapped
with
the
blue
a
little
bit
and
unfortunately,
the
o
emission
line
lies
right
in
here
somewhere
narrowband
filters
are,
as
you
can
see,
these
are
on
the
same
scale.
B
On
a
side
story
a
little
bit,
I
guess
is
this
diagram.
This
shows
the
color
range
of
various
things.
Let's
just
leave
it
things
for
a
second
here's.
The
human
eye
in
this
horseshoe
shaped
thing.
This
is
what
we
can
perceive.
If
you
have
good
eyes
and
good
color
reception,
you
should
be
able
to
see
this
range
of
colors.
B
Your
monitor,
on
the
other
hand,
follows
this
little
triangle
right
in
here:
the
srgb
triangle.
When
you're
processing
your
images,
you
can
choose
to
process
them
in
something
like
adobe,
rgb
or
pro
photo
rgb,
which
are
much
larger
triangles
and
therefore
have
more
colors
available
to
you.
The
problem
is,
is
when
you
display
them
on
your
monitor
or
anybody
else
displays
on
their
monitor,
they're
going
to
see
srgb
there
aren't,
I
don't
know
of
any
monitors
that
cover
all
of
the
adobe
rgb
and
none
that
that
cover
pro
photo.
B
So,
if
you're,
making
images
to
share
with
somebody
eventually
you're
going
to
end
up
an
srgb,
whether
by
your
own
hand
or
by
that
of
the
manufacturer
of
your
display
screen
or
by
astro
bin
or
whatever,
but
that's
just
notice
also
that
we're
rather
depleted
and
green
compared
to
our
depletion
of
red
and
blue.
B
If
we
use
the
srgb
and
that's
fine
for
astro
imaging,
because
there's
just
not
that
much
out
there,
that's
that's
truly
green
in
color,
and
but
if
you're
doing
trees
and
forests
and
things
like
that,
it's
kind
of
a
shame.
You
don't
get
more
of
the
greens,
but
we
don't
okay,
the
utility
of
mission
lines.
If
I'm
going
to
extract
these
things,
what
is
it
I'm
going
to
do
with
them?
Well,
one
thing
is
make
pretty
pictures-
and
I
guess
that's
covered
down
here.
B
It
makes
things
the
images
more
complex,
detailed,
interesting
and,
in
some
degree
informative.
It
adds
textures
shapes
and
details,
or
at
least
it
might
you're
liable
to
extract
all
the
all
the
things
and
all
the
narrow
band
the
emission
lines
and
still
come
up
with
an
image.
That's
just
kind
of
a
flat
white
up
flat
red
thing,
for
instance,
and
it
allows
you
to
better
adjust
dynamic
ranges,
tones
and
saturations.
B
It
reveals
the
proportions
of
hydrogen
oxygen
sulfur
in
the
regions
to
some
degree.
If
you
have
red
areas
and
blue
areas
you
can
and
if
you're,
using
a
whole
little
palette,
red,
green
and
blue
areas,
then
this
would
show
you
the
distribution
of
these
chemical
elements
in
this
space.
B
However,
I
suspect
that
a
professional
who
wants
to
know
that
would
take
a
spectrogram
of
the
area
to
find
it
out,
rather
than
taking
a
set
of
narrowband
images,
and
I
just
found
out
today
that
the
hydrogen
alpha
03
ratio
in
planetary
nebula
is
heightened,
increases
in
areas
of
of
bowel
shocks.
So
tomorrow
I
will
start
on
an
image
to
see
if
I
can
see
bounce
shocks
in
a
planetary
nipple,
but
just
learned
that
this
afternoon,
okay,
here's
another
emission
line.
B
By
the
way,
the
guy
who
wrote
this
was
a
p,
a
one
of
the
writers
for
futurama
who's,
a
phd
in
applied,
mathematics
and
there's
actually
now
a
a
mathematical
theorem
out
there.
Based
on
this.
This
episode
here,
the
prisoner
of
benda.
B
So
who
knows
where
you,
where
math
is
going
these
days
anyway?
Here's
my
method
of
extracting
the
emission
lines
say
we
want
to
extract
the
green
line,
the
green
o
three
portion
of
the
green
03
line
from
broadband
and
narrow
band
images,
the
g
in
the
broadband
and
the
03
narrowband
images.
We
can
write
two
simultaneous
equations
for
the
flux
recorded
by
those
two
images
and
if
we
write
it,
if
we
take
b
as
the
broadband
and
n
is
narrowband,
let's
start
by
writing
the
equation
for
the
broadband.
B
Very
simply
it's
the
line,
the
emission
line,
plus
the
continuum,
equals
the
total
flux
that
we
record
in
b,
whereas
in
the
narrowband
line.
What
we
want
to
do
is
scale
it
to
the
same
thing.
So
it's
the
narrowband
line
has
the
the
same
emission
line,
but
the
exposure
e
sub
n
divided
by
e
sub
the
narrow
band
divided
by
the
broadband
exposures
scales.
B
So
now
we
have
two
simultaneous
equations.
We
know
all
the
e's,
we
know
the
w's.
The
two
unknowns
are
the
line
and
the
continuum,
and
we
very
simply
solve
them
for
the
line,
and
we
get
that
the
line
is
equal
to
the
narrow
band
minus
a
times
b
b.
Is
the
broadband
image
a?
Is
this
ratio
of
the
set
of
constants
divided
by
the
the.
B
This
is
what
happens
when
you
get
all
all
of
a
sudden
things
go
out
of
your
mind
by
e
sub
n,
divided
by
u
sub
b.
Minus
again,
is
this
constant?
B
B
So
we
have
to
write
three
equations
and
they
look
just
like
the
equations
we
just
wrote,
except
there
are
much
more
complicated
contortions
of
all
the
constants
being
multiplied
and
divided,
and
so
instead,
I
just
simply
write
it
in
in
a
simple
matrix
form,
linear
and
solve
these
linear
equations
for
n,
1
and
n2,
where
n1
might
be
the
hydrogen
alpha
line
and
then
2
being
a
sulfur
2
line.
So
this
is
very
easy
to
solve
and
you
get
both
the
hydrogen
and
the
sulfur
in
one
one
pass.
B
If
you
remember
back
several
slides
where
I
showed
the
emission
spectrum
of
a
nebula
and
pointed
out
the
continuum
it
was
fairly
flat
and
so
over
a
100
versus
a
10
nanometer
section,
the
broadband
versus
narrowband
that
that
assumption
is
probably
fairly
good,
not
perfect,
but
but
probably
closer
than
we
would
expect
or
intuitively
expect.
Perhaps
the
other
assumption
number
two
is
that
the
filter
transmission
efficiencies
are
is
the
same
for
the
broadband
and
the
narrowband
image.
B
Almost
always
above
95
efficient
and
often
above
98,
and,
finally,
that
the
when
processing
them
you
have
to
check
and
make
sure
that
your
narrow
band
and
broadband
images
don't
have
a
pedestal
if
they
do,
the
equations
don't
hold,
but
if
they
do
simply
remove
that
pedestal
before
you
do
the
calculations.
B
Okay,
now
peri,
paris
and
lozano
this
year
published
a
an
equation
which
they
called
something
like
continuum
subtraction,
which
I
took
it,
and
I
think
everybody
else
did
to
mean
that
it's
going
to
give
you
the
emission
line,
and
this
is
their
equation.
They
don't
put
emission
line
on
the
left
side
of
the
equation.
They
just
give
you
the
right
hand
side.
B
So
this
is
what
I
presume
they're
solving
for
the
equation
is
to
take
the
h,
a
cenaroman
filter
and
you
subtract
the
quantity,
the
broadband
red
filter
minus
the
median
value
of
that
red.
Remember
median
is
the
midpoint
in
terms
of
50
percentile,
half
the
values
in
the
red
filter
are
below
that
and
half
or
above
and
then
you
hunt
around
make
some
tests
and
find
a
value
of
k
that
makes
the
image
look
the
way
you
think
it
should,
and
you
multiply
that
thing
in
the
parentheses
here
by
that
k.
B
So
I'd
like
to
point
out
something
this
half
the
time
when
this,
when
r
is
less
than
the
median
that's
half
the
time.
This
is
this
is
a
negative
value,
multiplied
by
a
positive
value,
so
it's
still
a
negative
value
and
subtract
it
from
h,
a
and
so
you're
adding
something
to
the
hydrogen
alpha.
B
If
this
is
continuum
subtraction,
what
is
it
you're
adding
to
the
hydrogen
alpha
and
it
can't
be
anything.
On
the
other
hand,
if
r
is
less
than
the
median,
then
this
is
a
negative
number.
Instead
of
I,
just
if
far
is
greater
than
the
median,
then
this
is
a
positive
number
multiplied
by
a
positive
number
is
a
positive
number
and
you
subtract
it
from
hydrogen
alpha
and
it
seems
like
you're
subtracting
something.
Perhaps
it's
the
continuum.
B
I
don't
know
any
physics
that
says
that
this
thing
here
is
the
physics
of
the
continuum.
The
equations
I
showed
in
just
a
little
while
are
based
on
simple
physical
principles.
This
one
is
just
ad
hoc.
All
it's
doing
is
emphasizing
the
areas
that
have
more
red
already
and
de-emphasizing
the
areas
that
have
less
red
and
besides
which,
if
you
have
a
nebula
that
only
makes
up
part
of
the
image.
B
B
B
Here's
my
narrowband
assistant,
I
call
it
my
app
script
and
fix
insight
for
doing
those
calculations.
It's
pretty
simple
to
use!
You
just
select
your
broadband
images
for
these
spots.
They're
narrowband
images
for
these
there's
a
h,
a
boost
factor
how
much
of
that
red
h
a
subtracted
from
found
in
the
mission
line.
How
much
emission
line
do
you
want
to
add
back
into
the
red
part
of
your
image?
B
How
much
red
from
the
s2
and
true
color
you
want
to
add
in
and
how
much
you
want
boost
the
blue
and
green
from
the
o,
and
you
can
adjust
these.
You
can
leave
them
at
one.
You
can
play
with
them
any
way
you
want,
and
it
also
calculates
an
hb
hydrogen
beta
emission
where
you
have
hydrogen
alpha
and
nebula,
you
virtually
always
have
a
proportion
of
that
that
equals
the
hydrogen
beta
and
it's
around
35
the
default
here.
So
that
gives
you
a
little
bit
of
blue
too.
B
B
You
can,
if
you
push
this
value
around,
there's
going
to
be
parts
of
your
image.
For
instance,
all
of
these
narrowband
images
is
going
to
extract
it's
going
to
extract
the
hydrogen
sulfur
and
oxygen
line
and
those
lines
will
those
images
will
tend
to
have
black
holes
in
them,
areas
where,
where
there
was
no
recorded
or
sulfur
recorded,
etc,
and
if
you
push
this
line
up,
you
can
plug
those
areas
by
by
warping
the
equations
a
bit
whether
you
want
to
work
the
equations
or
not
it's
up
to
you.
B
If
you
leave
it
at
one,
there's
no
warping
and
the
images
that
come
out,
as
I
said,
are
the
three
narrowband
images.
It
also
makes
a
show
rgb
image,
according
to
the
mixtures
that
you
specified
up
here,
and
it
makes
an
sho
image
with
the
true
color
mapping.
All
of
these
are
both.
These
are
what
I
call
true
color
mapping.
B
Okay,
now
most
narrowband
images,
if
you're
a
devotee
of
astrobin
you'll,
find
that
most
shl
images
is
the
hubble
palette
in
the
hubble
palette.
Sulfur
is
red,
hydrogen
is
green
and
oxygen
is
blue.
This
is
called
an
ordered,
color,
coloring
or
ordered
palette,
because
this
is
the
longest
wavelength
less
long
and
longer
shortest
wavelength.
B
I
am
not
looking
at
the
distribution
of
the
chemical
species
and
I
don't
expect
any
professional
scientists
working
on
with
hubble
images,
or
I
want
to
have
you
to
go
out
on
astro
bin
and
search
down
my
images
for
them
to
do
their
science
they're,
going
to
use
telescopes
of
several
orders
of
magnitude
more
power.
So
I
do
mine
for
for
art
for
satisfying
visual
images,
and
I
do
not
find
the
hubble
palette
to
create
great
satisfying.
B
Color
images,
furthermore,
the
star
colors
are
distorted
in
the
hubble
palette,
they're,
usually
magenta
by
the
time
you're
done
with
it.
Some
people
take
rgb
and
paste
the
rgb
stars
in
in
place
of
these
magenta
stars
and
some
others
simply
use
a
star
mask
and
remove
the
color
totally
from
those
stars.
B
Other
observatories
by
the
way
use
other
one-to-one,
sho
mapping,
color
palettes
besides
the
hubble
palette,
and
those
seem
not
to
get
very
much
play
time
on
astrobin.
So
here's
my
mapping-
and
I
remind
you
where
the
spectrum
is
the
shortest
wavelength-
is-
is
sulfur.
So
I
make
sulfur
red.
I
say:
the
intensity
of
the
sulfur
image
is
equal
to
the
red,
is
equal
to
the
red
image.
B
So
I'm
mixing
rgb
that
one's
red,
the
h
a
is
somewhat
longer
wavelength,
shorter
wavelength,
yeah,
shorter
wavelength
than
that,
so
it's
off
toward
the
green
just
a
bit.
It's
rather
arbitrary,
but
I
said
I
make
my
aha
17
green
and
83
red
that
puts
it
in
the
right
place.
More
or
less
right
place
relative
to
the
sulfur
o
lies
out
here
in
the
blue
green
and
I
mix
it
at
39.
B
Green
and
61
red
and
then
I
make
up
this
hydrogen
beta,
which
I
told
you
about
earlier,
because
where
there's
hydrogen
alpha
there's
hydrogen
beta,
and
so
I
assign
the
flux
at
35
of
the
hydrogen
alpha
flux.
You
can
change
that
if
you
want
and
make
it
zero
or
any
other
number,
and
I
set
it
at
20
percent,
green
and
80
blue,
eventually
and
probably
the
next
edition
of
my
script.
These
equations
will
be
adjustable
to
user
expression
right
now,
they're
hardwired
in.
B
Okay,
now
the
advantages
of
this
two
color
mapping
is,
you
can
use
photometric
color
call
calibration
from
fix
inside
on
it
and
when
you
do
that
the
stars
aren't
shifted.
A
great
deal.
They're
usually
shifted
somewhat
just
like
they
are
in
your
rgb
images,
and
if
you
don't
like
that,
then
you
don't
have
to
go
through
with
the
color
calibration
or
you
can
isolate
the
stars
and
color
them
yourself.
B
Whatever,
whatever
was
available
to
you
with
rgb
is
available.
Here
too,
you
can
display
your
images
at
home
or
in
public
without
having
to
try
to
explain
what
all
the
great
funny
green
things
are
in
there
green
areas
and
what
have
you,
as
I
said?
Eventually,
these
will
be
adjustable
okay.
B
Now
the
images
that
you
can
output
depends
on
right
now
on
the
images
that
you
input.
For
instance,
if
you
input
just
h
a
s
and
o,
and
maybe
these
aren't
the
lines.
Maybe
these
are
just
the
raw
stacks
that
you
have,
but
it
will
produce
an
sho,
true
color
image
for
you,
if
you
put
in
h
a
and
rgb
or
o
and
rgb,
or
s
and
rgb
it'll
help
put
it
an
hrgb
image
and
the
h
line
or
srgb
and
s
line
or
whatever
you
have.
B
If
you
put
in
the
whole
kit
and
caboodle
here,
hydrogen
sulfur
oxygen
and
rgb,
you
get
out
an
sr
o
rgb,
an
hso.
B
B
Okay,
I'd
like
to
go
to
a
live
example:
let's
see
if
I
can
make
that
work,
let's
go
there.
B
B
I
have
here
my
program,
narrowband
band
assistant,
and
when
I
open
it
up
because
I
prepared
already-
we
have
the
r
g
and
b
filled
in
the
h
s
and
o
here.
Are
the
individual
images
take
that
down
for
a
bit?
So
if
you
look
at,
for
instance,
the
hma
oops,
that's
the
one
place
on
that.
There's
there's
the
h
what
the
ancient
agent
looks
like.
So
these
are
all
black
and
white
standard
stacks
of
images.
B
B
There
was
a
900
second
exposure
for
the
broadband
images
and
1800
for
the
for
the
o
exposures,
I'm
not
going
to
screw
with
that
value.
I'm
going
to
make
all
the
images
possible
all
three
lines
and
the
two
combined
images,
so
all
you
do
is
take
a
deep
breath
and
hope
and
push
run
and
it
starts.
Most
of
these
images
will
be
deleted.
They're
intermediate
images
where
it's
getting
scaling.
B
B
And
here
is
the
the
rgb
show
image,
and
this
long
thing
here
tells
you
that,
for
instance,
sulfur
is
in
there
at
a
slider
value
of
one
hydrogen
at
one
oxygen
at
one
and
then
there's
rgb,
and
so
again
this
is
true
color.
Obviously,
there
needs
to
be
some
cropping
done
here
on
the
side,
but
the
star
colors
are
pretty
reasonable.
B
It's
a
little
harder
to
see
the
star
colors
in
this
one.
Unfortunately,
because
stars
don't
come
out
real
well
in
narrowband
images,
but
they're
also
close,
but
the
intensity
of
this
one
needs
to
be
boosted.
The
saturation
clearly
needs
to
be
boosted.
Okay,
in
addition
to
those,
we
have
four
lines,
believe
it
or
not.
B
Here's
the
hydrogen
alpha
line
that
was
extracted.
This
is
just
the
line
without
the
continuum
in
it.
Here's
the
o
extracted
from
the
blue
broadband
image.
So
this
is
a
this.
Is
the
blue
component
of
the
of
the
oxygen
here's,
the
green
component
of
the
oxygen
oxygen.
B
The
images
are
not
totally
identical,
especially
in
intensity,
which
is
interesting.
I
don't
think
anybody
has
taken
the
effort
to
extract
them
from
both
those
as
you
should,
and
then
you
combine
them
back
into
one
line.
Ultimately,
when
you
use
it-
and
this
is
the
sulfur
line,
what
I
didn't
extract
didn't
show
is
the
hydrogen
beta
line,
but
the
hydrogen
beta
would
be
just
this
image
if
you
multiply
every
pixel
by
0.35
and
then
add
it
into
the
blue
rather
than
into
the
red
somewhere.
B
Okay,
some
possible
additions.
Color
mapping,
as
I
said,
so
that
the
user
can
adjust
the
color
mapping
to
true
color
or
any
other
thing.
They
want
to
do
and
increase
the
flexibility
of
the
inputs
such
that
you
could
just
input,
r
and
h
and
get
out
the
h
a
line
right
now.
You
have
to
put
in
all
the
rgb
and
to
get
it
out.
It's
just
kind
of
a
limitation.
That's
easily
remedy!
B
B
This
would
have
really
helped
in
making
sure
that
the
continuum
value
is
common
between
those
two
are
extremely
close
to
being
common
between
those
two.
So
I've
never
seen
this
done,
but
if
you
happen
to
have
a
couple
extra
sets
of
narrow
band
filters
lying
around,
perhaps
you
started
out
with
10
nanometers
and
decided
to
go
to
five
or
three.
B
I
might
give
that
a
try
and
see
see
if
it
works
using
the
equations
same
equations.
I
have
one
of
the
another
advantage
of
this.
Is
it
would
isolate
just
the
hydrogen
hydrogen
and
you
wouldn't
have
to
simultaneously
extract
the
hydrogen
and
the
sulfur,
which
must
add
a
little
bit
of
uncertainty
to
the
images
that
come
out.
B
Okay,
I
think
that's
all
I
have
today,
here's
an
image,
that's
the
true
line
version,
here's
the
rgb
version.
This
has
been
processed
and
pre-processing
was
all
done
in
fixed
insight.
Most
of
the
post-processing
was
done
in
topaz
various
ones
and
in
aurora
hdr.
B
A
Thank
you
thank
you
alex,
and
let
me
let
me
let
me
start
with
a
with
a
question
that
I
noted
here.
I
noted
that
you
use
always
very
very
good
subs
I
mean
the
the
data
set
are
normally
coming
from
a
a
real
good,
telescope
and
location.
A
What
let's
say
is
the
influence
of
a
let's
say
less
than
stellar.
Let
me
let
me
pun
intended
data
set,
or
I
mean
the
theta
said
that
the
the
the
commoner
like
us
can
produce
in
our
backyard
in
that
approach.
B
I
think
one
of
the
keys
is
always
going
to
be
suppressing
noise,
and
one
thing
that
can
be
done
is
to
operate
on
in
some
way
to
suppress
noise
in
your
unstretched.
All
this,
of
course,
the
mathematics
are
all
for
unstretched
images,
so
you
have
to
suppress
the
noise
in
the
images
before
you.
You
apply
these
mathematical
tools
as
best
you
can.
B
One
thing,
of
course
we
have
in
our
command
is
more
exposures,
and
with
these
telescopes
I'm
I'm
working
with
oh
about
three
or
four
hours
of
show
under
very
good
skies.
B
I
don't
have
any
problem
under
skies
here
in
tucson,
a
metropolitan
area.
It
would
be
much
more
difficult.
B
I
guess,
if
you
really
think
about
it,
you
know
you're
carrying
the
noise
through
so
some
voice,
oppression
after
the
fact
is
important
too.
I
usually
open
noise,
pretty
hard,
there's
now
a
new
ai
noise
tool.
What
is
it
called
available
in
pic
within
pix
inside
cost?
A
few
bucks
noisy
noise
exterminator.
B
That's
that's
the
best
I
guess
I
can
do
is
is
try
and
see
and
be
aware,
maybe
whatever
is
called
murray
or
mura,
noise
suppression
work.
I
never
use
that
one's
done
on
subs,
right
after
stacking.
A
One
other
thought
about
your
last
last
note
about
the
possibility
to
combine
different
windows
in
narrowband
internal
filter,
so
different
window
size,
and
I
was
thinking
to
the
to
the
the
approach
that
francesco
had
with
a
multiple
location
and
people
contributing
to
the
same
same
image.
That
is,
an
approach
can
be
used.
In
that
case
I
mean
you
have
to
scale
probably
from
one
image
to
the
other,
so
the
pixel
side
has
been
different.
A
B
I
think
that's
not
a
huge
problem
because,
of
course
you
know
whether
you're
down
scaling
the
pixels
or
whether
you're
aligning
subs
you're
interpolating
all
the
time-
and
you
know
if
you
dithered
your
subs
or
interp
interpolating,
and
that's
what
you
would
be
doing
if
you
were
taking
images
of
different
sizes
from
different
scopes
anyway
and
of
course,
the
collaboration
as
you
suggest,
maybe
one
person
has
one
bandwidth
of
narrowband
filters
and
another
person
has
the
other
and
trading
the
data
would
be
a
less
expensive
than
buying
two
sets
of
filters.
B
A
B
A
Okay,
so
if
there
is
no
other
further
question,
we
can
first
of
all
thank
you
and
to
alex
for
a
lot
of
food
for
thought
and
thank
you,
everybody
for
participating
and
I'm
actually.
I
have.
I
have
one
last
question:
alex
wonderful
presentation
really
enjoyed
it.
Is
it
possible
to
share
the
presentation
itself
that
there
was
a
lot
of
stuff
quite
intricate
and
it
would
be
helpful
for
newbies
like
me
to
go
over
it
probably
multiple
times
to
understand.