►
Description
Presentation on "Instrument Packages" for the different types of Astro Imaging" and general image sharing and discussion.
A
Well,
hello:
everybody
welcome
to
the
november
2020
version
of
the
san
jose
astronomical
association.
Imaging
special
interest
group
meeting
and
tonight
is
posted
as
a
general
discussion,
but
we
have
some
stuff,
like
glenn,
put
together
on
spectroscopy
and
on
well
actually
on
more
on
equipment,
variations
and
anything
that
anybody
wants
to
talk
about
we're
we're
open
to
it.
I
think
we'll
have
time.
So
why
don't
we
get
started
glenn
you
want
to
take
it
from
the
top
sure.
B
A
B
Okay,
so
I
put
this
together
partly
for
you
guys
and
partly
for
me
just
to
document
the
different.
B
I
call
them
instrument
packages
that
I'm
using
with
my
scopes,
because
I
I
don't
want
to
forget,
you
know
how
many
spacers
it
took
to
come
to
focus
with
one
of
these
things
and
then
have
to
spend
time
refiguring
that
that
out,
so
I
thought
I'd
put
it
all
down
in
powerpoint
and
that
way
it'll
it'll
save
me
some
time
and
hopefully
it'll
be
interesting
for
some
of
you
as
well.
B
So
the
point
is
that
the
different
types
of
astrophotography
have
different
requirements.
You
know,
and
so
I'm
talking
about
deep
space
and
you
could
divide
that
into
kind
of
wide
wider
field
and
and
deeper
targets.
Smaller
fovs,
then
there's
planetary,
which
there's
some
common
requirements
across
the
whole
thing.
But
then
you
could
divide
it
down
into
the
planets
themselves
versus
solar
and
then
another
one
that
that
I've
gotten
motivated
about
again
is
the
spectroscopy.
B
So
I
don't
know
if
you
guys
saw
the
last
speaker
after
the
board
meeting
was
tom
field,
who
writes
the
r-spec
software
that
a
lot
of
people
use
for
spectroscopy.
So
there
was
a
good
talk
on
that
and
that
got
me
geared
up
to
get
back
into
that
part
of
the
hobby.
And
I
had
a
piece
of
gear
that
we'll
we'll
talk
about
here
called
the
grism
that
I
had
bought
back
in
2018
that
I
hadn't
actually
deployed
or
tried.
Until
now
so
yeah
reported.
B
I
can
link,
send
you
a
link
later
or
something,
but
if
you
just
go
into
youtube
and
and
put
in
sjaa
you
should,
you
should
find
the
channel
and
yeah
back
in
back
in
2018,
I
had
given
a
talk
at
one
of
these
sig
meetings,
but
it
was
before
we
were
recording
things
and
at
that
time
I
had
just
the
the
star
analyzer
grading
that
we'll
talk
about
that
as
well.
B
So
anyway,
some
stuff
about
spectroscopy
and
I'll,
bring
you
up
to
speed
on
what
what
I
did
in
the
last
week
on
that
as
well.
B
So
diving
into
deep
space.
So
this
is
my
workhorse
rig
here
all
these
by
the
way
plug
into
moonlight
two
and
a
half
inch
focusers,
either
on
my
william
optics,
80
millimeter
triplet,
what
a
megres
to
whatever
refractor,
and
also
on
my
12
inch
truss
rc.
So
I've
kind
of
standardized
on
the
on
the
focusers,
and
I
have
a
actually
three
of
them
now
for
some
reason,
so
everything
threads
into
one
or
the
other
of
the
the
focusers.
B
So
just
in
review,
you
know
what
you're
usually
trying
to
do
for
for
deep
space.
Is
you
want
faster
optics,
you're
you're,
trying
to
have
a
smaller
focal
ratio
than
maybe
your
native
scope
has
and
that's
to
limit
exposure
time?
It
gives
you
a
little
wider
field
and
maybe
you're
trying
to
utilize
as
much
of
the
flat
image
circle
that
your
telescope
can
produce
as
possible
in
terms
of
getting
all
of
that
flat
area
on
your
camera
sensor.
B
B
So
this,
let
me
put
my
mouse
over
here.
This
focal
reducer
goes
down
inside
the
focuser
up
to
these
threads
here
and
then
we
have
the
on
ag
for
guiding.
So
the
infrared
light
goes
straight
through
to
the
guide,
camera
and
the
visible
light,
bounces
off
a
cold
mirror
and
goes
through
the
filter
wheel
and
to
the
asi
1600.
B
I
have
since
moved
the
electronics
for
the
rotator
off
of
this
onto
the
onto
the
rc
ota
itself,
because
when
I'm
spinning
things
on
and
off
of
here,
this
keeps
running
into
stuff,
so
that
that's
one
change
that's
been
made
since
then.
I
think
everything
else
is
pretty
current
yeah.
So
again,
oh
and
also
you
know,
I've
talked
before
about
my
somewhat
3d
printed
rotator.
B
The
moonlight
focusers
have
the
ability
to
add
a
rotator
to
them
and
I,
rather
than
buy
the
one
that
they
make.
I
made
my
own
using
mostly
designed
from
somebody
on
the
on
the
net
and
it's
our
arduino
and
some
3d
printed
parts
and
stuff.
So
that's
nice
and
it's
especially
nice
with
the
anytime
you're
doing
off-axis
or
on-axis
guiding
versus
a
separate
guide
scope.
B
You
know
every
time
you
rotate
the
camera
you'd
have
to
recalibrate
phd2,
but
if
you
have
a
rotator
with
an
ascom
driver,
then
phd2
can
watch
that
and
it
can
do
the
math,
so
you
don't
have
to
recalibrate.
So
that's
that
was
one
reason
for
me:
adding
a
rotator
versus
just
doing
the
manual
rotation
with
the
manual
rotator
driver
in
in
sgp
to
prompt
you,
you
know
three
more
degrees,
three
more
degrees
or
whatever.
B
Okay,
let's
see
yeah.
So
this
gives
me
on
this
rig
anyway.
It
takes
it
down
to
f
six
point
three
there's
my
fov
and
arc
minutes,
and
yes,
it's
still
over
sampled,
but
I
can't
I
can't
have
any
more
focal
reduction
because
I'll
run
out
of
back
focus.
B
So
the
unfortunately
that's
one
downside
of
the
on
egg
and
if
you
remember
my
talk
on
the
on
egg,
using
it
with
the
focal
reducer
is
quite
painful,
because
this
distance,
through
the
on
egg,
is
so
far
that
you
know,
creates
the
amount
of
focal
reduction
and,
as
you
increase
that
or
have
that
bigger
distance,
it
eats
back
focus.
B
Your
your
your
you
know,
ota
to
sensor
distance
is,
is
keeps
shrinking
right,
so
that
involved
even
even
taking
all
the
extension
tubes
off
the
scope
that
would
go
between
the
scope
and
the
focus
there.
I
I
still
had
to
mechanically
shave
threads
off
the
focal
reducer
to
get
the
extra
10
millimeters
I
needed
to
to
come
to
focus.
A
B
Yeah,
what's
the
name
of
those
silent
fans,
bruce
the
knock
nocturna
noctua?
No?
Okay!
Thank
you
so
yeah.
My
camera
is
outdoors
24
by
7
365
and
I've
also
been
lazy
about
leaving
the
power
on
which
I
think
sometimes
it
seems
like
the
fan
runs.
Sometimes
it
does,
and
sometimes
it
doesn't,
I'm
not
sure
what
that
is.
It's
not
cooling,
but
the
fan
still
runs
but
anyway
after
however
many
years,
I've
had
this
camera.
Now
two
three
years
the
fan
died,
and
I
think
high
just
went
through
that
as
well.
B
So
you
know
looking
to
cloudy
nights
for
for
help
on
that,
since
spare
parts
are
not
available
for
for
zwo
in
any
kind
of
convenient
way.
The
the
thinking
there
was,
rather
than
replace
it
with
as
close
to
the
exact
same
fan
as
possible,
that
people
were
putting
this
these
other
fans
in
there
and
that
had
to
do
with
people
that
were
suffering
from
vibration.
B
Vibration
problems
more
than
anything
else,
but
I
just
decided
to
to
go
that
route
and
the
way
that
particular
fan
mounts
is
rather
than
having
screws,
and
this
helps
with
the
vibration
control.
It
has.
These
silicone
plugs
that
you
pull
through
this,
so
you
you
pull
them
through
the
fan
and
then
you
pull
the
fan
against
the
the
metal
by
pulling
these
four
things
through
the
screw
holes.
B
So
that's
what
that's!
What
that
is,
and.
E
B
Could
cut
them
down,
but
you
know
whatever
they're,
not
hurting
anything
they're,
soft,
okay,
okay,
all
right!
So
then
here's
what
it
looks
like
on
the
megrez
ii,
william
optics
scope.
So
it's
the
same
instrument
package
and
you
get
kind
of
another
view
of
the
rotator
here:
here's
the
solenoid
for
it
and
the
belt.
B
B
You
know
three
times
the
field
of
view
or
so
there
from
the
from
the
rc
1.76
arc
seconds
per
pixel.
So
that's
a
nice
wide
field
rig
and
I
just
it
just
rides
on
top
of
the
on
top
of
the
rc
okay.
B
B
So
we'll
talk
about
the
non-solar
side
of
of
planetary
first,
so
you
know,
one
of
the
things
is
planets
are
so
small
and
you're,
usually
magging
up.
So
much
that
you
know
your
field
of
view
is
ridiculously
small
and
there's
no
ability
to
to
plate
solve
off
that
smaller
fov.
B
So
one
of
your
challenges
is:
how
are
you
going
to
get
on
to
the
on
to
the
target?
So
that's
a
common
thing
and
one
of
my
approaches
was:
was
a
flip
mirror
and
it
was
sort
of
semi
successful.
B
For
both
the
mars
and
this
part
of
the
same
rig
is
used
for
solar
or
not
for
solar
first
spectroscopy
as
well,
and
I
ended
up
deleting
the
flip
mirror
for
that.
But
but
that's
so
that's
one
way
and
then
the
other
way
which
I've
ended
up
using
instead
of
the
flip
mirrors.
I
just
use
the
the
other
scope,
so
I
put
a
camera
on
the.
B
If,
if
I'm
on
the
rc
with
the
planetary
rig,
I
put
a
camera
on
the
refractor
and
plate
solve
onto
something
from
there
and
then
just
use
a
spiral
search
in
sharpcap
to
get
onto
the
onto
the
target.
With
the
smaller
field
of
view.
B
I
do
also
have
a
a
set
of
dovetails
that
are
adjustable
to
to
so
that
you
could
align
those
two
scopes
perfectly,
but
that
adds
a
lot
of
weight
and
a
lot
of
angular.
B
B
Yeah,
it
adds
quite
a
bit,
so
I
ended
up
deleting
that
as
well
and
now
I
just
know
that
you
know
if
the
refractor,
if
it's,
if
the
object
is
centered
on
the
refractor
in
the
center
of
the
frame,
then
it's
actually
going
to
be
you
know
down
and
over
here
or
something
I
can't
look
at
my
hands
and
get
them
right.
But
anyway,
I
know
where
it's
going
to
show
up
in
the
where
I
need
to
move
the
the
object
to
be
off-center
so
that
it
will
be
centered
in
the
other
scope.
B
So
anyway,
that's
what
I'm
doing
there
so
back
to
the
requirements
for
for
planetary.
So
there's
a
rule
of
thumb
that
says
that,
depending
on
the
scene,
you
want
to
have
some
multiplier
of
your
pixel
size
in
microns,
which
is
a
kind
of
a
you
know,
unitless
number,
but
that's
the
focal
ratio
that
you
want.
So
you
know
if
your
pixel
size
was
10
microns,
then
you
know
you'd
you'd
want
30
to
50
as
a
as
a
focal
ratio.
B
And
yes,
those
are
huge
numbers
and
I
always
tell
the
story
that
I
learned
when
visiting
lick
observatory
and
the
the
nickel.
I
believe
it's
the
nickel
telescope
is
a
casa
grain,
and
so
they
have
three
different
configurations
that
they
can.
They
can
do
which
produce
different
f-stops.
B
So
they
you
know
they
can
put
it
where
the
secondary
mirror
would
normally
be
for
for
fast
things,
and
then
they
can
put
it
at
the
bottom
in
a
full
cassegrain
configuration
and
then
they
can
take
the
camera
off
and
shine
it
on
a
mirror
that
goes
down
the
hallway
to
the
in
the
basement,
and
it's
f,
39,
f,
39
and
that's
how
what
they
do
for
planetary.
B
B
and
those
are
like
f12
or
f13
native.
So
those
are
good
for
planetary,
but
they
have
a
focuser.
That's
like
on
an
sct,
it's
kind
of
hard
to
to
automate,
but
I
did
do
that
as
well.
With
the
you
know,
3d
printed
parts
and
the
arduino
and
all
that,
but
when
it
was
all
said
and
done,
I
didn't
have
enough
really
resolution.
So
I
went
back
to
the
to
the
rc,
even
though
rc's
are
not
supposed
to
be
super
great
for
planetary
because
of
their
large
center
obstruction.
B
It
reduces
the
the
contrast,
but
anyway,
so
I'm
on
the
12-inch
rc.
I've
got
the
atmospheric
dispersion
corrector
here,
which
I
have
not
yet
learned
to
use,
but
it's
it's
there
that
replaced
the
the
flip
mirror
and
then
there's
an
adjustable
barlow.
So
this
is
the
astrophysics
they
have
an
acronym
for
it.
I
think
it's
like
bar
dv
or
something
the
advanced
barlow
and
it
looks
exactly
like
a
ccd
t67
focal
reducer,
but
it's
a
barlow,
and
so
you
control
the
amount
of
magnification.
B
B
B
So
I
wanted
to
add
that
to
my
filter
wheel,
but
my
filter
wheel
was
was
31
millimeters
and
this
is
an
inch
and
a
quarter,
so
I
did
3d
print
a
little
adapter
that
has
the
the
eyepiece
threads
on
it
for
the
for
the
inch
and
a
quarter
so
that
went
in
there
took
out
my.
I
normally
have
a
light
pollution
filter.
As
luminance
there,
okay,
well,
let's
see
we
talked
about
the
barlow
with
spacers
filter,
wheel
right
and
then
you
want
a
camera.
B
That's
gonna
be
as
fast
as
possible
in
terms
of
frames
per
second,
so
just
in
the
zwo
line,
the
one
that's
the
fastest
monochrome
that
they
have
right
now
is
the
asi
174.
B
So
that's
the
one
I
was
using
and
then
I
picked
up.
This
is
a
newer
color
camera,
that's
even
faster.
I
had
gotten
rid
of
my
or
sold
my
asi
1600.
That
was
that
was
the
color
version,
but
I
picked
this
up,
which
is
even
faster
than
the
the
174
one
shot
color.
B
So
we'll
try
that
and
see
how
it
works
compared
to
so
I
had
to
remove
the
the
ir
filter
in
that
case
for
that
camera,
but
with
the
monochrome
camera,
of
course,
I
was
using
luminance
and
then
the
rgb
were
actually
just
r
and
b
and
the
g
was
was
synthetic
was
recommended
by
the
the
gentleman
that
that
I
chose
as
my
expert,
the
london
astronomer
is
his
site.
B
Okay,
let's
see
yeah
so
either
pass-through
filter
or
the
one-shot
color
camera.
So
this
rig
puts
me
at
f17
or
0.15
arc
seconds
per
pixel
and
that
this
is
in
that
rule
of
thumb
for
amateur
scene
of
three
to
five
times
the
pixel
size,
at
least
for
the
174.
B
Let's
do
some
solar.
This
is
my
latest
solar
image.
I
managed
to
catch
this
sunspot
just
before
it
in
a
in
a
moment
of
clear
sky
just
before
it
went
around
the
edge.
I
you
know
I
was
saw
it
on
space
weather
and
you
know
we're
in
a
a
low,
a
low
spot
of
the
11
year
cycle.
So
I
really
didn't
want
to
miss
this,
but
it
was
cloudy
most
days,
but
finally,
I
got
a
clear
morning
so
requirements
for
solar
if
you've
got
a
larger
objective
scope.
B
B
So
you
could
either
you
know
just
cut
down
on
the
opening
by
putting
some
kind
of
a
a
mask
or
you
could
get
some
kind
of
a
filter.
So
you
can
just
use
a
yellow
or
red
photographic
filter,
which
is
what
I
did.
I
got
a
a
yellow
one,
but
you
know
if
you,
if
you're
going
all
out
for
the
the
12-inch
sct
as
a
solar
scope
or
something
then
then
you've
got
a
problem
with
and
people
do
sell.
B
Scts
completely
customized
for
solar,
but
it's
quite
expensive,
just
the
the
energy
rejection
filter
for
that
in
itself
is
like
a
thousand
bucks
or
something
I
think
so,
it's
crazy.
So
anyway,
that's
one
thing
you
need,
and
then
you
know
I'm
using
this,
this
camera
quark,
so
I've
got
a
separate
slide
on
it.
I
think,
but
it
it's
made
up
of
it's
it's
made
to
go
on
canon
lenses.
B
So
I
don't
know
you
can
kind
of
tell
there's
a
canon
lens,
adapter
there
and
another
one
here,
and
then
I'm
adapting
it
back
to
threaded
connections
for
for
the
scope
and
I'm
moving
my
mouse
on
the
wrong
picture
here.
So
there's
one
here
and
canon
eos
mount
here
and
then
you
know
I
have
other
adapters
added
to
convert
that
back
to
to
threads
for
for
astrophotography
purposes,
but
it's
designed
for
a
canon
lens
and
a
dslr.
B
B
This
is
the
edilon
filter,
which
is
the
the
h,
a
filter
and
that's
a
an
oven
with
the
optical
components
in
it,
and
you
adjust
the
this.
The
center
frequency
of
that
filter
by
adjusting
the
heat
of
the
of
the
oven
or
the
temperature
of
the
oven.
So
there's
a
knob
here
for
that
and
you
it's
got,
I
think
nine
detents.
B
B
So
we
talked
about
the
filter
and
oh
and
so
then
another
problem
that
you
have
with
with
solar
scopes
is
newton's
rings
and
it
has
to
do.
It
has
to
do
with
an
interference
pattern
that
sets
up
on
the
on
the
sensor
and
the
solution
is
to
either
tilt
the
camera
ever
so
slightly.
So
the
you
know,
maybe
two
degrees,
so
the
the
sensor.
B
It
kills
off
the
the
interference
pattern
or
what
I
did
was
I
bought
a
tilt
prism,
which
does
essentially
the
same
thing
but
but
optically
rather
than
mechanically
and
put
that
inside.
One
of
these.
This
is
a
zwo
canon
lens
to
t
threads
and
there's
a
two
inch
filter
threads
inside
there.
B
So
you
could
put
a
a
two
inch
filter
in
there,
and
so
I
made
a
again
a
3d
printed
part
where
I
could
glue
the
this
prism
tilt
prism
into
that
part
and
then
thread
it
into
the
two
inch
filter
threads,
so
that
got
rid
of
my
newton's
rings
so
that
the
newton's
rings
look
like
like
a
fresnel
pattern.
You
know
just
a
series
of
concentric
rings,
so
you
don't
want
that.
So
that's
that's!
How
I
got
rid
of
that.
B
D
D
I
yeah
yeah,
I'm
not
getting
why
we
don't
get
it
normally,
because
I
got
it
when
I
was
taking.
What
do
you
call
that
northern
lights
pictures
you
know
and
they're
all
they're,
a
very
specific
frequency,
but
anyway
I
had
a
question.
Another
question
is
why
a
barlow
isn't
the
sun
big
enough.
B
D
B
Yeah,
so
that
that
image-
let
me
back
up
here
if
I
click
in
here
and
then
yeah
so
whoops,
so
this
was
30
000
frames,
but
I
took
the
best
one
percent,
so
300
frames.
So
this
is
a
stack
of
300
frames,
so
out
of
30
000
frames
that
took
you
know,
maybe
I
forget
exactly
how
long
it
took,
but
it
was
less
much
less
than
a
minute.
B
It
was
really
fast
because
also
because
the
area
is
so
small
right,
that's
the
other
thing
is
you're
you're
using
a
region
of
interest
on
your
your
camera
sensor,
you're,
not
using
the
whole
thing,
because
again,
you're
trying
to
go
as
fast
as
you
can
so
reducing
the
amount
of
data
that
you
have
to
transfer
by
reducing
the
region
of
interest.
Lets
you
go
faster
and
faster
right.
D
B
And
what
what
I'd
learned?
Maybe
it
would
it's
obvious
in
in
hindsight,
but
when
I
was
working
on
mars
for
for
a
couple
weeks
there
you
know
at
first
I'm
I'm
like.
Why
is
this
guy
who's?
I
picked
this
guy
as
my
expert
right
and
followed
his
formula,
and
it's
like:
why
is
he
taking
50
of
his
3
000?
In
that
case
frames?
B
Wouldn't
you
know
one
percent
be
better
be
more
in
focus
than
than
the
top
50
percent
right,
because
that's
what
you're
doing
is
picking
out
the
ones
that
are
more
in
focus
the
sharper
right,
but
then
what
what
I
learned
is:
okay,
yes,
the
right
out
of
the
box
after
you
stack,
the
one
percent
stack
looks
better,
but
when
you
go
to
sharpen
it,
you
can't
sharpen
it
nearly
as
much
without
artifacts.
B
D
Clear
you
know:
okay,
one
question
is,
I
think
I
understand
what's
going
on
and
please
correct
me
if
I'm
wrong
what's
happening,
is
the
atmosphere
is
sort
of
moving
things
around
randomly
at
a
fast
rate
and
that's
what's
causing
the
blurring
yep,
so
you
could
shoot
at
a
fast
enough.
Shutter
speed.
D
One
picture
would
be
better,
so
if
you
could
shoot
at
a
thousandth
of
a
second
or
a
five
thousandth
of
a
second
one
picture
should
be
sharper
than
a
stack
of
a
hundred
where
you're
trying
your
best
to
align.
You
know
to
get
lucky
and
have
all
those
hun
you
know
find
the
hundred
that
are
sort
of
aligned.
D
B
That
would
be
better
analogous
to
what
I
was
just
saying
was
my
my
finding
was
that
when
you,
when
you
did
that,
when
you
stacked
fewer
pictures,
you
know
I
I
was
talking
about
in
that
case
mars,
3
000
frames
and
looking
at
one
percent
of
them
versus
50
percent
of
them
in
a
stack.
B
The
the
initial
result
of
the
stacking
looked
better
at
one
percent,
but
you
couldn't
hardly
sharpen
it
without
artifacts,
whereas
going
back
to
50
now,
you've
got
much
more
in
the
stack,
so
much
more
detail
hidden
there
in
the
data
it
was
able
to
be
sharpened
quite
a
bit
quite
a
bit
more.
So
that's.
F
There's
one
point
you're
missing
here:
if
you
take
a
simple
picture
of
the
sun,
you
are
not
only
taking
a
picture
of
the
sun
you're,
also
taking
a
picture
of
the
atmosphere
which
has
all
these
variation
of
waves
across
it.
So
in
other
words,
a
portion
might
be
a
little
bit
in
focus
a
very
short
distance
away.
D
F
F
It's
if
you're
looking
you're
looking
through
the
atmosphere
like
you're
looking
through
water,
that's
got
ripples
in
it
and
depending
on
how
much
rippling
and
how
much
turbulence
is
going
on.
That's
why
you
have
to
take
a
lot
of
shot
so
that
you
can
average
that
and
then
from
that
be
able
to
pull
out
with
the
sharpening
algorithms
the
detail
that
is
real
and
be
able
to
sharpen
it.
So
you
can
take
a
picture,
but
it's
not
going
to
have
much
to
do
with
the
reality
of
the
actual
sun,
with
just
a
single
shot.
E
Yeah,
a
single
picture
will
be
with
a
very
low
signal
to
noise
ratio,
so
basically,
when
we
stack
we're
trying
to
increase
signal
towards
ratio
so
much
right.
Otherwise,
yes,
you
can
take
one
image
but
will
be
very
dark
right
and
your
scene
will
be
very
low
and
you
still
have
details
but
will
be
half.
D
It
doesn't
have
to
be
dark
if
it's
the
sun,
I
mean
you
have
as
much
light
as
you
want
as
long
as
you're
not
burning
up
your
camera,
so
my
assumption
was,
you
could
control
it
such
that
you
know
with
a
thousandth
of
a
second
exposure
or
a
5
000th.
You
could
freeze
it
and
you
shouldn't
have
any
signal
to
noise
problems.
Now
the
atmospheric
this
version
across
the
image
argument:
okay,
maybe
I'll,
buy
that,
but
in
terms
of
snr,
no
there's
no
reason.
This
thing
is
so
bright.
B
We're
so
two
things
so
we're
we're
taking
as
short
an
exposure
as
we
can
such
that
we
can
do
it
in
a
stream
right.
Do
it
continuously,
I
I
let
me
I
just
had
the
folder
open,
so
I
can
tell
you
whoops
wrong.
B
B
B
What
the
frame,
what
the,
what
the
actual
exposure
time
was,
but
it
was,
you
know,
milliseconds,
it's
36
frames
a
second,
you
sure
it's
not
a
30th
of
a
second
I'm
sure
yeah,
because
the
limit
the
limit
here
is
due
to
the
the
region
of
interest.
I
chose
to
get
more
of
the
surrounding
detail
of
the
of
the
sunspot.
B
If
this
was
you
know
mars
or
something
I
would
be,
the
region
of
interest
would
only
be
about
this.
Big
would
only
be
like
a
quarter
of
it
and
this
camera
can
go
at
150
frames
per
second
in
that
case,
but
this
is
just
limited
by
the
by
the
speed
of
the
the
amount
of
data
and
the
speed
of
the
interface
between
the
camera
and
the
and
the
computer.
But
anyway
you
can
see
here
the
all
the
roiling
atmosphere
and
the
other
point
I
wanted
to
make.
B
Is
it's
not
necessarily
about
signal
to
noise?
It's
about
the
the
detail
that
you
can
pull
out
with
with
the
sharpening
and
again,
that's
why
I
wanted
to
show
you.
This
is
not
very
bright,
but
this
this
is
what
the
out
the
stacked,
300
stacked
frames
looked
like,
and
then
I
went
from
from
that
to
to
this
with
the
wavelet
sharpening
basically
and
other
photoshop
nonsense.
B
Okay,
let's
see
where
were
we.
B
Okay
yeah,
so
that's
f,
30.32
arc
seconds
per
pixel
on
the
12
inch
or
no
on
the
I'm
sorry
on
the
80
millimeter
refractor.
So
we
talked
about
the
energy
rejection
filter.
So
this
is,
I
forget,
whatever
it's
not
80
millimeters,
but
whatever
the
the
biggest
yellow
filter
I
could
find,
was
and
there's
no
filter
threads
on
the
the
refractor.
B
So
I
made
this
3d
printed
cover
which
friction
slides
over
the
the
dew
shield,
and
it
has
the
filter
threads
that
match
this
filter,
and
then
it
also
was
an
opportunity
to
glue
this
little
bracket
on
here.
For
the
I
have
a
solar
scintillation
monitor,
which
I
didn't
use
for
that
particular
image.
B
But
the
idea
is
that
you
would
there's
a
plug-in
for
fire
capture,
so
this
is
essentially
measuring
the
scene
in
real
time
on
of
the
sun,
and
then
you
could
say
you
could
just
set
it
all
up
and
let
it
run
and
say
we'll
start
your
start,
your
image
when
the
scene
is
this
good
right,
so
you
could
let
it
run
for
a
couple
hours
as
the
sun
moves
over
your
lawn,
which
is
the
best.
B
You
know
situation
unless
you're
up
a
couple
floors
and
it'll
just
pick
the
best
moments
of
of
seeing
to
do.
You
know
a
30,
000
frame
capture
and
then
you'd
take
the
best
of
those
kind
of
things,
but
I
haven't
didn't
have
time
or
to
get
that
all
going,
and
I
just
captured
the
data
and
and
went
with
it,
but
that's
so
anyway.
That's
the
energy
rejection,
filter.
B
Okay,
so
here's
what
the
solar
rig
looks
like
on
the
scope,
so
you
can
see
that
energy
rejection
filter
slid
over
the
dew
shield
and
the
solar
scintillation
monitor.
Is
there
there's
also
a
solar
finder,
and
I
have
to
put
when
you,
when
you
use
a
barlow,
it
moves
your
focal
plane
out
as
opposed
to
in
like
a
focal
reducer.
B
So
you
know
more
space
here
between
the
scope
and
the
focuser
and
then
the
focuser
and
all
the
the
stuff
we
talked
about.
That
is
the
camera
quark.
This
picture
has
the
asi
1600,
but
since
then
I've
moved
to
the
the
174,
which
is
much
faster.
B
Let's
see,
did
we
talk
about
everything
yeah,
the
camera
quark
comes
in
two
models.
The
mine
is
for
the
chromosphere
versus
the
the
I
always
want
to
say
flares,
but
that's
not
technically
correct
right,
the
prominences,
but
I
think
most
people
buy
the
chromosphere
because
they're
somewhat
okay
for
prominences
and
the
promise
prominence
model
is
not
as
good
for
the
surface
detail
of
the
sun.
B
B
So
yet
another
thing
you
can
do
with
with
the
light
from
objects
in
in
the
sky.
So
this
this
is
just
a
slide
from
the
talk
I
gave
back
in
2018
and
I
was
trying
to
see
if
I
could
measure
the
the
redshift
of
a
galaxy.
I
think
it
was-
and
this
slide
was
about
about
doing
that-
that
measurement
yeah.
This
is
a
galaxy.
U
ugc,
545
and
we're
comparing.
B
This
particular
wavelength,
which
should
be
the
same
element
between
vega
as
a
reference
and
what
we
saw
in
that
galaxy,
and
then
that
gives
you
your
your
redshift.
So
that's
one
of
the
things
that
you
can
do
with
spectroscopy.
B
So,
as
I
mentioned,
I
think,
or
maybe
it
was
before
we
started
recording
in
in
2018,
I
had
a
a
star
analyzer,
which
is
a
grading
spectrograph
and
since
then,
I've
purchased,
but
not
used
until
this
last
weekend,
a
grissom
spectrograph,
which
is
a
combination
of
a
prism
and
grading,
and
we'll
talk
about
that
in
a
minute.
B
So
going
back
to
the
format
of
of
requirements
and
and
the
instrument
package
again
you're
gonna,
since
you
you,
you
may
not
be
able
to
see
the
actual
stars
once
you've
got
your
spectrogram
graph
in
place
or
spectroscope
you're
going
to
need
some
way
to
get
your
target
on
the
image
and-
and
also
I
guess,
if
you
wanted
to
guide,
then
you'd
have
to
have
a
way
to
guide
as
well,
similar
to
the
other,
to
planetary
you're,
going
to
want
a
higher
focal
ratio,
and
that
gives
you
a
less
critical,
critical
focus
zone.
B
So
you
can
think
of
it
like
a
depth
of
field
in
a
in
a
terrestrial
camera
right,
you
have
a
higher
f-stop,
you
have
more
depth
of
field,
and
so
that
makes
it
easier
to
focus
and
get
the
critical,
sharp
detail.
You
need
to
see
those
lines
in
the
in
the
spectrum,
so
there's
various
types
of
spectroscopes,
the
the
grading,
the
prism
slit
and
then
there's
combinations
of
all
of
the
above.
B
So
this
particular
instrument
package-
and
you
see
I-
I
dropped
all
the
way
back
to
a
canon
t3
here,
although
I'll
be
changing
this
to
a
asi
1600,
I
wasn't
sure
that
the
the
spectrum
would
fit,
because
this
particular
rig
was
excuse
me
designed
specifically
for
a
dslr
crop
sensor,
sensor
size,
so
so
yeah.
So
I'm
doing
this
on
the
william
optics
refractor.
B
So
again,
it's
that
adjustable
barlow
from
the
planetary
rig,
and
this
is
why
I
wanted
to
take
pictures
and
document
all
these
different
things
so
that,
if
I'm
moving
things
around,
I
can
get
back
to
to
where
I
was.
If
I
want
to
go
back
and
do
planetary
and
take
this
apart
right,
so
there's
a
grism
here
inside
this
there's
a
t-ring
and
a
nose
piece
here
and
the
grism
is
in
the
nose
piece
and
it's
fixed
in
terms
of
its
optical
distance
from
the
from
the
camera
sensor
and
then
there's
a
way.
B
There's
a
thumb
screw
here
so
that
you
can
adjust
the
angle
of
the
spectrum
so
that
it's
along
the
horizontal
axis
of
your
your
sensor,
yeah.
So
the
adjustable
bar
low,
the
grism
and
then
the
guy
that
sold
the
grism
included
this
star
diagonal
with
a
an
eyepiece
and
the
eyepiece
is,
is
fixed
in
here
with
grub
screws
at
a
particular
setting.
So
the
idea
is
that
this
exactly
matches.
B
You
know
the
the
the
optical
distance
here
between
the
grism
and
the
sensor.
So
now
you've
got
the
optical
distance
between
the
grism
and
the
the
focus
point
of
the
the
eyepiece.
B
So
that
when
you
look
in
this
thing
you
know
you
remove
the
camera
and
and
actually
the
the
grism
actually
and
then
put
this
eyepiece
and
diagonal
in
there.
And
then
you
could
use
that
to
focus
and
then
take
that
out
and
put
the
grisman
camera
back
in
and
you
should
be
precisely
in
in
focus.
B
Okay-
and
so
here
are
the
two
types
of
spectroscopes
that
I
have.
This
is
the
star
analyzer
grading,
it's
about
two
hundred
dollars
and
then
this
grism
thing
was
about
three
and
a
quarter
off
of
cloudy
nights
and
then
the
next
step
up
unless
you're
gonna
make
your
own
is
like
three
grand
I
think
in
inspector
scope.
B
So
there's
a
big
gap
there
I
did
see
one
guy,
that's
got
a
a
nice
3d
printed
design
for
which
would
be
on
par
with
like
the
three
thousand
dollar
spectroscopes,
but
he
thinks
you
you're
going
to
spend
about
600
euros
to
to
in
optical
parts
and
all
the
printing
and
stuff
to
get
to
that
level.
So
I
don't
know
if
I'll
go
there,
but
we'll
see
since
I'm
3d,
printing
but
we'll
we'll
see.
B
So
the
the
the
grading
is
actually
on
the
diagonal
of
the
of
the
prism
here
and
that's
mounted
inside
this
t,
adapter
nose
piece
here
and
then
you
know
it
goes
and
puts
the
spectrum
on
the
on
the
sensor
and
the
the
reason
for
this
over
just
the
grading
by
itself
is
without
the
prism.
The
the
focal
plane
of
the
spectrum
is
actually
curved
right,
so
maybe
oops
sorry,
maybe
it's
perfectly
in
focus
in
the
red.
B
B
One
thing
I've
noticed
is:
is
that
with
the
the
grading
you
normally
have
the
actual
image
of
the
star
and
then
its
spectrum
off
to
the
side,
and
that
when
you
go
to
calibrate
your
your
spectrum,
that's
convenient
to
know
that
the
star
is
the
quote
zero
point,
and
then
you
can
count
pixels
over
to
you
know
an
h
beta
line
or
something
and
and
calibrate
that
way.
B
But
with
this
particular
setup
it
looks
like
you,
don't
get
a
star
image
at
all,
so
it's
going
to
be
a
little
bit
more
complicated
to
calibrate
you'll
have
to
identify
two
lines
in
the
balmer
series
or
something
you
know
where
you
have
dips
in
your
in
your
spectrum
to
to
get
calibrated.
B
Yeah,
so
what
I
found
so
previously
with
with
the
the
grading
right,
it
just
depends
on
the
star
field
you're
looking
at.
But
but
you
know
you
you,
you
get
your
your
star
that
you
want
centered
right
and
so
then
you
can
identify
that
star
and
just
look
at
that
spectrum
from
that
star
and
if
there's
other
stars
that
are
causing
overlapping
spectra,
then
you
can
create
some
kind
of
mask
or
something
in
the
case
of
the
the
grismo.
B
What
I
found
was,
maybe
just
because
of
the
targets
that
that
I
chose
and
I
I
tried
to
take
data
from
eight
or
nine
different
things
on
on
saturday
or
sunday.
I
think
it
was
but.
B
In
it
seemed
like
there
was
only
one
bright
spectrum,
and-
and
again
you
remember
when
I
said
that
I
knew
if
I
plate
solved
on
on
one
scope.
I
knew
where,
in
the
other
scope
the
the
image
would
be,
you
know
the
actual
image
would
come
up
right.
So
the
the
spectrum
that
I
wanted
was
starting
at
that
location
on
the
on
the
sensor.
That
said
that
you
know
that's
the
object
that
I
centered
with
plate
solving
in
the
other
scope.
I
hope
that
makes
sense
so
so.
D
B
Not
yeah,
there's
some
there's
yeah.
I
wasn't
prepared
to
do
a
whole
spectacular
spectroscopy
talk
tonight,
but
I
can
recommend
so
so
watch
the
the
youtube
that
tom
field
did
the
speaker
talk
at
the
after
the
board
meeting
last
time
and
he
recommended
this
book.
Let's
see
if
I
can
astronomical
spectroscopy
for
amateurs
by
ken
harrison
who
I've
been
emailing
with
all
week.
B
So
there's
a
lot
of
good
information
in
in
that
book
and
he
talks
in
great
detail
about.
You
know
how
you
would
go
about
getting
absorption
lines
from
from
a
nebula
versus
a
point
source
like
a
star
or
a
planet.
I
did
he
he
talks
about.
You
know.
One
of
the
chapters
is,
you
know,
here's
some
beginning
projects
right,
so
it
talks
about
looking
at
the
methane
lines
from
the
atmospheres
of
uranus
neptune
and
I
forget
what
the
other
the
other
one
was.
B
But
anyway
I
I
went
and
took
data
on
uranus
and
I
haven't
analyzed
it
yet,
but
but
this
this
is.
B
This
is
the
first
data
that
I
got
from
the
grism,
so
you
always
want
to
start
with
with
vega
as
your
as
your
example,
and
then
you
bring
up
these
hydrogen
bomber
lines
and
try
to
line
them
up
with
the
dips
in
your
in
your
spectra
and
then-
and
here
there
is
like
this
spike
here
that
I
think
might
represent
the
the
zero
point
or
the
the
star
so
but
I
need
to
basically
I
need
to
go
back
and
I
learned
a
lot
since
I
took
this
data
and
ken
has
been
helping
me
as
I
mentioned
the
author
of
this
book.
B
So
basically
I
underexposed
the
the
data
I
was
being
too
cautious
to
make
sure
I
didn't
over,
expose
it
and,
as
a
result,
I
underexposed
it.
So
I
need
to
go
back
and
retake
the
data,
because
I
think
this
he
was
explaining
to
me
this
dip
right
here
is
actually
an
artifact
of
the
canon
bayer
matrix
on
on
basically
all
their
cameras,
and
so
it's
not
it's
not
really
a
valid
dip
in
the
in
the
spectrum
there.
B
So
I
probably
didn't
do
these
do
these
lines
correctly,
even
though
it
looks
like
it's
correct,
so
anyways,
I'm
learning
some
learning
there,
but
you
know
the
comparing
the
two,
the
the
grism.
B
You
know
this
is
you.
You
have
a
resolution
of
about
14.8
angstroms
and
with
the
star
analyzer.
You
know
it's
33
angstroms,
so
it's
not
quite
as
quite
as
precise.
This
is
data
from
the
from
the
star
analyzer
back
in
2018,
and
I
the
reason
there's
this
big
hole
here
is
because
I
took
it
through
a
light
pollution
filter
right,
so
there's
a
big
hole
cut
out
where
all
the
man-made
light
would
would
be,
but
so
the
this
this
is
a
synthetic
spectrum.
B
B
Okay
other
bits.
B
So
as
long
as
I
was
documenting
everything,
let
me
turn
off
my
other
camera
here
for
a
second
for
the
recording,
so
yeah
the
flip
mirror,
and
I
I
threw
out
a
an
off
axis
guider
that
I
have
as
well.
So
you
guys
know
what
those
are.
B
So
here's
my
astrophotographer
joke,
so
you
know
I
put
all
this
crap
on
the
on
the
scope,
which
is
was
really
long
and
then
I
went
in
the
house
and
had
it
slew
to
target,
and
then
I
came
back
out
and
it's
like.
Oh,
I
almost
hit
the
fence
so
yeah.
You
might
want
to
think
about
that
if
you're
putting
all
these
spacers
on
on
things.
B
B
A
F
F
Now
that
the
fires
have
gone
away,
they
finally
admitted
that
the
creek
fire
has
stopped
growing
today,
since
I've
had
about
two
months
of
not
being
able
to
do
any
pictures,
I
had
to
kind
of
remember
how
to
try
and
do
things
so
I
been
tuning
things
back
up
with
my
11
inch
rosa
and
the
mount
and
the
guiding
and
this
and
all
that
stuff.
F
F
F
Oh
there
we
go.
Oh
do
this,
okay,
so
this
is.
I
took
the
data
for
this
night
before
last,
and
this
is
80
minutes
of
data
with
the
gaza
80
exposures,
one
minute
a
piece-
and
this
is
in
versus-
and
this
is
one
of
the
most
massive
galaxy
clusters
that
we
know
about
it's
240
million
light
years
away,
give
or
take,
and
it
also
can
have
my
cursor
here.
F
Now
this
is
kind
of
like
an
interesting
thing
that
I've
observed
over
the
year.
F
F
So
this
is
this
is
what
you
call
a
one-to-one
view
and,
as
you
can
see,
many
of
these
galaxies
are
ellipticos.
F
Some
lenticulars,
not
too
many
spiral,
galaxies,
I'm
afraid
they've,
probably
gotten
all
eaten,
and
there
are
many
there's
a
lot
of
absorption
from
our
galaxy
and,
of
course,
in
intergalactic
space.
So
these
all
these
galaxies
have
been
quite
red,
and
so
I
was
able
to
bring
out
the
stars
and
give
them
a
little
bit
of
blue
color
just
to
give
some
stuff
to
it.
F
So
this
is
the
way
the
razor
works
out
with
this
one
with
the
268
lead.
If
you
is,
these
are
1.77.
F
Arc
seconds
per
pixel
when
I
bend
these
one
and
a
half
to
one
so
scene,
wasn't
that
great,
but
and
a
lot
better
than
it
has
been
yeah.
This
is
kind
of
what
I
was
able
to
do
with
this
with
just
80
minutes
of
work.
F
You
know
what
can
I
say,
I'm
a
bit
of
a
carpenter
and
I
can't
figure
out
this
computer
programming
stuff
that
well.
C
So
why
why
did
you
throw
out
the
ones
that
were
duplicates
of
the
drizzle
position?
Well,
because
the
star
was
doubled.
F
You
know
it
dithers
during
the
middle
of
an
exposure.
So
therefore
you
know
each
star
gets
doubled
or
you
know
trailed
or
whatever
you
want
to
call
it.
So
then,
out
of
that,
I
rejected
about
about
another
10,
just
using
analysis
and
picks
inside
for
exposures.
F
So
it
would
be
nice
that
I,
if
I
could
tame
sharp
cap,
that
at
this
point
it
doesn't
seem
to
be
happening
too
much.
It's.
The
program
frankly,
is
kind
of
like
a
little
bit
complicated
to
do.
F
To
do
and
it's
a
common
issue,
I
can't
figure
things
out
and
but
I
make
do
you
know
any
questions
with
this
picture
here.
E
A
I
I
think
that
you
can
use
a
dithering
setting
in
sharp
cap
when
you're
doing
live
stacking.
You
just
have
to
connect
it
to
phd
through
sharpcap
and
yeah.
F
I
don't
really
prefer
to
do
live
stacking
in
a
way,
I'm
sort
of
a
manual
hands-on
kind
of
guy,
and
I
want
to
be
able
to
to
to
inspect
every
photo
that
I
take
and
just.
A
Of
course
pj,
but
if
you
do
do
a
live
stack,
you
can
discard
the
live
stack,
you
don't
have
to
keep
it
and
you
can
have
it
keep
each
of
the
individual
fit
files
that
you
save,
but
at
the
same
time,
if
you
do
it
using
the
live,
stack
option
or
subroutine,
whatever
you'd
call
it
in
sharp
cap,
then
you
can
do
dithering
using
sharp
cap
and
it
will
do
their
between
frames
instead
of
during
frames.
So
you
won't
waste
as
many
frames
and
you
should
have
a
lot
more
data.
F
You're
going
off
bruce,
I
don't
know
why
yeah
well,
maybe
maybe
figure
it
out
someday
and
then,
when
this
covered
stuff
is
over.
Maybe
I
can
have
somebody
come
up
here
and
show
me
how
it's
done.
Okay,
that's
kind
of
an
invitation
to
come
up
through
the
stairs.
A
F
Well,
that's
about
all
I've
got
to
do
data
that
I
can
now
run
through
and
see
how
I
can
do
on
it.
So
that's
what
I
get
to
do
when
the
moon
gets
into
the
sky.
F
A
Does
anybody
have
any
images
that
they
want
to
share
francesco
you've
done
some
stuff
recently
down
in
pinnacles?
I
understand
we're
looking
for.
A
I
H
I
I
Now
this
is
the
picture
I
captured
last
saturday
from
pinnacles
west,
it's
ic342
the
hidden
galaxy,
very
much
like
the
the
precise
cluster
that
pj
was
showing
this
one
also
sits
perspectively
relatively
close
to
the
milky
way.
So,
there's
lots
of
absorption
of
the
light
and
reddening
of
the
light
from
the
dust
in
our
galaxy,
and
so
the
the
galaxy
is
always
also
sits
in
the
middle
of
a
dense
star
field.
I
So
it's
hidden,
and
so
that's
where
the
name
comes
from
with
by
by
a
curtain
of
stars
that
I
tried
to
render
here
without
doing
my
usual
star
reduction,
so
that
you
can
see
the
the
density
of
the
star
field
and
the
colors.
If
I
zoom
in
galaxy
still
has
a
has
a
very
nice
spiral
shape.
You
can
see
some
interesting
dust
lanes,
some
h2
regions
in
the
galaxy,
and
for
me
this.
This
image
is
also
interesting
because
it's
the
first
one
that
I
captured
using
automation.
I
If
you,
if
you
know
how
I
image,
I
always
kept
a
very
low
tech
approach,
keep
it
simple
with
basically
without
using
a
computer
for
acquisition,
I
have
a
standalone,
auto
guider.
I
have
a
standalone
focusing
system
and
I
use
a
dslr,
so
I
just
programmed
the
sequence
in
the
in
the
guider
and
I
let
it
run
with
the
disadvantage
that
I
have
it's
not
really
unattended.
I
have
to
do
meridian
flips
myself
manually,
I
have
to.
I
I
don't
have
any
plate
solving
so
framing
is
always
a
little
bit
complicated,
but
this
time
I
decided
to
to
try
something
completely
new
for
me,
which
is
to
you
to
use
a
program
called
nina
nighttime,
imaging
and
astronomy
to
orchestrate
the
entire
capture
sequence.
So
I
programmed
nina
in
such
a
way
that
it
waited
for
night
to
fall.
I
It
started
taking
pictures
and
before
taking
picture
it
slew
to
the
object
played
solved,
gave
me
indication
on
how
to
rotate
it
started
the
guiding
and
took
pictures
dithering
and
doing
a
meridian
flip,
and
it's
it's
not
just
it's
more
complicated,
of
course
than
my
keep
it
simple
approach,
and
so
maybe
I
wouldn't.
I
I
wouldn't
have
started
with
my
mystery
imaging
journey
if
I
had
to
overcome
those
obstacles
right
off
the
bat,
but
at
this
point
I
think
I
see
the
advantage
not
so
much
from
from
a
dark
side
at
pinnacles,
but
from
home.
This
allows
me
to
capture
all
night
while
I'm
sleeping
which
I
know
that
it's
nothing
new
for
you.
But
to
me
this
was
a
big
big
improvement
in
my
workflow,
and
the
interesting
part
is
that
I
didn't
the
application.
I
One
of
the
reasons
why
I
was
so
stubborn
in
not
embracing
automation
is
that
some
of
my
hardware
in
particular
my
guider,
is
not
supported
by
by
most
of
automation,
software,
because
it's
not
really
a
camera.
It's
a
guiding
system,
it's
the
equivalent
of
a
phd2
plus
a
camera
not
of
a
camera
itself
in
itself,
so
nina
had
some
form
of
rudimentary
support
for
that
and
I
managed
to
given
it
its
open
source.
I
could
modify
the
code
myself
to
add
the
full
support
and
yeah
it
works.
It's
I'm
I'm
rather
impressed.
I
I
Yeah,
so
this
is
the
this
is
the
ui
of
the
program.
This
is
the
sequence
that
I
kept
they
used
to
capture
ic342,
you
see.
I
I
tell
the
program.
Wait
until
night
falls
then
salute
to
the
object
right
now.
It
gives
me
this
red
sign
because
it's
not
the
telescope
is
not
connected
center
and
rotate
start
guiding,
run
autofocus
and
then
start
taking
it
honoring.
A
meridian
flip
is
needed,
refocusing
if
the
hfr
increases
above
above
five
percent
and
then
capture
45
pictures,
each
of
them
480
seconds
with
my
well.
I
It's
a
it's
very
much
a
work
in
progress.
This
would
require
much
more
integration
time.
This
is
only
three
hours
it's
this
nebula
here
is
ngc
2182
and
it's
this
is
in
the
constellation
monoceros.
I
I
found
this
this
nebula
particularly
intriguing
because
of
all
these
dark
dust
lanes
that
you
can
see
here.
It's
called
the
angel
right.
Somebody
called
it
the
angel
nebula,
but
the
after
rogelio
imaged,
the
the
I
can't
remember
the
name,
but
he
imaged
an
image
which
is
part
of
the
ifn
that
he
calls
the
the
angel
nebula,
and
I
think
that
the
name
stuck,
and
so
I
think
the
rogelius
angel
has
superseded
this
one
but
yeah.
I
I
agree
with
you.
I
heard
the
I
heard
it
called
with
his
name.
I
It
is
extremely
faint.
It
takes
a.
It
requires
a
lot
of
way,
more
integration
time
that
I
gave
probably
10
to
12
hours.
This
is
only
three
and
has
some
nasty
reflections
from
gamma
monocerotis
and
from
another
starter.
They
couldn't
identify
yet
but
yeah.
If,
if
the
weather
cooperates
and
the
kovit
lets
us
go
out
of
the
of
the
county
still
in
december
and
january,
maybe
I'll
be
able
to
integrate
more
data.
I
I
Yeah,
it
is
it's,
it's
a
big
improvement.
I
Yeah
the
plate,
solving
is
a
big,
is
a
huge
improvement,
the
other
one
well
meridian
flip.
Of
course,
if
you
want
to
do
unattended,
imaging
and
the
other
thing
that
really
impressed
me
is
the
auto
focusing
so
I
was
using
a
basically,
I
have
a
focusing
system,
motorized
focusing
that
does
a
form
of
dead
reconning.
I
It
measures
the
temperature
and
has
a
calibration
curve.
That
knows
how
many
steps
the
focuser
needs
to
be
moved
inwards
per
each
celsius,
degree
of
dropping
temperature,
but
of
course,
there's
no,
it's
it's
an
open
feedback
system.
It's
you
never
know
until
the
sequence
is
done
and
you're
looking
at
the
images
by
using
this
nina
or
other
softwares
that
can
support
focusing
using
the
the
main
camera
as
the
detector.
For
focusing
it's
all
the
guesswork
is
is
gone.
I
You
have
a
nice
very
nice
v
curve
and
it
determines
the
the
best
best
focus
point
and
the
constantly
monitors
the
hfr
during
the
night.
And
so,
if,
if
something
changes,
then
it
can
trigger
refocusing
in
the
case
of
my
imaging
experience
at
pinnacles
when
it
when
the
hfr
degraded.
Unfortunately,
it
was
because
of
dew
and
not
because
of
refocusing
so
that
spelled
the
end
of
the
night
for
me,
but
yeah.
That
was
a
very
fun
experience.
I
Yeah,
so
let
me
go
back
to
my
windows
terminal,
so
I
can
show
you
so
this
is
the
what
they
call
it.
The
advanced
scheduler
you
can
have
conditions
like
wait
for
time,
wait
until
the
the
object
has
reached
a
certain
altitude
and
when,
when
the
altitude
goes
up
below
a
certain
threshold,
you
can
schedule
of
course,
are
using
it
osc.
But
if
you
have
a
filter,
a
monochrome
camera
with
filters,
you
can
decide
to
do
only
certain
filters
depending
on
the
night.
I
I'm
not
sure
I,
unfortunately
I
don't
have
any
other
experience
with
the
things
like
sg
pro,
so
I
don't
exactly
know
what
to
look
for.
But
if
you
can
ask,
if
you
can
give
me
examples,
I
can
find
out
what's
available.
This
is
what
they
call
the
advanced
scheduler
there's
also
a
need,
an
easier
scheduler
that
has
a
fewer
options.
So,
for
instance,
you
cannot
set
those
conditions
like
wait
until
the
object
has
cleared
a
certain
altitude
and
so
on.
I
Okay-
and
it's
all,
drag
and
drop
you
can.
These
are
the
the
various
verbs
that
you
can
add
to
the
program
so
to
speak,
and,
for
instance,
you
wanted
to
program
in
such
a
way
that,
at
the
end
of
the
sequence,
the
scope
is
parked.
You
just
drag
this
where,
in
the
in
the
sequence,
you
want
it
and
it's
there
and
it
will
be
executed
after
the
sequence.
In
this
case.
A
Voyager,
which
is
another
sequencing
program,
or
you
know
in
general
astrophotography
program
for
capture,
has
something
called
drag
script
and
I
believe
it
works
in
this
a
similar
way,
dragging
things
in
in
order
for
each
thing
that
you
want
to
do
yeah
I
haven't
used
that
aspect
of
it,
but
it's
a
pretty
powerful
way
of
doing
things.
I
understand.
I
I
Somebody
was
suggesting
that
he
implying
that
he
would
like
to
write
a
plugin
that
lets
you
sample
the
local
horizon
and,
let's
nina
decide
that
well,
I
can
only
image
this
in
this
object
after
has
cleared
the
obstructions
in
the
local
horizon,
and
so
it
takes
care
of
that
for
you
wow.
I
A
Anybody
else
have
any
recent
images
or
adventures
they
want
to
share.
D
I
wanted
to
ask:
I
haven't
an
image,
but
not
a
star
image
but
related
before
I
do
that.
I
I
saw
an
old,
I
don't
know,
probably
august
email
from
pixinsite
that
mentioned
that
they
had
introduced
a
feature.
I
called
that
integration
with
the
gaia
dr2
database
star
database.
D
If
you
folks
saw
that-
and
so
you
know
I
basically
if
you're
familiar
if
you're
not
familiar
with
that,
that's
a
star
survey
right
that
some
satellite
did
where
they
I
don't
know.
I
think
they
got
like
half
a
billion
stars
detail
of
the
right,
the
positions,
the
proper
motion,
the
colors,
luminosities,
etc.
D
But
that
would
be
interesting
right
to
take
your
image
and
play
with
it
and
point
out
a
particular
and
and
get
details
about
it.
But
anyway,
as
far
as
I
could
tell-
and
I
did
send
an
email
to
rob
who
seemed
to
indicate
that
it
wasn't
doing
that
yet,
but
anyway
they
allowed
for
the
ability
to
do
that.
For
people
who
write
plugins,
I
guess
for
pix
insight
and
who
knows
perhaps
so
one
error,
one
whatever
his
last
name
is
and
we'll
do
that
right.
So
that's
interesting!
D
But
anyway,
that
got
me
interested
in
this
gaia
database,
which
one
can
access
anyway.
You
know
just
via
python,
so
I
did
this
one
little
thing
I'll
I'll
show
you
about,
since
I'm
taking
these
photographs,
photography,
astronomy
classes
and
everybody,
you
know
in
every
class
talks
about
hr
diagrams,
and
you
know
this
and
that
about
the
hr
diagram.
D
So
I
solved
one
of
my
images
and
dumped
just
you
know
into
a
text
file
the
ra
and
deck
positions
and
radii
of
all
the
stars
that
I
solved
in
the
image
and
then
here
let
me
share
the
screen
for
a
second.
D
This
is
a
first
step,
but
what
this
is
is
is
the
the
start.
I
I
took
an
image
of
mine
at
m13
of
m13.
The
reason
I
took
m13
is
I
I
didn't
want
any
nebulous.
You
know
like
coloring
the
image
changing
the
colors
of
things.
I
don't
know
so
you
know.
Most
of
my
images
are
big
red
things
all
over
the
place
and
I
figured
that
would
sort
of
change
the
hr
diagram.
D
So
I
took
something
that
didn't
have
that
pj's
image
clearly
would
have
worked
too,
although
those
would
have
been
galaxies.
So
that's
a
whole
different
story,
and
I
anyway,
but
I
took
the
detections,
but
then
all
I
did
was
given
my
ras
and
decks.
You
know
queried
gaia
for
their
for
the
color
and
for
the
effective
for
the
effect
you
know,
for
the
color
and
for
the
absolute
luminosities
and
then
that's
that's.
What
an
hr
diagram
is
right.
D
It's
a
it's
a
loglog
plot
of
temperature
and
absolute
luminosity
so
anyway,
but
here
you
do
see
an
abs.
You
know
a
nice
hr
diagram
with
you
know,
there's
your
red
giant
part
and
there's
your
main
sequence
and
you
know
cut
off
here.
Maybe
because
I
s
you
know,
I
can't
I
don't
know
why
either
because
it
really
is
m13.
D
The
picture
was
much
broader
than
m13
and
maybe
m13
does
stop
there
because
of
its
age
or
who
knows
why
I'm
missing
brighter
stars,
but
anyway
they're,
maybe
they're
all
saturated
or
something
like
that?
D
Who
knows,
but
I'm
going
to
stop,
sharing
and
and
and
give
you
a
little
more
detail
about
my
plan
and
then,
and
so
did
I
stop
sharing
yes,
okay,
so
but
anyway,
but
I
cheated
right
because
I
used
gaia
data
and
so
then
the
next
plan
is
to
just
sample
the
pixels
off
of
my
image
that
I
did
the
star
detection
on
and
you
know,
get
the
color
from
you
know:
r
minus
you
know
the
the
I
mean
blue
minus
red
and
try
to
get
the
absolute
luminosities
by
the
pixel
values.
D
B
G
D
B
Going
to
say
that
sort
of
the
long
way
around
to
do
what
you
were
talking
about,
where
you
could
click
in
your
image
and
get
data,
you
know
you
can
put
your
image
in
stellarium
and
it
overlays
accurately
the
stars
that
are
there.
So
you
could
then
click
on
a
star
and
stellarium
would
tell
you
all
the
details
about
the
the
star.
D
Project,
the
the
the
important
piece
of
you
know,
ultima.
I
mean
I'm
sort
of
starting
with
the
hr
off
of
gaia.
Yeah
to
to
you
know,
to
see
where
I
screw
up
right.
Yeah
data
processing,
but
my
goal
is
to
be
able
to
you
know,
get
it
off
of
my
telescope,
but
I'll
always
need,
I
believe,
the
gaia
parallax.
D
You
know
how
far
this
star
is
right,
because
I
mean
you
know
if
you
use
apparent
luminosity,
that's
a
different
hr
diagram,
but
it's
not
quite
the
same,
but
you
know
you
really
want
this
star's
absolute
luminosity.
You
know
how
bright
it
really
is,
and
for
that
you
need
the
distance,
so
I'll
always
be
bugging
a
gaia
for
the
distance,
but
okay
yeah.
I
Hints:
okay,
thank
you.
Also,
pixel
sight
has
a
script
for
aperture
photo
photometry
yeah
it
integrates
to
the
fluxes
for
you.
D
Okay,
so
it
would
give
me
the
fluxes
yeah
I
and
k-stars.
Does
that
too?
So
I
can
do
that,
but
I
mean
k-stars
doesn't
do
it
in
a
way
that
the
end
user
can
get
at,
but
since
I
can
get
at
it
and
I'm
writing
this
code
anyway,
you
know
but
okay,
thank
you
and.
E
D
E
Because
I
I
use
sastrumidj
to
detect
things,
but
it's
very
manual
and
real
astronomers.
They
use
it.
They
don't
use
a
stream
with
j
because
they,
when
they
try
to
detect
something
like
supernova.
They
scan
really
big
parts
of
the
sky
per
night,
and
it's
fully
automated.
So
basically,
they
have
fully
automation,
protect
what
is
new
on
the
images
immediately
right
and
send
to
the
email.
B
Yeah
and
there's
there's
a
lot
of
stuff
in
astro
pi
and
probably
some
other
python
packages
as
well
to
you
know,
access
all
these
databases
and
stuff,
I'm
using
parts
of
it
to
do
the
stellarium
image
stuff
in
my
rudimentary
python,
scripting
that
I've
done.
D
Well,
anyway,
anybody
wants
my
script
for
getting
the
gaia
dr2
stuff
and
I
guess
dr3,
I
guess,
is
just
an
improvement
on
that.
It's
coming
out
in
december,
but
it's
it
does
give
you
a
wide
variety
of
information
pretty
easily
on
given
an
ra
and
dec.
B
By
the
way,
again
on
on
solarium,
sorry,
I'm
stuck
on
on
that
planetarium
versus
others,
but
I
recently
learned
that
you
can
bring
in
sky
surveys,
so
you
can
turn
off
the
nebula
images
that
that
we've
put
in
and
show.
Instead,
you
know
a
dss
survey
or
something
so
you
can.
Even
if
there's
not
a
nebula
image
there,
you
can
get
a
an
actual,
a
picture
of
that
area
of
the
sky,
to
kind
of
see
what
framing
you
might
want
to
use
and
stuff.
So
it's
kind
of
handy.
D
Yeah,
what's
the
name
of
that
there's
a
website
that
I
use
that
I
like
me,
you
guys
probably
know
about
it:
wiki
sky.
D
B
B
Yeah,
the
the
the
framing
and
mosaic
wizard
in
sgp
is
nice.
That
way,
and
then
you
can
put
in
the
name
of
an
object
and
it'll
go,
get
a
dss
survey
for
whatever
degree
size
you
tell
it
or
you
can
give
it
an
astro
bin,
url
and
it'll
figure
out
where
that
is
in
the
sky
and
pull
in
that
sky
survey
or
you
can
put
in
you
know
our
in
deck
and
then
you
can
draw
if
you're
doing
a
mosaic.
B
I
just
use
it
mostly
just
for
single
frames
to
figure
out
how
I
want
to
rotate
the
camera
right
and
how
I
want
to
position.
You
know
the
framing
and
framing
basically,
but
you
can
also
you
know
just
draw
a
rectangle
and
and
it'll
say:
well,
that's
you
know
4x4
or
5x8
or
whatever
mosaic.
So
you
can
compose
your
mosaic
that
way
too.
So
that's
that's
a
pretty
cool
feature.
B
A
Yeah,
it's
it's
included
in
the
the
regis.
You
know
the
paid
version
as
a
standard
thing
for
sequence,
generator
pro
these.
A
A
I
was
actually
wondering,
if
maybe
francesco,
has
an
idea.
Have
you
you,
you
probably
haven't
upgraded
to
the
new
mac
os.
Yet
I.
I
Guess,
oh
I'm
I'm
lagging
behind!
I'm
still
with
the
mojave.
B
A
A
I
don't
know
if
it
would
work
with
the
new.
You
know
the
big
sur
version,
that's
what
I
was
wondering
I
mean
we
have
had
issues
in
the
past
where
things
changed
and
then
and
fixing
site
didn't
work
with
the
newer
version.
So
I
was
just
wondering
if
anybody
had
anything,
I'm
not
I'm
not
planning
on
upgrading
anytime
soon.
I
think
they're,
probably
a
lot
of
growing
pains
that
have
to
be
worked
out
with
the
new
mac
os.
But
it
looks
interesting.
I
Yeah
I
haven't,
haven't
even
looked
at
the
forums
to
see
if
anybody
has
tested
it,
I'm
doing
it
right
now.
I
D
B
E
It
was
actually
from
deep
sky.
Was
that
there's
another
online
telescope,
foreign,
arizona,
deep
sky,
west,
deep
sky?
West?
Yes-
and
I
didn't
know
until
yesterday-
that
the
image
the
image
is
actually
is
a
top
pick.
A
E
E
Yeah
so
I
use
sort
of
advertisement
you
see
on
the
right,
a
bunch
of
advertisements,
I
saw
dipsky
west,
they
have
four
sets
of
data.
E
And
what
I
like
most
of
this
image
is
that
I
was
able
to
see
this
this
kind
of
white
area
here
of
the
nebula
surrounding
the
dark,
dark
side,
dark
patches
of
dust
right.
E
There
were
actually
more
more
people
actually
trying
to
actually
process
this
data,
but
so
far
this
image
is
the
one
of
the
few
that
actually
you
can
see
this
halo
kind
of
structure
around.
B
E
B
E
B
Didn't
have
that
exposed
before
now,
they're,
exposing
the
nominations,
nomination,
yeah,
but
still
beautiful.
A
Think
that
that's
that's
known
as
the
boogeyman
nebula
yep.
B
Yes,
so
say
a
few
words
about
telescope
live.
B
I
mean
you'd
have
to
read
through
all
the
stuff,
but
they're
they're,
trying
to
to
monetize
and
and
in
some
ways
I'm
I'm
a
little
upset.
It's
like
they're
they're,
trying
to
take
over
astro
bin.
It's
like
they're,
trying
to
be
the
place
where
people
post
their
image
and
and
the
social
interaction,
and
all
that
so
maybe
competition
is
good,
but
I
you
know
I
I
feel
a
little
bit
like
it's
going
to
dilute
the
community
some
but
and
then
they're
they're,
offering
what
they're
calling
pro
data
sets.
B
So
they
will,
you
know,
take
us,
maybe
four
images
in
a
in
a
month
and
and
give
you
the
the
calibrated
subs
as
part
of
your
subscription,
and
I
think
it's
like
50
bucks
a
month
is
the
level
at
which
you
have
to
subscribe
to
get
that
and
you
get
a
certain
amount
of
telescope
time
and
they
they
have
a
very
if
you've
used
like
itelescope.net.
B
They
have
a
very
different
approach.
It's
more!
It's
more
automated!
Where
you
just
put
in
a
request.
It's
like
well
yeah.
I
want
to
use
this
particular
telescope
in
chile
and
I
want
this
mini.
Aha,
whatever,
but
you
don't
go
and
pick
date
and
time
in
a
schedule
right,
they
just
manage
all
that
for
you
and
then
they'll
come
back
and
say:
okay!
Well,
you
wanted
that
stuff
and
we
think
it'll
be
done
on
november
30th
and
you
know
check
back
with
us
and
see
if
it's
done
or
not
kind
of
thing.
B
So
it's
a
little
more
removed
and
you
don't
get
to
log
in
and
see
the
the
actual
thing
in
progress
like
you
do
on
with
acp
modified
acp
on
itelescope.net.
B
But
it's
it's
interesting,
you
know
they
have
a.
They
have
a
very
slick
interface
and
it's
very
you
know,
app-like
and
modern,
and
and
all
that
so
we'll
see
it's
an
interesting.
Some
interesting
pricing
structures
too,
but
something
to
check
out
there's
also,
I
think,
for
for
nine
bucks
a
month.
B
You
get
a
certain
amount
of
telescope
time
and
you
get
they.
They
send
you.
B
B
B
B
Photometry
tonight
and
you
know,
there's
we've
had
talks
in
the
past.
The
high
school
did
the
radio
telescope
project
and
we
had
going
back
a
couple
years.
We
had
the
the
high
school
twins
that
did
the
the
exoplanet
search
and
and
wrote
software
for
nasa
and
all
that
stuff.
So
I
you
know
I
I
don't
want
to
do
it
myself,
but
I'd
really
like
it.
B
If
somebody
would
launch
a
program
around
citizen
science,
because
I
think
there's
well
and
another,
you
know
so
I
have
this
camera
that
that
tracks
fireballs,
that
seti
consumes
the
data
and
all
that.
So
that's
another
example
right.
So
there's
lots
of
different
things
that
elements
of
astronomy
or
corners
of
astronomy
that
you
can
get
involved
in
and
and
do
sort
of
pro-amateur
do
real
science
right
and
help
help
the
pro
astronomers
do
their
job
and
stuff.
E
Well,
I'm
interested
to
some
extent
cool
yeah,
but
we
can
actually
discuss
okay,
because
I
have
been
in
touch
with
professional
astronomers
and
I
provide
some
data
to
them
back
in
time
and
there
are
different
aspects.
You
can
actually
talk
what
what
exactly
can
be
presented.
E
It
actually
is
very
nice
for
everybody,
because
sometimes
we
take
subs
and
fit
files
and
data,
and
also
actually
I
found
a
story
to
the
image
yeah
and
I
didn't
know
actually
so
after
it
until
I
actually
did
the
stack
I
found
something
is
moving
and
then
I
sent
to
a
stormers
and
they
say:
oh
you
know
disaster.
It
don't
worry.
B
All
right,
I
I
saw
something
when
I
was
taking
my
spectroscopy
there.
I
said
it
data
on
on
saturday
or
sunday,
and
I
went
back
to
look
for
it
the
next
night
and
it
wasn't
there.
So
I
don't
know
if
it's
I'm
trying
to
bring
it
up
here
if
it
was
just
a
cloud,
but
it
looked
like
it
like
a
nebula
where
there
shouldn't
be
a
nebula.
B
Data,
let
me
try
another
way
here:
yeah
it
looked.
It
looked
kind
of
like
the
veil,
nebula
kind
of
like
a
shock
wave
but
then,
like
I
said,
I
went
back
the
next
night
and
hit
it
with
a
full
on
deep
space
versus
just
a
plate
solve
frame.
B
You
know,
h,
a
and
s2
and
all
that
o3,
and
there
was
nothing
so
I
don't
know
what
the
heck
that
thing.
That
thing
was
man.
Where
did
I
stick
that.
B
B
B
I
E
B
B
B
Paulo
you
mentioned
doing
something
with
nina
too.
Are
you,
are
you
hacking
nina,
or
are
you
just
working
on
the
ocs
software
to
make
it
talk
to
nina.
G
No
in
this
moment,
I'm
talking
I'm
I'm
using
k-star
okay,
I.
G
I
tried
nina
a
few
months
ago,
but
I
stopped
for
the
lack
of
scheduling
yeah
because
you
know
with
my
small
key
all
to
the
sky.
I
need
to
have
more
job
on
the
same
night
in
order
to
to
have
a
decent
number
of
of
data
collected.
G
Otherwise
I
have
only
two
hour
or
one
half
hour
for
each
target,
but
now
I
see
from
francesco
presentation
before
that
also
nina's,
starting
to
have
a
scheduler
in
order
to
have
the
possibility
to
have
a
more
job
one
after
the
other.
So
maybe
I
will
look
at
that.
You
know
I
I
decided
to
to
run
away
from
sgp,
so
I'm
in
search
of
eternity.
G
G
Probably
I
will
due
to
the
fact
that
is,
and
another
problem
was
that
for
nina-
that
the
astro
mechanic
cannon
focuser,
the
one
from
russia.
They
were
not
having
the
the
indie
driver,
and
so
that
was
precluding
both
nina
and
k-star.
But
since
a
couple
of
months
ago
they
have
that.
So
that's
another
good
option
and
I'm
using
I'm
using
that
that
focuser
from
astro
mechanics
and
is
working
very
well
good.
G
B
Yeah
well,
the
the
whole
observatory
control
system
is
is
a
topic
that
I
think
you
and
I
should,
or
one
of
us
I
can
volunteer
you
could
present
on
people
are
probably
tired
of
hearing
from
me,
but
that's
another
thing
that
that
you
know
we've
been
working
on
during
the
the
covid
thing.
So,
even
though
we
don't
have
observatories
it's
kind
of
nice
to
have
all
the
weather
sensors
telling
you
what's
going
on.
G
Yeah
and
that
that
was
my-
and
this
is
my
next
project,
because
I
I
connected
the
the
weather,
let's
say,
and
the
sqm
sensor
that
more
or
less
we
made
in
parallel
to
sgp.
But
now
I'm
moving
all
the
hardware
to
a
an
indie
environment.
E
G
B
G
Guy
I'm
trying
to
mimic
existing
hardware-
oh
okay,
for
for
for
weather
in
the
arduino
part,
so.
B
Yeah,
so
these
these
are
there's
a
whole.
I
don't
know
if
I've
mentioned
this
before
this
observatory,
control
system,
there's
a
whole
forum
and
everything-
and
this
guy
has
written
some
nice
arduino
software
to
to
monitor.
I
guess
I
could
bring.
I
could
bring
mine
up
here
to
show
you
what
it
looks
like
so
the
the
idea
is,
you
know
if
you've
got
an
observatory,
it
can
manage
the
the
heating
and
cooling
and
the
dome
and
the
hole
in
the
safety
and
observing
conditions
all
that
stuff.
B
Let
me
jump
on
the
telescope
computer
here.
B
B
B
Small,
but
I
guess
I
could
do
I
could
probably
do-
maybe
not
it's
not
maybe
yeah
yeah,
I'm
trying
ctrl
plus
here
we
go.
Does
that
help
yeah
so
th?
This
is
the
this
observatory
control
system,
so
it's
all
arduino
based
and
just
various
sensors
weather
sensors
that
are
connected
to
it.
B
So
just
this
is
the
the
overview
and
it's
all
modular.
So
you,
you
add,
there's
a
config
file
where
you
say
you
know:
I've
got
a
rain
sensor
and
I've
got
a
sky
quality
sensor,
and
so
you
just
add
all
that
stuff
in
and
what
your
limits
are
for
safety
and
unsafety
and
stuff
so
see.
Here.
It's
telling
me
it's
unsafe
to
observe,
because
it's
very
cloudy
right
now.
B
So
this
is
the
just
the
main
status
panel
and
I
added
in
the
dc
power
supply
voltage
there
and
then
I've
got
a
a
thermostat.
You
know
again,
I
don't
have
an
observatory,
but
the
box
that
I
put
all
this
stuff
in
is
too
small
and
it
gets
hot.
So
I
put
a
fan
in
there,
so
it
it.
You
know
the
fan
turns
on
at
a
certain
temperature
and
then
you've
got
weather
and
I've
added
these
weather
sensors.
A
Glenn?
Why
is
your
power
supply,
reading
11,
4
or
11
6.
B
B
I
wonder
if
I
ran
out
of
disk
space
or
something
I
haven't
looked
at
the
graphs
in
a
few
weeks,
but
anyway
you
get
graphs
of
everything,
but
for
some
reason,
mine's
not
showing
up
I'll,
have
to
troubleshoot
that,
but
any
there
is
a
there
is
an
ascom
driver
that
part's
not
open
source.
The
rest
of
this
is
open
source
that
you
can
hack
and
slash,
but
you
pay,
I
think,
10
bucks
or
something
for
the
ascom
driver
and
so
sgp
can
can
consume
this
safe
and
unsafe.
G
And
you
can,
if
I'm
I'm,
not
wrong,
that
was
the
let's
say
the
wired
version,
the
because
you
have
the
your
internet
connected
to
the
the
unit
connect,
correct.
B
B
Yeah,
it's
a
arduino
mega
with
the
ethernet
board
on
top
yeah
and
then,
if
I
was
gonna,
get
serious
about
it.
If
I
had
an
observatory,
you
know
there's
the
hardened
versions
of
these
boards
that
you
can
pay
a
lot
more
money
for
that
that
are
supposed
to
be
idiot.
Proof
which
I
need
so.
G
E
B
B
B
Other
than
that,
the
only
thing
that's
not
working
is
the
the
wind
sensor.
I
mean
I'm
getting
the
temperature
from
the
outside
temperature
via
the
wind
sensor,
but
for
some
reason
the
wind
sensor
itself
always
reads
zero
so
and
I've
I'm
on
my
second
one.
So
I
still
need
to
troubleshoot
that
I'm
sure
it's
a
connection
problem,
but
the
damn.
B
Well,
no,
the
the
connection
between
the
sensor
and
the
arduino
yeah.
Okay,
like
it,
was
one
of
these
projects
that
you've
learned
enough
to
go
back
and
start
over
right
and
you
do
it.
Do
it
right?
The
second
time.
B
Oh,
by
the
way,
this
is
sort
of
out
in
the
weeds,
but
one
of
the
things
in
this
this
astronomical
spectroscopy
book
is
there's
a
tear
down
of
that
flip
mirror.
So
I
can
finally
figure
out
how
to
get
the
damn
knob
off
the
thing
so
that
I
can
make
a
better
mechanical
connection
to
the
to
the
knob,
but
they
in
here
they
teach
you
how
to
tear
it
apart
and
make
it
into
a
beam
splitter.
A
A
Wow,
maybe
we
should
have
a
t-shirt
made
for
that
yeah,
yeah,
okay,
anybody
else
all
right!
A
Well,
if
any
of
you
guys
have
anything
that
you
want
to
present
next
month,
we're
wide
open,
I'm
going
to
continue
looking
for
a
presenter
and
we
the
way
things
are
ramping
up
with
covet.
I
doubt
we're
going
to
have
any
outings
right
now.
Glenn.
Do
you
have
any
input
on
that.
B
Yeah,
no,
it's
not
just
you
know,
there's
been
a
few
times
where
I've
I've
thought
you
know,
I
should
just
go,
do
a
do
a
workshop
and
just
tell
people
to
stand
six
feet
apart,
but
every
time
it's
gotten
close,
then
there's
been
another.
You
know,
checking
the
the
santa
clara
county
website
or
the
osp
or
there's
high
fire
danger
and,
and
the
osp
has
said
no,
so
we
just
right
now
we
just
don't
have
permission
to
take
anybody
but
docents
to
an
osp
site.
So
I
guess
that
could
be.
B
Maybe
some
you
know
small
group
of
people
doing
some
observing,
but
not
not
members
of
the
public.
B
And
I
guess
now
that
it's
wet
the
fire
danger
is
not
not
really
an
issue,
maybe,
but
that
was
the
other.
The
other
problem.
A
E
A
Thanks
for
coming,
everybody
stay
safe
and
I
hope
to
see
you
next
month
all
right.
We're
standing,
take
care,
bye,.