►
From YouTube: 31. #EveryoneCanContribute cafe: Machine Learning for Software Developers (...and Knitters)
Description
Blog: https://everyonecancontribute.com/post/2021-05-26-cafe-31-machine-learning/
Kris Howard: https://twitter.com/web_goddess
Twitter thread: https://twitter.com/dnsmichi/status/1397584420534726665
A
A
We
are
live
on
youtube
now,
at
least
this
is
what
zoom
is
telling
me
and
welcome
back
everyone.
We
have
our
31st
30
31
version
of
our
everbonk
contribute
coffee,
and
today
we
have
a
new
guest.
Chris
joins
us
for
doing
something
around
machine
learning.
At
least
this
is
what
the
title
says
and
I'm
looking
forward
to
learn
something
new
today
and
yeah.
That's
basically
from
my
side,
I
would
just
say
chris:
kick
us
off,
do
something
magic
and
enjoy
the.
C
A
That's
yeah
yeah.
C
He's
very
good
at
pressuring
people
in
to
do
this,
but
thank
you
thank
you
for
having
me
thank
you
for
inviting
me
to
the
group.
My
name
is
chris,
as
you
heard,
if
you
want
to
tweet
it
me
my
username,
there
is
web
underscore
goddess
on
twitter,
I'm
new
to
europe.
If
you
can't
tell
from
the
accent
I
moved
to
I'm
in
munich,
I
moved
to
munich
about
eight
months
ago
from
sydney,
australia,
where
I
lived
for
the
last
20
years.
C
For
some
reason,
my
husband
and
I
decided
that
moving
internationally
in
the
middle
of
a
pandemic
was
a
great
idea
and
that's
the
story
for
maybe
after
but
I've
been
at
aws
for
three
years.
I
was
back
in
sydney.
C
I
was,
I
managed,
a
team
of
solutions,
architects
and
then,
when
I
moved
to
to
munich,
I
joined
the
developer
relations
team
and
over
my
career,
I
started
out
I'm
old
two
decades
ago
as
a
web
developer
in
the
dot
com
boom
building
e-commerce
websites,
that
are
all
dust
in
the
wind
now
and
then
I
was
a
business
analyst
and
obviously
a
tech
evangelist
and
and
then
I
eventually
sort
of
moved
into
being
a
people
leader.
So
I
only
write
code
for
fun
these
days.
C
But
what
I
have
not
been
my
friends
is
a
data
scientist
or
a
machine
learning
expert,
but
it's
always
been
a
topic.
That's
been
kind
of
interesting
to
me
and
cool,
and
I
went
to
a
lot
of
talks
over
the
years.
I
remember
a
few
years
ago
in
sydney
I
went
to
a
developer
conference
that
shall
remain
nameless
and
went
to
a
session
called
machine
learning
for
for
developers
and
thinking
great.
This
is
going
to
be
exciting.
C
I'm
going
to
get
some
in
inspiration
and
get
going,
and
I
swear
to
god
within
five
minutes.
It
was
nothing
but
calculus.
It
was
slides
of
calculus
and
I
was
like
looking
around
at
the
other
people
in
the
room
and
they're
looking
at
me
and
we're
like
this
was
a
bait
and
switch
you
tricked
us
into
this
room
because
I
had
you
know:
calculus
is
cool,
I
did
it
in
uni.
I
hadn't
done
it
since.
C
So
I
promise
you
guys
tonight
there
will
be
no
calculus.
This
is
not
a
a
trick
or
a
bait
and
switch.
Instead.
I
thought
we
would
just
have
some
fun.
I
thought
I
would
show
you
a
very
silly
project
that
I
have
been
working
on,
which
does
have
some
potential
actual
useful
applications
and
maybe
inspire
you
to
click
around
and
try
some
things
on
your
own.
C
So
this
is
my
little
agenda,
but
anytime,
you
have
questions
just
interject
and
interrupt
me.
Happy
too,
I
just
have
a
couple
slides
about
some
of
the
cool
things
in
aiml.
That
interest
me.
I
I
think
the
things
that
interest
me
about
machine
learning
are
not
the
things
that
interest
other
people
necessarily.
So
I
thought
actually
some
stuff,
I
think,
is
cool.
C
Why?
I
think
that
even
software
developers,
who
don't
think
they
need
to
know
about
this,
should
probably
know
something
about
it:
a
little
bit
on
the
aws
machine
learning
stack,
but
not
too
much,
because
then
I'm
going
to
go
into
my
project,
which
I'm
a
knitter.
That's
what
I
do.
I
have
my
knitting
sitting
right
here
next
to
me,
for
when
I'm
in
boring
meetings,
I
am
working
on
a
vest
stricken
for
my
man,
mine
and
man,
and
I'm
trying
to
combine
knitting
and
machine
learning
to
reverse
engineer
knitting
patterns.
C
But
don't
worry
you
do
not
need
to
know
anything
about
knitting.
For
that
part,
I
promise
okay,
so
the
cool
stuff
that
I
like
about
aiml,
you
know,
there's
there's
new
breakthroughs
all
the
time.
There's
chat,
bots,
there's
self-driving
cars,
there's
dancing,
robots
and
things
like
that.
One
that,
I
think
is
cool
especially
right
now
is
some
of
the
things
that
are
being
done
around
coronavirus.
C
So
this
was
a
bit
I
saw
in
an
internal
talk
in
amazon
and
just
thought.
It
was
a
really
neat
one
there's
this
competition
that
happens
every
couple
years:
the
casp
competition,
which
is
critical
assessment
of
techniques
for
protein
structure
prediction.
So
this
was
just
last
november
and
the
idea
is
they
try
and
predict
how
a
particular
protein,
the
3d
structure
from
very
long
chains
of
amino
acids.
C
What
it'll
look
like
and
when
you
know
what
this
protein
looks
like
it
unlocks
all
of
these
breakthroughs,
medical
and
scientific,
and
so
this
one
here
the
real
molecule
is
in
green
and
the
predicted
one
is
in
blue
and
of
course
this
is,
it
belongs
to
coronavirus
and
is
implicated
in
immune
evasion.
So
it's
pretty
neat
how
some
of
this
is
actually
being
used.
I
just
got
my
shot
on
the
weekend
too.
C
So
it's
particularly
relevant
to
me,
but
the
thing
that
really
gets
me
excited
is
using
machine
learning
to
create
art,
and
I
actually
curated
a
whole
day
at
linuxconf,
australia
in
2018
about
art
and
tech,
and
I
had
a
guy,
for
example,
talk
about
procedurally
generating
landscapes
from
music
that
he
dj'd.
I
had
a
guy
who
every
year
participates
in
nanogenmo
national
novel
generating
month,
where
you
try
and
write
code
that
can
generate
a
50
000
word
novel.
C
C
People
are
probably
familiar
with,
but
you
the
idea,
is
you
have
this
convolutional
neural
network,
which
I've
represented
very
simply
here
that
that
takes
in
an
image
in
an
input,
image
and
you've
trained
it
to
recognize
different
different
types
of
objects
like
dog
cat
mouse
bear
wolf,
and
so,
when
you
feed
in
the
picture
of
the
cute
puppy,
the
dog
label
lights
up
right
now
the
deep
dream
thing
kind
of
flips
it
and
says:
well:
okay!
Well,
what?
C
If
we
give
it
an
input
that
we
haven't
trained
it
on
something,
some
animal,
that
it
doesn't
recognize
like
a
jellyfish
and
then
what
you
so
nothing
lights
up,
but
what
we
want
to
do
is
we
want
to
try
and
actually
trick
the
dog
label
into
lighting
up.
So
you
start,
you
start
making
modifications
to
the
input
image,
to
try
and
get
the
dog
neuron
to
fire
and
over
several
iterations
you
actually
start
getting
recognizable
dog
parts
like
snouts
and
legs,
and
so
artists
figured
out.
C
They
could
use
this
to
create
some
really
surreal
dream
like
imagery,
which
I
think
is
nightmare
fuel,
then
you've
got
style,
transfer,
neural
style
transfer
in
2016
that
prisma
ai
app
went
really
viral,
and
you
know
this
is
technology
that
allows
you
to
sort
of
copy
paste,
the
style
of
an
image
separate
from
the
underlying
content,
and
so
this
was
actually
from
a
tensorflow
workshop.
I
did
back
in
2017
in
australia,
and
so
it
was
a
photo
of
my
husband
and
a
horse,
but
I've
applied
different
styles
to
it.
It's
pretty
fun.
C
You
know,
there's
apps,
that
do
that
now.
Another
one
you
may
have
heard
of
is
generative
adversarial
networks.
Gams
gans
are
you
know,
people
are
very
worried
about
creating
fake
human
beings
that
never
existed.
This
is
the
internet.
I
I
think
we
should
talk
more
about
cats.
C
There
is
an
entire
website
called
these
cats
do
not
exist.com,
and
these
are
all
cats
that
have
been
created
using
stylegan
on
amazon
sagemaker,
some
of
which
are
look
really
realistic
and
others
are
frightening
that
one
in
the
upper
right,
I
don't
know,
what's
going
on
there,
you
don't
have
to
use
gans
to
create
photorealistic
output
either.
This
is
a
piece.
C
A
friend
of
mine
back
in
australia,
created
jay
rosenbaum
jay
does
is
an
artist
and
researcher
working
with
3d,
modeling,
ai
and
and
extended
reality,
tech
and
so
melbourne
had
a
really
long
covered
lockdown
last
year
you
might
have
seen
that,
and
so
this
piece
is
called
the
isolation
of
being,
and
so
jay
did
these
sort
of
drawings
of
faces
and
bodies
and
then
had
again
turn
them
into
landscapes,
which
I
think
is
really
beautiful.
C
And
lastly,
who
doesn't
like
getting
pizza
faster?
This
is
one
of
the
coolest
actual
business
use
cases.
I've
seen
domino's
pizza
enterprises
limited
is
like
the
biggest
domino's
franchise
holder
in
the
world.
They
cover
nine
countries,
including
both
germany,
where
we
are
most
of
us
are
some
of
us
are
and
australia
where
I
used
to
be,
and
so
dominos
was
already
getting
like
70
of
their
orders
online
even
before
the
pandemic,
and,
of
course,
the
biggest
thing
that
makes
people
happy
is
getting
their
pizza
faster.
C
The
sooner
you
get
it
to
them,
the
happier
they
are
so
they
launched
this
project
in
2019
called
project
310,
which
was
a
goal
of
having
a
pizza
ready
for
pickup
within
three
minutes
of
ordering
it
or
safely
delivered
within
10.,
and
so
they
had
a
bunch
of
different
ways.
They
tried
to
do
this.
Obviously
they
invented
ways
of
cooking
the
pizza
faster.
They
built
more
stores
closer
to
where
people
lived,
but
it
also
involved
predictive
technologies.
C
Have
this
like
heads-up
display,
where
at
any
given
time
it's
predicting,
which
pizzas
people
in
that
area
are
gonna
order
at
any
given
time,
so
they
can
start
making
them
ahead
of
time
and
so
by
the
time
michael
calls
up
to
order
one
it's
already
coming
out
of
the
oven,
and
so
one
of
the
restaurants
actually
managed
to
get
average
delivery
times
down
to
just
five
minutes,
which
is
pretty
cool,
so
how's
it
relevant,
which
I
don't
think
anyone's
still
asking,
because
I
told
you
about
pizza,
which
is
relevant
to
all
of
us,
but
I
know
that
there
are
some
traditional
software
devs
who
think
like
I'm,
not
working
on
machine
learning
apps.
C
Therefore,
I
don't
really
need
to
know
about
this
stuff,
but
truly
at
aws,
and
it's
not
just
marketing
spin.
We
believe
that
in
pretty
much
in
the
fullness
of
time,
every
application
is
going
to
make
use
of
machine
learning
in
some
way
all
of
the
customers.
We're
talking
to
are
interested
in
this
area,
not
just
the
ones
you
think
of,
like
like
fintechs
we've
got
startups.
C
Sas
providers,
media
non-profits
retailers
everybody's
looking
at
it
and
we're
just
sort
of
at
the
beginning
of
this
revolution,
and
I
read
just
today
even
that,
like
the
demand
for
machine
learning,
engineers
is
far
outstripping
the
supply
right
now
so
for
your
own
career.
You
should
probably
want
to
know
something
about
this
stuff.
C
There's
also
this
idea
and
I
won't
really
go
into
it.
This
is
a
quote
from
chris
bishop
who's,
a
researcher
in
the
uk,
but
there's
this
idea
that
in
the
future
we're
not
going
to
be
coding
right,
programming,
computers
in
the
way
we
have
in
the
past.
C
So,
just
to
give
you
an
example
of
that
this
is
a
little
bit
of
python
that
you
can
use
to
invoke
amazon
techstract.
So
this
is
one
of
our
services.
You
give
it
an
image
with
text
in
it
like
a
scan
or
a
photo
and
it
it
actually
will
analyze
it
and
get
the
text
out
for
you.
So
it's
already
pre-trained,
you
don't
have
to
train
it.
C
You
just
call
it
like,
like
an
api,
but
instead
of
getting
back
an
answer,
you
get
back
a
range
of
confidence
levels
and
you
have
to
branch
your
logic
based
on
those
confidence
levels.
So
it's
sort
of,
like
exception,
handling
where
you
do
one
thing
with
success
and
one
thing
with
failure,
but
it's
not
binary
you're
going
to
have
to
deal
with
with
with
more
than
one
outcome
there.
A
C
Don't
want
to
hit
it
too
hard,
but
I
think
everybody
needs
to
know
something
about
machine
learning
and
if
you
want
to
get
started
with
it,
of
course,
just
reach
out
to
me.
We've
got
resources
like
the
ramp
up
guide.
We
did
our
aws
innovate
conference
back
in
february,
which
all
the
sessions
are
still
aligned.
If
you,
google,
aws
innovate,
we
have
like
50
sessions
on
machine
learning
and
you
can
sign
up
and
watch
them.
C
Don't
want
to
go
too
hard
on
that,
but
yeah,
I'm
just
gonna
talk
about
our
machine
learning
stack
just
a
little
bit
because
we
have
a
bunch
of
stuff.
Maybe
you
aren't
familiar
with
our
goal.
Our
mission
is
to
put
machine
learning
in
the
hands
of
every
developer
and
they
mean
that
trust
me,
because
it
involves
building
many
different
services
that
cater
to
many
different
skill
levels
and
there's
a
ton
on
here.
C
I
know
it
looks
daunting,
but
it
basically
breaks
down
into
three
levels:
we've
got
frameworks
and
infrastructure
at
the
bottom
machine
learning
services
in
the
middle
and
ai
at
the
top
and
I'll
just
explain
a
little
bit
about
what
they
mean,
but
I'm
going
to
show
off
a
couple
of
them
as
part
of
my
project,
so
I
thought
it
would
be
useful
to
put
them
in
context
so
the
top
there,
the
ai
services,
those
are
the
ones
that
you
don't
really
you
just
use
them.
C
C
We
we
launched
this
suite
of
anomaly
detection
tools
last
year
under
the
amazon
lookout
brand,
see
these
you
just
use
them,
you
don't
have
to
train
them
or
anything
like
that
in
the
middle
layer,
what
we
call
machine
learning
services
for
the
most
part,
that
means
amazon,
sage
maker,
so
we've
got
an
ide
called
sage
maker
studio.
It's
got
all
the
tools
that
you
need
to
develop,
train,
debug,
deploy
and
monitor
your
machine
learning
models.
C
I
used
it
for
the
first
iteration
of
my
project,
so
we're
going
to
go
into
that
a
fair
bit
and
yeah.
We
they
keep
bringing
out
features
faster
than
I
can
figure
out
how
to
use
them.
To
be
honest
with
you
and
then
at
the
bottom.
If
you
are
an
experienced
practitioner,
you
want
to
roll
your
own
sure.
You
know,
launch
a
deep
learning
ami
on
whatever
hardware
you
want
and
manage
that
infrastructure.
However,
you,
like
not
gonna,
be
demoing
that
part
because
again
not
an
expert
okay,
my
project.
C
So,
as
I
mentioned,
when
I'm
not
messing
around
with
computers,
I
am
often
knitting.
I
learned
knitting
when
I
first
moved
to
austral.
I
grew
up
in
the
states,
but
when
I
moved
to
australia,
I
I
started
knitting.
I've
been
doing
it
off
and
on
ever
since
I
usually
leave
it
here
so
that
if
I'm
in
a
meeting
I
can
surreptitiously
knit
a
couple
rows
and
because
people
in
tech
know
me
as
a
knitter,
I
get
tagged
on
tweets
like
this
one.
So
this
was
back
in
april.
Somebody
tagged
me
on
this
one.
C
It
turns
out
this
poor
guy's
mother
turned
one
of
his
favorite
pullovers
into
a
cushion
on
the
sofa
and
he
was
devastated
and
he
wanted
to
know
if
someone
could
re-knit
his
sweater
for
him,
his
pullover
from
a
photo
of
it
and,
of
course,
in
the
replies
somebody's
like
well,
there
has
to
be
an
app
for
that
and
it's
a
joke,
but
actually
for
an
experienced
knitter.
This
is
a
really
common
and
easy
thing
to
do
so.
To
give
you
an
example,
this
is
a
photo.
C
I
took
back
in
sydney
before
I
left.
This
is
my
colleague
mega
in
the
kitchen
at
aws
sydney,
she's
sitting
there
in
this
beautiful,
blue
jumper
and
I'd
never
seen
that
stitch
before
so
I
took
a
picture
cause
I'm
a
weird
nerdy
knitter
and
that's
what
I
do,
and
so,
if
you
look
at
it,
I
can
tell
you
just
by
looking
at
that
photo
that
the
sort
of
background
texture
that
is
reverse
stockinette
stitch.
I
can
tell
that
from
the
texture
of
it.
C
I
can
tell
that
those
sort
of
zigzaggy
lines-
those
are
made
up
of
three
knit
stitches,
with
a
combination
of
increases
on
one
side
and
decreases
on
the
other,
and
that's
what
makes
them
zigzag
and
some
of
that
I
get
because
I
know
the
texture,
but
if
I
have
a
good
enough
image,
I
can
actually
zoom
in
and
read
it
like.
Like
reading
handwriting,
you
know
like
stitch
by
stitch.
C
So
what
is
this?
This
is
reverse
engineering.
We
do
this
in
software
all
the
time.
This
is
taking
something
that
a
human
being
has
made
and
studying
it
to
deconstruct
its
design
and
its
architecture,
and
it
turns
out
guess
what
knitters
do
this
a
lot
too
all
the
time
and
that's
what
got
me
to
thinking
like
if
I
can
do
this
in
my
head?
If
I
can
look
at
that
picture,
if
I
can,
if
I
have
a
sweater,
if
I
look
at
a
picture
of
it,
I
can
reverse
engineer
it.
C
Why
can't
I
train
a
neural
network
to
do
it
too?
So
I
started
researching
this
a
couple
years
back
and
it
turns
out
I'm
not
the
only
one
who
had
this
idea.
This
is
a
scientific
white
paper.
I'm
guessing
you
didn't
expect
to
get
one
of
those
and
to
talk
about
knitting,
but
here
it
is.
This
was
from
the
international
conference
on
machine
learning
june
2019
just
two
years
ago.
C
This
is
a
bunch
of
researchers
at
mit,
okay,
mit
geeks,
and
so
the
goal
of
these
researchers
is
to
take
a
picture
of
some
knitting
piece
of
knitting,
a
swatch
we'll
call
it
like
a
piece
of
fabric
feed
it
into
a
neural
network.
To
reverse
engineer
the
pattern.
Take
that
and
feed
it
into
a
knitting
machine,
so
sidebar
knitting
machines
are
a
thing
you
can
buy
ones
for
your
house
that
are
smaller
or
you
know
clothes
you
buy
in
the
shops
are
usually
needed
on
these
big
industrial
knitting
machines.
C
They
want
to
feed
the
pattern
back
in
the
knitting
machine
and
get
it
to
knit
and
get
the
identical
swatch
back
out,
so
they
basically
want
to
build
a
knitting
duplicator.
Like
how
star
trek
is
that
right,
so
there's
papers
here
and
one
of
the
one
of
the
things
in
it
their
real
breakthrough?
Is
they
realized
that
taking
pictures
of
knitting
actually
kind
of
sucks
for
computer
vision?
Because
wool
is
fuzzy?
You
know
you
can't
always
get
a
really
crisp
photo
of
it.
C
Also
knitted
fabric
is
floppy,
it's
stretchy,
so
it's
not
great
for
doing
this,
so
a
big
breakthrough
they
had
was
they
augmented
their
training
data
of
real
photos
of
knitting
with
simulated
knitting,
because
those
big
industrial
knitting
machines
that
I
mentioned
they
come
with
very
specialized
domain,
specific
software
that
can
actually
simulate
and
model
what
knitted
fabric
is
going
to
look
like.
C
So
they
use
that
to
generate
a
bunch
of
samples
too,
and
they
found
out
that
they
got
the
best
results
by
combining
both
real
photos
and
the
simulated
stuff,
and
I
promise
you
no
calculus.
This
paper
has
a
ton
of
calculus
in
it.
You
can
read
it,
I
didn't
understand
any
of
it,
except
the
results.
The
results
were
they
didn't
quite
crack
it
yet
they're
getting
close.
They
got
like
80
accuracy,
which
is
impressive,
using
both
convolutional
and
gans.
C
So
I
decided
that
I
would
have
a
go
myself,
but
rather
than
focus
on
reading
a
pattern
stitch
by
stitch
like
they
were
trying
to
do,
I
would
just
treat
it
as
an
image
classification
problem
so
similar
to
that
thing.
With
the
puppy
and
the
labels,
I
want
to
take
an
image
as
an
input
and
output,
a
class
label
for
it,
so
just
build
a
machine
learning
model
that
could
do
that.
C
So
I
I
got
a
piece
of
paper
wrote
out
my
little
plan,
not
knowing
how
difficult
this
was
gonna,
be.
I
thought
I'll.
Just
I'll
get
some
training
data
right,
I'll
label,
it
I'll
train
a
model
I'll
deploy.
It
maybe
build
a
little
web
front,
end
to
look
nice
and
then
you
know
dance
on
the
grave
of
the
knitting
industry
like
happy
times
and
so
naively.
I
began
everybody
with
me
so
far,
cool
all
right.
C
Getting
training
data,
everybody
who
told
me,
oh
I'm,
gonna,
do
this
machine
learning
project,
guess
what
you're
gonna
spend
80
percent
of
your
time
cleaning
data
and
I
thought
you're
right.
No,
it's
a
hundred
percent.
True
getting
clean
data
is
the
hardest
part.
Where
you
spend
all
your
time
and
a
lot
of
machine
learning
projects.
Have
you
start
with
easily
available
data
sets?
C
You
know,
guess
what
there's
not
a
lot
of
knitting
data
out
there,
but
I
I
researched
I
did
find
some
so
the
people
who
wrote
that
paper,
the
mit
researchers
they
do
make
their
data
set
available
so
just
to
show
you
their
methodology.
C
They
used
one
of
those
knitting
machines
to
knit
about
a
thousand
pattern
samples
in
five
by
five
grids
and
the
grid's
really
ingenious,
because
what
they
did
is
they
inserted
steel
rods
into
the
into
the
grid,
to
hold
it
flat
and
stretch
it
out
because
remember
what
I
said
about
it
being
floppy
took
all
the
pictures
realized
pretty
quickly.
That
was
not
going
to
scale
and
that's
when
they
moved
to
using
the
simulated
samples
as
well,
but
you
can
get
access
to
their
photograph
data
set.
C
I
download
it
and
play
around
with
it,
and
you
can
see
some
at
the
bottom
there,
but
they
do
kind
of
different.
Like
styles
of
designs,
I
decided
it
was
overkill.
I
just
wanted
to
focus
on
whether
I
could
identify
like
five
basic
stitches
stuck
in
it,
which
is
pretty
much
what
every
jumper
you
know
that
stitch
pattern
reverse
stuck
in
it,
which
is
the
inside
out
version
of
that
garter,
moss
and
seed.
C
These
are
just
basic
stitches
with
knits
and
pearls
and
because
I'm
an
idiot,
I
decided
to
crowd
source
my
data
from
knitters.
You
know
there's
a
lot
of
knitters
out
there,
the
the
venn
diagram
of
knitters
and
geeks.
There
is
a
lot
of
overlap
in
the
middle
there's.
A
lot
of
us
and
people
put
pictures
of
their
knitting
online,
but
I
didn't
want
to
like
spend
hours
scraping
and
individually
cleaning
up
images.
So
I
wanted
to
automate
stuff
as
much
as
I
could.
C
So
I
decided
to
set
up
an
email
pipeline
and
I
won't
really
go
into
this
in
detail,
but
the
idea
was
that
I
knew
I
was
going
to
need
all
my
training
data
in
an
s3
bucket.
So
I
set
up
an
email
address
in
amazon
ses
where
you
could
send
in
an
email
the
the
email
would
land
in
s3
bucket.
I
had
a
amazon,
aws
lambda
set
up
that
when
email
went
into
the
bucket
it
would
trigger
it
would
save
the
attachment.
C
You
know
check
for
viruses
and
stuff
save
the
attachment
in
another
bucket
and
then
log
the
metadata
in
a
dynamodb
table.
So
I
could
keep
track
of
who
sent
me
images,
so
I
tested
this
out,
didn't
take
very
long,
got
it
set
up.
It
would
save
me
from
having
to
just
you
know.
I
didn't
want
to
be
downloading
him
from
twitter
or
all
over
the
place,
and
so
once
I
had
it
working,
I
put
out
a
tweet.
You
may
have
seen
this.
I
don't
know
because
it
kind
of
went
viral.
C
This
was
back
in
october
2019
I
wrote
a
blog
post.
I
talked
about
the
type
of
images
I
wanted.
What
I
was
looking
for,
I
shared
the
email
address.
I
put
it,
I
put
it
on
social
media.
I
also
shared
it
on
ravelry,
which
is
basically
github
for
knitters.
If
any
of
you
unravel
ring,
you
know
you
can
friend
me
and
the
good
news
is
yeah.
This
guy
retweeted,
like
350
times
people
started
tagging
their
knitting,
friends
and
saying
hey.
You
should
contribute
to
this.
C
If
you've
ever
read
that
blog
ai
weirdness,
that
janelle
shea
did
she
messaged
me
and
we
had
like
a
back
and
forth,
which
was
my
like
rock
star
ai
moment,
and
she
pointed
me
to
some
like
other
scientific
papers,
she's
really
cool,
and
the
awesome
thing
was
that
images
started
to
turn
up
in
the
bucket.
Like
real
people
were
helping
me
out
with
this
ridiculous
project's
progress.
C
They
were
sending
me
pictures
and
being
so
generous
and
so
kind,
and
I
looked
at
them
and
it
was
all
great
and
then
I
looked
at
some
more
and
I
realized
I
was
gonna
have
a
problem,
so
this
is
just
some
of
them.
So
look
people
people
get
excited.
They
don't
read
too
quickly.
You
know
some
of
them
sent
me
patterns
that
I
wasn't
looking
for,
or
they
sent
me
pictures
of
their
whole
garment,
which
was
lovely
but
didn't
like
have
the
resolution
that
I
needed
or
they
attached
the
wrong.
Some
poor
woman
was
like.
C
I
think
I
sent
you
halloween
candy.
I
think
you
did
so
look
even
with
the
best
of
intentions.
Crowdsourcing
is
a
huge
risk
and
you
will
spend
just
as
much
time
cleaning
it
up.
So
it
didn't
actually
save
me
any
time,
and
I
don't
want
to
shame
these
folks.
They
were
being
super
nice
and
trying
to
help
me
out.
So
I
really
do.
Thank
them,
but
yeah
crowdsourcing
is
is
not
the
magic
magic
wands
that
I
thought
it
was
going
to
be.
C
So
I
got
rid
of
all
the
dodgy
images
I
cropped
the
rest
and
then
it
was
time
to
get
the
images
labeled
and
rather
than
me
sitting
and
doing
this
because
I
had
like
nearly
400
images.
I
think
I
didn't
want
to
do
them
by
hand
myself.
C
I
again,
I
recruited
a
bunch
of
knitters
to
help
me
with
this,
and
so
one
of
the
cool
services
in
that
middle
layer
we
have
with
sagemaker,
is
called
ground
truth.
It's
basically
like
a
ui
where
you
can
invite
either
a
a
public
or
private
workforce
to
actually
go
through
and
label
and
create.
You
know
your
your
data,
your
training
data
set
for
machine
learning.
C
You
can
put
your
jobs
out
to
mechanical
turk
and
pay
people
a
couple
cents
to
do
this,
but
I
kind
of
figured
this
is
a
very
specific
domain
identifying
knitting
stitches.
So
I
didn't
trust
it
to
just
complete
randoms.
I
needed
actual
knitting
experts
to
do
it.
So
I
put
out
a
call.
I
got
my
volunteer
task
force.
I
set
the
job
up
and
I
will
show
you
what
ground
truth
looks
like
now.
How
about
that.
So
I
have
here
the
console.
C
Hopefully
you
can
see
that
so
I'm
in
sagemaker
and
ground
truth
here
on
the
edge
you
can
see
labeling
jobs,
so
I'll
open
one
up
and
view
labeling
tool.
C
So
it
sets
up.
You
know
user
pools
and
stuff
and
you
can
set
up
logins
for
people.
It
allows
you
to
go
in
and
customize.
You
can
see.
I
created
on
the
left
here,
some
custom
instructions
for
them
on
what
the
different
samples
are
and
what
they
needed
to
do.
But
basically
one
of
my
volunteer
force
logs
in
it
gives
them
a
random
image
and
they
click
and
identify
it.
C
I've
got
are
the
five
things
I'm
looking
for,
plus
an
other
just
in
case
I
missed
any
any
ones
that
you
know
when
I
cleaned
them
up
that
actually
weren't
any
of
the
five
and
what
you
end
up
with
is
here
is
a
labeled
output.
So
you
can
see
here,
here's
an
image
and
label
stuck
in
it.
This
is
actually
crochet,
so
label
other.
You
know
so
I
had
some
stuff.
That
is
a
cable
stitch,
so
that's
label
other.
So
you
end
up
with
this
manifest.
C
It
puts
it
into
an
s3
bucket
and
you
can
then
feed
that
in
to
start
training
your
model,
which
is,
of
course
the
next
step.
Am
I
gonna
switch
back?
Yes,
so
before
I
trained
so
I
realized
I
had
another
problem.
It's
just
one
problem,
I
told
you,
there's
gonna,
be
lots
of
fails
and
errors
in
this.
C
So
one
problem
I
realized
I
had
is-
I
I
told
you
I
had
two
of
the
five
stitches
I
was
doing
was
a
garter
stitch
and
reverse
stockinette
stitch
garter
stitch
there
on
the
left
is
what
you
get
when
you
knit
every
row
and
reverse
stuck
in
that
stitch.
Is
you
knit
one
row
purl
one
row
and
it's
one
of
the
sides
they're
really
hard
to
tell
apart
in
that
picture
like
in
real
life?
It's
super
easy
because
one
of
them's
the
same
on
both
sides
and
one
isn't.
C
I
have
a
hard
time
telling
those
two
apart
and
oh
my
crap,
I'm
not
going
to
be
able
to
trust
any
of
the
labels
on
these
because
I
don't
know
which
is
which
and
the
labelers
aren't
going
to
know
so
so
yeah.
I
realized
that
label
was
not
necessarily
going
to
be
100
accurate.
C
C
C
So
I'm
sitting
here
going,
oh
god,
I've
spent
all
these
hours
and
hours
and
I've
got
completely
unreliable
labels.
What
am
I
going
to
do
it's
time
to
de-scope
even
further
right?
We've
all
seen
silicon
valley,
I'm
like
what
is
the
knitting
equivalent
of
hot
dog
or
not
it's
stuck
in
that
stitch
or
not.
Is
it
stuck
in
that
stitch?
Is
it
not
that's
all
I'm
gonna
care
about.
C
So
then,
it's
time
to
train
the
model.
Finally,
so
this
was
back
in
2019,
as
I
mentioned,
I
was
doing
this
project,
and
so
I
use
sagemaker.
So
sagemaker
is
amazon's
fully
managed
service.
It
covers
the
whole
machine,
learning
workflow.
C
It's
it's
fairly
advanced
like
if
you
know
what
you're
doing
it's
massively
configurable
I
didn't
so
I
was
going
from
a
mix
of
blog
posts
and
tutorials
and
things
other
people
done
rather
than
using
jupiter
workbooks.
I
just
did
everything
directly
in
the
console,
not
a
machine
learning
expert.
I
ran
my
training
jobs
on
a
single
p3
instance
there
if
you're
curious,
which
has
gpus
to
excel
hey,
I
work
for
amazon,
I
figure
if
I
can
throw
some
resources
at
the
thing.
C
Why
not
so
I'll
go
back
to
the
console
now
and
show
you
right
so
sagemaker?
C
C
C
I
think
yeah
image
classification
algorithm
was
the
one
that
I
used,
but
it's
got
a
whole
bunch
built
in.
I
don't
even
know
what
a
lot
of
those
are.
If
there's
a
lot
of
stuff,
you
can
do
you've
got
to
pick
the
resource
that
you
want
to
use.
That's
where
I
use
that
p32xl
hyper.
I
don't
know
you
know
I'm
skipping
everything
that
isn't
default.
Basically,
then
you've
got
to
point
it
at
your
training
data.
So
this
is
where
I
pointed
it
at
the
output
manifest
from
my
ground.
C
Truth,
training,
job
the
s3
bucket,
and
you
give
it
both
data
for
training
and
for
validation.
I'm
sure
some
of
you
know
this.
The
idea
is
that
when
you
have
your
your
data
set,
you
carve
off
80
for
training
and
20.
You
save,
and
you
use
that
to
test
the
model
at
the
end
to
see
how
accurate
it
is.
So
I
pointed
it
at
that.
You
tell
it
where
to
put
the
output
another
s3
bucket,
you
can
use
spot
training
if
you
want
to
save
money.
C
I
didn't
bother
with
that,
but
it's
a
good
option
and
we'll
talk
about
that
later.
You
kick
off
the
training
jobs,
like
I
said,
took
me.
Eight,
eight
or
eight,
eight
or
so
goes
to
get
one
that
actually
completed,
and
when
I
did
is
that
what
I'm
just
gonna,
do
I'm
just
gonna
go
ahead
and
show
you
the
actual
output?
Oh
then,
you've
got
to
deploy
it
of
course,
so
once
you've
got
a
model
that
works.
C
So
let
me
get
to
that
one
that
I
just
had
the
model
that
complete
the
first
one
they
completed
stuck
in
it
or
not
takes
seven.
You
can
create
a
model
from
it.
C
So
if
I
go
here
to
inference,
I've
got
a
couple
models
here
in
sagemaker,
just
from
different
versions
that
I
did
different
models,
then,
once
you
have
a
model,
you
have
to
set
up
an
end
point
configurations.
This
is
where
I'm
going
to
host
the
model
so
that
I
can
then
send
like
an
api
call,
send
images
to
it
and
get
the
labels
back.
So
you
have
to
set
up
an
endpoint
configuration
and
then
you
have
to
set
up
an
end
point:
it's
not
too
tricky.
Actually,
the
blog
post.
C
I
had
it's
probably
the
easiest
part
of
the
whole
thing,
and
then
you
can
actually
test
it
out.
So
I
did
all
this
going
back
to
my
thing
here,
so
I
trained
it
I
so
it's
really
easy
to
to
using
sagemaker
to
deploy
it.
It
takes
a
few
minutes
for
the
hosted
endpoint
to
come
up,
but
then
you
can
start
passing
images
to
it,
and
then
I
went
to
build
a
nice
little
web
front
end.
I
mentioned
that
my
colleague
gabe
hollanby
had
some
code.
C
He
shared
me
a
little
react
app.
He
wrote
that
I've
got
running
here
locally,
that
you
hook
up
to
a
sagemaker
endpoint.
The
only
tricky
bit
if
you
want
to
do
it
yourself,
is
that
sagemaker
endpoints
aren't
accessible
to
they're,
not
exposed
to
the
public,
so
you've
got
to
hook
up
api
gateway
and
a
lambda
function
in
there
in
the
in
the
middle
and
there's
an
aws
blog
post
about
this.
C
If
you
google
it
that
will
walk
you
through
it,
but
it's
not
too
hard,
and
once
I
had
it
done
so
I
fired
up
my
app
so
excited.
I
pointed
my
headpoint.
I
drag
in
an
image
of
stockinette
stitch
and,
ladies
and
gentlemen,
stuck
in
that
stitch,
I
was
like.
Oh
my
god.
First
try.
I
cracked
it
who
needs
calculus
like
I'm.
Clearly,
a
genius
give
me
my
phd,
but
maybe
I
should
test
more
than
one
image
just
to
be
sure.
So
then
I
I
dragged
in
some
garter
stitch.
C
Oh
it
it
thinks!
That's
stuck
in
that
stitch
too.
That's
that's!
Not
great!
All
right!
Let
me
try
another
one.
Okay,
that's
not
even
knitting!
That's
crochet!
That's
that's!
With
a
hook!
That's
not
knitting
at
all,
and
so
I've
got
this
really
sick
feeling
in
my
stomach
now
one
last
test:
that's
a
cat
that
that's
not
knitting
at
all.
It
thinks
it's
stuck
in
that
stitch
so
and
cancel
the
phd.
C
I
I
learned
that
nobody
cracks
it
on
their
first
try.
Somebody
scrubbed
it
to
me
is
like
making
the
making
pancakes
the
first
one
is
always
crap.
You
gotta,
throw
it
out.
So
it's
often
the
case
with
machine
learning.
You
will
not
hit
the
jackpot
on
your
first
go,
so
I
went
digging
into
my
logs
to
look
at
my
accuracy
on
the
model
45
worse
than
a
coin
flip
so
back
to
data
cleaning.
This
is
just
the
cycle
that
you
do
so
I
ran
like
you
saw
lots
of
different.
C
C
C
C
C
I
downloaded
this
open
data
set
of
like
airplanes
and
dinosaurs
and
accordions
and
chucked
that
in
so
that
it
learned
that
not
everything
in
the
world
is
knitting,
and
then
I
ran
a
hyper
parameter
tuning
job,
which
makes
me
sound
super
fancy,
but
basically,
rather
than
trying
out
like
thousands
of
combinations
of
these
hyper
parameters,
which
are
exactly
what
it
sounds
like
tiny
little
things
you
can
tweak
in
your
training.
Job
sagemaker
has
a
thing
where
you
can
just
like
kick
off
this
automatic
process.
It
will
do
that
for
you.
C
It
will
try
out
hundreds
of
different
combinations
and
keep
track
of
the
accuracy
and
tell
you
which
one
wins.
So
I
did
that
and
I
will
show
you
the
best
results
I
managed
to
get
back
in
2019.
So
I
have
here.
This
is
my
little
react.
App
that's
running
here
locally,
it's
pointed
at
my
endpoint
and
I
will
drag
in
some
test
images.
So
we'll
start
off
with
some
stocking
stitch.
Always-
and
hopefully
this
is
going
to
tell
us-
we
make
it
bigger.
C
D
C
Definite
improvement
on
version
one
right,
and
I
don't
these
are
all
black
and
white.
I
don't
have
to
do
just
black
and
white,
so
I
did
this
talk
at
a
conference
and
I
actually
ran
around
at
the
conference
and
took
pictures
of
people
wearing
sweaters,
and
so
I
actually
then
used
some
people's
sweaters.
So
this
guy
this
was
this
guy
named
ben.
That
was
his
sweater,
which
is
not
stuck
in
that
stitch.
That's
correct,
there
was
jill.
Jill's
is
stuck
in
that
stitch.
C
Yep,
that's
cool
oznad
should
be
stuck
in
that
stitch.
Yes,
stockinette
stitch,
the
one
that
it
gets
wrong.
It's
not
perfect
the
one
that
it
gets
wrong.
Is
this
chick
paulina?
C
It
thinks
that
that's
stuck
in
that
stitch-
and
I
can
kind
of
see
why
so
stuck
in
that
stitch-
makes
little
v's
like
if
you're
looking
just
from
a
textural
standpoint,
little
v's-
and
I
don't
know
what
stitch
her
jumper
was,
but
you
can
tell
it
looks
it
has
those
v's.
So
I
think
that's
why
it's
picking
it
up
and
stuck
in
that
stitch,
but
anyway
yeah.
So
this
was
I'm
thinking
great
this
back
in
2019.
I'm
I'm
really
happy
with
this.
C
So
to
show
you
my
my
stats
with
training,
job
15
plus
hyper
parameter,
optimization,
900
images.
I
got
it
up
to
almost
95
percent,
which
is
pretty
good,
considering
I
am
a
complete
newbie
at
this
and
I
mucked
it
up
a
whole
bunch
which,
as
you
saw
so
2019.
C
This
is
where
I
stopped
and
I
was
like
cool
it's
sort
of
possible
if
you
really
dumb
it
down
and
then
literally
a
month
after
I
went
through
all
of
this
aws
brought
out
recognition,
custom
labels,
which
does
all
of
this
a
hundred
times
easier.
So
recognition
is
one
of
those
services
at
the
top
of
the
stack
it's
one
of
those
pre-built
ones
out
of
the
box.
C
It
allows
you
to
identify
like
objects
and
celebrities,
and
you
know
tree
house
like
basic
stuff
in
images
and
videos,
but
it
doesn't
know
custom
domain
things
like
obviously
knitting,
and
so
they
brought
this
functionality
called
custom
labels
where
you
can
build
your
own
specialized
image
analysis
on
top
of
it,
to
look
for
images
relevant
to
your
use
case.
So
people
using
this
in
like
industrial
applications
and
stuff,
you
don't
need
to
train
a
new
model
completely
from
scratch
and
you
can
get
started.
They
claim
with
just
a
handful
of
images.
C
So
I
won't
quickly
switch
over
now
so
custom
labels
right
away
the
user
interface
way
easier
than
sagemaker
you've
just
got
projects
and
data
sets
a
data
set.
Let's
just
create
one
right
now
and
I
will
call
it
wednesday
night.
You
pointed
at
an
image
source,
so
I
actually
just
pointed
it
at
my
old
ground.
C
Truth
output,
so
I
actually
was
able
to
pull
in
the
exact
same
data
that
I
used
in
sagemaker,
but
you
can
also
just
point
it
at
a
bucket
or
you
can
even
just
upload
some
images,
so
I'll
click
upload
for
now,
and
it
actually
opens
up
like
a
little
ui
here.
I'll
click,
add
images
choose,
and
I
will
just
actually
let
me
go
up
a
level
just
add
in
a
bunch
of
I
don't
even
know
what
these
are
just
add
had
a
handful
of
images.
C
So
you
can
do
it
manually
like
this,
so
if
you've
got
a
small
set
of
just
like
very
particular
images
for
your
use
case,
you
can
just
do
it
this
way.
It's
very
easy.
You
do
not
have
to
know
have
a
lot
of
aws
and
machine
learning
expertise.
Then
I
can
set
up
my
labels,
so
I
can
set
up
a
label.
I
can
import
it,
but
I'll
call,
okay,
so
stockinette
or
not,
and
now
I
can
actually
go
through
and
start
labeling
them.
C
So
I
can
just
add:
just
go
in
and
assign
stuff.
You
can
also
do
draw
bounding
boxes
and
if
you
click
that
it
actually
opens
up,
it
uses
ground
truth
under
the
hood
you
can
tell
this:
is
the
ground
truth
ui?
So
you
can
label
your
data
that
way
like
I
said
you
only
have
to
have,
I
think,
a
minimum
of
10
per
label
in
order
to
actually
kick
off
a
training
job,
and
so
once
you've
got
a
data
set
in
there.
So
I
think
this,
the
first
one
I
did
I
had
50..
C
I
did
10
per
label
the
bare
minimum.
You
can
do.
You
click
trade
model,
that's
it
you
don't
have
to.
You.
Don't
have
to
do
all
that
other
stuff,
that
you
do
with
sagemaker
and
it
just
goes
off
and
it
does
it
and
it
figures
out
for
you
how
many
instances
it
wants
to
use
which
algorithms
it's
going
to
try.
It
works
all
of
that
out
and
the
output
you
get
is
you
you
get
here.
C
You've
got
a
training
job,
it
tells
you
how
long
it
took
and
you
can
then
actually
start
using
the
model
so
that
first
one
that
I
did
was
the
1033
when
it
was
that
one
there.
C
Oh
view
test
results
there.
It
is
so
I
can
see
the
test
it
carved
off
some
test
data.
It's
showing
me
true,
positive,
false
positives,
so
it
gives
you
some
statistics
on
on
it
and
stuff,
so
I
ended
up
feeding
it,
so
that
was
I
tested
out
with
just
the
bare
minimum
50.,
but
then
I
fed
it
in
the
900
from
from
my
original
stage
maker
job
now
in
order
to
actually
it
to
actually
use
this.
C
It's
it's
so
easy.
So
literally
so
this
one
here,
let's
say
this
one
I
can
literally
just
click
use
model
and
start
don't
have
to
create
an
endpoint
configuration,
don't
have
to
do
an
endpoint,
I
just
click
start
and
it's
gonna,
and
I
tell
it
how
many
resources
I
want
to
have
for
doing
inference,
and
that's
mainly
for
cost
we'll
talk
about
that
pricing,
but
that's
it
and
then
you
can
start
like
it's
hosted.
You
can
start
using
it
so
to
make
a
little
front
end.
C
I
found
out
that
we
have
this
super
cool
thing
on
the
aws
github,
which
is
a
sample
which
is
recognition,
custom
labels
demo.
It's
got
some
cloud
formation
stacks
here,
basically
just
click
it
and
it
sets
you
up
this
full
cool
little
web
app
with
cognito
user
to
log
in
and
give
you
all
these
these
links
later,
michael.
We
can
share
them,
but
michi
and
yeah.
So
this
it's
picked
up
all
the
models.
C
I've
got
in
recognition,
custom
labels,
I've
got
a
couple
of
them
running,
and
so
we
can
actually
test
them
out.
So
this
this
middle
one
is
the
one
that
corresponds
to
my
recognition
one.
So
I
won't.
I
can't
drag
and
drop
with
this
one,
which
is
slightly
more
annoying,
but
stuck
in
that
99
boom.
C
The
cat
is
not
stuck
in
it
99
boom,
interestingly,
the
it
they.
I
start.
I
started
going
through
my
test
set
and
I'm
saying
wow,
it's
it's
giving
me
pretty
comparable
accuracy.
It
even
gives
me
the
same
one
wrong
that
girl
paulina's
it
even
it
thinks
that's
stuck
in
that
as
well
same
as
the
sagemaker
one
did
and
again
you
can
tell
it's
probably
because
those
v's
I
actually
was
so
like
I
tweeted
about
this.
This
was
a
couple
months
back
when
I
did
this,
because
this
was
mind-blowing
to
me.
C
After
the
all
the
hours
and
the
fighting
with
stage
maker.
This
I
did
in
like
an
hour
in
the
morning.
It
was
nuts
it
was
so
simple
and
I
even
went
and
did
a
further
version,
because
I
wanted
to
see
if
I
could
identify
go
beyond
just
stockinette.
I
wanted
to
see
if
I
could,
for
example,
moss
stitch
actually
get
it
to
identify.
So
I
cleaned
up
my
data
and
I've
actually
got
a
version
now
that
can
identify
the
five
different
stitches
that
I
that
I
put
in
there.
C
So
same
900
images
it
gives
slightly
different
metrics
recognition
does
to
sagemaker
instead
of
accuracy.
It
gives
you
this
f1
score,
which
is
sort
of
a
composite
thing,
but
I
believe
that
equates
to
like
98
it's
at
least
as
good
as
the
stage
maker,
if
not
better,
keeping
in
mind
that
I
did
the
sagemaker
18
months
ago.
So
maybe,
if
I
did
the
sagemaker
one
again,
I
would
even
get
a
slightly
higher
accuracy
than
the
95
I
did
back
then
so
I
was.
I
was
really
really
surprised
by
this.
C
C
You
pay
a
dollar
an
hour
to
train
and
four
bucks
an
hour
for
inference
usd,
but
this
is
not
something
that
you're
going
to
turn
on
and
leave
on
it's
going
to
work
best
for
things
where
you
have
batch
processes
where
you
turn
it
on
you
processes
a
bunch
of
stuff
and
then
you
turn
it
off
again.
That's
where
it's
going
to
be
most
cost
effective
and
I've
broken
down
the
different
pricing,
and
we
can
go
through
that.
I
won't
spend
a
lot
of
time
on
that.
C
The
accuracy
is
comparable
and
look
if
you're
an
expert
go
sage.
Baker,
because
you're
going
to
understand
what
all
of
those
things
are,
you're
going
to
be
able
to
really
finely
tune
it.
But
if
you
just
want
to
do
an
image
classification
project
quickly,
you
can
have
this
up
in
a
morning.
I'm
telling
you
it's
so
fast.
Most
of
the
recognition
custom
labels,
training
jobs.
That
is
one
caveat
I
should
mention
you
don't
have
a
lot
of
control
over
the
training
jobs.
You
kick
them
off
and
that's
it.
C
They
run
and
it's
a
dollar
an
hour,
all
the
ones
I
did
most
of
the
ones
I
did
finished
under
an
hour
so
less
than
a
dollar.
There
was
one
with
color
images
that
ran
for
four
hours
because,
like
I
said
recognition,
figures
out
how
many
resources
to
use
and
how
how
much
training
to
do
you
don't
get
to
say
all
that
the
maximum
they
can
run
is
72
hours
and
then
we'll
kill
them
and
not
charge
you
for
it.
But
so
you
could
theoretically
kick
off
a
model
that
cost
you
70
bucks.
C
I
I've
not
seen
like.
I
said
all
the
ones
I've
tested
have
been,
and
we
say
ninety
percent
of
them
finish
in
less
than
a
day.
But
that's
that's
just
an
aspect
there
for
you.
We
do
have
a
free
tier
for
both
stage
maker
and
for
recognition,
custom
labels.
So
when
you're
getting
started
that
can
make
things
a
bit
cheaper.
C
If
you
just
want
to
do
a
little
proof
of
concept-
and
you
don't
have
to
just
listen
to
me-
we've
got
a
hero
in
italy,
luca
bianchi
is
in
milan
and
wrote
this
really
cool
use
case.
So
if
you're
like
guest
chris
knitting,
who
cares,
this
is
a
really
cool
legit
use
case.
So
this
is
a
monitoring
system
for
like
air
conditioning
systems.
C
You
know
where,
once
a
month,
you
need
to
check
them
and
see
how
filthy
they
are,
if
they're
going
to
make
people
sick-
and
this
is
a
perfect
example
because
you
could
spin
it
up,
run
your
images
through
and
then
turn
it
off
again
and
so
luca
did
a
a
model
on
exactly
what
I
did.
He
did
it
on
sagemaker.
He
did
it
with
2300
images.
He
went
through
all
the
written
role
of
getting
him
labeled
of
using
used,
pie,
torch
and
fastai
his
model
had
around
92
and
then
recognition.
C
Custom
labels
comes
out,
he's
like
all
right
I'll,
throw
my
data
in
there
and
see
what
it
spits
out
and
it
came
out
with
the
same
accuracy.
He
got
the
same
results
with
like
a
a
fraction
of
the
effort.
He
in
the
blog
post
calls
it
mind-blowing.
So
you
don't
need
a
data
scientist.
If
this
is
the
kind
of
use
case
you're
doing
and
like
I
said,
I
think
this
kind
of
thing
is
really
good.
Some
of
the
other
use
cases
we
talk
about
are
things
like.
C
Let's
say
you,
your
company
wants
to
identify
its
logo
being
used
in
like
social
media
posts
or
uploads
to
a
system.
There
was
a
great
talk.
I
saw
in
australia
from
a
site
that
sells
cars
and
they
need
to
identify
if
people
accidentally
upload
an
image
with
their
number
plate
in
it,
because,
like
thieves,
criminals
like
print
out
people's
license
plates
and
use
them
to
commit
crimes,
so
you
could,
for
example,
have
a
batch
job
that
checked
all
the
user-generated
uploads
and
flagged
the
ones
that
had
a
number
plate
in
them.
C
Things
like
that
so
yeah,
I'm
a
big
fan
of
of
custom
labels.
These
are
just
my
thank
yous,
all
the
people
who
helped
I
had
53
knitters,
who
sent
me
images.
I
had
a
team
of
14
who
did
labeling
for
me,
and
I
also
had
some
really
amazing
folks
at
aws,
aparna
and
gabe,
who
helped
me
out
with
expertise
and
that's
pretty
much
it.
We.
We
talked
about
some
of
the
stuff,
I
think,
is
cool
in
aiml.
I
hope
everybody
likes
pizza.
C
I
hope
you
thought
that
was
cool
too,
why
I
think
it's
important,
even
if
it's
not
your
day,
job
doing
machine
learning
it's
gonna
be
aws
has
a
bunch
of
different
tools
that
you
can
use,
even
if
you're,
not
an
expert
like
I'm
not
and
yeah
my
little
project,
which
is
not
done.
Of
course
not.
I
will
continue
to
to
do
this.
I
want
to
get
it
to
where
I
can
feed
it
an
arbitrary
pattern
and
have
it
analyzed
stitch
by
stitch,
so
I'll
keep
going
with
it.
C
C
That
industrial
knitting
software
costs
like
50
grand
like
those
machines.
Well,
you
get
it
with
the
machine.
The
machines
are
very
expensive.
I
have
a
whole
talk.
I
did
on
youtube.
You
can
look
it
up
because
I've
done
a
whole
talk
about
how
knitting
patterns
are
basically
a
programming
language.
It's
a
domain
specific
language,
and
so
the
best
expression
of
it
is
in
those
industrial
knitting
patterns,
because
you
can
write
a
pattern
in
that
dsl
and
it
will
simulate
it.
There
used
to
be
a
cool
project
on
sourceforge
like
an
open
source.
C
One
someone
did,
but
I
I
kind
of
was
looking
at
maybe
trying
to
revive
that
to
see
if
I
could
simulate
some
samples,
but
it's
like
written
in
java
from
like
15
years
ago,
so
I've
got
no
clue
how
to
actually
get
that
working.
I've
had
a
few
people
over
the
years,
reach
out
and
say
they
might
help.
But
it's
it's
an
idea,
and
I
think
that
would
because
there's
no
way
I
can
knit
enough
samples
and
crowdsourcing
is
not
the
way
to
go.
So
I
think
simulation
is
definitely
the
key.
E
C
Yeah
there
are
a
whole
like
another
version
of
this
talk
has
a
lot
more
white
papers
in
it
because,
as
you
say,
video
games
are
really
where
you
start
seeing
a
lot
of
the
advances
coming
in
this,
because
you
know
we
can
create
very
realistic
avatars
in
games
and
in
computer
generated
imagery
but
knitting.
Doesn't
you
know
you
know
what
it
looks
wrong
because
it's
flexible
and
it's
a
mesh
and
so
there's
a
lot
of
researchers.
C
Who've
been
trying
to
create
realistic,
looking
knitted
textiles
in
video
games
and
in
cgi
and
yeah
they've
done
some
really
cool
stuff
with
various
ways
of
modeling
that
and
making
it
move
realistically
and
stretch
and
respond.
So
yeah
there's
some
really
cool
papers
on
that.
I
can
share
with
you.
G
Always
great
to
see
people
nerd
out
about
the
things
they
love
like
it's
it's
awesome.
It's
awesome
to
see
that
to
me
personally.
C
So
yeah
I'm
happy
to
to
go
in
and
show
you
if
you
want
to
see
anything
else
in
the
console.
If
I
flashed
by
anything
too
quickly,
like
you
know,
most
of
the
stuff
takes
a
while.
That's
you
kind
of
got
to
pre-bake
it
because
it
takes
a
while
to
actually
do
it
live.
So
I
can't
really
do
much
of
a
live
demo
other
than
showing
you
the
end
results,
but.
E
C
Yeah,
it's
using
transfer
learning,
which
is
you
know,
you
take
a
model,
that's
been
trained
for
one
thing,
but
you
you
sort
of
feed
it
your
own
stuff,
and
so
it
it
yeah
you
don't
have
to.
I
didn't
have
to
give
it
any
of
the
non-knitting
stuff
you're
right
it
had
that
already
because
of
the
built-in,
the
underlying
model
that
it
was
already
using.
C
E
I
talking
about,
I
had
the
same
problem
a
year
ago
with
rick
recognized
something,
and
it
was
too.
C
So
here
you
can
see
sorry.
What
am
I
thinking?
No,
I
did
look
there's
a
wild
cat.
That's
not
stuck
in
that
stitch!
So
yeah.
This
has
the
the
same,
the
same
clutter
in
it
as
everything
else
so
yeah
I
did.
I
did
include
that
and
I
can't
remember
I
can't
remember
where
I
found
it,
but
there's
there's
a
sort
of
standard
set
of
copyright
free.
You
know
images
that
you
can
download
and
chuck
in
so
airplanes
and
animals.
E
Yeah
would
be
interesting
to
see
when
there's
non
the
non
stockinette
images
were
not
in
there
in
this
custom
build
machine
how
high
the
accuracy
would
be,
because.
C
Well,
that
that
what
I
did
with
only
50
labels,
where
I
did
10
per
per
label
it
it.
The
accuracy,
is
crap
like
it's
not
actually
very
good.
I
think
10
is
10,
at
least
for
this.
10
is
far
too
short,
too
small.
E
C
I
think
one
of
the
big
industrial
knitting
machines
is
german,
there's
a
german
one
and
then
the
japanese
one
is
called
like
shimaseki
there's
two
of
them,
but
they're
super
expensive
and
I
think
the
mit
researchers
actually
asked
them
if
they
could
use
the
software.
So
I
they're
not
going
to
give
it
to
me.
F
Yeah
one
question
that
I
had
is
you
were
talking
about
how
that
one
pauline
sample
that
you
have,
which
is
yeah.
F
About
stitch,
but
it's
mislabeled,
I
mean
sure
your
your
hypothesis
is.
C
Oh,
I
don't
know,
and
it's
one
of
the
challenges
with
machine
learning
is
the
explainability.
I
don't
know
why
it
thinks
that
is
stuck
in,
that
I
can
conjecture
and
and
that's
that's
one
of
the
challenges
and
even
more
so
with
recognition,
custom
labels,
because
that
is
a
black
box.
You
know
with
sagemaker
it's
possible
that
if
I
was
an
expert
and
I
went
into
the
hyper
parameters,
I
could
tweak
it
until
this,
but
I
I
mean
I
think
to
me
it's
I
can
see
why
it
would
think
that
there's
another
one.
C
F
As
you
were
saying,
deep
dream
exposes
like
some
of
the
features
that
the
model
has
learned,
and
so
it
might
be
interesting
to
experiment
with
what
are
the
features
that
you
that
this
model
has
learned
and
and
maybe
understand,.
C
Luckily,
my
colleague
here
auntie
barth,
some
of
you
may
may
know,
she's
based
in
dusseldorf
she's,
a
machine
learning
but
she's
just
written
a
book
on
machine
learning
at
aws.
So
I
just
need
to
like
get
antia
for
a
day
or
two
and
have
her
work
with
me
on
this,
because
yeah
there's
some
really
cool
stuff.
I
would
like
to
do
that
is
I'm
still
learning
yeah.
I
should
buy
her
book.
C
C
Yeah
I
mentioned
janelle
shane
earlier
the
ai
weirdness
lady,
because
she
did
a
real
she's,
a
knitter
too,
and
she
did
this
really
cool
project
where
she
fed
again
like
well,
a
neural
network
with
a
whole
bunch
of
knitting
patterns
as
the
training
data
and
then
had
it,
try
and
create
knitting
patterns.
And
then
she
had
knitters,
try
and
knit
them,
and
some
of
them
look
like
things
from
an
alien
civilization
so
that
she
called
it
sky
knit
which
I
thought
was
the
most
brilliant
part
of
the
whole
project
like
in
the
terminator.
D
One
more
technical
question
I
had
before
when
you
showed
the
the
recognition
yeah
front-end
in
aws
or
customers
there,
you
have
the
possibility
to
configure
an
s3
bucket
where
you
store
your
images.
Does
it
necessarily
has
to
be
an
aws
s3
bucket
or
could
anything
else
that
is
able
to
understand,
as
free
protocol
will
be
used.
C
Your
options
are:
s3,
buckets,
upload
images
or
import
them
from
ground
truth.
So
I
I
think
it
has
to
be
an
s3
bucket
to
use
this
to
be
honest
with
you
underneath,
but
I
can
double
check
that.
C
But
you
don't
have
to
manually
upload
them,
you
can
already
have
them
in
an
s3
bucket.
You
can
have
a
whole
separate
process
that
puts
them
into
the
s3
bucket.
I
just
showed
you
uploading
from
the
computer,
because
it's
it's
the
easiest
way
for
people
who
are
not
experts
to
do.
C
Yeah
yeah-
and
I
think
I
think
luca
on
that
blog
post
mentions
that
they
used
like
command
line
like
s3
sync,
so
that
they
just
did
them
that
way,
rather
than
doing
them
through
the
console.
D
Yeah
another
thing
I
thought
about
was
you
said
that
aiml
will
be
more
important
in
the
future
for
software
development
in
general
and
that
everybody
has
to
deal
with
the
topic
yeah
as
it
grows
in
importance.
So
I
heard
about
generating
test
data
with
the
help
of
ai
or
ml.
So
like.
C
You
could
definitely
do
that.
I
mean,
I
guess
it
depends
what
kind
of
test
data
you're
looking
to
create
if
you're
just
looking
to
create
textual
or
whether
you
want
to
create
images
but
yeah.
You
could
definitely
do
that.
Yeah
yeah,
create
fake,
fake
user
accounts
or
realistic,
looking
user
buying
histories,
or
things
like
that,
so
that
you've
got
realistic,
looking
data
to
run
and
don't
have
to
run
it
against
real
production
data
yeah,
you
could
definitely
use
a
number
of
a
number
of
services
to
do
that.
C
C
I
was
looking
today
because
I
thought
you
know
thoughtworks
do
that
that
that
radar
every
year
of
things
that
they're
talking
about
and-
and
that
was
where
they
had
that
bit
about
how
they
think
machine
learning
model
explainability
is-
is
one
of
the
big
things
that
we
need.
People
who
understand
this
so
that
they
can
explain.
Why
like
for
me
to
be
honest
with
you,
I'm
a
little
bit
dangerous
here,
because
I
don't
know
why
this
model
does
what
I
wanted
to
do
so,
I'm
using
this
for
for
for
test
things.
C
I'm
not
going
to
use
this
to
decide
if
someone's
going
to
get
insurance
or
health
care
or
not.
You
know
I
want
someone
who
knows
a
hell
of
a
lot
more
about
it
than
I
do,
building
the
models
that
do
that
and
being
able
to
explain
why
they're
coming
up
with
what
they're
doing
so
this
is
this
is
the
starting
point.
That's
why
I'm
bill
I've
built
a
toy,
but
there
there
are
definitely
some
really
interesting
applications
and
testing
data
testing
data
is
a
good
one.
Actually,
that's
that's
a
good
one.
C
I
hadn't
thought
of
generating
realistic
test
data.
The
other
one
is
is
tools,
and
I
I
I
know
engineers
get
very
nervous
about
this,
and
I
do
as
well,
but
there's
a
bunch
of
startups
right
now
who
are
working
on
solutions
where
you
know
no
code,
where,
instead
of
writing
an
sql
statement,
you
instead
write
out
in
real
language,
select
the
last
10
orders
in
this
set
and
it,
and
it
will
generate
the
code
for
you.
C
E
C
And
that's
where
you
I
mean
you
can
actually
with
I
mean
that's
one
of
the
things
with
sageman.
You
can
build
out
like
a
full
ci
cd
pipeline,
where
you
can,
you
know,
have
versions,
and
you
can
kick
off
new
training,
jobs
and
stuff.
You
know
recognition.
Custom
labels
is
very
like
you
do
it
and
it's
there
and
if
I
want
to
go
back
and
add
more
images,
I
can
but
it's
a
manual
process.
It's
it's
not
very
automated,
but
yeah.
C
Someone
asked
me
today
when
I
I
talked
about
this
earlier
today,
like
how
do
you
know
when,
when
to
stop
training,
when
it's
good
enough
and
I'm
like-
I-
I
don't
you
did
you
don't
know
you
go
until
you
get
accuracy,
that's
good
enough
like
or
until
you
run
out
of
money,
either
way
like
I,
I
did
the
cost
for
mine.
C
I
had
it
on
there,
but
like
ground
tree
other
than
the
ground.
Truth
labeling
leaving
that
out,
like
that
model,
cost
me
less
than
50
bucks
on
sagemaker
and
it
only
cost
me
a
couple
bucks
on
on
recognition,
custom
labels,
but
my
hyper
parameter
job
ran
for
13
hours.
So
you
know
if
you're
doing
that,
over
a
lot
of
different
models.
A
Oh,
no,
you
can
also
click
on
the
url,
but
I
can
share
my
screen
as
well
just
for
a
second
to
find
the
right
point.
The
thing
is,
we
have,
I
think,
30k
issues
are
for
the
gitlab
project
itself.
A
Let's
just
check
how
many
we
have
yeah,
okay,
it's
a
little
more
than
that
and
the
thing
is
like
doing.
Automatic
tracking
is
also
an
ideal
experiment
which
stan
who
is
currently
doing
like
training
a
model
and
then
classifying
the
new
issues
which
are
coming
in
so
that
it's
not
a
manual
task.
Every
time
someone
someone
needs
to
do
it.
I
don't
know
how
far
this
has
gotten
yet,
but
it's
an
a
really
interesting
effort.
C
That's
cool
my
my
husband.
I
were
talking
at
lunch
today
because
about
another
use
case
that
we
thought
of
which
is
like
he
was
talking
about.
You
know
you
go
to
like
saturn
or
one
of
the
electronic
stores
and
they
will
have
like
the
apple
booth
or
the
android
booth
set
up,
and
so
whenever
they
set
those
up
in
a
retailer,
someone
has
to
check
that
it's
done
correctly.
C
So
imagine
you
could
take
a
photo
and
use
machine
learning
to
actually
flag
that
yep
all
good
or
no
person
needs
to
go
and
have
a
look
at
this.
So
you
could
perhaps
check
the
setup
of
in-person.
You
know
for
events
like
evangelists,
we
always
going
and
setting
up
the
booth
at
conferences,
and
things
like
that.
So
that
was
an
idea.
We
thought
of.
A
Maybe
they
picked
the
pictures
when
you're
receiving
a
mail
package,
certainly
from
dhl
or
something
and
and
amazon,
does
that
as
well,
so
you
you're
getting
a
picture,
but
this
is
like
a
picture
for
you
as
the
customer
to
verify
if
you
could
be
like
scanned
automatically
and
said:
hey,
okay,
this
is
the
terrace.
This
is
the
whatever
window.
This
is
the
garden.
This
will
also
be
an
interesting
approach
to
say:
hey.
This
is
actually
the
right
place
which
matches
what
the
customer
told
me
in
the
in
the
order.
Description,
for
example,.
C
The
the
the
car
sales
in
australia,
the
the
guy
from
car
sales
who's,
an
aws
hero,
has
done
a
number
of
talks
about
what
they've
done,
which
is
really
fascinating.
So
you
know
you're
selling
your
car,
you
go
and
you
upload
pictures,
and
so
what
they
can
do
is
they've
now
built
up
a
model
that
can
identify
that.
C
Yes,
you've
done
the
front
view
the
side
view,
but
you
haven't
done
the
review
in
the
trunk
view,
and
we
know
that
people
want
to
see
those,
so
it
can
actually
recommend
to
you
images
that
you
haven't
included
that
that
would
be
useful.
It
checks
for
the
license
plates.
That's
where
I
learned
about
clutter.
He
did
a
talk
on
the
license
plates,
one
where
they
trained
it
on
just
pictures
with
license
plates.
C
They
didn't
test
it
on
pictures
without
license
plates,
so
it
would
start
identifying
license
plates,
regardless
of
whether
there
was
done
there
or
not,
and
so
that's
where
I
learned
about
the
concept
of
clutter
from,
but
they
had
a
couple
really
interesting
use.
Cases
in
terms
of
I
I
or
people
will
upload
a
car
and
they
could
actually
verify
without
even
seeing
like
the
little
logo
on
the
front.
What
type
of
car
it
was
from
the
body
that
of
the
of
the
machine
learning
model
that
they
built
up.
C
C
Oh,
is
it
I'm
just
able
to
understand
the
s3
protocol?
Oh
I
see
what
you
mean
sven
I'll,
look
into
that
sven
hit
me
up
on
twitter
or
linkedin
and
I'll
see.
If
I
can
find
the
answer
to
that,
for
you.
C
Oh
gosh
yeah,
I
think
what
they
ended.
E
But
you
can
you
can
when
you
put
one
white
pixel
into
an
image
and
train
it,
your
your
model
completely
garbage.
C
C
So
the
sort
of
next
the
next
kind
of
iteration
I've
been
working
on
with
custom
labels
is
trying
to
count
stitches
and
an
image,
because
it's
one
of
the
problems
I
do
as
a
knitter.
I
sit
here
and
I
forget
to
count
how
many
rows
I've
knitted
it'd,
be
so
much
easier.
If
I
could
just
take
a
picture
with
my
phone
and
it
would
tell
me
you've-
knitted
50
rows,
rather
than
have
to
sit
there
and
actually
count
them.
C
So
I've
actually
started
playing
around
because
I
showed
you
you
can
draw
bounding
boxes
around
things
in
the
image,
so
I've
been
trying
to
try
and
testing
out
a
little
bit.
I
haven't
got
anything
workable
yet,
but
taking
a
picture
of
some
knitting
and
then
drawing
it,
takes
forever
drawing
bounding
boxes
around
each
of
the
stitches
to
try
and
teach
it
to
represent.
This
is
each
of
these
little
v's
is
another
stitch
and
then
to
count
how
many
of
them
stacked.
A
A
C
C
E
C
C
That's
what
I
think
I'm
gonna
have
to
do
is
is
actually
like,
try
and
get
it
down
to
just
edges
as
much
as
possible.
Make
it
grayscale
get
it
down
to.
You
know,
really
enhance
the
edges,
chuck
everything
else,
but
yeah
it's
hard,
because
you
know
unless
you're
lying
it
flat
and
taking
it
perfectly
flat.
It's
gonna
have
to
account
for
things
not
being
perfect.
F
And
another
idea,
would
it
make
sense
to
like
you
have
rather
large
swatches
in
your
input
photos
right
like?
Would
it
make
sense
to
take
each
of
those
and
say
quarter
it
and
and
then
train
on
those
smaller
things
like
yeah.
C
C
So
really
was
only
actually
around
450
images
and
then
you,
you
cheat
it
by
by
doing
the
90
degree
thing,
but
yeah,
that's
yeah.
If
I
wanted
to,
if
with
the
really
big
ones,
and
in
fact
I
think
some
of
them
probably
are,
I
think
people
would
take
a
picture
of
one
sweater
and
just
like
take
a
few
different
photos.
You
know.
F
Yeah
I
mean
that
that
leads
me
to
the
next
question
of
what
are
the
operations
that
make
sense
to
apply
to
pictures
that
that
help
with
accuracy
or
with
the
the
model
and
which
which
don't
like,
if
I
mirror
it,
does
it
help
or
like
micah
said?
If
I
make
one
pixel
wide,
does
it.
C
That's
where
we
would
have
to
experiment,
I
I
I
mean
the
the
grayscale.
The
biggest
difference
I
noticed
with
grayscale
or
color
was
on
the
training
time
when
I
left
color
in
that
job
ran
for
four
hours.
It
ran
four
times
as
long
and
it
you
know
it
didn't
result
in
any
better
accuracy,
so
get
rid
of
color.
I
I
think
color's
a
no-brainer
get
rid
of
color
unless
you're
trying
to
train
it
to
recognize
particular
colors.
C
But
beyond
that
yeah
yeah.
Beyond
that,
I
need
I
need
to
experiment.
I
need
to
actually
see
what
makes
a
difference.
You
know
what
is
also
how
many
stitches
do
I
need
in
the
image
like?
Is
it
looking
like?
Is
it
reducing
it
to
just
a
texture,
or
is
it
actually
looking
at
the
individual
like
cellular
units
of
the
stitches?
I
don't
know
I
need
to
do
one
where
everyone
is
really
close
up
and
one
where
maybe
they're
a
little
bit
further
away
upside.
C
A
A
No
that
wasn't
this
was
amazing
and
it
was
a
really
great
insight
into
like
the
the
learning
step
with
machine
learning.
This
sounds
odds
but
how
to
approach
this
correctly,
because
I
was
like
reading
articles
and
reading
the
news
and
was
like
yeah.
I
totally
want
to
like
find
a
use
case
for
myself
and
picking
something
which
is
not
the
other
rich
I.t
stuff,
which
makes
your
head
explode.
Sometimes,
I
think
knitting
is
a
great
experience,
and
now
I
know
everything
around
it.
I
probably
no.
A
I
won't
start
it,
but
it's
it's
interesting
to
see
how
how
many
different
patterns
and
how
the
recognition
works.
So
I'm
really
grateful
that
you
joined
us
today
and
thanks
for
your
time,
and
I
think
we
had
a
little
fun
as
well.
C
If
there
was
only
one
lesson
people
could
take,
I
think
that's
the
one
for
me
is
that
when
I
have
a
new
tech,
you
see
the
tutorials
and
they're
always
like
build
a
to-do
list
or
something
I
don't
get
excited
about.
That
like.
I
need
an
actual
use
case
that
I'm
interested
in
to
motivate
me.
If
I
wasn't
interested
in
the
output
of
this,
I
would
I
would
have
never.
I
would
have
given
up
very
early,
so
I
think
that's
if
you
want
to
learn
something
and
be
motivated
to
learn
it
yeah.
C
You
need
to
find
a
use
case,
you're
interested
in.
Will
you
get
gitlab
socks?
Well,
my
hourly
rate's
really
expensive.
So
you
know
yeah,
you
have
to
knit
your
own.
C
G
G
I
I
got
everything
I
need
to
knit
here
like
not
in
my
room,
but
I
can't
just
see
my
boyfriend's
stuff,
so
amazing.
C
G
G
C
Well,
the
the
talk
about
comp
being
like
a
computer
language
is
a
really
interesting
one
because
it
has
loops.
It
has
four
loops
while
loops
it
has.
It
has
a
lot
of
the
constructs
and
in
fact
you
can
knit
elementary
cellular
automata,
which
means
I
can
knit
something
that
is
touring
complete.
C
So
I
have
a
whole
other
nerdy
talk
on
just
that
aspect
of
it,
but
I
actually
gave
a
version
of
that
talk
to
a
group
of
knitters
to
an
old
woman,
a
group
of
old
women
at
a
knitting
camp,
where
I
taught
them
that
they
knew
more
about
programming
than
they
thought
they
did,
because
they
were
executing
programs
when
they
follow
a
knitting
pattern.
You
know:
do
this
15
times:
that's
that's!
That's
a
for
loop.
You
know
they
have
a
counter.
They
increment
it.
They
just
didn't
call
it
that
so
yeah.
C
C
G
Yes,
there
is
in
in
bavaria,
there
is
actually
a
law
that
mandates
that
mandates
that
companies
can't
forbid
you
to
have
a
beer
for
lunch.
G
B
G
B
But
if
you
want
to
have
the
highest
vacation
days,
then
you
should
go
to
outsport.
Oxford
has
the
most
vacation
days
that
you
can
get
in
germany,
because
the
barrier
from
has
the
most
in
the
states
yeah.
C
B
G
If
we're
nerding
about
vacation
days,
I
could
just
go
wherever
I
like,
because
my
position
is
100
100,
remote
and
german
labor
law
actually
mandates
that
you
follow
the
vacation
days
of
the
place
you're
staying
in,
not
the
place
your
company
is
so.
If
I,
for
example,
visit
my
my
parents
in
bad
woodenberg
and
there
was
a
holiday
in
bad
workberg,
I
would
have
to
adhere
to
that
holiday.
I
was,
I
wouldn't
be
able
to
work.
A
A
G
C
C
A
A
C
G
Then
some
german
accents,
because
there's
just
so
damn
many
accents
in
germany
like
I
I
I
am
100
sure
that
for
some
people
from
the
south
of
bavaria
who
talk
a
totally
different
dialect
than
people
from
austria,
I
couldn't
tell
if
they're
from
austria
or
from
bavaria,
I
would
have
no
f.
No
damn
idea.
G
Oh
wow,
I
I
I
grew
up
near
lake
constance,
so
at
the
very
very
south
of
germany
and
my
mother
is
from
the
southwestern
part
of
germany
from
from
baden,
where
they
talk
alimanis,
which
is
almost
like
swiss
german.
So
I
understand
swiss
german
better
than
some
people
from
bavaria.
That's
that's
like
not
a
problem
for
me,
which
is
always
confusing
to
swiss
people,
because
they
don't
expect
germans
to
understand
them.
F
In
austria,
there
are
two
brands
of
apple
juice
and
orange
juice,
obi
and
kapi,
and
both
companies
produce
both
apple
and
orange
juice,
right
and
so
in
one
part
of
austria.
It's
common
to
call
orange
juice,
cuppy
and
apple
juice,
obi
and
in
another
part
of
austria.
It's
exactly
the
other
way
around
right.
A
A
Is
something
you
cannot
get
in
germany,
because
in
austria
it's
like
the
border
is
for
free,
it's
tap
water,
and
when
you
try
to
order
that
in
germany,
you
need
to
pay
for
the
water
and
they
don't
understand
it.
And
it's
just.
It
needs
to
be
like
the
the
mineral
water
with
the.
A
It
needs
to
be
like
this
up
for
sure
thing
and
I'm
like
yeah,
but
I
just
want
overlapped
them
yeah.
We
don't
have
that
after
after
I
think
one
or
two
years
trying
it.
I
just
said.
Okay,
then
give
me
something
else.
Give
me
a
johannes
special
something
different.
G
That
each
place
like
each
restaurant
has
to
provide
you
with
a
light
expresso
with
tap
water
for
free
at
all
times.
It's
like
back
from
the
day
from
the
days
where
you
couldn't
just
hop
in
your
car,
it's
like
from,
I
think,
1905
or
even
older,
that
more
where
it
was.
Probably.
B
F
B
C
G
A
C
Yeah
so
well,
we
I
had
my
first
official
german
lesson.
Last
week
I
studied
german
in
high
school
in
the
u.s,
and
I
lived
in
krefeld
for
one
summer
near
dusseldorf,
but
I'm
doing
new
classes
again
because
it's
lockdown,
I
don't
get
to
talk
to
anyone,
but
the
woman
at
the
supermarket,
and
my
teacher
was
very
impressed
that
I
knew
reinhardt's
good
box.
I'm
like
come
on
like
is.
C
Well,
that
would
be
really
interesting
because
someone
put
me
onto
this
music
group
called
de
huns
krippen,
who
I
think
are
a
bavarian
rock
band
they're
on
youtube.
It's
a
slang
term
tihonskripple,
I
think,
and
they
sing
they
have
a
song
called
byresha
vault
and
I
understand
like
ten
percent
of
it,
and
the
lyrics
are
in
obviously
buy
irish
and
google
translate
does
not
handle
irish
so
yeah.
I
I
need
more
machine
learning
models
that
handle
regional
slang,
yeah
them
they're,
really
good.
B
It's
a
german
cover.
It's
a
drum
traveler
from
my
family
romance.
A
I
was
just
like
trying
to
serve
some
time
for
chris
for
getting
a
beer
and
then
coming
back,
but
I
just
wanted
to
say
thank
you
again
and
give
a
shout
shout
out
to
chris
and
next
week
we
will
dive
into
continuous
profiling
with
frederick
from
polar
signals.
This
is
the
next
next
week's
topic
totally.
A
But
maybe
not-
I
don't
know,
let's
see
about
this
and
with
that.
C
Shirt,
our
summit
is
in
two
weeks
from
today.
So
if
you
don't
know
about
it,
it's
our
big
event.
It's
online
this
year,
it's
free
it's
on
the
9th
and
10th
just
aws
summit,
and
you
can
register
and
you'll
be
able
to
watch
all
the
sessions
afterwards
as
well
on
demand.
So
it's
worth
signing
up
just
so
you
get
access
to
all
of
them.
We've
got
got
we
last
year.
I
think
they
had
like
50
sessions
this
year.
We've
got
like
250.
C
and
we
have
an
entire
german
language
track
as
well
this
year
that
my
colleague
dennis
has
been
curating,
so
there's
a
whole
18
tracks.
Alfdeutsch,
I
don't
know
what
accent
the
speakers
will
have,
but.
A
Yeah
and
we
will
share
everything
we
talked
about
today
in
the
blog
post
later
on
and
yep.
So
thanks
for
listening
and
bye
on
youtube.