►
Description
KEYNOTE: Making Magic in the Cloud with Node.js at Google - Justin Beckwith, Google
A
How's
it
going
everybody
thanks
for
coming
today.
My
name
is
Justin
Beckwith
I'm
gonna
be
talking
about
nodejs,
talking
about
Google
and
a
lot
of
the
things
that
we're
doing
at
Google
to
try
to
make
it
a
better
place
for
nodejs
developers.
Oh
and
of
course,
when
I
talk
about
just
a
little
bit
of
magic
along
the
way
and
know
okay
cool
all
right,
so
before
we
get
into
some
of
this
stuff,
you
know
why
I
want
to
talk
a
little
bit
about
why
at
Google
we
care
about
nodejs.
A
A
Another
big
thing
we've
been
doing
is
a
lot
of
contributions
into
v8
and
into
nodejs
core
to
make
it
a
better
platform
for
no
developers,
but
really
what
this
all
comes
down
to,
and
it
sounds
a
little
corny,
but
we
believe
in
the
web
as
much
as
any
other
company
out
there.
Google
was
founded
on
the
idea
that
we
exist
on
the
web,
we're
all
there
together
and
ultimately,
we
want
to
make
it
a
better
platform
for
everyone
to
come,
build
javascript
based
applications,
and
this
means
tooling
at
development
time.
A
It
means
tooling,
what's
you're
running
in
production,
it
means
tooling
to
help
you
get
into
production
and
really,
when
you
look
at
what
we've
been
doing
for
the
last,
you
know
10
15
years
and
the
number
of
front-end
apps
we're
all
the
way
up
to
nearly
two
and
a
half
billion
lines
of
JavaScript
in
our
repository.
So
all
of
the
tooling
that
we
need
that
we
need
to
go
build
for
ourselves.
We
want
to
make
it
so
everyone
else
can
go
use
this
as
well.
A
The
general
idea
is
that
you
can
pick
up
your
application.
You
write
it
in
a
particular
way.
Typically,
we
like
to
run
things
in
docker
containers
is
the
way
that
we
like
to
do
it,
and
then
you
can
run
your
app
inside
of
Google's
data
centers
right
along
applications
like
Gmail
search
Maps,
all
the
sorts
of
apps
you've
been
using
for
some
time
and
we
isolate
everything
and
you
can
run
it
on
our
infrastructure.
So
this
isn't
new.
You
know:
we've
had
virtual
machines
for
a
long
time.
A
A
Today,
we're
getting
asked
to
do
a
lot
more
than
we
used
to
get
asked
to
do
we're
asking
we're
getting
asked
to
do
more
with
DevOps
if
we're
building
our
applications
now
we're
getting
asked
to
manage
them
in
the
cloud
and
then
we're
getting
asked
to
do
more
in
terms
of
the
richness
of
our
applications
and
how
our
users
interact
with
them.
We're
getting
asked
to
look
at
pictures
like
this
and
figure
out
is
this:
what
is
this
a
picture
of
what
kind
of
animal
is?
A
A
You
know
we
use
our
phones
talk
to
Siri,
okay
Google,
and
it's
expected
that
I
can
talk
to
my
device
and
have
it
understand
what
I
was
trying
to
say
after
we
have
the
audio
after
we
have
those
images,
we're
expected
to
be
able
to
draw
user
intent
and
meaning
from
the
data
that
we
got
out
of
that,
and
this
is
a
huge
challenge
for
anyone.
That's
tried
to
step
up
and
build
these
kinds
of
applications
and
it's
a
big
problem.
We've
been
trying
to
solve
at
Google
for
a
number
number
of
years.
A
A
Videos
sound
not
direct
input
with
thumbs
unless
that's
what
you
still
want
to
do
so
to
help
make
this
easier
part
of
what
we
started
to
do
is
expose
our
what
we
call
our
cloud
machine
learning,
api's
and
essentially
these
are
api's
that
expose
technology
we've
been
building
at
a
number
of
years
for
our
own
products
for
things
like
Google,
Translate,
Google,
Photos,
inbox
and
exposing
those
as
api's.
You
can
pick
up
and
you
can
start
using
with
pre-trained
ml
models.
A
If
you
want
to
go
down-
and
you
want
to
get
deep
into
machine
learning,
we
have
a
number
of
tools
that
make
that
easier
with
our
cloud
machine
learning
platform
tensorflow
is
a
good
thing
to
go
check
out,
but
the
idea
with
these
api's
is,
if
you
come
up
and
have
a
simple
task
where
you
just
want
to
look
at
that
picture
and
figure
out.
What's
in
it
is
there
text
you
want
to
analyze
speech
there,
Pat
they're
tailored
for
that
specific
type
of
workload?
The
whole
idea
is
that
we
want
to
make
building.
A
We
want
to
machine
learning
stuff
easier,
because
today
it's
just
way
too
hard
to
pick
up
and
get
started
with
on
your
own,
and
you
think
about
it.
You
know.
If
we
have
these
kinds
of
tools,
you
know
it
used
to
be:
you'd
have
to
spend
years
doing
research
to
be
able
to
do
that
kind
of
image,
analysis
or
years
doing
research
to
do
speech
transcription
with
these
kinds
of
api's.
We
could
go
out
there
and
we
could
build
the
next
set
of
apps
that
are
gonna,
go
out
and
change
the
world.
A
I
use
them
to
build
an
app
about
my
cat,
so
that
actually
is
my
cat
getting
back
a
little
bit
to
why
Google
cares.
Our
core
mission
is
to
organize
the
world's
information
you
make
it
universally
accessible
and
useful,
and
for
me
this
means
asking
questions
and
it
can
be
serious
questions
or
they
can
be
kind
of
silly
questions,
or
they
can
be
arguments
that
you
had
with
a
co-worker
who
sits
next
to
you
and
every
time
I
give
this
talk.
My
coworker
John
feels
a
little
bit
bad
because
I
prove
him
wrong
with
data.
A
So
my
very
very
important
question
to
go:
ask
is
which
animals
are
cuter
dogs
or
cats.
You
know
very
important,
and
where
can
we
find
the
kind
of
data
to
help
us
answer?
Those
sorts
of
questions?
Can
we
get
quantifiable?
You
know
concrete
data
to
help
back
up
my
assertion.
My
assertion
is
that
dogs
are
cuter
by
the
way
and,
of
course,
I
decided
when
you
want
to
go
to
a
place
where
you
know
the
great
thinkers
of
our
time
get
together
and
ask
the
great
questions.
A
Of
course,
I
went
to
read
it
in
this
specific
case,
I
decided
to
go
to
our
/aa,
pretty
much
like
the
home
base
for
cute
pictures
of
animals
right
and
since
I
was
I
was
on
the
job.
This
is
during
work
time.
I
was
trying
to
organize
the
world's
information
and
make
it
accessible
so
I
figured.
It
was
okay
to
do
build
this
app
at
work,
micro-service,
style
architecture,
running
on
App
Engine.
We
have
a
front-end,
that's
serving
the
the
front-end
content.
A
A
back-end
is
doing
the
processing
going
to
our
/sha
out
to
Reddit
downloading
all
the
pictures
of
dogs
and
cats
for
a
couple
of
days
and
then
processing
them
with
the
cloud
vision
API
to
figure
out
if
it
was
a
dog
or
if
it
was
a
cat.
So
if
there
are
more
dogs
than
we
know,
that
I
was
right.
If
there
are
more
cats
on
the
front
page,
then
I
know
I'm
wrong.
We
can
do
it
so
to
do
this.
A
A
We
can
identify
not
safe
for
type
content,
which
is
very
important
and
they're
prepared
for
this
talk,
because
I
don't
want
to
get
fired,
can
tech
things
like
logos,
landmarks,
all
that
kind
of
cool
stuff
and
we're
going
to
see
some
of
this
code
in
a
bit?
So
this
is
enough
talk,
let's
actually
go
and
run
the
demo,
so
this
is
cloud
cats,
so
right
now
what's
happening.
Is
that
we're
going
out
to
reddit.com
we're
going
to
the
first
three
pages
of
our
/oo
I'm?
A
A
If
you
want
to
check
it
out,
we
have
dogs
coming
out
to
a
bit
of
an
early
lead,
but
cats
are
catching
up,
got
a
cat
with
a
funny
thing
on
its
head,
crazy-looking
cat,
oh
no
I,
might
be
getting
wrong
there
more
cats
right
now,
I
think
we
misclassified
a
handful
of
baby
turtles.
Sometimes
that
will
happen.
It's
not
it's
not
perfect.
We
all
make
mistakes
so
Christmas
cat
we've
got
a
sweater
dog
and
there
we
have
it.
Dogs
win.
A
So
if
you
want
to
go,
try
this
out
yourself,
we
make
the
cloud
vision,
API,
accessible
and
a
lot
of
you
a
few,
the
others
that
we're
gonna
take
a
look
at
today
using
the
Google
cloud,
NPM
module-
and
this
is
just
one
of
the
many
NPM
modules
we
have
out
there-
that
we've
started
to
build
over
the
last
couple
of
years
that
are
exposing
our
services
using
the
API
is
actually
pretty
pretty
simple.
You
pretty
much
pass
it
uploaded,
pass
it
a
path
to
an
image
locally
say:
detect.
A
Labels
is
what
I
used
in
this
case,
because
I
didn't
want
to
do
all
the
other
analysis.
I
wanted
to
keep
it
fairly
fast
and
it
will
really
just
return
a
list
of
labels
and
in
my
case
it
would
return
dog
cat,
canine
golden
retriever.
These
kinds
of
this
kind
of
information,
it's
pretty
easy
to
pick
up
and
start
using.
So
we
got
our
first
answer.
Dogs.
Let's
move
on
to
the
next
great
question
that
we
want
to
go
answer.
What
is
the
happiest
subreddit
on
the
front
page?
A
You
know
I'm
a
real
positive
guy,
I,
don't
like
negativity
and
the
last
few
weeks
on
the
internet,
there's
been
a
little
negativity
and
a
few
things
I
wanted
to
avoid,
and
so
I
want
to
create
curated
set
of
subreddits
based
on
what
was
going
to
make
me
the
happiest
and
I
decided.
This
is
a
good.
This
is
a
good
thing
for
Google
to
go
figure
out.
You
know
this
is
a
good
machine
learning
tasks,
so
I
went
out,
took
the
top
50
subreddits.
A
These
are
all
the
ones
that,
on
the
front
page,
if
you
haven't
used
read
in
a
subreddit,
is
like
a
forum
where
you
can
go,
discuss
various
topics
and
I
took
all
of
them
downloaded
the
first,
the
first
four
pages
of
content.
All
of
the
comments
from
the
first
four
pages
went
down
into
various
levels
of
depth,
brought
it
all
locally
and
analyzed
170
thousand
comments.
A
This
was
data
from
about
a
week
ago
that
we
went
so
that
we
could
do
some
analysis
on
it
and
if
you
want
to
see
the
source
code
for
how
I
did
this
analysis,
the
link
to
the
github
page
is
right
there
you
can
go
check
it
out.
So
it's
a
partisan
to
figure
out
if
it
was
happy
or
not.
I
used
the
Google
cloud
natural
language
API.
Now
this
API
is
capable
of
doing
a
lot
of
things.
A
It
can
extract
meaning
structure,
people
places
what
I
used
it
for
specifically
was
the
sentiment,
analysis
API
that
was
baked
inside
of
it
so
using
the
NLP
API
for
sentiment,
kind
of
similar
to
what
you
saw
with
the
vision
API
you
just
import
the
module,
use,
the
language
API
and
then
really
just
pass
it
a
string
and
say
detect
sentiment,
and
so
what
I
did
after
I
processed
all
these
records,
I
needed
someplace
to
store
it
so
170,000
comments.
A
lot
of
processing
I
decided
to
store
the
data
in
bigquery.
A
To
do
the
analysis,
so
bigquery
is
Google's
hosted
data
warehouse
solution.
The
idea
is
that
you
can
stream
large
large
large
amounts
of
data
into
it
and
then
use
standard
sequel
to
query
over
top
of
it.
If
you're
gonna
do
any
kind
of
large-scale
data
analysis,
it's
a
great
tool
and
one
of
my
favorite
things
you
can
find
another
another
one
that
talks
that
I've
given
in
the
past.
A
We
have
these
things
called
public
datasets,
so
we
actually
have
all
of
github
loaded
into
bigquery,
and
you
can
go,
do
really
interesting
queries
over
that
or
queries
over
all
the
content
of
hacker
news
or
other
more
useful
things
like
census,
data
or
disease
data,
lots
of
cool
stuff
out
there
to
go
play
with,
but
I'm
using
it
to
figure
out
the
happiest
place
on
the
internet.
So
once
I
had
all
this
content
again,
I
mentioned
a
standard
sequel
to
query
over
it
I'm
using
the
polarity
and
the
magnitude.
A
These
are
the
two
things
that
come
back
from
the
natural
language
API.
When
you
want
to
detect
sentiment-
and
they
tell
you
polarity
is-
was
it
was
it
happy?
Was
it
sad
and
magnitude
is
the
degree
to
which
it
was
happier?
Sadly,
extremeness
of
that
emotion,
a
very
crude
way
to
combine
those
to
figure
out
general
happiness
is
just
multiply
them
together.
A
So
I
grabbed
all
of
those
ordered
by
descending
trying
to
find
the
happiest
subreddit
and
the
results
listen
to
this
and
music
and,
of
course,
Photoshop
battles,
because
who
doesn't
like
a
good,
a
good
Photoshop
battle,
and
so
I
found
the
happiest
things
that
I
could
do
when
I
was
on
reddit
after
after
the
election.
If
I
wanted
to
stay,
happy
was
just
completely
focused
on
music,
focused,
subreddits
and
actually
did
trim
my
down
based
on
the
results,
so
that
was
just
the
top
five
results.
A
What
if
we
want
to
see
a
better
visualization
of
this
data
cuz,
we
did
analyze
fifty
subreddits
to
do
that.
I
use
Google,
Data
studio,
and
really
this
is
just
a
nice
dashboarding
tool
that,
like
lets
me
create
reports
from
things
like
bigquery.
If
you've
data
in
Google
sheets,
my
sequel,
analytics
of
using
Google
Analytics
for
your
front-end,
you
can
dump
it
into
this
and
create
nice-looking
reports,
and
so
I
just
did
that
real
quickly
and
I
got
this
kind
of
interesting
analysis
of
the
happiness
for
the
various
subreddit.
So
you
can
see.
A
Listen
to
this
is
actually
twice
as
happy
as
the
next
subreddit
down
and
you
can
see
pretty
quickly
when
it
levels
out
the
place
you
want
to
stay
away
from.
If
you
want
to
stay,
happy,
stay
away
from
announcements,
explain
it
like
I'm,
five
and
world
news.
World
news
is
a
terrible
place,
don't
go
there
and
today,
I
learned
was
the
most
man.
It
was
like
nice
and
in
the
middle.
So
the
answer
that
I
got
is
stick
to
the
music
focus
subreddits.
Nobody,
who's
listening
to
good
music
is
angry
with
the
world.
A
That's
what
I
learned
alright
so
now
getting
into
perhaps
the
hardest
question
for
us
to
go
solve
and
maybe
a
little
bit
of
a
harder
one
that
we
can't
solve
machine
learning.
But
why
isn't
my
code
working?
This
was
something
I
had
to
ask.
As
I
went
through
preparing
for
this
talk
several
times
and
I
was
trying
to
build
some
of
these
demos,
and
this
gets
into
one
of
the
ways
where
we're
trying
to
help
make
it
easier
to
build
and
maintain
nodejs
applications
both
locally
and
in
cloud.
A
So
just
to
kind
of
show
some
of
this
off.
What
I'm,
gonna
do
is
show
this
application
that
I've
got
running
locally.
That
uses
the
cloud
vision,
API
I'm
running
it
on
run
on
localhost
and
I'm.
Just
gonna
take
a
quick
picture
and
what
I'm
doing
is
feeding
this
through
the
vision,
API
and
starting
to
try
to
figure
out
what
I'm
doing
and
look
it
detected
that
I'm
doing
a
speech?
A
That's
actually
pretty
cool
I
haven't
seen
that
one
before
and
then
it
found
my
face
and
we
can
actually
go
deeper
with
that
and
figure
out.
What's
going
on,
like
what
parts
of
my
facial
features
are
available,
but
one
of
the
cool
things
we
started
to
do
is
taken
connect
the
development
tools
we
have
inside
of
Chrome
for
debugging
and
connect
those
directly
with
what
you're
able
to
do
with
your
no
js'
application.
A
So
you
can
see
here
it
detected
that
I'm
actually
running
a
node
process
and
I
could
directly
now
connect
to
that
process
and
debug
my
both
my
front-end
and
back-end
code
together
inside
of
chrome.
So
this
is
the
application
using
the
vision,
API
I'm,
just
gonna
come
in
and
just
like
I
would
do
normally
for
front-end
development
set
a
breakpoint
take
another
picture:
let's
get
a
water
bottle,
it
may
detect
that
it
may
not
we'll
see
how
it
goes
and
we're
able
to
hit
our
breakpoint
and
introspect
all
the
interesting
things
that
are
happening.
A
It
found
my
face.
It
got
five
labels
this
time,
I
thought
I
had
alcohol,
so
it's
also.
It's
also
a
it's
also
a
predictor
of
future
things
that
are
happening.
So
in
that
case
it's
actually
correct
and
we
can
do
local
debugging
using
chrome,
dev
tools
to
dig
into
our
back-end
processes.
So
this
is
all
well
and
good
for
local.
But
what,
if
you
have
this
application
out
running
in
production,
the
same
kind
of
thing?
What
am
I
gonna
take
a
picture
of
now.
Let's
do
this
thing,
I've,
no
idea
what
it's
gonna
say.
A
That
is
so
this
time
you
can
see,
and
it's
gonna
pick
up
a
few-
those
what,
if
I,
what
if
I,
want
to
figure
out
what
I
want
to
say,
something's
going
wrong
and
prod.
How
do
I
do
bug
this
without
affecting
all
of
my
users?
One
of
the
new
things
that
we
put
out.
There
is
called
Google
Cloud
stack
driver
debugger,
and
this
is
a
passive
debugger
that
actually
lets
you
capture
state
of
your
app
at
runtime,
without
affecting
perf,
so
I'm,
just
gonna
come
down,
find
the
exact
same
breakpoint.
A
We're
gonna
go
ahead
and
run
this
run
this
one
more
time
last
picture
I'm
gonna
take,
and
you
can
see
this
is
gonna
complete.
It's
gonna
finish.
It
didn't
affect
performance.
It's
still
found
my
face,
my
facial
hair
and
the
fact
I'm
giving
a
speech,
but
you
can
see
we
just
hit
the
breakpoint
and
it
took
a
snapshot
of
the
current
state
of
all
my
variables
in
production.
I
could
also
do
this
with
a
conditional
breakpoint
if
I
know,
there's
some
impossible
condition
that
causes
a
blow-up.
A
In
my
app
and
then
I
can
come
in
look
through
the
state
of
all.
My
local
variables
find
my
face,
find
landmarks
labels
even
go
as
deep
as
figuring
out
am
I
happy.
Am
I
sad
the
position
of
my
ears
my
eyes,
my
forehead,
all
this
cool
stuff,
but
the
bigger
point
is
that
I
can
now
do
this
in
prod,
without
affecting
a
performance.
I
can
debug
on
the
client
and
debug
on
print
reduction
on
the
server,
so
everything
I
just
talked
about
is
there
most
of
its
open-source.
A
You
can
find
the
docker
images
we
use
on
App
Engine
for
a
runtime,
the
API
client,
the
npm
modules.
All
this
stuff
is
out
on
github,
so
please
go
check
it
out.
If
you
want
to
see
the
specific
sources
for
any
of
these
demos
or
any
more
documentation
on
the
mla
api's
they're.
All
here
go
learn
more
and
thank
you.