►
From YouTube: English Google SEO office-hours from April 23, 2021
Description
This is a recording of the Google SEO office-hours hangout from April 23, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome webmasters of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
here
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
jump
in
and
ask
their
questions
around
search
and
we'll
try
to
find
some
answers.
A
As
always,
bunch
of
stuff
was
submitted
on
youtube.
We
can
take
a
look
at
that
and
maybe
to
to
get
things
started.
We
can
take
some
live
questions
as
well.
I
see
bruce
on
if
I
got
your
name
right,
you
have
your
hand
raised.
Do
you
want
to
get.
A
A
Pretty
good,
let
me
just
see
bushan
are
you
here
somewhere.
B
Yes,
john,
can
you
give
me
one
minute?
Okay,.
A
C
Yes,
hello
again,
I
noticed
that
he
rolled
out
a
new
feature
in
the
search
console.
Some
page
experience
tab
and
I
wonder,
there's
there's
a
counter
of
views
and
I
can't
correlate
it
to
to
any
other
metric.
I
mean
for
us,
it
shows
like
12
millions
but
of
views,
but
then
in
the
in
the
serp
in
the
strip
tab
which
was
used
in
the
search
results.
It's
like
three
times
as
much.
So
what
does
this
number
like
correspond
to?
In
the
page
experience.
A
Okay,
I
don't
know
offhand
without
taking
a
look,
but
my
my
understanding
is
the
the
number
of
impressions
that
we
show.
There
are
specifically
for
the
pages
that
reach
the
good
threshold,
so
the
pages
that
we
can
kind
of
classify
in
that
good
bucket
and
if,
if
a
lot
of
your
site
is
in
the
good
bucket,
then
probably
a
lot
of
your
pages
will
be
in
there.
But
if
parts
of
the
views
do
not
map
into
that
good
bucket,
then
that's
probably
that
difference
that
you're,
seeing
there
yeah.
C
C
Thank
you
yeah,
it's
a
lot
of
work,
but
it's
still
like
on
only
a
third
of
what
we
get
in
the
serpent
so.
A
Yeah,
I
I
don't
know
if
those
are
meant
to
map
one
to
one.
What
yeah
I
I'd
have
to
ask.
C
Okay:
okay,
on
some
days,
it's
something
like
two
times
as
much
on
some
days
on
some
days.
It's
like
three
and
a
half
times.
It's
not!
I
mean
it
shows
the
next
circuit.
C
Yeah,
but
I
don't
like
that's-
that's
still
still
like
too
much,
I
think,
of
a
difference:
okay,
okay,
anyway.
Okay,
thank
you.
Thank
you
very
much
and
I
have
another
question
sure.
Thank
you
in
the
courage
tab.
I
said
no,
not
coverage.
Where
is
it?
C
C
Yes,
mobile
is
a
visibility.
I
think
it
is
or
something
like
that.
Sorry,
no
core
vitals,
I'm
sorry
core,
regardless
and
for
the
past
two
years,
the
way
we've
been
trying
to
optimize
for
google,
the
the
graph
that
that's
you
know
there
to
show
you
the
good
pages
and
the
pages
that
need
some
attention.
C
It's
been
steady
for
for
for
those
years,
but
in
the
past
two
months
it
just
started
like
going
up
and
down-
and
I
I
don't
know-
I
thought
it
was
something
to
do
with
that
scroll
budget
thing.
But
I
don't
know
we
just
we.
We
have
zero
back
pages.
We
have
zero,
you
know
yellow
ones.
We
all
all
of
our
all
of
our
pages
are
in
green,
but
it's
it
went
from
45
000
to
like
27
000,
and
now
it's
like
30.
So
I
wonder
why
it
is.
A
Yeah,
I
I
think
that's
a
bit
confusing
so
so.
Basically,
what
happens
in
the
aggregate
reports
is
that
we
show
the
information
from
a
significant
sample
of
the
pages
of
the
website
and
depending
on
the
website,
that
can
be
almost
all
pages,
and
sometimes
it's
it's
less
than
all
pages
and
that
sample
can
change
over
time.
A
So,
for
example,
when
you're
looking
at
speed,
if,
if
your
pages
do
not
change
with
regards
to
speed,
you
might
still
see
the
number
of
pages
going
up
and
down
there
and
that's
just
based
on
us
using
a
slightly
different
sample.
A
So
essentially,
what
you
should
watch
out
for
there
is
the
errors
that
are
mentioned
there
and
look
at
a
sample
of
those
errors
that
are
mentioned
and
work.
Based
on
that,
I
wouldn't
assume
that
the
total
number
that
is
shown
there
in
these
aggregate
reports
maps
one
to
one
to
the
total
number
that
you
have
in
index
coverage,
for
example,.
E
A
Fantastic
cool,
rahul.
F
A
F
So
I'm
having
multiple
questions.
The
first
question
is
about
301
redirect.
So
my
question
is
that
I
have
implemented
301
redirect
for
a
page
and
it's
working,
fine.
The
page
is
being
redirected
to
the
new
url,
but
the
thing
is:
google
is
not
selecting
the
new
one
as
a
canonical
url
and
it
is
indexing
both
the
pages
two
different
pages.
So
what
can
be
the
should?
I
do
anything
externally
to
provide
that
this
is
one.
The
new
url
is
the
one
as
canonical
or
anything
like
that.
A
All
of
that
can
play
in,
and
it
can
happen
that
you
redirect
one
page
to
another
page,
but
our
systems
say
the
canonical
should
remain
the
original
url
and
the
way
that
you
can
help
us
to
switch
that
over
is
to
just
make
sure
that
everything
is
aligned,
that
all
of
your
internal
links
are
pointing
at
the
new
url.
Your
external
links
are
pointing
at
the
new
url
in
the
sitemap
file
and
all
of
the
annotations
that
you
have.
A
F
Okay,
another
question
is
that
I've
come
across
a
video
of
matt
cutts
for
link
building
that
video
was
uploaded
11
years
ago.
So
my
question
is
that
in
the
video
he
proposed
that
if
we
do
like
answering
questions
on
forums
or
communities
like
quora,
it
can
help
building
authority
or
kind
of
you
know
backlinks
for
your
site.
So
is
it
helpful
in
this
period
also.
A
I
I
don't
remember
what
all
matt
said.
So
it's
hard
for
me
to
say
in
general
a
lot
of
the
sites
that
use
user
generated
content.
They
use
a
nofollow
on
all
of
their
links.
So
it's
it's
rare
that
you
would
get
any
kind
of
advantage
out
of
that.
F
Okay,
one
more
question:
we
are
having
a
product
page
in
that
we
are
having
multiple
products
on
same
category.
So
what
is
happening
is
all
the
products.
F
A
A
No,
I
it
sounds
sounds
essentially
just
like
you
have
different
parameters,
and
some
parameters
are
not
critical
and
usually
what
happens
is
we?
We
learn
which
parameters
are
not
critical
and
we
tend
not
to
focus
on
them.
You
can
also
tell
us
about
that
in
the
url
parameter
handling
tool.
If,
if
that's
something
that
where
you
have
a
clear
map
of
which
parameters
are
actually
important
and
which
ones
aren't,
but
it's
not
thin
content.
F
G
Thank
you
sure,
melissa.
H
Hello
hi:
I
am
a
director
of
engineering
for
the
atlantic
and
have
a
couple
questions
about
amp
and
google
news
for
you.
H
No
I'm
curious
because
so
we've
got
the
json
ld
for
the
news.
Article
rich
results
on
all
of
our
pages
and
we're
seeing
some
different
things
between
the
news.google.com
results
for
articles
and
the
default
search,
and
so
I
was
wondering
if
they
use
different
algorithms
or
like
pull
metadata
from
different
places
to
populate
the
results.
A
That's
okay,
it
sounds
like
it
sounds
like
we
are
based
on
what
you've
seen
so
for
for
the
most
part,
we
would
crawl
the
same
way
for
news
and
for
web,
but
it
is
very
possible
that,
depending
on
how
we
need
to
process
that
information
that
we
have
slightly
different
pipelines
from
there,
so
I
could
imagine
that
there
might
be
subtle
differences
in
that
regard.
H
Cool
okay,
that
makes
sense,
and
then
I
just
have
another
question
about
mobile
results
right
now,
with
amp
and
obviously
ramping
up
on
the
core
web
vitals.
We're
we've
seen,
and
I've
noticed
this
on
other,
like
news
us
publications
results
that
that,
like
the
amp
and
the
like
their
core
site,
the
results
are
mixed.
So
it's
not
all
amp
articles
or
it's
not
all
kind
of
the
mobile
web
experience.
H
So
I
was
wondering
in
this,
like
interim
period
between
now
and
like
when
core
web
vitals,
become
the
main
signal
for
results.
How,
where
should
we
be
focusing?
Where
should
publishers
be
focusing
on
improving,
like
the
amp
experience
or
improving
core
web
vitals
for
ranking?
Currently.
A
Essentially,
core
web
vitals
would
apply
to
both
so
when,
when
it
comes
to
the
the
core
web
vitals,
especially
the
the
speed
aspect,
what
is
it
largest
contentful
paint
cumulative
layout
shift
and
first
input
delay,
though
those
metrics?
We
would
use
the
pages
that
users
actually
see
in
the
search
results.
So
that's
based
on
the
field
data
and
if
users
are
seeing
the
amp
pages
in
the
search
results,
then
we
would
base
those
metrics
off
of
the
amp
pages.
A
If
users
are
primarily
seeing
mobile
web
results,
then
that's
what
we
would
base
those
metrics
on
and
from
from
that
point
of
view,
if,
if
you're,
seeing
that
your
amp
pages,
for
example,
are
slow
on
mobile,
then
that's
what
I
would
focus
on
to
kind
of
improve
that
what
users
are
actually
seeing
it's
less
a
matter
of
like
we
will
not
show
amp
pages
later
on,
because
we'll
continue
to
have
amp
pages.
It's
it's
really.
Just
from
the
speed
point
of
view.
We
take
into
account
that
what
the
users
actually
see
got
it.
A
I
So
I
just
wanted
to
know
how
important
is
it
to
write
something
unique
which
includes
the
same
thing
or
probably,
I
should
go
with
the
manufacturer's
descriptions,
so
I
came
up
with
this
question.
After
reading
the
blog
on
product
reviews,
where
the
last
point
says
describe
the
key
choices
in
how
product
has
been
designed
and
their
effect
on
users
beyond
what
manufacturers
says.
A
So
from
that
point
of
view,
I
think,
looking
at
the
guidance
that
we
have
for
product
reviews
is
good,
but
it's
not
critical,
because
that's
not
really
your
primary
focus,
but
when
it
comes
to
the
description
on
a
page
or
on
for
for
products,
it's
something
that
we
do
take
into
account.
And
essentially
what
happens
is
we
will
index
the
different
pages
because
the
different
shops
that
have
the
same
product,
the
pages
themselves,
are
unique,
like
they
have
extra
other
information
around
the
product
description.
A
A
I
Okay
cool,
so
we
are
an
online
e-commerce
room.
So
probably
what
I
can
let
me
just
continue
my
understanding.
What
it's
like,
probably
it's,
something
like
it's
great
if
you
could
make
uni,
is
that
what
I
can
take
from
this
john.
J
A
Sure,
jeremy.
K
Hi,
john,
just
for
some
framing,
I
work
for
publicist
health
media,
so
we're
we're
dealing
with
a
lot
of
larger
pharmaceutical
and
healthcare
clients
and
we've
had
a
number
of
them
come
through
and
ask
about
personalization
technologies
that
leverage
javascript
to
kind
of
change.
The
experience
on
a
page
based
on
the
user's
referring
path
so
like
if
they
come
through
on
a
certain
from
a
certain
condition
area.
K
If
they
land
on
a
certain
page,
it
might
alter
some
of
the
kind
of
calls
to
action
or
some
of
the
multimedia
on
the
page,
and
you
know
that's
kind
of
a
half
measure
that
we've
come
up
with,
because
initially
they
wanted
to
change
like
everything
on
the
page.
But
we
were
concerned
about
how
google
might
perceive
that
both
from
a
you
know,
cloaking
perspective
or,
from
a
perspective
of
you,
know,
javascript
crawling
and
not
seeing
everything
initially.
K
That
kind
of
thing
so
just
curious
about,
like
whether
you
can
put
kind
of
a
stamp
on
that
to
to
know
like
if
there's
any
sort
of
best
practice
around
that
or
like
what
sort
of
best
prac
like
what
sort
of
guidance
we
need
to
point
them
at
whether
it's
the
cloaking
guidance
or
or
things
like
that.
A
Yeah,
I
I
think
the
cloaking
guidance
is
definitely
worthwhile
to
to
look
at.
We
also,
I
believe,
have
some
guidance
on
a
b
testing
in
general,
so
I
I
would
take
a
look
at
that
in
general,
with
regards
to
javascript
based
a
b
testing,
we
don't
necessarily
differentiate
between
javascript
and
other
kinds
of
a
b
testing.
A
We
can
render
some
javascript,
so
it's
possible
that
we
would
fall
into
one
of
these
a
b
tests
and
then
just
see
that
version,
and
then
we
would
use
that
version
for
indexing.
The
one
thing
I
would
watch
out
for
with
regards
to
a
b
test
is
that
googlebot
has
a
fairly
persistent
view
of
the
pages,
not
that
it
changes
every
time
we
crawl
the
page.
A
A
K
Okay,
and
just
for
for
clarity
it
it
is,
it
does
look
like
a
b
testing
technology
to
me,
but
they're
looking
to
use
it
in
a
permanent
fashion.
Is
there
a
problem
with
that
over
the
a
b
testing
end
of
things.
A
Not
necessarily
so,
if
you're
tweaking
things,
I,
I
don't
see
your
problem
with
that.
If
you're
again,
if
you're
significantly
changing
things,
then
I
would
just
make
sure
that
googlebot
sees
the
primary
version
of
the
page,
not
that
90
of
the
users
see
one
version
and
googlebot
sees
alone
sees
like
something
different,
but
usually
that
that
kind
of
falls
into
place
anyway,.
A
I
mean
that
technically
that
falls
into
cloaking,
but
essentially
what
you
want
to
avoid
is
that
googlebot
regularly
jumps
into
different
buckets
when
you're
doing
this
kind
of
test
and
from
from
that
point
of
view,
some
people
do
do
it
by
ip
address
or
location
or
username
or
something
like
or
user
agent.
But
essentially
the
idea
should
be
that
you're
not
special
case
in
googlebot
and
googlebot
alone
is
going
this
way,
but
rather
like
there's
a
whole
group
of
people
going
this
way,
and
it
includes
google
product.
A
A
Sure
gk,
I
think
hi
hi
john.
D
A
So
the
page
experience
tab
takes
into
account
the
core
vitals
plus
all
of
the
other
factors,
and
so
that's
that's
why
you
might
see
a
slight
difference
with
regards
to
to
those
two,
and
it
might
also
be
that
the
basis
out
of
which
the
the
pages
are
counting
are
slightly
different
in
those
two
reports.
A
So
that's
something
where
usually,
what
I
recommend
doing
is
with
these
aggregate
reports
is
trying
to
recognize
the
issues,
the
problems
and
focusing
on
those
and
not
using
the
total
count
as
a
metric.
A
Okay,
oh
my
gosh.
L
I
have
a
blogging
website
and
it
was
ranking
on
number
one
or
two
position
in
many
keywords,
but
after
the
google
product
review
update
my
website
ranking
pushed
down
to
10
or
11
on
the
same
keyword.
So
my
question
is:
how
can
I
improve
my
content?
According
to
the
google
productive
update,
to
regain
my
position.
A
What
I
would
recommend
doing
is
double
checking
the
the
blog
post
that
we
have
with
all
of
those
tips
on
there
and
also
double
checking
some
of
the
external
blog
posts
that
were
made
around
the
product
review
update
where
people
bring
in
some
more
information
from
things
like
the
quality,
rater
guidelines
and
some
of
the
other
kind
of
aspects
that
could
also
play
a
role
with
regards
to
review-based
content
or
content
that
might
be
perceived
as
being
focused
on
reviews.
A
I
I
can't
go
through
all
of
the
details
here.
I
would
really
recommend
taking
a
look
at
the
the
blog
post,
because
it's
it's
pretty
comprehensive
and
has
a
lot
of
information
in
there.
Thank
you
so
much.
Thank
you
sure.
M
Hi
john,
I
am
from
new
delhi,
india,
hi
hi,
so
my
question
is
related
to
international
targeting
tool
under
search
console
from
last
few
years.
I
am
seeing
that
this
tool
comes
under
legacy
tools,
category
in
search
console
and
when
we
are,
when
we
try
to
access
this
tool,
then
it
will
redirect
us
to
the
old
version
of
search
console
so
just
just
to
just
to
know
better
about
this
tool
just
to
know
the
future
about
tool.
M
A
I
don't
know
it's
it's
still
in
search
console,
so
it's
still
supported.
I
would
continue
to
use
it.
It's
it's
not
something
where
we
say
we're
going
to
remove
it
completely
because
we
would
have
removed
it
already.
I
don't
know
what
the
near-term
future
is.
If
we
will
update
it
or
if
we
will
decide
in
the
end
that
maybe
we
don't
need
it
as
much
as
we
thought
we
would.
I
don't
know
it's
hard
to.
M
J
A
J
A
Yeah
I
mean
it's
like.
If,
if
you
make
changes
on
your
website,
then
in
search,
we
will
try
to
reflect
that
and
if,
if
our
systems,
when
they
look
at
the
new
website,
see
that
everything
is
is
better
than
before,
we
will
try
to
rank
it
better,
but
if
they
see
that
things
are
maybe
not
as
good
as
they
were
before,
then
it
can
happen
that
it
goes
down
in
ranking
as
well.
A
A
A
I
I
don't
know
yeah,
I
it's
it's
hard
to
say
what
what
you
made,
what
kind
of
changes
you
made
and
how
how
that
worked
out.
I
am
kind
of
worried
when
you
say
we
just
added
more
keywords,
because
that
sounds
like
keyword
stuffing,
almost
and
kind
of
going
in
that
direction,
but
it's
really
hard
what
I
would
recommend
doing.
A
Okay,
thank
you,
john
sure.
Let
me
run
through
some
of
the
submitted
questions
for
a
bit
and
then
I'll
get
back
to
to
the
live
questions
as
well,
wow,
so
many
people
so
many
questions.
A
Okay,
let
me
see
what
contact
information
is
it
better
to
indicate
on
the
authors,
page
social
network
accounts,
email
addresses
or
both
if
we
can
choose
only
one
option,
what
is
more
preferable,
so
I
thought
it
was
actually
a
pretty
interesting
question.
A
I
I
don't
think
we
we've
had
run
across
this
before
so
essentially,
what
what
I
see
on
our
side
is
when
it
comes
to
things
like
author
pages
or
information
about
the
author
or
information
about
entities
in
general
behind
a
website
an
article
or
something
what
happens
there
is
our
systems
try
to
recognize
who?
That
is
what
that
entity
is,
and
we
do
that
based
on
a
number
of
different
factors,
and
that
does
include
things
like
links
to
profile
pages,
for
example,
or
visible
information
that
that
we
can
find
on
these
pages
themselves.
A
So
my
recommendation
here
would
be
to
at
least
link
to
a
common
or
kind
of
like
a
central
place
where
you
say,
like
everything
comes
together
for
this
author,
which
could
be
something
like
a
social
network
profile
page
for
example,
and
use
that,
across
the
different
author
pages,
that
you
have
when
you're
writing,
so
that
when
our
systems
look
at
an
article
and
they
see
an
author
page
associated
with
that,
they
can
recognize.
A
Oh,
this
is
the
same
author
as
the
person
who
wrote
something
else
and
we
can
kind
of
group
this
by
entity
and
we
do
that
based
on
maybe
this
common
social
networking
profile
that
is
there
in
I
don't
know
long
ago
we
used
to
have
the
rel
author,
annotation
and
all
of
the
the
older
seos
will
probably
face
palm
now,
but
it's
something
essentially
where
we
try
to
use
structured
data
to
explicitly
apply
this.
A
But
it
is
something
that,
like
the
the
rel
author,
annotations,
are
no
longer
used
at
google
for
quite
a
while
now,
but
we
do
try
to
understand
who
the
entity
is
behind
an
author
page
and
for
many
authors
it's
it's
pretty
clear.
There's
one
name,
and
it's
very
obvious.
This
one
name
is
associated
with
this
one
person
for
other
people.
A
It
can
be
a
little
bit
more
complicated
like
like
me,
for
example,
john
mueller,
if
you
search
for
me,
you'll
find
wikipedia
pages
barbecue,
restaurants,
bands,
all
kinds
of
people
who
are
called
john
mueller
and
if,
on
my
site,
I
I
don't
kind
of
specify
who
I
actually
am,
then
it
could
happen
that
our
systems
look
at
my
pages
and
go.
Oh.
This
is
that
guy
that
runs
at
barbecue
restaurant
and
then
suddenly,
I
am
associated
with
a
barbecue
restaurant,
which
I
don't
know
might
might
be
a
move
up.
A
A
A
I
don't
know
clickbaity
the
title
and
description
so
that
it's
really
something
where
our
systems
when
they
look
at
it
say.
Oh,
this
is
a
good
recommendation
for
this
page.
We
will
show
that
one
we
we
still
use
the
title
in
in
our
search
in
in
ranking
systems,
even
if
we've
modified
it
in
the
search
results.
We
don't
use
a
description
at
all
when
it
comes
to
ranking
it's
really
just
what
we
would
show
in
the
search
results.
A
How
can
I
deal
with
missed
redirects
for
moving
from
http
to
https?
Does
it
make
sense
to
catch
up
now,
I'm
new
at
the
company
and
back
then
they
missed
to
do
this.
Is
it
now
too
late,
or
should
we
still
do
it?
I
would
definitely
still
do
that
so,
especially
when
it
comes
to
http,
having
that
redirect
in
place
make
sure
that
everyone
is
going
to
your
https
version,
and
then
you
can
also
set
up
hsts,
which
is
a
way
of
automatically
sending
people
to
the
https
version
of
your
site.
A
So
even
if
you
didn't
do
it
back,
then
I
think
just
purely
from
a
user
best
practice
point
of
view.
It
makes
sense
to
set
up
those
redirects.
I
don't
think
you
would
see
any
any
visible
effect
when
it
comes
to
seo,
but
really
just
from
a
user's
point
of
view.
It's
it's
still
a
big
good
practice,
even
if
you
didn't
do
it.
A
In
the
beginning,
we've
been
working
on
improving
our
page
speed,
scores,
lcp,
lab
data
and
field
data
are
below
2.5
seconds,
but
core
web
vitals
still
list
many
pages
of
sailing
running
viva
revalidation
doesn't
help.
Additionally,
the
number
of
failed
pages
changes
erratically
for
no
reason
it
drops
from
160
to
90
and
then
jumps
up
again
a
bit
later,
so
the
the
number
of
pages,
as
I
mentioned
in
the
beginning,
that
could
be
just
based
on
the
sample
that
we
look
at
for
your
site
and
that
sample
size
can
vary
over
time.
A
So
that
might
be
something
that's
less
relevant
to
worry
about.
I
would
focus
more
on
the
number
of
issues
that
are
found
and,
in
particular
the
kind
of
sample
urls
that
are
shown
there.
The
patterns
that
are
shown
for
the
individual
issues
with
regards
to
page
speed
scores
versus
core
web
vitals
in.
I
assume
this
is
in
in
search
console
because
it's
essentially
using
the
same
thing
in
search
console.
We
use
the
chrome
user
experience
reports,
the
field
data,
the
real
user
metrics,
and
they
take,
I
think,
28
days
about
to
be
updated.
A
So
that
might
be
something
that
that
you're
seeing
there.
The
other
thing
that
commonly
pops
up
is
the
difference
between
lab
data
and
field
data,
essentially
with
lab
data
you're
doing
an
approximation
of
what
users
might
see
and
that
approximation
might
be
wrong
because
we
we
don't
know
where
your
users
are
what
they
would
actually
see,
and
because
of
that,
it
can
happen
that
the
approximation
looks
good,
but
the
user
data
looks
bad
or
similarly,
the
approximation
looks
bad,
but
the
actual
user
data
is
pretty
good.
A
And
that's
why
you?
You
do
sometimes
see
those
differences
in
general
when
it
comes
to
optimizing
things
for
speed.
I
recommend
using
the
user
data
the
field
data
as
a
way
of
recognizing
broader
areas
of
problems
and
then
using
the
lab
data
to
incrementally
improve
your
website
in
that
regard.
So,
if
you're,
seeing,
for
example,
that
cls
is
really
bad
in
in
the
field
data,
then
I'd
recommend
like
looking
at
the
lab
data
for
cls
and
trying
to
figure
out
like
how
can
you
significantly
improve
that?
A
A
Yes,
that
can
happen
so
essentially
we
we
have
kind
of
some
heuristics
in
place
to
recognize
when
you're
linking
to
images
and
when
we
can
tell
that
a
link
goes
to
an
image,
then
we
will
say:
oh,
this
is
probably
an
image
and
we
will
process
it
in
image
search
and
we
won't
even
look
at
it
for
web
search.
However,
if
we
can't
recognize
that
it's
an
image,
then
we'll
try
to
crawl
that
url
for
web
search
and
then
kind
of
the
crawler
will
say.
A
Actually
there
was
no
web
page
here
and
then
in
within
web
search,
we'll
say:
oh,
we
weren't
able
to
index
this
page
and
that's
not
necessarily
a
bad
sign.
It's
just
essentially
us
telling
you.
We
tried
to
crawl
these
urls
and
we
just
didn't
get
a
web
page
out
of
it
and
from
from
that
point
of
view,
it's
more
of
an
informational
thing
for
you
to
understand.
Well,
google
tried
to
crawl,
so
you'll
see
it
in
your
logs,
but
they
weren't
actually
indexed
as
web
pages.
A
Probably
they
ended
up
being
indexed
as
images,
which
is
something
you'd
have
to
kind
of
try
to
figure
out
almost
indirectly,
because
we
don't
have
the
same
index
coverage
report
for
images
as
well,
but
you
could
see
maybe
in
the
performance
report
which
landing
pages
are
showing
images,
and
then
you
can
double
check
to
see
what
is
actually
shown
in
google
images.
For
those
pages,
that's
I
think
one.
A
I
saw
that,
according
to
the
field
data,
the
page
also
has
a
bad
cls
score.
Why
is
that
not
shown
in
search
console
as
well?
I
don't
know
it's
it's
really
hard
to
say
on
an
individual
url
basis.
What
I
would
recommend
doing
is
not
worrying
too
much
about
individual
pages
unless
they
are
very
critical
and
important
pages
for
your
site
and
instead
focusing
on
the
broader
pattern
for
your
website.
A
A
Okay,
on
the
other
hand,
if
you
have
kind
of
for
your
main
product,
a
very
unique
landing
page-
and
that
is
not
okay
with
regards
to
these
core
web
vitals,
then
that
would
be
something
that
I
would
focus
on,
but
especially
when
you're
looking
at
individual
pages
across
the
different
tools,
you
kind
of
need
to
keep
in
mind
that
the
tools
have
a
different
databases
in
the
sense
that
the
report
in
search
console
is
based
on
groups
of
urls.
A
The
reports
in
pagespeed
insights
are
also
kind
of
based
on
the
information
we
have
for
one
url
plus
for
the
origin
overall,
plus
the
lab
test
as
well.
So
all
of
these
reports
could
have
slightly
different
bases.
A
Yeah,
I
in
general,
I
think
it's
it
can
get
very
tricky
when
you're
looking
at
individual
pages
like
this,
my
recommendation
is
really
to
kind
of
try
to
take
a
step
back
and
look
at
the
overall
patterns
rather
than
focusing
too
much
on
individual
pages,
especially
if
those
pages
are
part
of
a
common
theme
that
you
have
across
your
website.
A
Let's
see
and
there's
a
question
another
one
about
core
web
vitals
wow
everything
we've
done
everything
we
could
to
get
the
lab
data
below
the
required
numbers.
However,
field
data
is
still
not
changing
again,
there's
the
the
time
difference.
That
is
there
on
the
one
hand
and,
on
the
other
hand,
the
factor
of
the
lab
test
being
an
approximation
essentially
they're,
guessing
at
what
the
field
data
might
be
for
an
individual
score,
based
on
specific
assumptions
that
they
make
like.
A
So
that's
something
where
I
would
use
the
lab
data
to
incrementally
improve,
but
I
would
use
the
field
data
as
kind
of
the
basis
of
truth
for
for
the
site.
A
What
you
can
also
do,
if
you
don't
have
time
to
wait,
those
28
days
or
so,
for
the
field
data
to
start
updating
is
you
can
annotate
your
pages
with
a
specific
javascript
snippet
that
you
can
use
to
make
it
so
that
you
can
track
the
core
web
vitals
of
those
individual
pages
in
something
like
your
analytics
system,
and
there
are
various
snippets
that
we
have
on
web.dev
our
developer
site
for
chrome,
where
you
can
add
some
javascript
to
your
pages
and
then
track
the
cls
or
you
can
track
the
largest
contentful
paint
all
of
those
things,
and
you
can
track
those
then
in
analytics
and
then
fairly
quickly.
A
You
start
seeing.
Were
these
changes?
Okay,
whether
change
is
not
okay.
You
can
also
compare
different
individual
pages.
If
you
have
like
abc
versions
of
a
page
where
you're
making
individual
changes
not
sure
which
one
is
the
best,
then
you
can
track
that
a
little
bit
faster
as
well.
A
A
My
general
recommendation,
when
you're
working
on
on
a
website
is
to
try
to
build
out
one
website
and
make
it
as
strong
as
possible
and
to
kind
of
grow
that
over
time,
rather
than
to
create
a
bunch
of
disconnected
single
page
websites,
but
sometimes
you
just
want
something
completely
separate
that
maybe
you
use
as
an
advertising
landing
page
as
well,
and
that
can
absolutely
appear
in
the
search
results.
A
It
can
absolutely
rank
fairly
well
as
well
for
a
while,
it
was
very,
I
don't
know,
fashionable
to
have
single
page
websites
really
long
pages
kind
of
these.
These
longer
experiences,
but
it's
it's
totally
up
to
you.
You
can
try
it
out.
A
A
One
of
the
things
to
keep
in
mind
is
that
we
would
fold
those
scores
together.
So
on
the
one
hand,
if
you
have
something
like
a
slash,
amp
page
and
a
slash
web
page,
we
would
fold
those
scores
together
and
also
the
score
that
we
would
have
from
the
amp
cache.
A
If
we're
serving
your
amp
page
from
the
amp
cache,
we
would
also
fold
that
together
and
kind
of
make
one
score
for
that
page
based
on
the
canonical
itself,
so
that
kind
of
comes
together
there
and
if
we're
always
showing
your
amp
pages,
always
showing
them
from
the
amp
cache,
then
that's
kind
of
what
would
be
taken
into
account.
A
The
other
thing
to
to
keep
in
mind
is
that
we
track
this
data
not
based
on
what
what
is
shown
in
search
results
and
where
people
click
from
the
search
results,
but
essentially
based
on
what
people
see
when
they
are
within
your
website.
So
if
they
go
to
your
amp
page
on
the
amp
cache
and
they
click
on
a
link
and
all
of
the
links
lead
to
your
traditional
mobile
page,
then
suddenly
they're
on
the
mobile
page
and
the
experience
that
they're
seeing
is
based
on
what
they
see
on
the
mobile
page.
A
That's
something
that
you
can
also
kind
of
control
a
little
bit
with
regards
to
how
you
set
up
your
internal,
linking
if
you
link
your
amp
pages
with
your
amp
page,
then
they'll
probably
stay
within
the
app
version,
whereas
if
you
link
your
amp
pages
to
your
traditional
mobile
version,
then
it's
very
likely
that
if
someone
stays
on
your
site
for
a
little
bit
longer,
which
is
usually
a
good
thing
that
they
end
up
on
your
traditional
mobile
version,
yeah,
so
we're
kind
of
running
low
on
time,
maybe
I'll
switch
to
one
or
two
more
questions
from
you
all.
A
N
Oh,
I
see
okay,
thank
you.
So
my
question
is
about
google
product
review.
So
I
just
wondering
if
the
domain
that
we
post
that
led
review
matter-
because
previously
you
mentioned
the
product
review,
content-
is
optimized
for
professional
product
review,
so,
for
example,
but
what
I
mean
is
that
if
I
am
apple.com
and
I
write
a
review
about
a
different
computer,
it
will
obviously
be
biased
right
compared
to
more
like
seemingly
neutral
site
like
consumerreporter.com
or
something
like
that.
N
A
I
don't
know
my
my
guess:
is
there
there
will
be
some
kind
of
a
mix
where
we
take
the
different
kinds
of
reviews
into
account
and
then
rank
them
like
that,
I
I
don't
think
we
would
have
something
explicitly
saying
that
this
review
was
written
by
the
company
that
actually
made
the
product.
Therefore
we
should
ignore
it,
but
I
really
don't
know
because
it
can
also
happen
that
the
company
that
makes
a
product
has
a
collection
of
fairly
serious
reviews
by
by
more
neutral
providers.
But
I
don't
know
it's.
N
A
O
Hi,
I
have
another
question
about
codeback
vitals.
O
A
I
think
at
the
moment
we
don't
especially
for
the
core
web
vitals.
We
only
differentiate
between
mobile
and
desktop,
and
we
only
or
we
plan
on
only
using
mobile,
at
least
for
the
moment.
So
it's
it's
possible
that
can
change
over
time,
but
at
least
the
the
current
plan
is
only
on
mobile,
okay,.
O
And
I
have
another
question
about
the
e-commerce
if
we
create
a
master
master
page
for
variant
pages
like
a
mobile
phone
with
128gb
256gb,
and
we
create
a
master
page
for
all
of
those
variant
pages
and
canonicalize
variant
pages
to
the
master
page
right.
And
if
I
use
a
searches
for
the
variant
page.
Will
google
show
our
categoricalized
page.
A
O
Okay,
can
I
ask
another
question
so
when
I
use
the
lighthouse
tool,
it
suggests
us
to
use
no,
no
reference
for
outlinks
or
external
links
that
we
are
providing
outcome
links
so,
and
it
says
that
it
can
affect
the
security
and
performance
and
and
using
no
reference
and
no
opener
can
help
us
in
this.
Can
you
help
me
to
understand
what
is
the
use
of
node,
wrapper
and
local
in
case
of
performance
or
seo.
A
A
I
kind
of
have
a
vague
recollection
of
what
those
two
attributes
do,
but
I
would
really
recommend
checking
out
the
documentation
for
them
and
then
based
on
that
deciding
on
what
you
do,
the
I
think
they
refer.
No
refer
essentially
just
doesn't
pass
a
referrer
along.
So
that's,
usually
less
less
a
matter,
and
the
no
opener
is
is
something
that
is
more
used
for
security
reasons.
A
O
It
will
not
affect
our
performance
like
speed,
matrices.
A
A
Sure,
john.
P
Hello,
john
hi,
I
have
a
question
about
responsive
images
and
which
one
google
tries
to
rank.
I've
put
a
little
piece
of
code
in
the
in
the
in
the
chats
just
to
have
some
kind
of
sense.
What
I'm
trying
to
say,
I
was
wondering
which
images,
google,
eventually
indexes
or
crawls,
I'm
guessing
you
crawl,
all
urls
or
all
sources,
but
maybe
you
just
index
one.
Is
it
the
biggest
one
or
is
it
the
one
in
the
source
attributes.
A
For
for
a
while,
it
used
to
be
just
the
one
in
the
source
attribute.
I
don't
know
if,
if
that
has
changed,
but
in
in
general,
it's
it's
very
common
for
us
to
recognize
multiple
sizes
of
images
and
to
index
all
of
them
and
to
recognize
that
they're
the
same
image
and
then
what
will
happen
in
the
search
results
in
the
image
search
results.
A
So
that's
something
where
it's
it's
usually
less,
a
matter
of
which
one
we
pick
it's
more,
that
we
can
at
least
definitely
index
the
one
with
the
source
tag,
because
that's
kind
of
the
the
standard
setup
and
if
we
can
recognize
the
other
sizes,
then
we
could
match
those
when
people
are
explicitly
searching
for
something
like
a
larger.
J
A
P
Yeah,
and
would
you
say
if
I
understood
correctly,
that
it's
very
important
to
have
the
source
attribute,
because
I
think
for
the
browser,
if
you
just
use
the
source
set
attribute,
it
would
work
just
fine.
I
guess.
A
Yeah
yeah,
I
I
think
that
changed
in
in
our
systems
at
some
point
that
we
also
use
the
source
set
to
pick
up
the
different
variations,
but
at
least
in
in
the
beginning.
It
was
such
that
we
only
use
the
source,
and
I
could
imagine
that
other
search
engines
also
focus
on
only
the
source
attribute.
P
Okay,
thank
you.
I
don't
know.
If
I
have
time
for
one
more
question,
am,
I
sure
sure
go
for
it
regarding
the
core
web
files,
you
mentioned
the
field
data
as
being
the
one,
and
it's
also
in
regards
to
ranking
only
focused
on
mobile
in
the
faqs.
You
also,
or
somebody
from
google,
also
said
that
it's
it's
kind
of
in
the
cutting
edge.
P
Or
how
do
you
say
it
when
it's
it's
two
equally
equally
relevant
pages,
striving
for
the
top,
then
it
might
matter
more
in
those
situations.
P
A
I
I
don't
think
we
would
have
that
defined
so
that
that
could
vary
it's
it's
something
that
in
in
practice,
is
less
likely
to
to
happen,
because
usually
you
would
be
comparing
different
sites
in
the
same
niche
and
if
they've
been
relevant
kind
of
fairly
similar
in
in
the
past,
then
they
have
kind
of
similar
traffic
from
search
already,
so
they
would
be
pretty
much
in
a
similar
grouping
already.
A
So
that's
something
where
my
guess
is
we.
We
will
leave
that
undefined
for
the
moment
and
we'll
try
to
figure
out
how
how
to
best
deal
with
that.
It's
similar
with
with
other
factors
in
search
where
sometimes
we
just
don't
have
a
lot
of
information
about
a
site,
especially
if
a
site
is
completely
new
and
suddenly
there's
new
content
there.
Then
our
systems
have
to
make
assumptions,
and
sometimes
those
assumptions
are
more
more
positive
than
the
end
result
ends
up
being.
A
A
Okay
thanks
sure,
okay,
let
me
pause
the
recording
here.
You're
welcome
to
to
stick
around
a
little
bit
longer,
but
it's
always
good
to
have
a
reasonable
length
for
the
recording.
Thank
you
all
for
joining
if
you're
watching
this
on
youtube.
I
hope
you
found
this
useful
and
maybe
you'll
join
one
of
the
future
hangouts
as
well.
All
right,
let's
pause
here.