►
From YouTube: English Google SEO office-hours from January 15, 2021
Description
This is a recording of the Google SEO office-hours hangout from January 15, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://developers.google.com/search/events/join-office-hours
Feel free to join us - we welcome webmasters of all levels!
A
All
right
welcome
everyone
to
today's
google
seo
office
hours.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
here
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
websites
and
seo
and
google
search
and
all
of
that
a
bunch
of
things
were
already
submitted.
But,
as
always,
if
any
of
you
want
to
get
started
with
a
question.
You're
welcome
to
jump
on
in.
A
B
So
my
question
is
pretty
quick:
there's
this
website
that
keeps
showing
up
in
search
results
in
very
prominent
positions,
and
the
problem
is
that
when
you
actually
click
on
the
links
from
that
website,
it
actually
takes
you
to
something
else,
completely
unrelevant
for
those
queries
and
obviously
this
is
taking
up
valuable,
real
estate
from
other
websites.
That
would
be
more
relevant
for
for
those
terms.
B
So
my
question
is,
first
of
all,
why
doesn't
google
pick
up
on
these
situations
and
remove
those
websites
from
the
index
and
because,
in
this
particular
example,
I've
been
seeing
this
publication
for
over
a
year?
And
if
I
may,
send
you
in
the
chat
box,
perhaps
the
url,
so
you
can
investigate.
A
Sure
yeah,
like
feel
free
to
to
drop
a
link
in
in
the
chat
in
particular,
also
queries
where
you're
seeing
this
kind
of
an
issue.
It's
it
kind
of
depends
a
bit
on
what
what
is
actually
happening
there.
A
If
it's
more
that
they're
returning
content,
which
isn't
great,
then
that
can
happen
if
it's
a
matter
of
being
redirected
to
malware
or
to
phishing
content,
then
that's
that's
a
little
bit
of
a
different
situation,
usually
that's
something
where,
if
you
submit
it
to
the
appropriate
forms,
then
someone
from
that
team
will
take
a
look
at
that.
A
C
Hi
john,
how
are
you
hi
hi?
So
my
question
is
about
web
vitals
report
in
search
console,
so
we
are
working
on
web
titles
and
search
console
is
that
there
are
few
pages
on
the
website
that
has
a
poor
cls
score.
C
So
when
we
check
those
urls
in
page
speed,
insights
or
other
tools
like
web
page
speed.org,
the
score
is
a
pretty
good
like
0.06,
but
search
console
is
reporting
it
around
0.60
or
more
than
that.
So
I'm
not
sure
what
we
can
do
here,
because
search
console
is
saying
something
else
and
psi
something
else.
So
what
do
you
suggest?
We
can
do
here.
A
Yeah,
I
I
think
what
you're
seeing
there
is
the
difference
between
lab
data
and
field
data,
so
lab
data
is
when
you
do
a
theoretical
test
and
you're
kind
of
theoretically,
looking
at
at
the
website.
Using
your
connection
using
your
browser
using
a
standardized
browser
standardized
connection
or
what
have
you
and
field
data
is
the
data
that
is
actually
collected
from
users
when
they're,
using
chrome
and
accessing
the
website
and
in
in
many
cases
the
lab
data
is
kind
of
predictive
of
the
field
data
that
you
would
see
there.
A
But
there
definitely
are
situations
where
what
you
see
in
your
kind
of
theoretical
tests
doesn't
match
exactly
what
what
your
users
are
actually
seeing,
and
that
can
happen.
For
example,
if
a
lot
of
your
users
are
in
in
locations
where
they
have
a
bad
internet
connectivity
or
where
they
use
devices
that
tend
to
be
different
than
what
you're
testing
for
and
that's
kind
of
where,
where
that
difference
tends
to
come
from.
C
Well,
yeah
in
in
our
case,
actually
the
search
console
is
giving
like
0.60
for
that
url,
but
paid
speed
inside
in
field
data
and
lab
data.
In
both
these
sections
it
is
showing
around
0.04
or
0.06.
C
A
A
So
it's
not
that
search
console
is
doing
separate
tests
there
and
what
what
might
just
be
happening
is
that
there's
kind
of
a
time
delay
that
that
you're,
seeing
there
but
essentially
they're
based
on
on
the
same
data,
the
the
other
effect
that
you
might
be
seeing
is
that
we're
grouping
things
slightly
differently
in
search
console
than
when
you
test
them
individually,
but
that's
also
something
which,
which
is
which
tends
to
settle
down
over
time.
A
D
C
D
D
Is
this
okay?
Okay,
thanks
so
john,
I
have
a
question
about
international
seo,
so
we
have
like
multiple
websites
for
different
countries
and
actually
they
have
the
similar
content
in
english.
We
have
used,
of
course,
the
href
link
for
each
website,
and
actually
we
are
facing
a
major
decision,
whether
we
should
keep
it
like
this
with
different
cctlds
or
just
switch
to
a
generic
domain
with
subfolders,
so
we're
not
sure
yet
which
one
to
choose.
A
Yeah
from
from
our
side,
both
of
those
approaches
would
be
valid,
so
it's
not
something
where
we
would
say:
there's
an
inherent
seo
advantage
of
cctlds
or
generic
domain
with
the
subdirectories.
A
A
It
will
take
a
bit
of
time
for
everything
to
settle
down
again,
so
it's
not
so
much
that
it'll
be
better
automatically
or
worse
automatically.
It's
just
during
that
transition.
Time,
which
might
be
a
month
might
be
two
months
you
will
see.
Fluctuations
and
search
will
have
trouble
kind
of
understanding
the
full
picture
of
your
website.
A
So
if
you
make
a
decision,
make
it
based
on
what
you
want
to
have
for
the
long
run
and
kind
of
realize
that
you
will
see
fluctuations
so
maybe
plan
the
timing
in
a
way
that
it
doesn't
affect
your
business
overall,
that
you
can
say
well.
People
are
not
searching
so
much
during
this
time,
so
I'll
do
the
migration
there.
D
A
Yeah
exactly
for
for
seo
reasons
both
of
these
are
valid.
It's
sometimes
there
are
policy
reasons
to
change.
Sometimes
there
are
practical
reasons
that
the
infrastructure
is
is
more
expensive
or
takes
more
time
to
maintain.
All
of
those,
I
think,
are
valid
reasons
for
seo
reasons
you
can
do
either
way.
D
Okay
and
you've
said
something
interesting
about
change.
Actually,
our
urls,
like
for
the
co
for
the
blog
post,
let's
say
it's:
the
content
path
is
url,
slash
the
post
name.
So
is
it
bad,
for
example,
to
change
the
structure
to
like
site,
site.com
blog,
slash,
post
name,
so
is
this
change
could
harm
our
ranking
factors?
So
it's
very
important
to
know
this.
A
You
you
will
also
see
fluctuations
when
you
make
that
change,
but
afterwards
it
can
be
the
same
as
before.
So
it's
not
that
you
automatically
have
an
advantage
or
disadvantage
from
changing
the
urls,
but
it
will
just
take
time
to
settle
down
and
be
understood.
E
A
A
So
that's
something
where
you
could
look
at
kind
of
like
the
queries
that
are
going
to
your
website.
You
could
also
do
user
studies
for
kind
of
a
broader
batch
of
users
who
maybe
are
not
going
to
your
website
already
and
based
on
that,
try
to
figure
out
what
what
aspects
do
I
need
to
cover
there
so.
A
It
doesn't
have
to
be
the
exact
sentence,
but
if
you're
seeing
that,
as
as
a
pattern
of
something
significant
with
regards
to
traffic
to
your
website,
then
you
can
kind
of
assume
people
have
this
kind
of
a
question
and
they're
trying
to
find
information
on
that.
So
it's
less
a
matter
of
you
have
to
match
that
sentence
one
to
one.
E
Are
having
and
and
bert
has
nothing
to
do
with
that,
or
it
does
have
something
to
do
with
that.
I.
A
F
Small
point
of
view
here,
since
we
we
generally
try
to
figure
out
kind
of
the
same
thing:
the
search
engine
behind
certain
queries
in
order
to
kind
of
understand
what
what
kind
of
pages
should
be
created
or
optimized,
and
this
is
specifically
for
more
general
queries
where
not
even
knowing
the
audience
you're,
not
really
sure
what
people
are
looking
for.
F
So,
for
example,
somebody
search
for
something
like
progressive
glasses,
you're,
not
sure,
if
they're
searching
to
buy
something
you're,
not
sure
if
they're
searching
to
understand
what
they
are
or
are
searching
for
information.
So
I
I
think
a
good
way
to
go
about.
It
is
basically
kind
of
check
the
top
10
results,
kind
of
figure
out,
because
kind
of
sort
of
speak.
F
Reverse
engineer
what
google
kind
of
thinks
that
the
most
relevant
results
should
be
like
maybe
more
informational
results,
because
if
there
are
just
information,
results
and
you're
trying
to
improve
or
optimize
a
commercial
page
that
might
not
have
a
lot
of
chances
to
rank
because
there
it's
not
really
relevant
based
on
the
intent
that
the
user
is
looking
for.
More
information
about
the
subject,
not
necessarily
to
purchase
anything
and
that
can
change.
I
I've
noticed,
for
example,
certain
holidays
like
black
friday,
and
things
like
that.
F
The
search
intent
might
actually
change
a
bit
and
the
search
results
might
change
to
be
a
bit
more
commercial,
maybe
rather
than
informational
or
something
that's
newsworthy,
might
change
that
a
bit
as
well,
but
that's
kind
of
a
general
idea
of
using
what
else
is
ranking
in
the
you
know,
top
results
to
kind
of
understand
what
what
kind
of
results
you
you?
What
kind
of
page
you
should
create
in
order
to
make
sure
you're
relevant
for
whatever
the
users
are
searching.
E
For
I
also
noticed
too,
depending
on
the
search
query
that
the
tabs,
at
the
top
of
google,
will
change
so,
whether
it's
an
image,
a
video,
a
map
whatever
it
is,
that
order
will
actually
change
based
upon
the
query.
So
does
that
give
me
any
insight
to
what
information
I
should
have
in
the
hierarchy
of
the
information
I
should
have
on
my
page?
E
A
A
But
you,
you
need
to
also
kind
of
keep
in
mind
that
you're
kind
of
optimizing
for
the
current
situation,
where,
like
especially
with
a
query
like
back
friday,
if
in
the
summer
you're
like
saying
oh,
I
want
to
optimize
for
black
friday
and
you
search
for
it
then
you'll
find
informational
content
on
that,
whereas
during
the
time
that
you
care
about
like
that,
informational
content
is
probably
not
the
right
thing
to
kind
of
focus
on
with
your
website.
E
G
John
question
about
a
safe
search,
filter
and
potentially
content
that
I'm
concerned
is
getting
filtered
by
the
safe
search,
and
I
can
see
why
potentially,
but
I
want
to
try
and
understand,
is
there
a
way
to
definitively
tell
if
content
is
being
filtered
via
safe
search
and,
if
so,
and
once
we
can
determine
that?
How
can
I
and
I
make
changes
to
try
and
get
around
that?
G
A
The
the
easy
way
to
test
if
safesearch
is
kind
of
triggering
for
that
content
is
to
just
turn
the
search
filter
on
and
off
so
doing
something
like
a
site
query
and
then
just
in
in
the
search
url
on
top,
you
can
just
add
ampersand
save
equals
on,
I
think,
or
save
equals
off
and
based
on
that
you'll
see
if,
if
they're
different
results,
and
if
there
are
no
different
results,
then
safe
search
is
essentially
not
not
affecting
that
that
side.
G
A
Yeah
sure,
yeah
and
with
regards
to
kind
of
changes
that
you
make
on
your
website,
that
is
something
where
we
have
to
reprocess
the
the
content.
We
have
to
kind
of
re-understand
it
a
little
bit,
so
it's
a
little
bit
longer
than
just
recrawling.
A
A
Yeah
yeah,
I
mean
yeah,
because
we
we
associate
the
image
with
the
landing
page
and
if,
if
there's
something
problematic
in
the
image,
then
it's
it's
very
likely.
That
we'll
just
say
well
kind
of
this
pair
of
landing.
Page
and
image
is
the
thing
that
we
have
to
filter
for
safe
search,
got.
G
It
and
then
I
think
earlier
at
the
end
of
last
year,
you
had
mentioned
that
if
the,
if
the
site
is
being
flagged
in
search
or
has
been
flagged
as
having
adult
content,
that
it
won't
be
eligible
for
rich
results,
is
that
across
the
board
or
on
those
pages
specifically
or
did
I
misinterpret
that.
A
I
I
think,
that's
specific
to
to
the
pages
that
are
affected
so
with
regards
to
safe
search,
it's
something
where
we
try
to
do
it
as
granular
as
possible,
but
we
can't
always
do
that.
So
sometimes,
if
we
have
like
individual
pages
across
the
site
that
we
think
need
to
fall
into
the
safe
search
category,
it's
possible
that
our
algorithms
will
just
say:
we'll
take
the
whole
domain
and
say
we
we
don't
know
for
sure.
A
So
that's
that's
something
that
kind
of
plays
in
there
as
well.
It's
like
the
the
individual
pages
are
essentially
what
we
try
to
focus
on,
but
if
we
don't
know
for
sure
that
these
individual
pages
are
okay
or
not-
and
I
mean
it's
not
like
a
quality
kind
of
assessment-
it's
just
like
safe
search
or
not.
H
John
john,
I
think
we
talked
about
this
in
like
2014
or
so
when
all
these
female
celebrities
were
being
hacked.
I
had
an
entertainment
site
and
we
were
doing
legitimate
reporting
about
so
and
so
was
hacked.
So
there
was
naked,
photo
or
naked
video,
where
the
words
and
just
by
reporting
on
it,
the
sheer
number
of
those
women
who
kept
being
hacked,
we
were
getting
bucketed.
I
think
you
said
that
the
the
systems
were
sort
of
accidentally
bucket,
not
accidentally,
but
they
were
seeing.
H
I
All
right,
hi
joe,
I
might
have
a
short
and
sweet
question.
Okay,
google
earlier
announced
that
the
rollout
date
of
the
new
page
experience
ranking
signals
like
lcp
and
fid,
and
things
like
that
that
the
rollout
date
was
may
2021.
I
A
My
understanding
is
that
this
is
a
global
rollout,
so
it's
something
that
we
collect
on
a
global
scale
and
there's
no
big
reason
to
not
do
that
on
a
global
level.
If
we
can
so
my
understanding
is,
it
will
be
global
with
regards
to
kind
of
experiments
before
that,
I
I
don't
know
for
sure.
So
in
particular,
we
want
to
do
kind
of
that.
Badging
of
the
search
results
when
we
see
that
aside
is
within
the
good
page
experience
bucket
and
usually
badging
is
is
really
hard
to
do-
is
really
hard
to
get
right.
J
Hi
there,
john
yeah,
I
work
in
a
camper
van
rental
company
and
recently
we
have
been
facing
a
big
problem
that
I've
been
losing
a
lot
of
time
too,
the
last
few
months,
basically
on
the
specific
regions
of
the
uk,
we
see
that
for
specific
queries
that
have
kind
of
the
same
terms.
The
search
intent
is
exactly
the
same.
I
think
I'm
gonna
give
you
an
example
like
camper
van
rental
and
camper
space
van
rental.
J
We
see
that
our
competition
has
consistent
results.
They
basically
rank
on
the
first
page
and
for
the
two
queries
they
keep
keep
the
same
position,
while
our
company
for
camper
van
rental,
for
example,
goes
like
top
5
10
results
and
camper
space
fan
rental.
We
go
to
60
or
80th
position,
and
I
went
deep
into
this
problem
and
I
checked
like
how
how
many
times
I
use
this
word.
I
hate
to
go
by
keyword
by
keyword,
but
I
actually
did
it
this
time
and
I
see
they
don't
use
it.
A
Okay,
I
I
suspect
what
is
happening
there
is
that
we
we
don't
see
these
words
as
being
100
synonyms
and
that's
something
that
our
systems
try
to
pick
up
automatically.
So
it's
not
that
we
have
at
least
as
far
as
I
know
I
I
don't
think
we
have
linguistic
models
that
are
there
to
try
to
figure
out
what
are
the
possible
synonyms
for
these
words.
But
rather
we
see
these
are
unique
queries
and
we
try
to
understand
what
what
the
entities
are
behind
that
and
to
try
to
match
that
as
much
as
possible.
A
But
probably
we
don't
see
them
as
100
synonyms.
Otherwise
the
results
would
be
exactly
the
same,
and
that's
something
that
is
some
usually.
It
just
works
itself
out
over
time
in
that,
when
we
have
enough
data
to
understand
that
okay
people
searching
for
the
word
together
mean
exactly
the
same
thing
as
with
like
separate
words,
then
that's
something
that
that
we'll
take
into
account
and
say:
okay.
A
This
is
more
more
of
a
synonym
rather
than
just
like
kind
of
very
similar,
and
with
regards
to
to
optimizing
for
something
like
this
improving
your
rankings,
I
I
don't
really
have
any
super
super
fancy
tips
there.
It's
more
a
matter
of
if
you're,
seeing
that
a
lot
of
people
are
searching
in
one
particular
way,
then
maybe
use
that
way
on
your
website
as
well,
so
that
we
can
find
that
appropriately
across
your
site
and
if
you're,
seeing
kind
of
a
balance
between
those
two
ways
and
you're.
A
A
It
is
kind
of
an
awkward
shift
into
the
kind
of
the
traditional
seo
picture
of
like
oh
they're,
just
optimizing
for
all
synonyms,
and
usually
we
figure
that
out
fairly
well,
but
sometimes
it's
kind
of
more
practical
to
say.
Well,
google
will
figure
it
out
at
some
point,
but
I
need
to
solve
it
for
myself
now
so
I'll.
Just
kind
of
do
do
the
awkward
thing.
A
And
there
was
a
video
I
think,
like
what
was
it
fall
2019
from
webmaster
conference
in
mountain
view,
from
pal
har,
where
so
paul
har
is
one
of
the
the
top
quality
engineers
that
works
at
google
on
search
quality
and
he
mentions
quite
a
bit
around
synonyms
and
how
we
pick
that
up,
how
we
understand
which
words
belong
together,
which
words?
Don't
it's?
It's
not
going
to
solve
your
problem,
but
it'll
give
you
a
little
bit
of
background
information
on
what
might
be
happening
there.
K
Hi
john,
our
website
has
about
60
000
products
and
we
have
a
rather
robust
layered
navigation
that
uses
number
of
product
attributes
we're
preparing
to
go
to
a
migration
to
a
new
system
where
our
search
parameters
are
going
to
be
changing
from
the
current
system
to
the
new
one,
and
we
were
hoping
to
preserve
some
of
our
seo
value
for
many
of
those
parameters,
since
they
do
appear
in
the
search
results
page.
K
A
Yeah,
I
think,
any
time
you
do
a
migration
on
on
a
url
level.
Ideally,
you
would
redirect
the
old
urls
to
the
appropriate
new
ones.
So,
if
you're
changing
parameters
or
the
setup
on
how
you
use
the
parameters
within
your
website
and
you
care
about
the
old
or
the
new
urls,
then
make
sure
that
you
have
redirect
setup
and
with
that,
we
can
pretty
much
pass
all
of
the
value
there.
A
That,
no,
I
I
would
not
use
I
mean
if,
if
it's
something
that
you
care
about
the
traffic
to
the
old
urls
kind
of
that
that
gets
forwarded
to
the
new
ones,
then
I
would
definitely
make
sure
you
have
a
redirect
set
up,
or
at
least
if
you
can't
do
a
redirect,
then
set
the
rel
canonical
to
the
appropriate
new
url,
so
that
we
can
forward
any
of
the
the
value
collected
with
those
old
urls.
A
A
I
think
some
of
them
we
might
already
have
covered
or
gone
through,
so
the
the
first
one
I
have
here
is
our
business
is
going
through
a
rebrand
and
a
system
update
we're
planning
on
executing
quite
a
hefty
migration,
and
then
they
list
a
ton
of
different
things
that
they're
doing
like
a
domain:
migration,
cms
migration,
consolidating
pages,
updating,
content,
updating
the
site
structure,
template
redesign,
switching
hosting
providers
and
kind
of
the
question
is
like
how
should
we
do
this?
A
From
from
my
point
of
view,
these
are
all
pretty
big
changes,
so
it's
something
where
you
will
definitely
see
fluctuations
along
the
way
and
that's
something
where,
if
you
make
or
if
you
plan
to
do
so
many
different
changes,
because
you
think
the
end
state
will
be
important,
then
you
could
theoretically
spread
that
out
and
do
it
one
after
another,
but
practically
speaking,
that's
probably
not
going
to
work
so
practically
you're
probably
going
to
have
to
bite
the
bullet
and
say
well.
A
Oh,
I
forgot
to
redirect
my
blog
and
now
everything
is
gone
and
with
regards
to
potentially
the
most
dangerous
thing
with
regards
to
the
rankings
there,
it's
important
to
keep
in
mind
that
you
will
see
changes
in
ranking,
at
least
during
the
migration.
You
will
see
different
rankings
afterwards
as
well.
A
The
individual
sites
got
before
and
assume
that
your
new
site
will
get
it
it's
possible
that
your
new
site
will
be
so
much
stronger
that
you'll
get
a
lot
more
traffic.
It's
also
possible
that
the
combination
that
you're
doing
essentially
results
in
one
site
rather
than
a
couple
of
sites,
and
that
you
might
lose
some
visibility
in
search.
A
A
Do
you
have
any
for
any
information
regarding
web
stories
when
they
might
be
released
in
australia?
We
want
to
be
early
adopters,
so
if
we
can
get
any
estimation,
I
don't
have
any
information
on
when
the
new
features
will
launch
particularly
web
stories
in
australia
with
web
stories.
It's
important
to
keep
in
mind.
A
With
regards
to
us
kind
of
turning
on
individual
features
in
individual
countries,
sometimes
that's
a
matter
of
working
out
the
policies
working
out
the
technical
details.
Sometimes
that's
also
a
matter
of
us
being
able
to
look
into
the
data
from
that
country
and
say:
oh
lots
of
people
are
already
implementing
this
structured
data.
Maybe
we
should
just
turn
it
on
for
that.
A
Please
detail
the
difference
between
a
doorway
page
and
a
landing
page
and
why
doorway
pages
are
frowned
upon,
so
usually
we,
I
think
we
have
a
help
center
article
about
this,
but
usually
doorway
pages
are
individual
pages,
where
you're
just
funneling
people
to
the
actual
part
of
your
website
that
you
care
about,
and
is
I
my
assumption,
is
that
for
the
most
part,
our
systems
are
able
to
understand
these
doorway
pages
better
nowadays
anyway,
so
that
when
we
look
at
these
doorway
pages,
we
see.
Oh,
this
is
a
kind
of
a
low
quality
page.
A
It
mentions
these
keywords,
but
it's
kind
of
a
low
quality
page,
so
we
will
probably
not
show
it
as
visibly
in
search
anyway,
but
essentially
the
idea
is
that,
instead
of
setting
up
a
number
of
different
pages
for
just
variations
of
different
keywords,
make
one
really
strong
page
that
really
kind
of
covers
that
topic
really
well,
so
that,
like
our
systems,
can
recognize.
This
is
a
fantastic
page
and
appropriately
we
can
understand.
Well.
This
is
a
part
of
this
bigger
website
which
is
also
really
fantastic,
and
that
helps
us
to
kind
of
better
rank.
A
Those
pages
overall,
we'll
adding
history
api
functionality
into
infinite
scroll
pages.
Is
that
enough,
or
do
I
need
to
add
physical
pagination
links
on
top
of
this?
Yes,
if
you're
using
infinite
scroll,
then
having
paginated
links
is
very
important
for
us,
because
then
we
can
crawl
and
index
those
paginated
links
separately.
A
History,
api
is
something
we
can
sometimes
pick
up
on,
but
often
it
relies
on
kind
of
the
user,
doing
something
specific
on
a
page
like
scrolling
to
a
specific
location
and
googlebot
in
general,
when
rendering
a
page
does
not
trigger
any
of
these
actions,
but
rather
renders
a
page
in
kind
of
a
really
long
viewport,
and
maybe
that
will
trigger
your
history
api
or
not,
whereas
if
you
have
links
to
the
individual
paginated
versions
of
those
pages,
then
for
sure
we
can
figure
out
like
these
are
the
different
pages.
A
A
The
other
thing
to
keep
in
mind
is
sometimes
we
don't
need
to
trigger
infinite
scroll
on
a
website,
so
in
particular,
if
you
have
something
like
a
blog
on
a
category
page
and
you
use
infinite
scroll
there.
If
we
can
already
find
all
of
those
blog
posts,
then
we
don't
really
need
to
trigger
infinite
scroll.
A
So
it's
not
something
where
you'd
have
to
say.
Well,
if
we
use
infinite
scroll,
we
must
make
sure
that
it's
crawlable,
but
rather
it's
like
it's
it's
a
good
practice
and
if
you
care
about
those
pages
being
indexed,
then
sure
you
should
watch
out
to
make
sure
that
they're
crawlable,
if
you
don't
care
like
that's
totally
up
to
you,.
A
F
John,
since
you
mentioned
earlier
regarding
pagination,
I
know
there's
an
old
video
from
miley,
oh
hey,
regarding
what
kind
of
options
you
have
in
terms
of
pagination,
basically
the
view
all
page
or
how
it
was
then
run
next
real
prev,
and
she
mentions
in
the
video
that
if
google
does
find
the
view
all
page
on
the
site,
it
will
usually
prefer
to
show
that
page
in
the
search
results.
F
So
obviously,
if
you
don't
want
that,
then
you
can
use
the
normal
pagination
and
google
will
kind
of
figure
it
out
on
its
own.
What
I
noticed
is
that
so
for
a
couple
of
our
clients,
where
we
have
a
view
all
page,
but
it
simply
be
there,
because
the
platform
allows
you
to
see
kind
of
all
of
the
results
and
there's
like
a
parameter
like
page
equals
o
or
something
like
that.
F
Google
will
actually
use
that
index
that
and
rank
that
despite
us
not
wanting
to
it
has
a
canonical
to
the
non-parameter
version,
but
google
simply
seems
to
be
ignoring
it,
and
I
wonder
whether
it
is
because
google
kind
of
figured
out
oh
this
is
a
viewable
page.
This
is
a
strong
signal
for
us
to
kind
of
out
use
it
despite
having
the
canonical
for
something
else.
F
A
Yeah,
I
I
don't
know
what
we
do
with
the
the
view
all
pages
at
the
moment.
My
my
guess
is
that
you
might
be
linking
to
that
view
all
page
from
all
of
the
pages
within
the
paginated
set,
and
then
we'll
say.
Oh,
this
must
be
a
really
important
page.
We
should
focus
on
it
instead
of
the
individual
paginated
versions.
A
That
might
be
where
we're
picking
that
up,
but
that
then
kind
of
depends
on
well.
Do
you
want
to
keep
a
view
all
page
or
not?
And
if
you
don't
want
to
keep
a
view
all
page
then
obviously
like
don't
link
to
it
and
then
we'll
we'll
kind
of
work
that
out.
If
you
do
want
to
link
to
it,
then
maybe
we
will
end
it.
F
Yeah
and
it
looks
like
we're
linked
to
it
so
yeah
I
I
thought
it
was
just
a
drop
down
with
like
with
no
anchor
actual
links
like
html
tags,
but
it
looks
that
they
are
html
tags,
so
yeah.
That
seems
to
make
sense,
so
removing
it
might
lead
google
to
drop
it
and
use
the
okay.
A
No
yeah
cool
and
then
a
question
about
backlink
badges,
so
I
don't
know
this
seems
like
something.
That's
feels
a
little
bit
older,
but
I
the
the
question,
is
a
few
days
ago,
I
invited
members
of
my
website
to
use
an
html
snippet
that
shows
a
badge
on
their
websites
linking
back
to
their
profile
page
on
my
website.
A
A
A
That
sounds
essentially,
okay,
I
mean
that's
something
where,
if
there's
really
no
kind
of
value
associated
with
that
link
in
terms
of
it's
like,
you
must
link
to
my
website
with
this
badge
in
order
to
gain
access
to
these
features
or
whatever.
Then
that's,
that's
perfectly
fine.
Some
sites
do
this
in
in
other
ways,
for
example,
that
they'll
have
a.
I
don't
know,
link
on
a
page
that
says
link
to
this
page
and
you
click
on
that.
A
A
That's
that's
essentially
the
important
part
then
there's
another
question
about
cls
and
kind
of
page
speed,
insights
and
the
field
data
there.
I
think
we
talked
about
that
a
little
bit
beforehand
as
well.
It,
I
think,
also
goes
into
kind
of
a
difference
between
the
field
data
that
you
see
in
page
speed,
insights
and
the
data
in
search
consoles.
So
I'll
definitely
pass
this
on
to
to
the
search
console
team
to
take
a
look
at
if
you
have
any
examples,
feel
free
to
drop
them
in
the
chat
or
send
them
my
way.
A
So
we
would
not
see
that
there's
a
link
associated
to
an
another
url
within
your
website,
so
in
in
that
regard.
If
you
want
to
use
kind
of
something
that
looks
like
a
button
for
internal
navigation,
then
I
would
use
normal
html
links
and
just
style
it
with
css
to
make
it
look
like
a
button
rather
than
to
use
button
elements
in
html
and
then
add
javascript.
A
Question
about
internationalization:
we
have
a
the
same
version
of
the
website,
english
for
different
countries,
and
we
use
hreflang,
and
I
think
this
is
similar
to
the
question
we
we
had
in
the
beginning
so
currently
we're
using
different
cctlds
for
the
individual
countries
and
we'd
like
to
move
to
a
generic
domain
with
subdirectories
and
yeah.
Like
I
mentioned,
we,
we
support
both
of
those
setups
with
regards
to
international
websites,
as
long
as
the
main
domain
is
really
a
generic
top-level
domain,
so
you
mentioned,
or
the
question
here
mentions:
dot
io.
A
I
believe
that
io
is
theoretically
a
country
code,
top-level
domain.
It
might
be
that
we
treat
io
as
a
generic
top-level
domain.
You
can
look
that
up
in
our
help
documentation,
but
I
believe
by
default,
that's
a
country
code,
top-level
domain
and
for
country
code
top-level
domains.
You
would
not
be
able
to
set
geotargeting.
L
Okay,
so
I
just
wanted
to
understand
if
there
is
one
page
in
top
level
country
top
level
domain
and
we
have
decided
to
go
with
the
cctlds
or
we
have
decided
to
create
separate
country
pages
for
the
same
page.
On
that
case,
obviously
new
pages
older
page
was
having
a
lot
of
backlinks,
but
now
new
pages
won't
be
having
those
backlinks.
L
On
that
case,
how
google
deals
with
backlinks
to
main
page
if
we
have
expanded
to
different
country
level
pages
now,
so
so
so,
for
example,
I
have
right
now
onepagent.com
and
we
have
decided
to
to
go
ahead
with
three
separate
countries
because
from
from
them
we
are
getting
good
number
of
leads.
So
we
have
planned
to
go
ahead
with
separate
country
pages
for
those
countries
like
pk,
for
pakistan
in
for
india.
A
Yeah,
so
if
you
create
new
pages
and
they're
not
linked
internally,
then
we
would
see
them
as
kind
of
separate
pages.
A
With
regards
to
hreflang
what
would
happen
there
is
if
we
understand
that
these
pages
are
part
of
a
set,
and
we
see
a
user
from
a
different
country
kind
of
doing
a
search
where
we
would
show
that
page
in
the
search
results
then
with
ahreflang.
We
would
swap
that
out,
but
by
default,
if
you
expand
your
website
to
include
more
countries,
then
you're
kind
of
in
a
sense
diluting
the
the
value
of
your
pages
by
expanding
it
across
multiple
country
versions.
A
So
you
really
have
to
keep
in
mind
that
you
should
also
focus
on
those
country
versions
that
you
shouldn't
just
say.
Oh,
I
can
make
like
50
different
country
versions
of
my
website
now
and
you
shouldn't
assume
that
kind
of,
like
copying
everything
onto
50
different
versions,
will
make
it
more
visible,
rather
than
maybe
even
less
visible,
because
you're
kind
of
like
diluting
the
value
across
all
of
those.
A
So
if
you're
adding
individual
pages,
then
I
would
definitely
make
sure
that
you
have
normal
html
links
between
those
existing
pages
and
kind
of
individual
country
pages,
so
that
we
can
understand
how
those
new
pages
kind
of
map
within
your
website,
yeah
and
hreflang,
helps
us
a
little
bit.
But
hflang
is
not
the
same
as
a
normal
link
on
a
page.
L
So
so,
on
that
case,
internal
linking
is
fine.
We
will
be
doing
it
hr
flank
tag.
Also,
we
will
be
doing.
My
only
concern
was
that
older
page
was
getting
the
backlinks
from
india.
Now
there
is
new
india
page
targeted
for
india
market.
On
that
case,
will
hr
flank
tag
swap
all
external
backlinks
to
new
page.
A
It
depends
on
what
you
do
like
if
you
take
that
existing
page
and
you
link
to
your
new
india
page,
then
we
can
forward
some
of
that
value.
It's
not
the
same
as
kind
of
redirecting
or
posting
everything
on
the
existing
page,
but
it
does
spread
some
of
that
value.
So
it's
not
the
case
that
you
have
to
kind
of
like
work
to
get
individual
links
to
all
variations
of
all
of
your
pages,
but
rather
that
kind
of
spreads,
naturally
with
the
internal
linking
of
your
site.
A
A
So
I
I
feel,
like
this
question
is
asked
a
little
bit
pointed,
and
I
I
guess
the
quick
and
easy
answer
is
yes:
if
you're
paying
people
to
create
content
with
links,
then
you're
paying
people
for
those
links
right
and,
if
you're
paying
for
links,
then
that
would
be
something
that
would
be
against
our
webmaster
guidelines.
A
So
that's
kind
of
the
the
easy
answer
there.
Of
course,
if
you're
kind
of,
if
these
links
do
not
pass
page
rank,
if
they
have
the
the
nofollow
attached
to
it
or
else
sponsored
attached
to
it,
then
that
can
be
fine
like
that's
essentially
a
way
of
advertising
your
website.
A
So
maybe
that
that
helps
a
little
bit
then
a
question.
I
think
there
are
two
similar
ones
like
this
with
regards
to
structured
data
types
for
tourism
companies
in
the
schema.org
website.
There
are
different
suggestions
for
travel
sectors
such
as
tourist
trip,
travel
agency
or
trip
are
the
schema
types
mentioned
above
supported
or
relevant
for
google
indexing
and
ranking.
If
not,
what
would
be
the
most
suitable
structured
data
type
for
trip
landing
page.
A
So
in
particular,
we
support
a
subset
of
the
functionality
or
the
the
different
types
of
markup
from
schema.org,
and
those
are
the
things
that
we
would
show
in
the
search
results
and
everything
else
is
essentially
kind
of.
I
don't
know
more
like
a
nice
to
have
or
more
like
something
where
you're
adding
a
little
bit
of
extra
information
to
your
pages.
But
I
would
not
expect
to
see
any
change
in
search
because
of
that,
and
I
think
that
refers
to
all
three
of
those
types
in
in
that
sense.
A
So
what
I
would
recommend
doing
is
going
to
our
search,
developer,
documentation
and
looking
at
the
rich
results,
types
that
we
have
listed
there
and
trying
to
work
out
which
of
these
types
map
to
the
kind
of
content
that
you
have
and
then
kind
of
focusing
on
those
types.
First,
so
that
you
kind
of
have
the
visible
effect
kind
of
nailed
down.
A
And
you
kind
of
know
that
if
you
spend
time
on
that
specific
kind
of
markup,
you
will
have
a
visible
effect
in
search
and
then,
if
you
want
to
go
further
and
provide
additional
information
through
other
schema.org
types.
You're
welcome
to
do
that.
I
would
just
assume
by
default
that
you
would
not
see
any
effect
in
search
at
all.
A
So
there
is
a
slight
sense
of
well.
We
understand
the
pages
a
little
bit
better
with
with
more
types
of
structured
data
on
our
page,
but
I
would
not
assume
that
that's
something
that
would
have
a
visible
effect
in
terms
of
ranking
or
better
visibility
in
search,
so
that's
kind
of
the
direction.
I
would
go
there.
A
H
John,
I
saw
some
tweets
this
week
with
you
and
gary
about
an
old
subject,
but
it's
it's
still
a
favorite
domain
names.
You
know
an
old
domain
name,
whether
you're
going
to
have
problems
with
it
down
the
road,
because
you
bought
one
that
had
you
know
bad
content,
bad
links
directed
to
it.
I
mean
the
only
way
I
know
of
is
to
look
through
wayback
machine,
but
that
will
give
you
a
sense.
Okay,
maybe
it
was
parked
or
maybe
it
had
content,
but
could
you
ever
really
tell
if
it
had
links?
A
Now
I
mean
to
to
this
eval
you'd
have
to
have
the
site
verified
in
search
console,
so
if
you
have
it
verified,
then
you'll
also
see
links
in
search
console,
so
that
kind
of
helps
a
little
bit
if
you're
still
at
the
stage
where
you're
unsure.
If
you
should
buy
that
domain
name
or
not,
then
maybe
it
makes
sense
to
look
at
some
of
the
third-party
link
checking
tools
that
are
out
there.
A
I
think
there
are
a
number
of
providers
that
essentially
crawl
the
web,
similar
to
how
we
do
it
and
provide
some
aggregate
views
of
those
domains
and
usually
you'll
see
that
it
matches
a
little
bit.
What
you
would
see
in
the
wayback
machine,
where
you
see
like
okay,
this
is,
I
don't
know
a
small
business
and
the
links
that
you
would
find
are
probably
like
the
kind
of
links
that
a
small
business
might
have,
or
maybe
it
was
a
casino
or
some
crazy
affiliate
site.
M
Cool
hi,
john
I'll,
just
jump
in
so
I
do
have
a
question
about
how
google's
treating
the
date
modified
for
certain
articles.
So
we've
tested
live
blog
posting
and
we've
noticed
that
google
sig
consistently
either
ignores
or
cannot
correctly
detect
our
last
modified
date.
M
So
what
we
tend
to
see
is
we've
updated
the
post,
let's
say
five
minutes
ago
and
google's
still
showing
in
the
service
that
it
was
updated
or
published
seven
hours
ago
or
four
hours
ago.
Something
like
that.
So
I
think
this
puts
us
at
a
competitive
disadvantage.
We
do
have
the
proper
markup
in
schema.
Our
front
end
does
reflect
the
correct
last
updated
date
timestamp
and
it's
also
in
our
sitemaps.
I'm
curious.
If
you
have
any
insights
as
to
how
we
can
kind
of
resolve
that.
A
I
I'd
almost
need
to
see
some
examples
where
we
get
that
wrong,
so
that
I
could
pass
that
on
to
the
team.
So
one
thing
that's
that's
important
is
that
we're
able
to
recognize
the
same
date
on
the
page
as
well.
A
So
if
you
have
a
date
set
in
the
article
structure
data
for
the
the
modification
date,
then
that
should
be
something
that's
also
visible
on
the
page
as
well.
So
that's
that's
one
thing
to
to
keep
in
mind.
The
other
thing
is
to
watch
out
for
time
zones.
That's
something
that
I've
sometimes
seen
where
people
will
specify
a
date
and
a
time
on
on
the
site
and
we're
kind
of
not
sure
about
which
time
zone
that
refers
to,
and
then
we
can't
match
that
correctly
to
to
the
structured
data
on
the
page.
A
So
that's
kind
of
another
aspect
that
sometimes
plays
in
there,
and
I
guess
the
third
thing
that
I've
sometimes
seen
is
sometimes
we
get
it
wrong
in
terms
of
maybe
we'll
show
the
proper
rate
in
the
normal
search
results
and
then
in
the
what
is
it?
I
think,
the
top
stories
or
what
are
the
the
different
news
blocks
that
we
have
sometimes
we'll
show
a
date
that
that
is
several
hours
off,
and
I
believe
we
fixed
that.
M
M
Another
question
on
the
same
topic,
so
our
live
version
also
has
this
amp
component
right,
but
for
some
reason
our
amp
cache
doesn't
update.
With
our
current
updates
and
we've
tried
purging
manually,
we've
tried
it
over
api.
M
It
should
kind
of
work,
alongside
with
the
updates
on
the
non-amperage,
but
for
some
reason
this
is
not
at
least
being
accepted
by
google.
So
I'm
curious.
If
you
have
any
thoughts
on
that
as
well.
A
I
I
don't
know
how
how
that
is
connected,
so
my
my
general
answer,
there
would
be
to
use
the
api
to
request
those
cache
updates,
but
it
sounds
like
you're
already
doing
that,
so
I
don't
know
what
what
might
need
to
be
done
there.
Additionally,
is
that
specific?
To
pages
where
you're
updating
things
on
the
page
kind
of
like
the
the
live
blog
situation,
specific.
M
To
the
live
blog,
where
so
for
other
pages,
we
can
re
request
a
purge
and
it
will
update.
It
may
take
some
time,
but
it
will
update,
but
for
this
live
blog
for
some
ins.
For
some
reason
this
doesn't
take
it.
It
does
give
us
like
a
200
response
and
everything,
but
we
don't
see
any
kind
of
changes
on
the
on
the
app.
N
Hi
john,
if
I
may
jump
in
sure
I
have
a
problem
with
the
site
and
submitting
the
site
map
in
google
search
console
when
I
said
try
to
submit
the
site
map.
The
search
console
will
give
me
a
generic
http
error
and
when
I
have
a
look
at
the
server
log
files,
then
I
can
see
that
googlebot
can
fetch
the
sitemap
with
a
200
status
code
and
I
can
load
the
sitemap
from
a
web
browser
using
all
the
other
tools.
The
sitemap
is
structurally
correct.
N
A
A
A
A
Okay,
let
me
let
me
take
a
break
here
and
just
pause.
The
recording
and
you're
welcome
to
kind
of
hang
around
a
little
bit
longer
as
well,
if
you'd
like
to,
but
just
so
that
we
have
kind
of
a
reasonable
length
for
the
recording.
Thank
you
all
for
joining
in
thanks
for
all
of
the
the
many
questions
that
were
submitted
and
that
were
asked
here.
A
If
there's
something
on
your
mind
that
you
still
need
to
find
an
answer
for
I'd,
recommend
going
to
the
search
central
help
forum
and
asking
the
folks
there
that
the
people
there
are
pretty
smart
and
have
a
lot
of
experience
with
websites,
so
they
can
generally
help
you
along
and
they
can
also
escalate
issues
if
something
pops
up.
That
is
like
completely
weird
or
different.
A
So
maybe
that'll
help
I'll
set
up
more
of
these
office
hour
hangouts
as
well.
So
if
you
want
to
join
in
one
of
the
future,
episodes
watch
out
for
the
I
think
the
youtube
community
page
is
where
we
list
them
all
and
they're,
also
on
the
event
calendar
on
the
search
central
site.