►
From YouTube: English Google SEO office-hours from December 24, 2021
Description
This is a recording of the Google SEO office-hours hangout from December 24, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
miller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
sessions
where
people
can
jump
in
and
ask
their
questions
around
seo.
A
Even
on
christmas
eve,
I
guess
it's
it's
cool
to
have
so
many
of
you
here
and
we
have
a
bunch
of
questions
submitted
as
well,
but
people
are
already
jumping
in
to
raise
their
hands
so
maybe
we'll
get
started
wow
it's
like
almost
everyone's
raising
their
hands,
so
crazy.
Okay,
let's
see
eve,
I
think
you're
first
on
my
list.
B
Hi
john
hi,
thanks
for
your
time
for
question
in
regards
to
paywalls
data
or
payable
content.
B
Is
it
considered
cloaking
if,
for
example,
I
only
show
this
as
I
check
whether
it's
googlebot
and
the
only
show,
then
the
structured
data
ends
them
at
the
table
data,
but
then
to
the
regular
user
in
order
to
really
like
properly
hide
it.
I
don't
show
the
data
at
all
and
I
don't
show
the
structured
data
or
is
that
fine.
A
A
With,
let's
see
vajid,
I
think.
C
C
Nice
to
meet
you,
I
have
two
questions
fair
at
first,
I
want
to
ask
about
the
mca
report:
digital
millennium,
copyright
act
of
the
united
states.
I
wanna
I
want
to
know
if
someone
copy
text
content
from
my
site
can
I
report
for
dmca
to
remove
his
site
from
google
search
results.
A
Maybe
so
the
the
reason
I
say
maybe
is
because
this
is
essentially
a
legal
process,
and
I
can't
give
you
legal
advice.
So
it's
something
where,
if,
if
you
feel
that
this
is
the
appropriate
way
to
handle
this,
if
you
check
with
your
lawyers
that
everything
is
okay
in
that
regard,
then
feel
free
to
do
that.
The
dmca
process
is
such
that
it's
tied
to
individual
pages,
though
so
not
the
whole
website.
A
C
That's
great
another
question
that
I
have
is
about
indexing
recently
me
and
I
know
a
large
community
of
webmasters.
Almost
all
of
us
have
problems
for
indexing
new
pages
from
google.
I'm
gonna
ask
you
evan.
I
I
published
high
quality
content.
I
submitted
site
map
and
sometimes
request
indexing
from
google
search
console,
but
I
still
have
problem
for
indexing,
new
content
and
or
it's
indexed
by
july,
so
I'm
gonna
ask
you,
have
you
ever
heard
in
this
case?
Recently
it's
a
bug
from
google
or
it's
a
new
adwords
update.
A
I
I
mean
these
kind
of
questions
come
up
every
now
and
then
so
from
that
regard.
A
Yes,
I
I
have
heard
about
these
questions,
but,
as
far
as
I
know,
there's
no
bug
on
our
side
in
that
regard,
in
the
sense
that
I
I
think
the
tricky
part
is
we,
we
just
don't
index
all
content
and
some
websites
generate
a
lot
of
content
and
if
we
don't
index
everything
from
our
point
of
view,
that
can
be
okay,
but
maybe
you
want
everything
indexed
and
we
we
can't
do
everything
all
the
time.
A
The
the
tricky
part,
I
think,
is
that
in
in
the
past,
it
was
such
that
a
lot
of
websites
were
technically
not
that
great.
So
it
was
a
little
bit
clearer,
which
kind
of
content
did
not
get
indexed,
and
nowadays
websites
are
technically
okay,
and
it's
almost
like
the
quality
bar
is,
is
a
little
bit
higher
in
that
sense
that
anyone
can
publish
something
that
theoretically
could
get
indexed,
but
we
still
like,
we
have
to
make
sure
that
we're
indexing
the
right
things
that
are
actually
useful
and
relevant
for
users.
A
A
D
People
speak
very
highly.
I
have
an
amp
question.
D
Our
developer
said
that
some
of
the
page
isn't
some
pages
are
not
being
indexed
because
of
the
we're
using
unsupported
html
tags
and
attributes,
but
the
page
is
only
our
blog,
not
our
normal
site,
that's
in
amp,
and
so
some
of
the
I
think
on
the
page,
some
of
the
the
normal
menu
bars
header,
footer
and
that
sort
of
stuff
is
in
non-amp
or
uses
some
stuff
that
isn't
supported.
D
So
can
that
mean
that
the
whole
page
won't
be
collected
or
or
do
you
need
to?
Is
there
a
best
practice
to
create
a
different
header
and
footer
and
different
surroundings,
so
that
there's
two
versions
of
the
page
one?
That's
in
the
same
way
that
we
do
for
react
java.
We
create
two
versions
of
the
page,
so
that
there's
a
kind
of
backup.
Do
you
do
that
for
amp.
A
So
I'd
like
to
just
say
it
depends,
but
anyway
no
the
the
thing
is:
if,
if
it's
not
a
valid
amp
page,
then
we
can't
process
it
with
the
amp
cache
and
all
of
that
which
means,
if
you
have
a
kind
of
a
traditional
html
page
and
the
amp
version
of
that
page.
What
would
happen
is
we
would
just
index
the
traditional
html
version
of
the
page
and
use
that
so
from
from
kind
of
an
indexing
point
of
view?
That
would
be
fine.
A
So
essentially
it's
it's
an
error
in
the
amp
page
and
we
won't
be
able
to
give
you
kind
of
like
those
the
the
advantages
of
the
amp
cache
and
all
of
that,
but
it
wouldn't
affect
the
the
rest
of
search.
So
it's
not
installed.
D
E
D
A
A
Yes,
I
I
pass
it
on
and
they're
also
kind
of
torn
in
terms
of
like.
Is
it
working
as
intended
or
not,
because
I
mean
essentially
I
I
think
it
was
like
what
was
it
adam
and
eve?
The.
A
And
and
the
query,
those
are
the
kind
of
queries
where
people
probably
wouldn't
be
expecting
a
a
kind
of
a
sex
toy
website.
So
that's
from
from
our
point
of
view,
it's
something
where
we'd
say
well,
it
can
happen
that
better
systems
try
to
fix
that
and
then
wouldn't
show
that
automatically.
A
So
from
from
our
point
of
view,
I
I
would
not
assume
that
that
would
change
immediately.
It's
it's
possible
that
at
some
point
over
time
that
that
would,
I
don't
know,
shift
back,
but
it's
it's
definitely
not
something
where
the
team
when
they
looked
at
it
they
that
they
said.
Oh,
this
is
a
bug
on
our
side.
A
H
H
Can
I
just
add
something
because
I
I
had
a
look
at
that
as
well
last
week
and
I
had
a
chat
with
axel
about
this.
The
funny
thing
is
that
what
we
see
is
we
do
see
a
canadian
website
with
the
same
title,
which
is
also
kind
of
like
a
18
plus
related,
which
does
rank
so
it's
kind
of
confusing.
Why,
in
france,
a
a
from
french
website
won't
rank
when,
when
a
canadian
website
does,
which
is
you
know,
kind
of
related
is
as
far
as
topics
concerned.
A
Yeah
I
mean
if,
if
you
point
it
out
like
that,
I
would
assume
the
safe
search
team
would
say
like
okay,
then
we
need
to
fix
the
canadian
website
too,
but
I
I
don't
think
they
would
look
at
that
and
say:
oh
well,
this
other
sex
toy
site
is
ranking.
Therefore
we
should
just
show
all
sex
toy
sites
like
I.
I
don't
see
that
happening,
and
so
I
I
think
I
don't
know
at
some
point.
In
the
past
there
was
a
similar
website.
A
What
was
it
I
think,
jack
and
jill,
something
like
that,
which
is
is
a
nursery
rhyme,
not
not
like
a
bible
story,
but
they
they
ran
into
similar
issues
where
people
are
searching
for
jack
and
jill
and
they
kind
of
expect
a
nursery
rhyme
or
like
some.
I
don't
know
almost
like
kids
information
and
running
across
a
an
adult
website,
that's
kind
of
unexpected
and
something
that,
from
our
point
of
view,
we
would
almost
see
like
as
a
bug
in
our
search
quality.
A
So
I
my
from
for
my
feeling,
I
don't
see
that
changing
it's
it's
possible
that
at
some
point
we
would
try
to
figure
out
like
oh
this.
This
is
someone
who's
explicitly
looking
for
sex
toys
and
we
can
recognize
that
somehow
and
then
maybe
we
would
be
able
to
show
that,
but
otherwise
I
I
don't
see
that
like
being
treated
as
a
bug
on
our
side.
A
G
A
I
mean
it
is
from
from
our
point
of
view:
it's
not
that
it's
tied
to
any
specific
site.
It's
not
that
the
web
spam
team
went
in
and
said
manually.
Your
site
should
not
rank
for
this
query.
It's
really
essentially
our
safe
search,
algorithms,
trying
to
figure
out
what
is
relevant
to
people,
what
what
is
kind
of
like,
not
surprising,
in
a
negative
way
for
for
people
who
are
not
expecting
that
those
kind
of
things.
G
A
All
right,
christian.
I
Hi
good
morning
again,
I
was
a
little
bit
surprised
to
see
that
the
webmaster
hangout
is
taking
place
today.
So
thanks
a
lot
for
that
and
then
for
all
the
other
hangouts
as
well.
John
thanks
cool
and
yeah.
I
I
figured
out
a
question
and
it's
about
the
product
reviews
update.
I
I
was
seeing
that
even
if
the
the
update
only
affects
english
speaking
website,
so
I
was
seeing
some
movements
in
german
search
as
well,
so
I
was
wondering
if
there
could
also
be
an
effect
on
on
websites
and
other
languages
by
this
product
reviews
update
or
any
kind
so
because
we
had
lots
of
yeah
movement
and
volatility
in
the
last
weeks
and
so
yeah.
My
question
is:
is
it
possible
that
the
product
previous
update
affects
other
sites
as
well.
A
I
don't
know
like
other
languages.
My
my
assumption
was
this
was
global
and
across
all
languages,
but
I
don't
know
what
what
we
announced
in
the
blog
post
specifically,
but
usually
we
we
try
to
push
the
the
engineering
team
to
to
make
a
decision
on
that,
so
that
we
can
document
it
properly
in
the
blog
post.
I
don't
know
if
that
happened
with
with
the
product
reviews
update,
don't
recall
the
complete
blog
post,
but
it's
it's
from
from
my
point
of
view.
I
Yeah,
I
was
just
checking
to
to
be
sure
and
yeah
the
the
blog
posts
say
it's
it's
it's
only
english
websites,
english
language
website,
so
this
is,
I
was
because
I
was
so
surprised,
but
yeah
good
to
know
anyway,
and
and
even
if
it's
only
targeted
on
english
speaking
websites,
you
say
it
could
affect
other
sites
as
well,
sooner
or
later,
okay,.
A
B
A
A
A
A
J
I
got
one
question
about
localization:
do
you
know
any
other
ways
to
localize
the
same
set
of
page
pages
for
different
english
speaking
countries
like
right?
Now
we
have
several
subdomains
with
jail
or
top
level
domain,
like
maybe
from
australia,
new
zealand
subdomains,
and
we
have
set
set
the
country
in
the
jsc
backend
and
also
use
the
halfling
on
page
level,
but
other
than
that
we
couldn't
figure
out
some
other
ways
to
help
us
to
localize
these.
These
subdomains.
A
Yeah,
so
I
I
think
you
you
covered
the
main
ones,
so
that's
geo
targeting
in
search
console
and
the
hreflang
settings.
Geotargeting
works
on
kind
of
a
subdirectory
or
subdomain
level,
so
it's
all
pages
in
there.
Hreflang
is
on
a
per
page
basis.
So
if
you
have
kind
of
like
a
home
page
for
one
country
and
different
product
pages
for
the
same
country,
then
each
of
those
pages
would
need
to
be
cross-linked
with
hreflang.
A
A
Usually
that's
something
like
a
javascript
based
banner
that
you
can
show
when
you
recognize
that
the
user
is
on
the
wrong
version
of
a
site.
So,
for
example,
if
a
user
from
australia
ends
up
on
on
the
page
from
I
don't
know
england,
then
you
could
show
a
javascript
banner
saying
like
hey.
We,
we
have
an
australian
version
of
this
page.
Here
you
can
go
there
directly.
A
The
the
advantage
of
a
javascript
based
banner
is
that
you
can
block
it
with
robots
text
so
that,
from
an
indexing
point
of
view,
it
doesn't
show
up,
and
if
you
don't
automatically
redirect,
then
essentially
all
of
the
search
engine
side
of
things,
they
will
be
able
to
process
those
two
versions
independently.
A
A
So,
for
example,
if
you
have
a
page
for
new
zealand
and
australia
and
the
the
whole
content
is
the
same,
the
only
thing
that's
slightly
different
is
the
currency
on
the
page.
Then
it
can
happen
that
we
fold
those
pages
together
and
we
pick
them
one
of
them
as
a
canonical,
and
we
used
that
as
the
basis
for
search.
A
J
A
I
don't
think
you
can
and
a
you
generally
don't
need
to,
because
we
we
will
still
process
the
href
lang
across
those
versions
and
we
will
show
the
right
url
in
the
search
results.
It's
just
the
the
content.
That's
indexed
is
from
that
one
canonical
version,
and
if
the
content
is
the
same,
then
we
just
need
that
content
once
so.
The
only
really
confusing
part
that
that
is
there
in
that
situation
is
that
search,
console
reports
on
the
canonical
url,
and
that
makes
it
really
hard
to
recognize.
When
is
the
australian
page
shown?
A
I
I
don't
think
it's
it's
optimal
in
all
cases,
and
in
particular
for
things
like
the
the
home
page
of
the
site
or
important
pages
that
are
site
specific,
I
would
try
to
make
sure
that
you
have
different
content
there,
so
that
they're
really
indexed
independently
and
that
they
can
rank
for
the
independent
versions
of
their
content,
so
kind
of,
like
those
pages
I
think,
are
really
critical
to
be
clearly
separate,
but
for
for
products
and
categories,
and
things
like
that,
I
I
don't
really
think
it
matters
that
much
if
the.
J
J
A
A
J
E
Hi
john
good
morning,
thanks
for
the
google
search
form.
Actually
I
have
few
questions
and
also
asked
last
week
like
my
first
question:
is
that,
like
my
website
has
millions
of
pages
like
category
sub
category
and
product
e-commerce
kind
of
pages,
so
we
have
added
a
dynamic
content
because
millions
of
pages
so
there's
a
very
difficult
to
add
you
separate
content
or
maybe
unique
content
on
each
pages.
A
I
don't
know
it's
it's
hard
to
say
just
based
on
on
what
you
described
so
to
to
me
it.
It
feels
a
little
bit
like
you're,
artificially
just
adding
keywords
and
content
to
pages
and
doing
that
artificially
in
the
sense
that,
oh,
we
hope,
search
engines
pick
up
these
keywords
that
we
drop.
I
don't
think
that's
a
good
idea,
but
dynamically,
adding
relevant
content
to
a
page
that
that
can
make
sense
because,
like
dynamically,
adding
content
is
essentially
just
doing.
I
don't
know
a
database
lookup
and
adding
content
based
on
that,
so
that's
usually
normal.
A
So
I'm
I
don't
know
it.
It
really
kind
of
depends
on
how
you
have
that
set
up
and
the
the
main
thing
I
would
really
avoid
is
that
you
run
into
a
situation
where
you're
artificially
adding
content
to
a
page,
just
in
the
hope
that
this
page
ranks
better
for
the
keywords
that
you
artificially
add,
because
then
the
page
itself
will
will
contain
this
artificial
content
and
when
users
go
there
they'll
be
like
why.
Why
are
these
random
keywords
on
this
page?
A
It
doesn't
make
any
sense
so
from
from
that
point
of
view,
making
sure
that
you
actually
have
good
relevant
content.
For
those
key
keywords.
That's
that's
more
what
I
would
focus
on
and
not
so
much
that
you
just
find
a
way
to
inject
a
bunch
of
keywords
into
a
page.
A
I
mean
it's,
it's
not
that
you
have
to
write
a
book
for
every
page,
so
it's,
but
it
should
be
something
on
the
page
that
that
is
relevant.
And
if,
if
it's
a
category
page
then
the
products
that
you
have
listed
there
are
very
relevant.
For
example,
and
usually
you
have
a
description
of
that
category,
that
description
is
is
very
relevant,
but
it's
not
that
you
have
to
write
a
wikipedia
article
on
the
bottom
about
all
of
these
products
and
where
they
come
from,
and
everything
like
that.
E
A
It
shouldn't
be
no
because
for
search
we
we
do
rendering
and
we
process
the
javascript
on
the
pages.
A
So
if,
if
it's
visible
in
a
normal
browser
and
you're
not
doing
anything
particularly
bad,
then
we
will
be
able
to
index
those
pages.
Normally,
you
can
double
check
with
the
inspect
url
tool
in
search
console
to
see
if
the
content
is
actually
visible.
When
googlebot
tries
to
render
the
page,
and
if
the
content
is
visible,
then
you
should
be
all
set.
E
A
I
I
think,
like
a
contact
us
page
always
makes
sense
even
for
for
global
business,
the
the
google,
my
business
or
google
business
profiles,
that's
something
that's
usually
more
tied
to
a
local
business
location
where
people
can
go.
A
But
if
you,
if
you
have
a
local
business
location
where
people
can
go,
then
I
would
definitely
set
that
up.
I
mean
it's.
It's
free
to
set
up
and
it's
it's
helpful
for,
for
I
don't
know
getting
people
to
come
in
person
at
least.
E
A
Usually
not
the
the
main
reason
for
I
think
there
are
two
main
reasons
for
that.
On
the
one
hand,
it's
very
easy
to
end
up
in
a
situation
where
you
have
another
million
urls
that
are
just
different
searches
which
doesn't
provide
any
value
to
you
so
kind
of
this.
We
we
call
it
an
infinite
space.
A
That's
that
kind
of
a
problem,
that's
something
you
want
to
avoid,
and
the
other
thing
you
want
to
avoid
is
that
people
do
spammy
things
in
the
search
box
and
try
to
get
those
things
indexed,
which
could
be
something
like
they're
searching
for
their
phone
number
and
I
don't
know
their
business
type
that
they
have
and
then
suddenly
your
website
search
page
ranks
for
that
kind
of
business
and
shows
their
phone
number.
A
A
Cool,
let
me
go
through
some
of
the
submitted
questions
and
I'll
get
back
to
to
the
others
here
who
have
their
hands
raised
as
well,
towards
the
end,
just
to
make
sure
that
we
cover
some
of
the
submitted
stuff
too.
The
first
one
I
have
on
my
list
is:
would
an
seo
firm
be
classified
as
a
your
money
or
your
life
website,
or
is
it
only
related
to
medical
and
financial
advice
websites?
A
It's
an
interesting
question,
but
I
don't
think
seo
websites
are
that
critical
to
people's
lives
like
obviously,
if
you
work
at
an
seo
company,
then
you're
kind
of
tied
to
that.
But
it's
not
that,
like
the
website
itself,
is
like
a
your
money
or
your
life
type
website.
So
from
that
point
of
view,
I
I
personally
wouldn't
see
it
as
as
fitting
and
not
every
website
that
sells
something
is
kind
of
like
falling
into
this
category.
A
But
what
I
would
recommend
here
is
rather
than
kind
of
like
blindly
trying
to
see.
Is
this
type
of
website
kind
of
falling
into
this
specific
category
is
to
read
up
on
where
this
category
came
from,
namely
the
the
quality
rater
guidelines
and
to
kind
of
understand
a
bit
more.
What
what
google
is
trying
to
do
with
kind
of
understanding
these
different
types
of
websites,
because
usually
that
that
will
give
you
a
little
bit
more
background
information
on
like
what
what
is
actually
happening.
A
Then,
at
the
beginning
of
december
we
migrated
a
cms
site.
The
site
averaged
50
000
sessions
per
day.
It
didn't
have
amp
and
it
had
a
lot
of
performance
and
seo
problems.
The
migration
consisted
of
making
respective
301
redirects
going
to
a
full
amp
format,
and
it
goes
on
and
basically
says,
since
the
migration,
the
website
has
basically
lost
all
traffic
or
a
lot
of
traffic.
So
I
took
a
quick
look
at
this
before.
A
Usually
what
what
I
do
with
these
kinds
of
questions
is:
go
to
our
internal
tools
and
where,
where
I
see
something
similar
to
what
you
would
see
in
search
console-
and
I
try
to
look
at
a
period
before
the
change
and
after
a
change,
in
particular
with
regards
to
kind
of
like
the
overall
visibility
that
the
site
gets
so
the
impressions
and
sometimes
that's
tie
that
kind
of
matches
when
a
site
actually
makes
technical
changes,
and
sometimes
it
doesn't
really
match,
and
one
of
the
things
that
that
I
noticed
when
looking
at
this
site
in
particular,
is
it
used
to
rank
for
a
lot
of
terms
where
I
would
consider
it
not
to
be
that
relevant.
A
A
So,
from
my
point
of
view,
that's
something
where
essentially
this
this
website
got
a
lot
of
traffic
for
queries.
Where
I
would
say
it
could
be
considered
almost
like
a
bug
on
our
side
that
this
happened
and
obviously
we
we
show
a
number
of
different
sites
for
these
kind
of
queries,
and
it's
not
that
we
just
show
that
one
official
site
for
that.
A
But
it's
still
something
where
I
would
say
if,
if
you're,
not
whatsapp
and
you're
kind
of
like
suddenly
no
longer
ranking
for
queries
like
how
to
log
into
whatsapp,
then
from
that
point
of
view,
it's
like
it's
not
really
a
bug
on
our
side
that
this
has
changed,
and
sometimes
these
kind
of
queries
can
drive
a
lot
of
impressions
and
with
them,
probably
a
lot
of
traffic
as
well.
A
So
if
you
just
look
at
the
overall
traffic
to
a
website,
it
can
look
like
there
was
a
significant
change.
But
if
you
look
at
the
actual
kind
of
relevant
traffic
for
a
website
assuming
I
I
think
this
is
like
some
something
like
a
news
type
website.
A
If
you
look
at
kind
of
like
the
relevant
traffic
that
is
relevant
for
this
kind
of
website,
then
that
looks
like
it's
actually
pretty
stable
and
it's
it's
sometimes
tricky
to
figure
out
like
which
part
of
the
traffic
that
you're
seeing
or
the
impressions
that
you're
seeing
is
actually
relevant
to
a
website.
And
it's
hard
to
also
like
have
a
high
level
view
of
the
website,
while
trying
to
ignore
those
less
relevant
things.
A
So,
for
example,
we
saw
this
with
our
own
search
documentation
where
we
worked
together
with
the
the
seo
folks
to
try
to
de-emphasize
specific
queries.
In
particular,
I
think
it
was
like
the
query.
I
don't
know
google
video
or
something
like
that
where
lots
of
people
were
searching
for
google
video
and
we
had
a
page
on
video
structured
data
for
google,
which
technically
is
kind
of
like
matches
those
keywords,
but
practically
it
didn't
really
make
sense
for
those
users.
A
So
when
we
looked
at
the
overall
impressions
to
our
site,
we
saw
this
fairly
high
number,
but
actually
that
wasn't
like
really
useful
for
us
as
a
metric,
because
it
contained
this
big
segment
of
traffic
for
things
that
are
kind
of
irrelevant
for
our
site
and
that,
I
I
think
is,
is
a
bit
tricky
and
means
that
when
you
recognize
this
kind
of
situation,
it
means
that
you
can't
just
look
at
the
overall
traffic
to
a
website
and
say:
oh,
this
is
a
metric
that
should
remain
stable
or
go
up.
A
So,
from
my
point
of
view,
looking
at
this
specific
site,
I
didn't
see
anything
technically
incorrect
with
regards
to
this
migration.
It's
essentially
just
our
search
systems
trying
to
figure
out
like
what
should
we
really
show
for
queries
like
this,
and
maybe
it's
not
that
particular
site,
and
probably
it's
not
really
a
bug
that
we
would
not
show
that,
and
my
my
feeling
is
that
the
timing
here
was
almost
more
accidental
with
regards
to
the
technical
changes
that
were
made
and
it's
something
that
that
would
have
happened
anyway
over
time.
A
Let's
see
the
structured
data
for
paywall
content,
I
think
we
covered
that.
Does
google
have
someone
tasked
with
the
job
of
recording
each
update
in
a
book
and
noting
which
aspects
of
search
are
implement
impacted?
If
so,
is
it
called
google's
book
of
secrets?
Yeah,
I
don't
know
alan,
I
I
don't
really
have
a
good
answer
for
you.
A
So
even
if
we
gave
you
this
kind
of
log
book
of
google's
kind
of
search
changes,
I
I
think
it's
really
hard
to
turn
that
into
something
useful
and
say.
Well
now
that
I
know
what
google
has
changed,
I
will
always
be
updating
my
site
to
match
that,
because
it's
it,
you
can't
really
do
the
reverse
of
that.
A
So
that's
kind
of
tricky
and-
and
the
other
aspect
is
like
you-
want
this
to
be
tracked
in
excel,
which
I
I
don't
know
that
doesn't
seem
like
a
good
idea
when
it
comes
to
breadcrumbs
structured
data.
Does
it
have
to
be
exactly
the
same
as
the
breadcrumbs
that
a
visitor
would
see
on
a
page,
I
sometimes
see
a
condensed
version
of
breadcrumbs
on
the
page,
while
the
structured
data
is
a
complete
breadcrumb
path
are
both
acceptable
options.
A
A
So,
if
you're
doing
something
like
showing
shorter
version
of
a
breadcrumb
on
a
page,
and
we
can't
match
that,
then
it
might
be
a
bit
hit
and
miss
if
we
actually
pick
up
that
breadcrumb
markup
and
use
that
if
you're
taking
individual
crumbs-
or
I
don't
know
like
the
individual
items
in
in
the
breadcrumb
list
and
you're
just
showing
some
of
those
but
not
all
of
them,
then
it
might
be
that
we
just
pick
up
those.
It
might
be
that
we
still
pick
up
the
rest,
because
we
see
overall
a
lot
of
the
breadcrumb
matches.
A
I
think
the
the
main
exception
that
I'm
aware
of
is
the
I
think,
the
the
faq
markup,
where
you
have
questions
and
answers
where,
from
our
point
of
view,
the
important
part
is
that
the
question
is
actually
visible
and
the
answer
can
be
something
like
a
collapse
section
on
a
page,
but
the
question
at
least
has
to
be
visible.
A
I
have
a
few
sites
from
the
same
owner
that
talk
about
different
aspects
of
the
the
same
topic.
Overall,
let's
say
gaming,
for
instance,
and
different
sites
target
different
parts
of
gaming
or
different
games.
Would
it
be
considered
against
the
guidelines
if
the
footer
of
each
site
of
one
links
to
the
others
so
from
from
our
point
of
view,
as
long
as
these
are
significantly
different
sites?
A
That's
perfectly
fine.
As
long
as
you
have
a
reasonable
number
of
sites,
then
I
don't
see
a
problem
with
that
by
reasonable
number
of
sites.
I'm
saying,
like
I
don't
know,
maybe
like
one
or
two
handful
of
sites
that
you
link
like
this.
If
you
have
something
like
hundreds
of
sites
that
you're
linking
like
this,
then
to
me
that
starts
looking
a
lot.
A
Like
doorway
pages
doorway
sites-
and
that
would
be
something
I
would
avoid,
but
from
a
technical
point
of
view
you,
you
could
do
something
like
this
like
linking
between
the
different
versions
of
that
or
the
different
sites
that
you
have.
If
it's
something
I
don't
know
along
the
lines
of
like
five
ten
sites,
maybe
maximum
from
a
practical
seo
point
of
view.
A
A
Let's
see
a
question
about
multilingual
sites.
We
run
a
site
with
under
300
index
pages,
all
in
english,
we're
looking
to
translate
about
half
of
these
pages
in
spanish,
which
will
be
placed
in
a
subdirectory
on
the
same
domain
like
slash,
es
and
tagged
as
alternate
language
versions
of
the
english
content.
Is
it
okay
to
translate
only
some
of
the
pages
content,
or
should
we
translate
everything
to
exactly
mirror
the
english
website
and
stand
the
best
chance
of
ranking
in
other
locations?
A
A
We
look
at
the
language
of
pages
individually,
so
if
you
have
some
pages
in
spanish,
we
just
look
at
those
spanish
pages.
When
someone
is
searching
in
spanish,
it's
not
the
case
that
we
would
say.
Oh
there
are
a
lot
more
english
pages
and
spanish
pages
here.
Therefore,
the
spanish
site
is
less
important.
A
So
from
that
point
of
view,
it's
not
something
where
you
must
have
everything
translated,
obviously
for
users,
sometimes
it
makes
sense
to
have
as
much
content
as
possible
translated,
but
usually
this
is
something
that
you
incrementally
improve
over
time.
Where
you
have
where
you
start
with
some
pages,
you
localize
them
well
and
then
you
add
more
pages,
different
pages
as
well
over
time.
All
of
that
the
hreflang
annotations
are
also
on
a
per
page
basis.
A
So
if
you
have
some
pages
in
english
and
in
spanish
and
you
link
those
that's
perfectly
fine,
if
you
have
some
pages
just
in
spanish,
that's
fine!
You
don't
need
hreflang,
some
pages
just
in
english.
It's
also
fine.
So
from
that
point
of
view,
this
seems
like
a
reasonable
way
to
start
cool.
Okay,
let
me
see,
let's
maybe
switch
back
to
some
more
live
questions
towards
the
end
rohit.
I
think
you've
been
patient
for
a
while
hi.
L
F
Okay,
so
it's
a
good
question
about
the
crawling.
The
website
I'm
talking
about
is
a
c
wordpress
website.
It
automatically
generates
multiple
url,
unwanted
urls.
What
I
want
to
understand
is
that
a
way
where
I
can
stop
the
crawler
to
find
out
this
urls.
I
know
I
can
no
index
it,
and
you
know
those
are
all
no
indexed
urls,
but
then
I
can
see
them
on
the
search
console
under
the
excluded
part.
I
mean
is
this
going
to
affect
the
crawl
budget
of
the
website,
because
it's
a
news
website?
F
A
Okay,
so
I
I
think
at
that
size
of
a
website
on,
on
the
one
hand,
I
would
not
worry
about
the
the
crawling
budget,
because
we
can
crawl
that
many
pages
fairly
quickly,
usually.
A
Yeah,
but
the
the
other
thing
I
I
think
with
regards
to
no
index
is
the
the
no
index
is
a
meta
tag
on
the
page,
so
we
have
to
crawl
the
page
to
see
the
meta
tag,
which
means
you
can't
avoid
that
we
we
check
the
no
index
pages.
We
have
to
check
them
from
time
to
time.
A
However,
if
we
see
that
there's
a
no
index
on
the
page,
then
usually
over
time,
we
crawl
those
pages
less
often,
so
we
will
still
double
check
every
now
and
then,
but
we
won't
check
as
much
as
a
normal
page
that
is
otherwise
indexed.
The
other
approach
is
to
use
robot's
text
with
the
robot's
text
file.
You
can
block
the
crawling
of
those
pages
completely.
A
The
disadvantage
is
that
sometimes
the
url
itself
can
be
indexed
in
the
search
results,
not
the
content
on
the
page,
but
just
the
url
alone,
and
if
the
the
content
is
otherwise
similar
to
things
that
you
have
on
your
website.
So
if
you
I
don't
know,
for
example,
have
a
football
news
website
and
you
have
some
articles
that
are
blocked
and
some
articles
that
are
allowed
for
crawling.
A
Then,
if
someone
is
searching
for
football
news,
they
will
find
the
indexable
versions
of
your
pages
and
it
it
won't
matter
that
there
are
other
pages
that
are
blocked
by
robot's
text.
However,
if
someone
explicitly
does
a
site
query
for
those
blog
pages,
then
you
would
be
able
to
see
those
urls
in
search,
but
again
for
the
normal
user.
They
they're
basically
not
visible.
A
So
from
from
that
point
of
view
in
in
a
situation
like
yours.
On
the
one
hand,
I
would
not
worry
about
the
crawl
budget
and,
from
a
practical
point
of
view,
both
the
no
index
and
the
robot's
text
would
be
kind
of
equivalent
in
the
sense
that
this
content
would
probably
not
appear
in
the
search
results
and
like
we
would
still
need
to
crawl
it
if
there's
a
no
index,
but
it's
the
numbers
are
so
small
that
they
don't
really
matter.
A
We
might
still
index
it
with
a
url
if
they're
blocked
by
robot's
text.
But
again,
if
you
have
other
content
on
your
website,
that's
similar.
It
doesn't
really
matter
so,
which
one
should
you
choose?
I
I
don't
know
I
would
choose
the
one
that
is
easier
to
implement
on
your
side
and
if,
for
example,
you
have
wordpress
and
you
can
just
have
a
checkbox
on
the
post
that
says
this
page
no
index,
maybe
that's
the
easiest
approach.
M
Can
you
hear
me
sure,
yes
yeah?
Actually
I
have
some
questions
about.
F
M
About
you,
people
huge
traffic
board
to
to
manipulate
the
google
ranking
yeah.
F
I
saw.
M
The
vietnamese
and
I'm
in
vietnam-
and
I
saw
the
locals
in
vietnam
about
seo.
J
M
Yeah
seo
service,
they
try
to
use
the
traffic
bot
to
drive
traffic
for
this
keyword
and
and
it's
variation
about
seo
service.
Just
like
a
best
seo
service
or
something-
and
I
see
the
ranking,
I
see
the
traffic
it.
It
grow
from
from
20
to
13,
from
20
to
30
000
volume
just
one
week
just
one
week
for
all
of
various
and
fragrancing
and
the
ranking,
and
the
website
appears
from
nowhere
to
top
one.
M
A
I
don't
know
it's
it's
hard
to
say
I.
I
think
one
one
really
tricky
aspect
there
is
that
we
we
don't
use
something
like
that
as
a
ranking
signal,
so
it
feels
like
you
might
be
seeing
something
in
in
your
tools
when
you
look
at
that,
which
is
not
actually
the
reason
for
for
the
change
that
you're
seeing
in
search.
A
It
might
be,
for
example,
that
maybe
other
sites
just
happened
to
drop
out
of
the
search
results
at
that
time,
and
because
of
that,
we
we
ended
up
showing
something
that
was
like
less
visible
previously,
but
it
also
sounds
like
you,
you
might
be
seeing
some
kind
of
almost
like
a
search
quality
issue
in
that
it's
it's.
M
A
M
A
A
That
to
the
team
here
as
well
to
to
double
check.
But
I
think.
F
A
It's
really
tricky,
especially
when,
when
you
look
at
things
like
search
volume,
because
that's
something
that
the.
F
M
A
I
I
I
don't
know
I
I
feel
it's
it's.
It's
really
tricky
when,
when
you're
looking
at
kind
of
external
tools
and
saying
like
oh,
this,
other
person's
site
is
getting
so
much
traffic
because
I've
I've
also
seen
cases
where
the
tools
have
reported,
that
a
site
is
seeing
a
big
drop
in
traffic
and
then
the
site
itself
is
like
watch
watch.
A
M
M
M
N
Hi
john,
I
also
have
a
question
about
crawling.
We
see
in
our
log
files
and
also
proven
that
it's
that
it's
google
bot
via
ep
a
lot
of
crawling
from
the
organic
bot
to
utm
parameter
urls,
google
display
and
universal
app
campaigns.
N
N
A
A
I
I
don't
know
offhand
the
the
one
place
where,
with
googlebot
we,
we
also
crawl
pages
that
you
list
in
in
ads
campaigns,
I
think,
is
the
for
product
search.
So
if
you
have
a
product
search,
feed
or
merchant
center
feed,
I'm
not
sure
what
it's
called
set
up.
Then
we
would
also
crawl
those
pages
for
googlebot
to
make
sure
that
we
can
pick
them
up
for
the
merchant
center
and
if
you
have
tagged
urls
in
there,
then
that
can
happen
that
we
will
like
keep
those
tag,
urls
and
reprocess
them.
A
I
I
don't
know
offhand
how
how
it
works
with
merchant
center.
It
might
also
be
that
other
people
are
able
to
submit
the
these
kind
of
products
kind
of
like
from
your
site
in
a
separate
feed,
so
it
might
necessarily
not
be
you
who's
submitting
them,
but
maybe
someone
who's
working
on
your
behalf
or
who
has
the
the
permission
to
do
that
as
well,
but
that
that
seems
like
the
the
most
likely
reason
for
something
like
this.
A
I
mean
the
the
other
aspect
is:
if
we,
if
we
find
links
to
these
pages
somewhere,
we
will
try
to
crawl
them.
So
if
you
have
tagged
internal
links
within
a
website,
we
will
still
try
to
pick
that
up
and
crawl
that,
if
you
have
things
set
up
in
javascript,
that
maybe
you
have
tracking
urls
with
these
parameters
set
up
somewhere
and
when
we
process
the
javascript.
It
looks
like
it's
a
link
to
those
tracking
urls.
N
Okay,
yeah,
because
it's
it's
quite
large,
so
it's
those
urls
are
crawled,
so
basically
50
of
our
crawling
budget
or
crawling
of
our
organic
bot.
But
the
thing
is:
I've
also
been
looking
into
search,
console
and
in
search
console.
That's
not
reported,
so
we
don't
see
this
parameter
urls
in
search
console.
Might
that
be
an
indication
of
something.
A
It
depends
on
where
you
look
in
search
console
so
in
in
the
crawl
stats
report.
If
they're
being
crawled
through
the
google
bot
infrastructure,
then
they
should
also
be
listed
there
in
the
index
coverage
and
search
performance
side.
I
don't
think
we
would
list
them.
A
A
A
N
Okay,
okay,
so
mystery
sorry-
and
I
have
another
question:
if
I
may
sure,
does
google
crawl
ever
websites
without
being
google
bot.
N
So
I
don't
know
like,
for
example,
in
cases
like
you're
trying
to
check.
I
call
this
like
showing
two
different
versions
to
users
and
bots,
or
something
like
this.
I
don't
know
if
there's
any
systems
like
that.
A
I
I
don't
think
we
would
crawl
like
like
that.
I
we
might
check
individual
urls
and
there
there
are
lots
of
other
systems
that
kind
of
go
into
that
as
well,
which
could
be
something
like.
I
don't
know.
Google
translate,
for
example,
that
might
use.
A
A
From
from
my
point
of
view,
I
don't
think
we
would
kind
of
process
like
that
other
than
like
I
mentioned,
like
the
merchant
center
side
of
things
or
the
ads
landing
pages
checks.
Those
are
also
like
a
large
number
of
urls
but
they're,
based
on
what
you
submit
to
those
systems
appropriately.
A
N
A
A
Cool
okay,
let
me
take
a
break
here.
A
Sure,
thanks,
let
me
take
a
break
here.
I'll
get
back
to
the
other
questions
that
are
still
have
their
hands
raised,
but
just
so
that
we
have
kind
of
like
a
reasonable
length
recording.
Thank
you
all
for
joining
in,
even
on
on
christmas
eve,
sending
so
many
questions
our
way.
I
hope
I
hope
you
have
a
good
holiday
time,
I'm
still
not
100
sure
if,
if
I'll
do
one
next
week,
but
maybe
so
like
the
absolute
last
chance
to
to
get
an
office
hours
in
this
year,
we'll
see.
A
But
in
any
case
I
wish
you
all
a
good
start
into
the
new
year
and
a
good
holiday
time
if
you're
taking
time
off
merry
christmas,
if
you
celebrate
christmas,
otherwise
just
a
kind
of
a
quiet
week
with
less
emails,
hopefully
cool
all
right.
Thanks
a
lot
and
see
you
next
time.