►
From YouTube: English Google SEO office-hours from November 19, 2021
Description
This is a recording of the Google SEO office-hours hangout from November 19, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hours
sessions
where
people
can
join
in
and
ask
their
questions
around
their
website
and
web
search,
and
I
wonder
what
we'll
be
talking
about
today.
So
we
we
have
a
bunch
of
questions
that
were
submitted
already,
but
I
see
some
of
you
already
have
your
hands
raised.
So
maybe
we'll
just
start
off
with
you.
A
Let's
see
akash,
I
think
you're.
First.
B
Hi
john
hi
hi,
so
my
question
is
like
past
two
days
a
few
of
my
web
pages
have
been
considered.
You
know
they
are
failing
for
mobile
usability
test.
Due
to
you
know,
text
text
is
too
small.
So
could
this
be
possible
if
you
fail
for
one
of
the
mobile
usability
errors,
or
your
pages
won't
be
indexed?
A
They
should
definitely
still
be
indexed,
even
even
if
they're
not
completely
mobile
friendly,
we
we
would
still
index
them.
The
the
mobile
friendliness
criteria
is
something
that
we
use
as
a
small
factor
within
the
mobile
search
results,
but
we
we
definitely
still
index
those
pages,
and
sometimes
this
this
kind
of
issue
can
come
up
temporarily,
where
maybe
we
can't
crawl
one
of
the
css
files
for
a
brief
time,
and
then
we
don't
see
the
full
layout.
A
B
Okay,
is
it
different
than
how
mobile
usability
tool
works
and
a
browser
when
you
know
they
start
rendering
the
page,
because
the
problem
is
just
that
that
tool
shows.
You
know
mobile
usability
test
space
due
to
the
small
text,
but
on
the
mobile
or
the
browser
the
chrome
was,
I
see
the
you
know
the
page
is,
you
know
properly
be
loaded.
There
are
no
such
issues.
A
Yes,
so
in
in
particular,
we
we
have
to
follow
the
robot's
text
and
in
the
robot's
text
file
it
can
happen
that
you
block
something
from
being
crawled,
which
could
be
something
like
the
css
file,
which
kind
of
tells
us
how
the
page
should
look
like.
So
in
a
browser,
we,
the
the
browser,
doesn't
look
at
the
robot's
text,
but
when
we
crawl
the
page,
we
do
look
at
the
robot's
text,
so
that
could
definitely
be
be
something
where
it's
different.
When
we
look
at
it
than
when
you
look
at
it.
B
A
Much
for,
thank
you
sure,
all
right.
I'm
mayor
seo.
C
Hi
hi,
how
are
you?
I
have
actually
three
short
questions.
I
work
in
news
industry,
so
we
are
publishing
like
20
article
each
day,
so
I,
when
I
was
like
optimizing
the
seo,
I
told
them.
We
need
to
fix
the
title
to
65
letter
only
and
they
were
very
like
upset
about
that,
because,
like
news
should
be
like
a
long
title
and
a
good
title,
so
after
I
struggling
with
them,
I
I
searched
the
web
and
I
didn't
get
like
a
right.
Good
answer.
C
Is
that
a
really
good
factor
of
ranking
that
the
title
long
should
be
like
something
with
the
limitation.
A
No,
we
we
don't
have
any
recommendation
for
the
length
of
a
title,
so
I
I
think
kind
of
picking
a
number
from
from
your
side
and
saying
on
mobile,
like
this
much
room
is
available.
So
as
an
editorial
guideline,
we'll
say,
65
or
whatever
you
want
to
choose.
I
that's
that's
perfectly
fine,
but
from
google
from
the
kind
of
the
search
quality,
the
ranking
side.
We
don't
have
any
guideline
that
says
it
should
be
this
long
or
not
even.
C
A
C
A
A
No,
no,
I
mean
it's
good
to
ask
these
questions.
I
think
it's
it's
good
practice
to
have
some
of
the
the
words
in
the
url,
so
it's
more
like
a
a
readable
url,
but
it's
not
a
requirement
from
an
seo
point
of
view.
So.
A
C
A
C
The
last
thing
is
about
in
the
coverage
in
the
google
console.
There
is
something
called
not
indexed
and
requested,
like
any
gray,
something
we
have
like
more
than
one
million
page
on
that
when
I
clicked
it,
it's
redirect
wages.
So
I
I
fixed
the
redirect
issues
in
in
the
website,
but
is
there
any
like
validate
the
fix,
like
the
coverage
error
in.
D
A
Okay,
I
mean
that's
something
that
just
settles
down
automatically,
so
it's
not
even
something
where
I
I
would
say
you
would
need
to
submit
like
a
validate
fix
or
anything
special
as
we
recrawl
those
pages
we'll
see
the
redirect
works
and
goes
to
a
different
place,
and
we
can
just
follow
that.
So
that's
that's
perfectly
fine.
E
Yes,
I
just
wanted
to
know
the
new
design
for
page
speed
insights.
I
wish
I
could
be
reverted
back
because
I
have
to
keep
on
scrolling
down
see
you're
laughing,
because
you
knew
this
was
going
to
happen,
but
I
mean
I,
you
know.
My
main
concern
is
a
lot
of
sites.
I
remember
when
https
came
when
that
was
introduced
and
there's
a
bunch
of
other
algorithms
inside
everything
that
makes
the
algorithm
or
what
it
is
today.
So
this
is
not
something
again.
E
We
can't
just
continue
to
focus
in
making
sure
that
you
know
this
side
is
going
to
be.
You
know,
98
out
of
100
and
so
on,
but
in
february,
when
the
ranking,
I
guess
with
when
page
speed
insight
will
count.
What
do
we
do
because
at
this
point
in
time
I
mean
there's
certain
sites,
I'm
having
issues
with
lcp
issues
and
cls
and
a
lot
of
sites
right
now
are
some
sites.
I
guess,
are
on
a
very
popular
cdn
platform,
I'm
not
going
to
name
them,
but
they
told
me
like
look.
E
You
know
it's
servers
within
your
area
that
are
creating
a
specific
score
and
you're
seeing
you
know.
Sometimes
I
would
see
the
lcp
and
cls
at
a
perfect
score,
which
would
be
almost
99
out
of
100
and
then
it
goes
back
into
another
score
and
which
has
nothing
to
do
with
in
the
site.
So
I'm
kind
of
I
I
mean
I'm
getting
worried
because
google
sometimes
crawls
the
site
and
then
I
I
get
the
error
in
search
consoles.
E
Saying
hey,
you
know,
you
have
cl
issues
the
cls
issues,
an
lcp
and
then
I
go
and
scan
it
and
it's
fine
and
it's
like
99
out
of
100
and
then
some
pages
on
the
blog
are
100
out
of
100..
So
then
I'm
like
what
is
going
on
here
and
then
the
new
design
that
just
got
introduced
and
then
I'm
seeing
some
areas
where
it
has
no
data
found
and
I'm
like
okay,
I
guess
you
guys
need
to
crawl
it,
and
then
data
will
come
john.
This
is
just
I
don't
know.
I.
A
I
think
there
there's
one
one
big
aspect
that
that
that
plays
a
role
there,
which,
which
is
important
in
that,
on
the
one
hand,
we
have
these
lab
testing
tools
like
page
speed,
insights
and
lighthouse,
and
things
like
that,
where
essentially
you
test
it,
either
from
your
browser
with
your
connection
or
with
some
server
web
page
tests,
I
think,
does
similar
things.
Some
of
the
various
other
tools
do
that
as
well.
That's
kind
of
the
the
lab
testing
that
you
can
do
and
that's
more
something
that
is
useful
for
incremental
improvement.
A
A
So
essentially
what
what
you're
probably
seeing
there
is
users
saw
that
something
was
slow
kind
of
on
on
an
aggregate
level.
We'd
look
at
that.
It's
not
it's
not
on
a
per
url
level,
and
we
give
you
that
information,
like
some
users,
saw
this
kind
of
an
issue
in
search
console
and
then
you
can
essentially
use
the
the
individual
lab
testing
tools
to
see
where
could
this
be
coming
from?
Is
this
something
that
I
can
fix?
Is
it
something
that
was
like?
A
I
don't
know
some
quirk
that
happened
in
the
in
the
network
overall,
that
you
can't
really
kind
of
help
with
that
kind
of
helps
you
to
narrow
things
down.
E
The
the
thing
is
in
gt,
metrics
and
your.
I
know
webpagetest.org
it's
a
favorite
of
yours
because
you
in
a
lot
of
hangouts.
You
do
have
that
shown.
It's
perfect
there.
So
when
I
rescan
it,
of
course
you
gotta
wait.
25,
there's
25
people
ahead
of
you
all
that
stuff,
and
I
just
it's
not
easy
to
reschedule
page
speed
insights,
but
when
I
do
that,
it's
fine,
it's
perfectly
fine.
So
what
should
I
do
then?
Who
do?
I
trust.
A
Well,
I
mean
essentially
what
what
is
happening
in
a
case
like
that
is
with
the
lab
testing
tools.
They're
saying
things
are
okay
from
a
theoretical
point
of
view,
but
in
practice
people
see
something
different.
So
all
of
the
lab
testing
tools
they
make
assumptions
on
what
users
might
see.
So,
if
you're
running
in
a
data
center
somewhere,
you
can't
just
act
like
oh,
my
I
don't
know.
Gigabit
connection
through
the
internet
is
what
every
user
has,
but
rather
they
they
slow
things
down
artificially
and
they
try
to
act
like
a
normal
user.
A
But
of
course
they
can
only
go
so
far
and
make
assumptions
and
what
your
actual
users
see
might
be
different.
It
might
be
that
your
actual
users
are
like
using
something
faster
than
the
assumption
or
using
something
slower,
and
essentially
the
the
search
console
part
of
that.
The
search
ranking
side
is
based
on
what
users
actually
see
and
not
based
on
some
artificial
number,
that
some
data
centers
so.
E
E
A
Yeah
I
mean
what
what
you
can
also
do,
what
what
some
sites
do
is
try
to
instrument
the
pages
themselves.
So
basically
you
add
javascript
to
the
individual
pages
on
your
site
or
the
pages
that
are
being
flagged
as
being
slow
and
you
collect
the
actual
data
that
users
saw
in
analytics,
and
that
gives
you
a
little
bit
of
a
faster
feedback
loop
where
you
can
try
things
out
and
see.
Okay,
what
happens
if
I
make
this
image
even
smaller
or
what
happens?
If
I
remove,
I
don't
know
adsense
or
analytics
or
whatever.
A
E
Arden,
it's
fun
at
the
same
time
as
you're,
getting
the
hang
of
it
and
then
yeah
with
yeah.
So
it's
not
something.
We
should
worry
about.
A
C
A
D
Hello
just
check,
everybody
can
hear
me
yeah,
yes,
good,
okay,
a
couple
of
things.
First,
one
up,
I
did
write
a
question
in
there
with
regards
to
concern
with
doorway
pages.
D
This
is
kind
of
linked
in
with
google
ads.
I
know
this
is
search
and
all
that
kind
of
stuff,
but
I
came
across
an
article
that
basically
said
explained
a
person's
experience
where
they
created
a
bunch
of
landing
pages
and
stuff.
Exactly
what
I
just
did.
They
were
being
flagged
as
doorway
pages
and
not
only
were
they
de-listed
their
google
ads
account
was
actually
shut
down.
D
I
can't
have
either
of
those
done.
I
can't
afford
that
I'm
a
small
like
local
service
based
business,
don't
have
thousands
of
pages.
I've
got
like
seven
keyword,
targeted
landing
pages
in
the
google
discussion
forums.
D
Somebody
suggested
well
just
no
index
those
landing
pages
but
which
I
did
and
but
I'm
also
thinking,
is
those
seven
pages.
They
are
targeted
for
specific
keywords.
They
could
be
very
beneficial
for
organic
search.
I
don't
know
what
what's
the
the
like,
the
risk
level
of
being
labeled
as
oh.
This
is
doorway.
A
So
I
I
think,
with
seven
pages,
you
don't
have
any
problems,
that's
kind
of
at
that
level.
Even
if
someone
from
the
web
spam
team
were
to
manually
look
at
that,
they
would
see.
Oh
it's
it's
seven
pages.
It's
not
it's
not
like
thousands
of
pages.
It
would
be
different
if
you
were,
for
example,
a
nationally
active
company
and
for
every
city
in
the
country
you
had
a
separate
page,
then
that's
something
where
we'd
say
well.
This
is
this
is
way
way,
past
kind
of
acceptable.
A
But
when
you're
talking
about,
like
I
don't
know
a
handful
of
pages
like
like
the
seven
that
you
mentioned,
I
I
don't
think
that
would
be
a
problem
just
purely
from
from
the
web
spam
point
of
view.
Even
if
someone
were
to
manually
look
at
that
they'd
be
like
it's
seven
pages.
It's
like
it's
not
going
to
change
the
search
results
completely.
A
Mean
I
mean
I,
I
don't
think
it's
ideal,
so
that's
that's
kind
of
the
the
other
aspect
there.
It's
it's,
probably
not
an
ideal
situation
when
you
have
pages
that
are
very
similar
like
that.
But
again
it's
not
something
from
a
webspam
point
of
view.
They
will
take
action
on
so
they
would
not.
I
mean
I.
I
can't
promise
this
because
I
don't
know
what
your
site
is.
A
I
can't
I
tell
them
like
don't
do
anything
with
this
site,
but
I
just
purely
what
I
know
from
them
like
they
would
not
take
action
on
this.
So
your
your
search
listing
and
probably
also
the
ad
side.
I
don't
know
the
details
of
the
ad
side,
but
probably
also
that
part,
would
be
perfectly
fine
with
seven
pages.
I
think
overall,
it's
not
optimal,
but
so
I
would
think
about
ways
that
you
can
make
those
pages
actually
useful
for
those
individual
kind
of
niches
that
you're
targeting
and
really
go
past.
A
I
mean:
is
it
going
past?
I
don't
know
swapping
out
the
city
name,
for
example,
in
the
title,
or
something
like
that
where,
where
you
have,
I
don't
know,
maybe
like
directions
from
that
city
to
your
your
place
of
business
or
some
additional
information
that
gives
people
a
little
bit
more
context.
There.
D
And
well
they're,
not
city
keywords,
it's
basically,
I'm
I'm
a
photographer.
I
shoot
headshots
specifically,
so
my
keywords
are
like
headshot
toronto,
headshot,
headshots,
toronto,
professional
headshots
stuff.
Like
that
right
all
the
content
is
pretty
much
the
same.
Like
I
shoot
headshots
like
I
don't
know
how
many
other
different
ways
I
can
describe
what
it
is.
I
do
right,
okay,
so
that's
why
the
content
is
very
identical.
A
Okay,
in
in
that
case,
I
would
pick
one
page
and
make
it
canonical,
because
essentially,
what
you
have
there
is
just
variations
of
the
same
keywords
like
when
you're
saying
like
photography,
toronto,
toronto,
photography,
for
example.
Those
are
just
like
variations
of
the
same
keyword
and
for
the
most
part
we
figured
that
out
and
even
within
the
natural
text
on
a
page,
you
can
mention
both
of
those.
It's
not
like
it's
not
going
to
blow
the
page
up.
If
you
mentioned
those
those
seven
variations.
D
Yeah
I
I
went
down
this
route
because,
honestly,
I've
been
fighting
with
google
ads.
Is
there
something
like
this
for
google
ads
where
we
can
like
really
ask
people.
D
A
My
from
my
point
of
view
you
you
could
combine
all
of
that
into
one
page.
The
other
thing
that
I
I
think
is
also
worth
maybe
looking
into,
is
you
can
block
pages
from
search
and
keep
them
for
ads
kind
of
like
that
no
index
that
was
mentioned
in
the
forum?
Well,
if
you
make
the,
if
you
keep
those
pages-
and
you
say
well,
I
don't
want
them
shown
in
search.
A
You
can
choose
a
canonical
or
use
a
no
index
on
those
pages,
but
within
ads
you
can
still
use
them
as
landing
pages
and
something
that's.
A
E
Also
andrea-
and
I
I
suggest
maybe
you
checked
your
ssl
certificate
for
some
reason:
it's
not
like
I'm
getting
blocked
coming
to
the
to
the
site.
There's.
E
D
E
I
went
to
it's
adrian
and
then
your
last
name
right.
D
Sorry,
john
one,
other
question
before
we
go.
This
is
with
regards
to
search
console.
I
have
created
a
custom
post
type
in
wordpress
that
creates
the
archive
page
for
reviews,
testimonials,
I'm
not
seeing
any
of
that
enhancement
show
up
in
search
console,
even
though
the
page
has
been
crawled.
A
Are
they
reviews
or
what
did
you
mark
up?
Yeah?
Okay?
So
we
probably
wouldn't
show
that,
as
reviews
in
in
the
search
results,
because
it's
more
like
a
testimonial,
so
reviews
would
essentially
need
to
be
something
which
is
is
based
on
a
specific
product
on
that
page
and
the
reviews
need
to
be
things
that
users
leave
directly
on
that
page.
D
Okay,
like
all
the
reviews,
all
the
content,
I'm
taking
from
the
reviews
is
from
my
it's
not
called
google
my
business
anymore,
but
it's
called
reviews
there
yeah.
So
how
would
I
add
any
kind
of
schema
for
you
guys
to
see
that.
D
A
I
think
it's
it's
kind
of
tricky,
because
we
we
try
to
recognize
this
situation
automatically
and
sometimes
we
don't
recognize
it
properly
and
just
show
it
anyway.
So
perhaps
when
you're
searching
around,
you
see
other
sites
that
have
that
shown
where
it's
just
well,
it
was
like.
We
didn't
recognize
that
this
was
actually
not
left
directly
on
the
site,
but
from
from
a
policy
point
of
view,
we
we
try
not
to
show
reviews
that
are
left
somewhere
else
that
are
copied
over
to
a
website.
D
Okay,
I
am
getting
one
review,
snippet
show
up
and
it's
basically
the
one
that's
tied
to
the
the
business
object,
so
I
have
a
section
in
the
business
schema
that
says.
Review
and
yeah
just
has
like
an
overall
review
kind
of
thing,
but
yeah,
not
the
individual
ones.
Okay,
remember
that
was
my
whole
plan.
A
A
Right,
let
me
run
through
some
of
the
submitted
questions
and
then
we'll
get
more
of
the
the
live
questions
as
well.
Let's
see
the
first
one
is
about
search,
console
verification.
We
have
the
code
on
every
subdomain,
but
we're
also
using
a
tool
which
requires
domain
level,
verification
we're
thinking
about
doing
it
through
dns
records.
Would
that
invalidate
the
current
setup
and
lose
the
data
that
we
already
have?
A
A
A
I
think
that
would
be
a
really
weird
coincidence,
but
I
I'm
not
aware
of
any
kind
of
collaboration
with
being
that
we
swap
off
the
crawl
requests.
So
I
don't
see
that
being
related.
What
might
might
happen?
A
Sometimes
is
on
google
side
when
we
recognize
that
a
server
is
overloaded
and
kind
of
slow
and
showing
server
errors,
then
we'll
tend
to
crawl
less,
and
it
could
be
the
case
that
when
bing
crawls
a
lot
we
see
the
server
is
generally
slower,
so
we'll
crawl
a
little
bit
less,
and
when
we
see
that
the
server
has
more
capacity
is
a
little
bit
faster,
then
we'll
crawl,
more,
which
might
be
kind
of
related
to,
when
being
happens,
to
crawl,
less
theoretically,
that's
possible
from
a
practical
point
of
view,
I
kind
of
doubt
that
would
be
happening
just
because
usually
websites
have
so
much
capacity
for
for
users
to
come.
A
That,
like
some
crawling
from
being
and
some
crawling
from
google
is
not
going
to
slow
down,
the
whole
website
will
amp
pages
impact.
The
google
requests.
We
saw
research
a
few
days
ago
on
twitter
that
google
crawl
volume
on
non-amp
pages
increased
recently,
while
the
crawl
volume
of
amp
pages
decreased.
A
So
we
we
take
into
account
all
crawling
that
happens
through
the
normal
google
bot
infrastructure,
and
that
also
includes
things
like
updating
the
amp
cache
on
a
website.
So
if
you
have
normal
pages
and
you
have
the
amp
versions
and
you're
hosting
them
on
the
same
server,
then
the
overall
crawling
that
we
do
on
that
website.
That
will
be
essentially
what
we
try
to
balance
out,
and
that
includes
the
amp
and
the
non-amp
pages.
A
That's
not
an
issue,
it's
usually
more
of
an
issue
when
you
have
tens
of
millions
of
pages
and
google
barely
gets
through
crawling
all
of
them,
then
that's
something
where,
if
you
add
another
kind
of
duplicate
of
everything,
then
of
course
it
makes
it
a
lot
harder.
But
if
you
have,
I
don't
know
thousands
of
pages
adding
another
thousands
of
pages
from
the
amp
versions.
That's
not
gonna,
throw
things
off.
A
We
we
have
a
client
having
almost
identical
pages
per
location,
and
I
I
thought
this
was
your
question.
Adrian
piano
course,
birmingham
and
piano
course,
london.
The
only
difference
in
those
pages
is
the
location.
Shall
I
canonicalize
those
pages?
What
would
be
the
best
practice
kind
of
like
I
mentioned
before?
If
you
have
a
handful
of
pages,
it's
something
where
purely
from
a
doorway
page
point
of
view.
I
wouldn't
really
worry
about
that.
A
So
if
you
have
something
that
you're
offering
in
one
city
and
you're,
offering
it
in
another
city
as
well,
then
give
some
information
about
why
you
have
these
pages
separately
and
make
it
so
that
when
people
are
searching
for
it,
they
really
find
something
that
matches
that
not
just
like
oh
piano
courses
in
the
uk
and
like
a
handful
of
city
pages
there.
A
So
that's
that's
kind
of
one
approach
there.
The
other
approach
is
like
I
mentioned
before
canonicalizing
and
just
picking
one
of
those
pages
to
to
show.
I
think
if
you
have
something
distinct,
that
you
want
to
provide
and
having
multiple
pages
is
perfectly
fine.
A
If
it's
something
where
you're
just
offering
variations
of
the
same
thing,
then
sometimes
folding
them
together
is
a
better
strategy.
The
advantage
of
folding
them
together
is
that
you
make
a
little
bit
of
a
stronger
page
rather
than
two
pages
that
are
kind
of
like
okayish.
A
From
from
a
strength
point
of
view,
and
the
the
general
idea
here
is
that
google
has
some
information
about
your
website
and
if
you
kind
of
dilute
that
across
a
whole
bunch
of
pages,
then
each
of
those
pages
individually
will
be
a
little
bit
less
value
than
if
you
concentrate
everything
on
fewer
pages.
So
if
you
think
you
can
make
a
really
strong
page
just
piano
courses,
then
that
might
be
a
good
approach,
especially
if
there's
a
competitive
market
there.
A
If,
on
the
other
hand,
you
think
that
these
individual
locations
are
distinct
enough,
that
you
want
to
have
individual
pages
such
as,
maybe
you
have
really
physical
addresses
in
these
locations
or
different
opening
hours
or
different
kinds
of
teachers
in
these
locations,
then
that
definitely
makes
sense
to
keep
separate
because
you're
providing
something
unique
that
people
can
find
value
on
there.
So
that's
kind
of
the
approach
I
would
take
there.
A
It's
not
that
there's
one
answer
that
works
for
all
websites
here
for
some
it
makes
sense
to
hold
together
for
some
it
makes
sense
to
keep
separate
depending
on
the
website.
Sometimes
you
even
combine
that.
So,
if
you
have
an
e-commerce
site,
you
might
say
well,
this
model
of
shoe
is
all
the
same
model
of
shoe,
but
you
have
maybe
one
unique
size
that
really
kind
of
stands
out
or
a
unique
color
variation
that
really
stands
out.
A
Maybe
you'll
split
that
off
into
a
separate
page,
so
those
are
kind
of
the
thoughts
behind
that
in
general
does
disavowing
a
301
redirected
domain
have
the
same
impact
as
disavowing.
All
the
backlinks
of
the
redirected
domain.
We
have
a
301
redirected
domain,
pointing
to
our
home
page
that
has
18
000
spammy
backlinks.
Can
we
just
disavow
the
domain
redirected
to
us,
or
do
we
need
to
disavow
all
the
18
000
backlinks?
A
We
know
that
in
most
cases,
google
might
already
discount
these
links,
but
we
want
to
be
safe,
so
in
in
a
case
like
this,
I
I
think
it's
kind
of
tricky,
because
we
do
try
to
figure
out
what
the
kind
of
the
destination
and
the
origin
of
individual
links
are,
and
we
try
to
take
that
into
account.
So
purely
from
a
kind
of
an
theoretical
point
of
view,
you
would
probably
want
to
disavow
as
many
of
those
individual
links
as
you
can.
A
I
think
you
can
probably
also
get
pretty
far
by
just
disavowing
that
kind
of
redirect
to
your
home
page
and
kind
of
handling
it
at
that
level.
If
this
is
the
only
spammy
link
that
you
can
find
for
your
website,
probably
I
wouldn't
even
worry
about
it
too
much.
I,
I
think
disavowing
that
redirect
kind
of
takes
care
of
a
lot
of
that
issue.
A
A
Rather,
it
appears
on
its
own
in
the
index
after
several
days
or
even
weeks,
sometimes
bouncing
in
and
out
of
the
index.
This
happens
on
well-established
sites
with
50
000
organic
monthly
visitors.
Why
is
this?
I
don't
know
like
it's
it's
hard
to
say
kind
of
in
in
a
broad
way
to
kind
of
like
this
is
why
things
are
not
being
indexed
so
quickly
on
your
website,
but
in
in
general.
A
The
request
indexing
tool
in
search
console
is
something
that
passes
it
on
to
the
right
systems,
but
it
doesn't
guarantee
that
things
will
automatically
be
indexed.
I
think
in
the
early
days
it
was
something
that
was
a
lot
kind
of
a
stronger
signal
for
indexing
system
to
actually
go
off
and
index
that,
but
one
of
the
problems
that
that
happens
with
these
kind
of
things
is,
of
course,
people
take
advantage
of
that
and
use
that
tool
to
submit
all
kinds
of
random
stuff
as
well.
A
So
over
time
our
systems
have
kind
of
grown
a
little
bit
safer,
almost
in
that
they're
trying
to
handle
the
the
abuse
that
they
get,
and
that
does
lead
to
things
sometimes
being
a
bit
slower,
where
it's
not
so
much
that
it's
slower,
because
it's
doing
more.
But
it's
slower
because
well
we're
trying
to
be
on
the
cautious
side
here,
and
that
can
mean
that
things
like
search
console
submissions
take
a
little
bit
longer
to
be
processed.
A
It
can
mean
that
we,
we
sometimes
need
to
have
a
confirmation
from
crawling
and
kind
of
a
natural
understanding
of
a
website
before
we
start
indexing
things
there
and,
from
our
point
of
view,
that's
kind
of
expected
one
of
the
things
that
I
think
has
also
changed
quite
a
bit
across
the
web.
Over
the
last
I
don't
know
last
couple
of
years,
probably
longer
is
that
more
and
more
websites
tend
to
be
technically
okay,
in
the
sense
that
we
can
easily
crawl
them.
A
A
We
can
crawl
and
whereas
in
the
past,
when
something
would
not
get
indexed,
you
might
say
well,
like
maybe
there's
something
set
up
incorrectly
on
the
website,
and
you
try
to
find
that
problem
from
a
technical
point
of
view
and
nowadays
most
submissions
that
we
get
are
technically
okay,
so
we
can
go
off
and
index
them,
which
means,
because
there's
still
a
limited
capacity
for
crawling
and
also
for
indexing.
We
kind
of
need
to
be
a
little
bit
more
selective
there.
A
So
I
imagine
overall,
that's
kind
of
what
what
you'd
be
seeing
there.
If
we're
not
picking
things
up
as
quickly
as
you,
I
don't
know
we
might
have
in
the
past,
I
have
a
domain
that
hasn't
been
used
for
four
years.
The
blog
I
had
was
doing
great
in
its
niche,
but
because
I
didn't
want
to
sell
it,
I
deleted
all
the
content
and
left
the
domain
parked.
I
want
to
revive
the
content
on
it,
but
I
want
to
take
a
slightly
different
approach.
A
So
if
the
content
was
gone
for
a
couple
of
years,
probably
we
need
to
figure
out
what
what
this
site
is
kind
of,
essentially
starting
over
fresh.
So
from
from
that
point
of
view,
I
wouldn't
expect
much
in
terms
of
kind
of
bonus
because
you
had
content
there
in
the
past.
I
would
really
assume
you
you're
going
to
have
to
build
that
up
again,
like
any
other
site
like
if
you
have
a
business
and
you
close
down
for
four
years
and
you
open
up
again,
then
it's
going
to
be
rare.
A
That
customers
will
remember
you
and
say:
oh
yeah,
I
will
go
to
this
business
and
it
looks
completely
different.
They
offer
different
things,
but
it
used
to
exist.
I
I
think
that
situation
is
going
to
be
rare
in
in
kind
of
in
real
life.
I
guess
if
you
will
as
well,
so
I
would
assume
that
you're
essentially
starting
over
here.
A
A
A
So
it's
it's
kind
of
hard
to
say
with
just
that
information,
because
for
for
the
most
part,
we
don't
just
remove
things
from
our
index.
We
kind
of
pick
up
new
things
as
well,
so,
if
you're
adding
new
content
at
the
same
time
and
some
things
get
dropped
on
along
the
way
from
our
index,
usually
that's
that's
kind
of
normal
and
expected
because,
essentially
for
pretty
much
no
website,
we
index
everything
on
the
website.
A
So,
if
you're,
adding
hundreds
of
pages
a
month
and
some
of
those
pages
get
dropped
or
some
of
the
older
pages
or
less
relevant
pages
get
dropped
over
time.
That
seems
kind
of
expected
to
to
minimize
that.
I
I
think
it's
something
where
essentially
you
need
to
show
to
google
or
to
your
users.
I
guess
what
the
value
of
your
website
is
overall,
so
that
google
says
well.
A
Let's
see
for
job
posting
schema
markup.
What
are
the
most
important
parts
for
getting
indexed
and
ranking
highly
we've
seen
spikes
in
index
jobs
that
drop
off
after
a
week
or
two,
with
no
understanding
as
to
why
I
I
don't
have
much
insight
into
the
job,
search
or
google
for
google
for
jobs.
I
think
it's
called
side
of
things,
but
essentially
this
also
falls
into
similar
categories
of
indexing,
like
what
can
you
do
to
make
sure
that
your
content
is
better
indexed
and
especially
the
the
things
that
I
mentioned
so
far?
A
I
think
those
are
really
what
relevant.
If
you
have
kind
of
more
a
a
stronger
website
structure,
you
can
also
guide
things
a
little
bit
in
that.
If
we
can
index,
let's
say
100
out
of
200
pages
on
a
website
using
the
website
structure,
you
can
guide
us
a
little
bit
to
focus
on
which
part
you
want
us
to
focus
on
and
the
the
way
usually
that
is
done
is
with
internal.
A
Linking,
for
example,
you
take
the
important
pages
of
your
website
and
you
make
sure
that
you're
linking
to
the
other
parts
that
you
care
about,
so
that
could
be
like
on
the
home
page.
You
have
a
section
that
says
important
pages
or
popular
products
or
new
products
or
new
jobs
or
whatever
it
is
that
you
want
to
have
highlighted
in
the
search
results
and
from
the
home
page.
You
link
to
that
part
of
your
website
directly
and
then
we'll
kind
of
go
and
look
at
that
and
say
well.
A
The
home
page
is
probably
the
most
important
page
for
most
websites,
and
it's
telling
us
that
these
pages
are
really
important.
So
we'll
focus
more
on
those
pages
and,
for
example,
if
you
have
a
jobs
website
and
you
have
new
jobs
that
you
want
to
have
index
as
quickly
as
possible,
make
sure
that
from
the
home
page
you're
linking
to
those
new
jobs,
many
sites
do
that
automatically
kind
of
intuitively.
A
But
you
can
also
do
this
kind
of
in
a
very
determinant
way
and
clearly
kind
of
tell
us
exactly
what
it
is
that
you'd
like
to
have
shown
in
search,
and
it
doesn't
have
to
be
the
newest
ones.
It
can
be
things
like
the
the
pages
where
you
make
the
most
money
out
of
or
the
pages
where
the
competition
is
the
strongest
or
anything.
A
Essentially,
like
just
tell
us
what
you
think
is
important
on
your
website
through
internal
linking
and
we'll
try
to
follow
that
and
highlight
that
a
little
bit
more
in
search.
A
We
migrated
our
site
about
seven
months
ago
to
a
new
domain.
Should
I
remove
the
old
sitemap
from
the
old
site?
Probably
yes!
So
usually
when
you
migrate
a
website,
you
end
up
redirecting
everything
to
the
new
website,
and
sometimes
you
keep
a
siphon
file
of
the
old
urls
in
kind
of
in
search
console
with
the
goal
that
google
goes
off
and
crawls
those
old
urls
a
little
bit
faster
and
finds
the
redirect
and
that's
perfectly
fine
to
do
in
a
temporary
way.
A
But
at
the
same
time,
you're
redirecting
to
the
new
ones,
which
one
do
you
really
want
to
have
index,
and
at
that
point
you
essentially
want
to
remove
that
conflict
as
much
as
possible,
and
you
can
do
that
by
just
dropping
that
sitemap
file
and
like
giving
us
all
the
signals
that
you
can
that
point
at
your
new
pages,
so
that,
like
once,
we've
seen
that
redirect,
we
can
just
focus
purely
on
the
new
pages
that
you
have
on
your
on
your
website
or
the
new
domain
or
whatever
move
that
you've
made.
A
Google
snippets
and
knowledge
graphs
seem
to
be
getting
better
all
the
time,
and
often
users
don't
have
to
visit
a
website
to
get
the
answers.
They
need.
What
does
the
future
hold
for
web
publishers,
big
and
small
in
terms
of
getting
seo
traffic?
A
I
I
don't
see
the
the
need
for
websites
ever
really
going
away,
because
that
information
is
really
in
detail
on
your
website
and
sometimes
people
just
want
something
really
quick.
Maybe
they
want
the
phone
number
or
the
address
of
your
business
and
then
they
go
off
and
go
visit
that
business
directly,
but
essentially
for
anything
more
than
just
a
snippet
of
information.
You
want
to
kind
of
find
out
the
the
full
context
and
get
some
more
information
there.
So
I
don't
really
see
that
going
away.
A
It's
also
definitely
not
our
goal.
To
kind
of
like
be
that
one
place
where
everyone
goes
and
just
gets
the
answers
directly
because
we
know
like
we
have
to
work
together
with
the
ecosystem,
together
with
anyone,
who's
making
websites
to
make
sure
that
the
the
things
that
we're
providing
in
search
they
provide
value
for
for
website
owners
as
well,
because
it's
easy
for
website
owners
to
say.
Well,
I
don't
want
to
take
part
of
this
search.
A
I'd
rather
just
be
findable
in
facebook
or
in
social
media
or
somewhere
else,
and
we
we
want
to
make
sure
that
there
is
there's
kind
of
an
equal
deal
there.
In
that
we
can
show
your
content.
We
can
send
users
your
way,
but,
like
you,
also
get
something
out
of
that
in
the
sense
that
well
you're.
Getting
all
of
this
traffic
as
well,
and
that
is
something
that
all
of
the
teams
at
google
that
work
on
search
they
care
deeply
about.
A
A
But
I
I
understand
that
this
is
something
that
that
folks
also
worry
about,
because
they
see
these
fancy
features
in
search
and
it's
hard
sometimes
to
understand
what
what
the
bigger
picture
there
is
or
what
what
the
net
effect
is
across
websites-
and
I,
I
think
also
if
you
run
across
situations
where
you're
like
well,
I
I
really
don't
like
the
way
that
google
is
showing
this,
because
it's
really
something
that
I'd
prefer
people
to
look
at
on.
My
website
then
give
us
that
kind
of
information
give
me
contact
me
on
twitter.
A
A
Occasionally,
when
crawling
a
website,
I
run
across
a
spider
trap.
Infinitely
expanding
urls
and
I've
been
wondering
how
googlebot
handles
such
situations.
Does
it
somehow
ignore
those
urls
and
focus
on
the
rest
of
the
normal
urls
on
the
site,
or
does
googlebot
get
stuck
in
some
way
and
miss
crawling
urls
as
a
result?
A
Yeah
that's
complicated
question
and
it
is
something
that
sometimes
causes
problems
for
the
most
part.
I
think
we
we
end
up
figuring
this
out,
because
what
what
happens
is
this
kind
of
the
spider
trap
area,
which
is
is
something
for
example?
Maybe
you
have
an
infinite
calendar
where
you
can
scroll
into
march
3000
or
something
like
that,
and
essentially
you
can
just
keep
on
clicking
to
the
next
day
and
the
next
day
and
it'll
always
have
a
calendar
page
for
you.
A
That's
kind
of
an
infinite
space
kind
of
thing
for
the
most
part,
because
we
crawl
incrementally
we'll
start
off
and
go
off
and
find,
I
don't
know,
maybe
10
or
20
of
these
pages
and
then
we'll
say
well,
there's
not
much
content
here,
but
maybe,
if
we'll
look
a
little
bit
deeper
and
we
go
off
and
crawl,
maybe
100
of
those
pages,
and
we
start
seeing
well.
A
All
of
this
content
essentially
looks
the
same
and
they're
all
kind
of
linked
from
this
long
chain,
where
you
have
to
click
next
next,
next,
next,
next
to
actually
get
to
that
page.
At
some
point,
our
systems
are
going
to
say
well,
there's
not
much
value
in
crawling
even
deeper
here,
because,
like
we,
we
found
a
lot
of
the
rest
of
the
website.
That
has
really
strong
signals
telling
us
this
is
actually
important
and
we
found
this
really
weird
long
chain
here
then
overall
we'll
say
well.
These
are
probably
not
that
important.
A
We
don't
have
to
crawl
them
that
often,
if
at
all,
if
we
want
to
keep
them,
and
rather
we'd
focus
on
the
rest
of
the
site,
and
I
think
that
relative
understanding
of
urls
across
a
website
that's
something
that
we
we
don't
really
show
in
search
console
because
it's
more
like
part
of
our
internal
systems,
it's
already
complicated,
just
to
show
all
the
urls
and
kind
of
highlight
that
we've
been
crawling
things.
A
A
Just
like
avoid
crawling
much
deeper
there
and
you'll
also
see
that
over
time,
some
urls
get
crawled
very
frequently,
maybe
once
a
day
or
even
more
often
and
other
urls
get
crawled
less
frequently
and
that's
kind
of
very,
very
roughly
an
indicator
of
like
what
we
think
is
important
on
our
website.
It's
not
a
perfect
kind
of
correlation
with
our
sense
of
importance,
because
there
there's
also
an
aspect
there
of
we
want
to
make
sure
that
we
pick
up
changes.
A
Okay,
so
many
questions
and
running
out
of
time.
Let
me
see
if
I
can
get
some
questions
from
here.
I
think
I
think
what
I'll
do
is
just
take
the
the
hands
that
are
raised
at
the
moment
and
then
look
at
the
chat
it
looks
like
people
are
asking
questions
there
too.
I
also
have
a
bit
more
time
afterwards,
if
any
of
you
want
to
stick
around
after
the
recording.
G
Hi
jun,
how
are
you
hi
hi?
So
my
question
is
related
to
an
e-commerce
site.
So
there's
a
friend
of
mine
whose
client
owns
an
e-commerce
store,
and
this
question
basically
comes
from
that
client.
So
this
e-commerce
business
is
based
in
one
of
the
metro
cities
in
india,
but
they
deliver
products
throughout
the
country
and
so
the
products
they
sell
online
are
you
know,
like
furniture
or
home
decoration
stuff,
something
like
that
so
and
they
have
these
product
and
product
category
pages
on
their
site
so
which
is
doing
fine
for
them.
G
But
the
thing
is
the
client
said
that
a
few
of
their
competition
sites,
what
they
are
doing
is
they
are
creating
pages
like
product
plus
location,
based
pages,
something
like
product
by
product
x
in
city,
a
by
product
x,
in
city
b,
something
like
that
and
these
competitors.
They
don't
have
any
physical
presence
in
in
those
cities
for
which
they
are
creating
these
pages.
A
But
then
you
end
up
in
a
situation
where
you
really
have
thousands
of
pages
that
are
essentially
the
same
just
with
different
city
interviews,
and
that
would
be
considered
doorway
pages
on
our
site
or
from
from
the
webspam
point
of
view,
the
the
website
team
might
take
action
on
that.
The
the
algorithmic
side
of
search
will
probably
also
look
at
that
and
say
well.
These
pages
are
all
very
similar
in
just
targeting
variations,
and
then
they
might
take
action
on
the
site
as
well.
So
that's
something
I
would
generally
not
recommend
doing.
G
This
is
what
we
advised
to
the
client,
but
he
said
look
at
the
competition.
They
are
ranking
and
they're
getting
traffic.
So
oh.
A
G
A
Okay,
amit.
F
Hi
john
hi
hi,
we
have
a
new
site
that
is
basically
public
policing
in
only
in
english
language
right,
so
we
want
to
create
a
the
same
article
in
different
languages
like
in
russian
in
canadian.
So
how
can
we
do
that?
I
I
have
done
my
research
and
I
have
found
that
I
have
to
create
a
separate
publisher
for
that.
But
my
question
is
regarding
that:
should
I
need
to
can
define
a
canonical
version
and
alternate
version
there,
or
what
should
I
do.
A
So
that's
that's,
usually
the
the
approach
to
take
and
with
regards
to
the
canonical
with
the
canonical
you
tell
us
which
url
to
focus
on
so
the
canonical
should
be
the
individual
language
versions.
It
shouldn't
be
one
language
as
a
canonical
for
all
languages,
but
rather
each
language
has
its
own.
Canonical
version
like
this
is
the
french
version
and
the
french
canonical.
This
is
the
hindi
version
and
the
hindi
canonical,
so
it
shouldn't
be
linking
across
the
languages.
F
Okay,
but
my
site
is
approved
on
google
news
and
ranking
well
there
and
I
have
amp
pages
crawled
there
right
so
should
I
use
their
alternate
version
like
we
have
alternate
version
in
hindi,
language
or
in
russian
language.
So
what
what
you
recall
on
this.
A
Yeah,
so
with
amp
pages
and
normal
web
pages
and
different
language
versions,
it
gets
very
complicated
so
in
in
the
amp
documentation
they
they
have
a
a
page
on.
I
don't
know
what
it's
called,
probably
like
multilingual
content
that
has
a
diagram
with
regards
to
the
hreflang
annotations
that
you
should
have
there.
I
I
would
check
that
out
with
regards
to
the
google
news
side
of
that.
I
don't
know
how
news
would
handle
that.
A
F
So
in
the
next
lesson
we
can
discuss
more
about
this.
A
Yeah,
I
I
mean
I,
I
think
it's
complicated
with
amp
pages
and
normal
pages
and
internationalization.
So
it's
like
when
you
look
at
the
graph
you
you
will
probably
say.
Oh,
this
makes
a
lot
of
sense,
but
if
you
have
a
lot
of
different
pages
and
a
lot
of
different
language
versions,
it's
a
lot
of
work
to
get
that
all
of
those
details
right.
F
Yeah,
can
you
can
you
tell
me
where
I
can
find
that
diagram
or
what
should
I
say,
let
me
see
if
I
can.
A
I
think
I
have
it
I'll,
just
put
it
in
the
chopped.
A
Okay,
cool,
I
see
we
still
have
some
hands
raised.
I
I
will
definitely
get
to
you,
but
let
me
pause
the
recording
first
and
then
we
we
can
continue
with
the
rest
of
the
questions.
Also,
the
questions
in
the
chat
looks
like
things
are
happening
there.
So,
if
you're
watching
this
on
youtube,
the
recording
it's
like.
Thank
you
for
watching
to
the
end
all
of
you
here.
Thank
you
for
joining
and
submitting
so
many
questions.