►
From YouTube: English Google SEO office-hours from June 4, 2021
Description
This is a recording of the Google SEO office-hours hangout from June 4, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome webmasters of all levels!
A
All
right
welcome
everyone
to
today's
google
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
side
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
website
and
web
search.
A
Wow
lots
of
people
still
joining
cool.
It
looks
like
we.
We
have
a
bunch
of
people
already
raising
their
hands,
which
is
great,
so
we
can
go
through
some
of
you
all.
There
are
also
a
bunch
of
questions
submitted
on
youtube.
If
you
want,
you
can
add
more
questions
to
youtube
or
if
you're
in
here
live,
feel
free
to
raise
your
hand
as
well
and
we'll
see
how
far
we
can
make
it
today.
B
A
B
So
I
wanted
to
ask:
is
it
really
important
for
google
bot
to
be
able
to
see
a
cookie
consent
message
that
we
serve
to
users
because
we
kind
of
decided
to
serve
it
upon
user
interaction,
so
google
bot
won't
be
able
to
see
it?
I
wonder
if
that
can
cause
us
any
problems
in
terms
of
google
friendliness.
A
So
if
you
have
it
set
up
like
that,
I
think
that's.
Fine
lots
of
sites
serve
the
cookie
consent
banner
to
googlebot
as
well,
just
because
they
serve
it
to
everyone.
That's
usually
also
fine.
The
important
part
is
essentially
that
googlebot
is
not
blocked
from
being
crawled
or
blocked
from
crawling
the
website,
so
that
you
don't
have
a
kind
of
an
interstitial
that
blocks
the
access
to
the
rest
of
the
content.
D
Hi
john,
I
have
a
kind
of
a
softball
question,
but
it's
something
that
I'm
running
into
right
now,
going
back
to
basics,
so
I
have
a.
I
have
a
directive
to
use
keywords
specifically
target
keywords
in
you
know
in
meta
tags
use
it
here
in
the
h1
use
it
this
many
times
in
a
in
a
piece
of
content,
and
that
really
just
seems
seems
outdated
to
me,
especially
with
all
the
advances
in
semantic
search
and
all
the
cool
mom
and
all
that
other
stuff.
That's
that's!
Coming
down
the
pipe!
D
I
just
wanted
basic
question:
do
you
think
that
that's
still
a
legitimate
seo
tactic,
or
should
we
not
be
focused
on
using
this
particular
keyword?
This
many
times
on
a
on
a
page.
A
I
in
general,
the
the
number
of
times
that
you
use
a
keyword
on
a
page.
I
don't
think
that
really
matters
or
makes
sense
when,
when
you're
writing,
naturally,
usually
that
resolves
itself
automatically
and
also
with
regards
to
the
individual
keywords-
I
I
think
that's
something
where
I
wouldn't
disregard
it
completely,
but
at
the
same
time
I
wouldn't
over
focus
on
exact
keywords,
so
in
particular
things
like
singular
and
plural
or
kind
of
like
the
different,
the
different
ways
of
writing.
Individual
words.
A
That's
something
that
you
probably
don't
need
to
worry
about,
but
mentioning
what
your
site
is
about
and
kind
of
like
what
you
want
to
be
found
for.
That's
something
I
would
still
do
so
in
in
particular
what
we
sometimes
see
when
we
look
at
things
like
news
articles.
If,
if
a
news
news
site
doesn't
really
understand
seo,
they
might
write
in
a
way
that
is
more.
A
So
that's
something
where,
from
from
an
seo
point
of
view,
if
there's
something
you
want
to
rank
for,
I
would
still
mention
that
on
the
page
I
wouldn't
go
overboard
with
the
number
of
mentions.
I
wouldn't
go
overboard
with
all
of
the
synonyms
and
different
ways
of
writing
it,
but
like
mentioning
it
at
least
once
definitely
makes
sense.
A
Okay,
thank
you
sure,
espen
hi,
john,
so.
E
E
So
does
it
only
affect
the
canonical
version
or
does
it
affect
the
entire
sort
of
duplicate,
constant
cluster,
which
this
canonical
is
a
part
of?
So,
for
instance,
if
if
I
write
in
a
url
which,
in
my
opinion,
is
the
one
to
be
excluded,
but
it's
basically
a
canonical
variant,
sorry,
the
non-canonical
variant.
A
A
F
My
question
relates
to
templated
content
above
the
fold
from
a
mobile
first
perspective,
and
this
relates
to
like
a
blog
restructuring,
we're
working
on
where
we're
moving
from
all
of
our
blog
articles,
living
in
like
a
blog
sub
directory
to
building
out
like
topic
clusters,
with
specific
topic,
landing
pages
and
based
on
initial
designs
of
the
topic,
landing
pages
they're
all
going
to
include
the
same
hero
banner
with
a
with
the
same
header
and
then
the
same
two
sentences
and
then
below.
F
Each
of
these
topic,
landing
pages,
they're,
unique,
urls,
unique
titles,
but
they
have
that
same
header
and
like
just
above
the
fold
content
but
like
right
below.
That
is
a
unique
sub
header
in
like
a
in
like
which
would
be
like
the
h1
tag
of
the
page.
So
I
just
want
to
get
your
thoughts
there.
F
I
know
google
in
the
past
has
said
that
they
weigh
above
the
fold
content
and
I
think
it's
something
important
for
us
to
look
at,
but
is
it
like
detrimental
to
like
non-brand
rankings
for
those
topic
landing
pages.
A
Now
we
so
the
the
important
part
for
us
is
really
that
there
is
some
amount
of
unique
content
in
the
above
default
area.
So
if
you
have
a
banner
on
top
and
you
have
generic
hero
image
on
top-
that's
that's
totally
fine,
but
some
of
the
above
the
fold.
Content
should
be
unique
for
that
page,
and
that
could
be
some
something
like
a
heading.
That's
visible
in
a
minimum
case,
but
at
least
some
of
the
above,
the
full
content
should
be
there.
A
Yeah
I
mean
it's,
it's
probably
also
something
where
you
want
to
look
at
how
users
interact
with
those
pages
afterwards,
but
that's
that's
kind
of
more
from
from
a
non-seo
perspective,
but
it's
it's
always.
I
think
important
for
cases
like
that
that
you,
you
take
a
look
and
see
what
what
actually
happens
with
the
users
afterwards
awesome.
Thank
you
so
much
sure
promotash.
G
A
G
John,
I
have
a
question
regarding
the
domain
age
and
does
the
domain
expiry
matter
suppose
I
have
a
domain
that
is
couple
of
years
old
and
it
will
expire
in
next
six
months.
So
does
the
seo
check
the
age
of
the
domain.
G
A
I
don't
think
so,
at
least
at
least
not
that
I
know
of
because
it's
it's
also
something
where
some
some
of
the
top-level
domains
and
the
registrars
they
just
have
everything
private
by
default.
So
it's
like
you
wouldn't
be
able
to
compare
that
anyway
or
use
that
for
anything
useful.
G
So,
john,
so
like
I
am
little
bit
worried
right
now,
so
what
happens
is
like
now?
Both
of
us
are
talking
in
front,
so
we
can
clear
out
things.
So
there
is
lot
of
crap
means
wrong.
Information
and
data
spread
widely
across
internet.
It
is
copied
and
the
content
length
keeps
on
increasing.
Let's
suppose,
one
seo
expert
in
300
words
on
on
page
seo,
says
10
top
points
that
you
should
do
the
next
seo.
Experts
come,
he
writes
20
points
and
the
content
that
increases
200,
000
words.
Then
the
next
expert
comes.
G
There
are
only
five
six
people.
Only
if
you
search
on
google,
they
will
come
only
okay.
They
are
the
main
culprits
who
are
spreading
all
the
wrong
information.
Then
the
next
one
comes.
He
writes
2000
content
and
he
lists
20
points
and
another
person
comes
here.
Has
200
google
algo
points
and
he
has
4500
words,
and
he
claims
also
like
I
have.
I
have
been
increasing.
My
word
crown
and
google
is-
has
increased
my
rank
from
number
three
to
number
one.
G
So
it
is
not
directly
my
question,
but
how
the
people
are
using
the
skyscraper
technique
to
building
a
bigger
building,
then
suppose
another
tower
next
person
is
writing
again
crap.
There
is
no
useful
information,
his
building
is
more
bigger,
then
another
is
more
bigger
and
they
are
like.
If
I
want
to
see
real
information,
then
the
actual
gurus
are
also
spreading
misinformation,
but
still
they
are
competing
around
the
five
six
gurus
are
competing
and,
like
many
people
are
taking
their
courses
and
on
and
on
okay.
B
G
In
seo
suppose
I
want
to
write
something
truth,
so
we
are
discussing,
I
am
writing
and
in
whatever
we
discuss
in
200
or
300
or
500
words,
I
write
the
real
answers,
so
I
think
maybe
with
new
google
algorithms.
This
is
my
suggestion.
So
google
try
to
read
also
the
content
like
the
meaning
of
the
content.
A
Yeah
yeah,
I
I
think
it's
I
think
it's
always
tricky
and
that
there's
also
the
the
aspect
of
sometimes
there's
just
old
and
outdated
information
out
there
on
specific
topics,
but
yeah
like
it's
yeah.
I
don't
know
people
like
to
write
a
lot
of
things
and
to
promote
the
things
that
they've
been
working
on.
So
it's
it's
hard.
G
A
G
Say
yeah
like
why
still
the
skyscrapers
techniques
seem
to
work,
they
they
showcase.
The
example
like
three
months
ago
I
was
using
500
piece
of
content
with
500
words
and
suppose
on
on
page
seo.
They
will
give
example
and
yeah
I'm
taking
one
example
on
page
seo.
I
have
500
words
and
I
used
to
rank
at
number
six.
Then
I
increased
to
1000
words.
I
moved
to
number
five,
then
I,
as
he
slowly
increases
he
naturally
climbs
the
ladder
up
to
number
one.
A
G
A
G
A
G
Yes,
you
pronounce
it
absolutely
correctly.
It
is
difficult,
though
yeah
hope
you
are
doing
well.
My
question
is
related
to
google
search
console
and
it
has
more
to
do
with
the
sampling
of
data,
so
what
I'm
stuck
with
and
I'm
confused
with.
So
in
my
the
main
property
of
search
console,
if
I
create
a
sub
property,
so
what
I
see
that
the
sub
property
sub
properties
numbers
the
clicks.
G
So
when
I'm
seeing
analyzing
the
numbers
at
the
main
property
level,
I
might
skip
out
that
portion
of
that
subfolder,
because
I
see
only
four
thousand
clicks
and
maybe
rest
of
the
site.
Sections
are
doing
great
and
all.
But
when
I
create
a
sub
property,
it
gives
me
huge
number,
which
is
a
10
times
difference.
So
what
then?
What
I
did
eventually
that
I
created
many
sub
properties
inside
search,
google
search
console
to
get
more
numbers,
which
gives
me
a
better
picture
that
okay,
this
subfolder
is
doing
great.
G
It's
actually
not
only
four
thousand
or
two
thousand
it's
actually
twenty
thousand
thirty
thousand
like
that.
So
what
should
I
do
so?
Should
I
continue
pulling
numbers
from
sub
properties,
or
should
I
go
to
the
main
property
and
consider
the
numbers
from
there
because,
if
I
add
up
all
the
sub
properties,
the
total
goes
beyond
the
numbers
which
I
see
at
the
main
property.
A
Yeah,
I
I
think
it
depends
on
the
the
size
of
your
site
and
the
the
way
that
the
traffic
comes
to
your
site,
because
there
is
a
limit
of
the
number
of
data
points
that
we
collect
per
site
per
day,
and
perhaps
that's
that's
what
you're
seeing
there
so
from
from
my
point
of
view,
if,
if
you're,
seeing
more
useful
numbers
or
more
actionable
numbers
that
way,
I
I
think
that's
totally
fine.
I
don't
know
if
that's
something
that
the
the
average
site
needs
to
do.
G
Yep,
but
the
only
thing
is
the
concerning
part
is
that
if
I
continue
pulling
the
numbers
from
the
sub
property,
that's
fine,
but
when
I
look
at
this
main
property.
So
if
I
have
to
look
at
the
overall
numbers,
then
it
gets
little
confusing,
because
the
sub
properties
are
speaking
a
different
story
and
the
main
property.
The
total
overall
website
numbers
are
saying
a
different
thing.
A
Yeah,
I
I
mean
you're
you're,
looking
at
a
different
databases
there,
so
that,
from
from
that
point
of
view,
comparing
those
different
sub
properties,
that
probably
doesn't
make
sense.
But
if
you
need
to
look
at
more
detailed
information
within
one
of
those
subsections
of
your
website,
then
maybe
that
does
make
sense
to
to
look
at
it
on
that
level.
A
Yeah,
I
I
mean
I
I
think
for
for
most
sites
like
we,
we
chose
those
numbers
to
make
sure
that
for
most
sites
the
data
was
sufficient
and
useful
and
that
the
totals
kind
of
line
up.
But
if
there
are
definitely
individual
cases
where
it
makes
sense,
to
dig
more
into
detail
and
kind
of
verify,
subsections
of
a
site.
A
I
have
no
idea
yeah,
I
I
I
mean
it's
something
where
we
we
tend
not
to
talk
about
what
what
is
lined
up
for
the
future.
I
I
don't
know
it
feels
weird
to
to
have
like
a
premium
version
of
search
console
for
something
like
that,
but
like
who
knows,
I
I
think
like
seeing
what
people
are
doing
with
search
console
and
seeing
where
they're
running
into
limitations.
F
H
Hi
john,
how
are
you
hi
okay,
so
my
question
is
for
core
web
vitals
me
and
my
team
have
been
preparing
and
making
optimization
whether
it's
theme
related
in
css.
H
So
what
we
have
witnessed
is
the
data
in
the
google
search
console
it's
completely
different
to
when
we
go
to
the
incognito
mode
of
the
website
and
go
to
inspect
and
go
to
lighthouse
and
generate
a
report
from
there.
So
over
there
I
can
see
91
performance
and
everything
is
good
largest
content
for
pain
time
to
interactive
everything
is
green.
However,
if
I
go
to
google
search
console,
I
can
see
like
100
core
urls,
so
we
are
in
a
middle
situation
where
we
don't
know
where
to
go
from
here.
A
So
so
we
differentiate
between
the
the
lab
data,
which
is
kind
of
like,
if
you
test
it
directly,
which
is
what
you
would
see
in
page
speed,
insights
or
in
lighthouse
when
you
look
at
it
in
incognito
in
your
browser
and
the
other
variation
of
the
data
is
the
field
data,
which
is
what
users
actually
see
when
they
go
to
your
website
and
that's
the
data
that's
shown
in
search
console
and
from
from
there.
A
The
differences
are
essentially
that
the
the
lab
tests
that
you
do
they
make
a
lot
of
assumptions
with
regards
to
probably
users
will
have
this
kind
of
device
or
this
kind
of
connectivity
or
have
this
kind
of
configuration,
and
they
will
try
to
kind
of
use
those
assumptions
to
to
figure
out
what
it
might
be.
But
what
they
actually
see
in
practice
could
be
very
different
and
that's
probably,
what
you're,
seeing
there?
A
H
So,
having
said
that,
google
would
consider
the
practical
data
over
the
lab
data.
Yes
to
run
to
you
know,
run
the
pages.
Okay,.
A
So
in
in
search,
we
use
the
the
field
data
from
real
users,
there's
one
one
tricky
thing
with
the
field
data
in
that
it
takes
28
days
to
update
in
search
console
for
various
reasons.
A
What
you
can
do
is
use
something
like
google
analytics
together
with
an
extra
script
on
your
site,
to
track
the
field
data
yourself
as
well.
So
then
you
would
have
the
the
lab
data,
your
own
field
data
and
the
search
field.
Data,
and
usually
you
would
see
kind
of
the
differences
between
the
the
lab
and
the
other
field.
Data
right.
I
Hi,
I
I
had
a
question
about
yeah.
I
I
I
know
google
doesn't
like
cloaking
on
a
site,
but
but
that's
essentially
what
what
what
happens
now,
when,
when
you
use
google
analytics
a
b
testing,
for
example,
because
it
switches
the
content
underneath
your
when
it
does
the
av
testing,
so
so
any
anything
that
optimize
is
optimized
for
seo
cannot
use.
Google
analytics
every
test
is
that
right.
A
Kind
of
kind
of
so
the
the
important
part
for
us
with
a
b
testing
is
that
the
a
b
testing
is
not
a
permanent
situation
and
that
the
a
b
testing
is
something
where
google
bot
essentially
also
falls
into
that
a
b
test,
and
essentially
we
we
can
kind
of
see
what
what
users
are
seeing
and
when
it
comes
to
a
b
tests.
The
important
part
for
us
is
also
that
the
kind
of
the
purpose
of
the
page
remains
equivalent.
A
So
if
you're
a
b
testing
a
landing
page
and
you're
selling
one
product,
then
it
shouldn't
be
that
instead
of
selling
a
car,
suddenly
you're
selling
a
vacation
or
a
flight,
or
something
like
that,
it
should
be
something
where
the
purpose
of
the
page
is
the
same,
so
that,
if
we
understand
this
is
a
a
page
that
has
a
car
for
sale
for
example,
then
we
can
send
users
there
and
if
the
page
looks
slightly
different,
when
users
go
there
like
that
happens,
sometimes
you
you
personalize.
Sometimes
you
do
a
b
testing.
A
All
of
that
is
essentially
fine,
so
the
a
b
testing
that
you
would
do
with
analytics
usually
is
in
that
area.
Where
you
say,
oh,
I'm
changing
my
call
to
action.
I'm
changing
the
colors
the
layout
slightly,
but
you're
not
changing
the
purpose
of
the
page.
I
Yeah
yeah
totally
we
we
are
not
changing
the
purpose
of
the
page,
but
we
do
plan
to
run
some
ux
enhancements
through
the
experiment.
So
but
it's
kind
of
risky.
You
know
to.
A
No,
it
shouldn't
be,
it
shouldn't
be
so,
especially
if
the
purpose
of
the
page
remains
the
same,
then
that's
something
where
even
if
googlebot
were
to
see
both
of
the
av
versions,
then
we
would
be
able
to
index
a
page.
Normally
it
wouldn't
change
anything
for
us.
So
that's
that's
perfectly
fine.
The
the
cloaking
side
that
is
more
problematic
is,
if
I
don't
know,
you're
selling
a
car
and
then,
when
googlebot
looks
at
it
it's
showing
a
car.
A
I
C
Thank
you,
okay,
but
john
john.
Regarding
this,
the
same
thing:
if
ranking
goes
down
because
of
av
testing
is
it
is
it?
Is
it
a
good
sign
that
google
is
not
finding
the
different
version,
and
if
we
roll
out
older
page,
then
the
ranking
should
come
back.
A
I
don't
know
I
my
assumption
is
the
ranking
would
not
change
with
normal
a
b
testing,
because
you
you
have
the
same
content.
Essentially,
the
purpose
of
the
page
remains
the
same.
If
you
were
to
significantly
change
the
page
like
remove
all
of
the
textual
content
in
the
in
the
b
version
and
googlebot
sees
that
version,
then
that's
something
where
the
ranking
might
change,
but
for
the
the
normal,
a
b
testing.
J
My
question
is:
we
are
a
wordpress
site,
so
we
have
a
blog
section.
We
have
also
a
services
section.
Our
services
sections
are
labeled
as
pages
on
the
wordpress
backend,
whereas
our
blog
sections
are
labeled
as
posts.
Our
services
section
gets
a
lot
of
traffic,
but
our
blog
section
does
not
get
a
comparable
number
of
traffic.
So
is
it
because
google
tweets
pages
more
favorably
than
blocks,
or
maybe
we
are
missing
out
on
other
funds,
like
blog
marketing,.
A
I
I
don't
think
googlebot
would
recognize
that
there's
a
difference
so
usually
that
difference
between
kind
of
posts
and
pages
is
something
that
is
more
within
your
backend
within
the
the
kind
of
the
cms
that
you're
using
within
wordpress
in
that
case,
and
it
wouldn't
be
something
that
would
be
visible
to
us.
So
we
would
look
at
these
as
it's
an
html
page
and
there's
lots
of
content
here
and
it's
linked
within
your
website.
In
this
way,
and
based
on
that,
we
would
rank
this
html
page.
A
J
A
I
I
think
I
mean
I
I
don't
know
your
website,
so
it's
it's
hard
to
say,
but
what
might
be
happening
is
that
the
internal
linking
of
your
website
is
different
for
the
blog
section.
As
for
the
services
section
or
the
other
part
of
your
website,
and
if
the
internal
linking
is
very
different,
then
it's
it's
possible
that
we
would
not
be
able
to
understand
that
this
is
an
important
part
of
the
website.
A
J
A
At
all,
so
the
number
of
words
in
in
your
articles-
that's
totally
up
to
you.
I
I
think
some
some
people
like
to
have
a
guideline
with
regards
to
the
number
of
words,
but
that's
that's
really
an
internal
guideline
for
you
for
your
authors,
it's
not
something
that
we
would
use
for
seo
purposes.
Yes,
thank
you
zone.
A
Sure.
Let's
see
thomas.
K
Hi,
john
hi,
so
I
have
a
question
about
google
search
console.
My
website
has
around
120
external
links
and
about
40
of
them
are
non-working,
japanese
domains.
I
have
no
idea
where
they
came
from
and
what
do
I
have
to
do
with
them?
So
yeah.
A
I
probably
you
don't
need
to
do
anything
with
them
if,
if
these
are
just
random
links
from
the
internet,
I
I
would
just
ignore
them,
sometimes
like
it's
not
specific
to
japanese
links,
but
sometimes
spammers
include
normal
urls
within
kind
of
the
urls
that
they
use
to
promote
spam,
and
that
means
like
on
on
random
forums
and
blogs.
They
will
drop
these
urls
as
well,
and
sometimes
that
ends
up
with
a.
I
don't
know,
a
lot
of
links
that
are
posted
in
in
non-english
or
in
foreign
language
content
and
I've.
K
Okay
and
the
like,
almost
50
percent
count
it
doesn't
matter.
I
shouldn't
be
worried
that
I
get
some
kind
of
penalty
for
that
or
yeah.
A
No,
no,
I
mean
the
those
those
are
the
kind
of
links
that
I
I
mean.
On
the
one
hand,
if
they
don't
exist
anymore,
then
we
will
ignore
them
over
time
anyway.
So
like
that,
doesn't
matter
if
you
disavow
them
or
not,
if
they
don't
exist,
they
don't
have
any
value
and
on
the
other
hand,
these
are
the
kind
of
links
that
we
see
so
often,
since
I
don't
know
10
20
years
and
we
we
just
ignored
that.
A
Just
because
we
want
to
show
you
everything
that
we
know
about
so
yeah,
it's
I
I
I
agree
like
in
a
case
like
this.
It's
like,
oh,
if
google
knows
these
are
stupid
links,
then
maybe
they
shouldn't
show
them,
but
in
in
search
console
we
we
try
not
to
make
a
judgment
with
regards
to
the
links,
and
we
also
include
things
that
are
nofollow
and
all
of
that,
so
it's
like
we.
A
Okay,
thank
you
very
much
sure
all
right,
let
me
run
through
some
of
the
submitter
questions
and
then
I'll
get
back
to
to
all
of
the
hands
that
are
still
raised
wow.
So
many
people,
let's
see
maybe
I'll,
see
how
I
can
go
through
these
a
little
bit
faster
and
I
have
a
bit
more
time
afterwards
as
well.
Let's
say
I
have
500
physical
shops
that
are
selling
my
products
and
I
want
to
create
for
each
of
them
a
specific
landing
page.
Would
this
be
considered
doorway
pages?
A
No,
that
would
be
essentially
fine.
It'd
be
kind
of
like
having
different
products,
because
these
are
unique
locations.
These
are
physical
locations.
Having
individual
pages
for
them
is
is
perfectly
fine.
Sometimes
it
might
make
sense
to
combine
these
and
put
them
on
on
a
shared
page,
such
as,
if
you
have
a
lot
of
shops
in
specific
countries,
maybe
just
list
the
shops
there
instead
of
individual
pages
per
shop,
but
that's
totally
up
to
you,
then.
Second,
from
a
ranking
point
of
view,
does
google
treat
nofollow,
ugc
and
sponsor
drill
attributes
any
differently?
A
We
we
do,
try
to
understand
these
and
try
to
treat
them
appropriately.
So
I
could
imagine
in
our
systems
that
we
might
learn
over
time
to
treat
them
slightly
differently,
but
in
general
they're,
all
with
the
same
theme
in
that
you're
telling
us
the
these
are
links
that
are
placing
because
of
this
reason
and
google
doesn't
need
to
take
them
into
account.
A
Let's
see
the
wordpress
site,
we
looked
at
briefly
how
how
might
a
poor
site
structure
affect
indexing?
Some
of
my
older
articles
are
getting
de-indexed,
but
these
were
articles
on
page
eight
to
nine
of
my
blog
I've
since
created
static.
Category
pages
that
make
it
easier
for
google
to
keep
track
and
to
find
these
pages,
but
like
what
could
be
the
indexing
issues
here,
it's
it's
certainly
the
case
that
we
don't
index
all
content
of
all
websites
on
the
web.
So
at
some
point
it
can
happen
that
we
say.
A
Oh,
these
pages
are
not
critical
for
the
web.
Maybe
we're
not
showing
them
in
the
search
results.
Maybe
just
generally,
we
don't
think
these
are
critical
for
the
web,
so
maybe
we
will
drop
them
from
our
index
so
that
we
can
focus
more
strongly
on
the
important
pages
within
your
website
and
that's
something
where
we
use
the
internal
site
structure
quite
a
bit
to
try
to
figure
that
out
in
that.
A
A
How
much
of
an
impact,
disavowing
and
disavowing
links
in
bulk
is
going
to
make
we
recently
disavowed
many
backlinks
on
our
site?
They
were
all
http
sites
with
low
domain
authority.
Many
of
them
were
comments
with
link
back
to
our
site.
However,
we
haven't
seen
any
positive
improvement.
There
are
currently
no
manual.
Actions
against
us
is
disabled
not
necessary
these
days.
A
Let's
see,
could
you
some
questions
around
the
sources
behind
google
news,
so
I
don't
really
have
much
insight
into
the
google
news
side.
So
if
you
have
questions
about
google
news,
I'd
recommend
going
to
the
news
publisher
forum
and
posting
there,
so
I
don't
really
have
much
insights
on
like
the
the
labeling
and
how
kind
of
things
have
to
be
set
up
for
that
sub
properties.
We
talked
about
briefly,
does
googlebots
still
heed
pagination
and
breadcrumbs,
or
does
it
affect
the
ranking?
What's
the
best
practice?
A
A
So
if,
like
you
have
pagination
and
breadcrumbs
on
your
site-
and
you
remove
that,
then
that
does
mean
that
the
internal
structure
of
the
site
is
now
different,
a
good
way
to
kind
of
double
check.
How
those
factors
play.
A
role
within
your
website
is
to
use
a
an
external
crawler,
some
kind
of
a
tool
that
you
can
run
across
your
website
to
kind
of
crawl
the
website,
the
way
that
it
is
now
and
then
you
could
analyze
from
there
like
are
these
breadcrumb
links?
A
A
A
question
on
301
redirects
in
a
scenario
where
a
group
of
urls
have
been
changed,
but
for
some
reason
the
301
redirects
have
not
been
set
up
right
away.
Roughly.
How
long
is
the
time
frame
that
you
have
to
implement
redirects,
to
transfer
the
ranking
authority
from
the
old
to
the
new
pages
and
prevent
ranking
drops?
A
So
I
it's
tricky
because
there
is
no
specific
time
for
this,
especially
because
there
are
different
variations
of
this
kind
of
problem
situation
that
you
have
here,
in
particular
if
the
old
content
still
exists
and
you've
created
a
copy
of
that
on
a
new
url.
A
Then
in
a
case
like
that,
we
will
treat
those
two
urls
as
being
a
part
of
a
same
cluster
and
we'll
try
to
pick
a
canonical
url
between
those
two
urls,
and
it
can
happen
that
we
switch
over
to
your
new
url
for
that.
And
if,
if
that's
the
case,
then
essentially,
we
will
forward
all
of
the
signals
from
the
old
url
to
the
new
url
automatically,
even
without
a
redirect
in
place
so
like
in.
In
that
scenario,
probably
you
will
not
see
a
big
difference
if,
at
some
point
later
on,
you
add
a
redirect.
A
Then
that's
something
where
we
would
essentially
in
a
first
step,
lose
all
of
the
information
we
would
have
about
this
page
because
suddenly
it's
a
404
and
we
would
treat
the
new
page
as
being
something
new.
And
we
would
essentially
say
well
there's
a
new
page
here
and
we
would
not
have
any
connection
between
the
old
page
and
the
new
page
and
that's
something
where
at
some
point
we
will
drop
kind
of
the
old
page
from
our
index
and
lose
all
of
those
signals.
A
So
that's
kind
of
in
in
that
situation,
where,
if
you
delete
things
and
just
move
it
somewhere
else,
then
probably
after
a
certain
period
of
time,
I
don't
know
how
long
that
would
be
depends
on
the
website.
You
would
not
see
any
improvement
from
adding
redirects
and
in
a
case
like
that,
it
would,
from
my
point
of
view,
it
still
makes
sense
to
start
adding
redirects
there,
just
so
that
you're
sure
that
if
there
is
any
small
value
still
associated
with
those
old
urls,
then
at
least
that
is
still
forwarded
over.
A
So
those
are
kind
of
the
the
main
scenarios
there.
Can
you
tell
us
something
about
the
new
lighthouse
tree
map?
I
I
haven't
actually
taken
a
look
at
the
lighthouse
tree
map
and
probably
for
questions
around
lighthouse.
It
would
make
more
sense
to
check,
in
with
the
folks,
from
the
chrome,
side
and
kind
of
pick
their
brains
on
that
on
what
what
you
should
be
focusing
on
there.
A
A
They
can
get
pretty
complicated,
some
insight
into
the
optimization
of
free
listings
on
google
shopping
if
we
manually
edit
the
description
and
title
to
match
keywords
that
we're
interested
in
in
positioning
our
listings,
for
will
we
be
penalized
if
these
keywords
do
not
appear
on
our
site,
for
example
edit
the
free
listing
to
include
the
word
low
cost
or
cheap
in
the
description
of
a
product
like
a
gold-plated
ring
where
the
words
are
not
referenced
on
the
reference
domain.
For
that
listing,
I
don't
know
my
my
feeling
or
my
understanding
is.
A
We
do
try
to
map
the
landing
page
with
the
products
that
you
have
in
your
merchant
center
feed
and
if
they
don't
align,
then
we
will
have
to
make
a
call
with
regards
to
which
which
of
these
versions
do
we
actually
use.
So
that's
something
where
I
would
generally
recommend
trying
to
make
sure
that
these
two
versions
align
as
much
as
possible
so
that
you
don't
give
google
kind
of
this
situation
where
you're
saying
this
data
doesn't
really
match.
A
A
Essentially,
if
you
can
give
the
data
in
in
a
clear
way
such
that
google
can
actually
take
that
into
account
immediately,
then,
usually,
that's
a
lot
better
in
the
long
run
than
kind
of
this
short
term
like
tweaking
individual
keywords
where
you
probably
don't
see
much
value
out
of
that
anyway,
question
about
two
different
anchor
texts
to
a
unique
url
which
anchor
link.
Does
google
value?
A
Does
google
value
the
size
of
the
anchor
text
or
size
that
the
anchor
text
takes
up
on
the
screen?
I
don't
think
we
have
that
defined
at
all.
So
those
are
the
kind
of
things
where
from
from
our
systems
like
it
can
happen
that
we
we
pick
one
of
these
and
we
try
to
understand
which
one
is
the
most
relevant
one.
A
A
A
question
regarding
amp
we're
a
new
site,
we're
thinking
about
implementing
amp,
but
google
announced
that
amp
is
not
necessary
to
rank
in
the
top
stories
carousel
and
the
amp
badge
will
be
removed.
So
my
understanding
is
that
we
need
to
focus
on
core
web
vitals
and
fine-tune
our
website
to
make
it
fast
and
create
high
quality
content.
A
So?
Yes,
we
we
did
announce
that
amp
is
no
longer
required
for
the
top
stories
carousel
and
instead
we
will
be
focusing
on
things
like
the
core
web
vitals
and
the
page
experience
factors
that
to
try
to
understand
which
pages
we
should
be
showing
there.
The
I
think
the
important
part
with
amp
is
that
it's
a
really
easy
way
to
make
pages
extremely
fast
and
to
kind
of
make
sure
that
you're
kind
of
in
in
an
easy
way,
almost
by
default,
achieving
the
the
metrics
for
the
core
web
vitals.
A
So
that's
something
where,
if,
if
you're
trying
to
make
your
pages
fast-
and
you
don't
know
which
which
framework
to
use-
and
maybe
amp
is
a
good
good
approach
there,
maybe
that's
also
something
where
you
can
take
individual
elements
out
of
amp
and
just
reuse,
those
on
your
pages
and
then
over
time
kind
of
migrate.
More
of
your
pages
to
the
amp
framework.
But
I
would
see
it
more
as
a
framework
rather
than
a
kind
of
a
feature
that
you
have
to
turn
on
or
turn
off.
A
A
If
your
site
is
built
on
wordpress
and
you
can
just
enable
the
amp
plugin,
then
sometimes
in
that
can
kind
of
automatically
shift
your
site
over
into
the
good
side
of
the
core
web
vitals,
okay,
we're
kind
of
running
towards
the
end,
so
maybe
I'll
shift
back
to
to
some
live
questions
from
you
all
and,
like
I
mentioned,
I
have
a
bit
more
time
afterwards
as
well.
If
any
of
you
want
to
stick
around
and
ask
even
more
questions,
let's
see
niraj,
I
think
you're.
On
top
now.
C
Yeah,
john,
so
actually
I
was
talking
to
some
other
seo
and
asu
and
he
has
the
problem
he
was
saying
so
this
kind
of
websites
are.
There
are
so
many
websites
like
this,
so
this
is
related
to
job
and
in
job.
What
happens
is
that
one
website
has
the
structure
where
in
india
it
just
shows
one.
Generic
page
asks
the
user
to
give
force
forcefully,
in
fact
his
city
and
then
redirects
that
particular
user
to
that
respective
city
jobs.
C
C
C
So
I
was
just
just
thinking
that
this
can
also
not
be
cloaking
from
google
side
if
we
are
providing
one
page
to
google,
but
in
users
in
the
same
country
we
are
just
asking
for
the
city
and
then
redirecting
them
to
that
respective.
C
You
know,
please
just
give
me
your
ideas
on
this
yeah.
A
So
the
the
important
part
for
cloaking
in
in
a
case
like
this
is
that
googlebot
sees
the
same
country,
the
same
content
as
other
users
from
the
same
country
where
googlebot
is
crawling
from
woodsy
and
most
websites
we
crawl
from
the
us.
So
if
a
user
in
the
u.s
were
to
access
this
website,
they
should
see
the
same
thing
as
googlebot
would
see.
A
From
our
point
of
view,
that's
perfectly
fine,
so
it's
really
just
a
matter
that
googlebot
sees
the
content
that
other
users
would
see
in
the
us
and,
if
you're
doing
something
more
unique
in
a
country
where
googlebot
is
currently
not
crawling
from.
That's
that's
perfectly
fine
and
up
to
you,
the
the
important
part
here
is,
I
think,
also
to
keep
in
mind
that
googlebot
will
be
crawling
from
the
u.s.
A
A
So
if
someone
is
looking
for
a
job
in
san
francisco,
then
we
would
say:
oh
yeah,
here's
one
on
this
website,
but
if
someone
is
looking
for
a
job
in,
I
don't
know
mumbai
or
some
other
place
in
india,
then
we
might
not
necessarily
know
that
the
homepage
of
this
website
would
be
relevant
because
we
never
see
jobs
from
india
listed
on
the
home
page
on
the
lower
level
pages
of
the
site.
Usually
that's
like
there's
less
personalization
there,
but
especially
with
the
home
page.
A
That's
one
area
that
you
kind
of
need
to
watch
out
for,
if
you're
doing
a
lot
of
personalization
my
recommendation,
there
would
be
to
say
you
have
one
part
of
the
home
page
that
is
very
personalized,
that
it
kind
of
matches
where
the
user's
location
is
and
another
part
of
the
home
page
is
essentially
very
static,
something
that
is
something
that
all
users
would
see.
And
then
you
can
kind
of
guide
google
to
index
the
home
page
and
say
it's
like
there's
some
u.s
jobs
information
here,
but
there's
also
some
worldwide
jobs.
C
John
regarding
this
cloaking,
I
was
also
going
through
google
documents
and
I
also
personally
feel
that
there
should
be
more
content
available
because
for
looking,
there
is
very
less
content
available
where
we
are
saying
that,
if
you
are
having
different
behavior
for
users
and
different
for
google
part,
then
it
could
be
clocking.
A
That's
that's
good
feedback
to
have
yeah.
So
what
what
I
would
recommend
doing
in
in
a
case
like
this
is
trying
to
find
the
most
related
page
within
our
documentation
and
using
the
feedback
link
there
to.
Let
us
know
that
it
would
be
useful
to
have
more
examples
or
more
details
there,
but
I
I
can
also
pass
that
on
to
the
team,
but
the
the
feedback
link
is
is
kind
of
the
best
way
to
get
the
the
information
directly
to
you.
Yeah.
C
Sure
I
will
do
it,
and
my
only
concern
was
that
let
us
share
it
with
youtube.
I
will
also
do
that.
One
and
the
do's
and
don'ts
will
always
be
welcome.
Like
please
do
this.
Please
don't
do
this
so
at
least
no
need
to
like
getting
in
getting
confused
and
then
coming
and
asking
you
the
same
question.
A
L
Can
you
hear
me
now
sorry
about
that?
Yes,
okay,
sorry
about
that.
So
in
a
previous
question
about
keywords,
you
mentioned
how
it's
important
that
the
writing
is
good,
that
people
use
plurals
correctly
and
it
made
me
think
about
a
headline
I
saw
a
few
days
ago
from
a
major
outlet
that
read
so
and
so
announces
they're
engaged
to
so-and-so.
L
So
I
read
it
and
I
said
good
for
that
outlet
for
correctly
referring
to
that
particular
non-binary
individual
as
they
and
then,
of
course,
my
mind
immediately,
went
to
seo
and
wondered
whether
google
understands
that
this
isn't
for
grammar,
but
rather
inclusive
english
is
google's
natural
language
processes.
L
Does
it
understand
a
plural,
followed
by
a
singular
form
of
verb
like
they
is
doing
this
or
they
is
doing?
That?
Is
that
that
that
that's
grammatically,
correct.
J
A
I
I
I
mean
it's
it's
something
where
usually
our
our
systems
would
learn
this
automatically.
We
would
not
and
like
manually,
define
english
grammar
to
be
like
this,
and
I
I
could
imagine
that
especially
these
these
kind
of
shifts
in
language
where,
like
over
the
years
like
this,
becomes
more
and
more
common,
that's
something
that
probably
takes
a
bit
more
time
for
our
systems
to
learn
automatically
and
probably
if
we
were
to
run
into
situations
where
we
obviously
get
it
wrong,
and
we
we
see
feedback
about
that.
A
L
L
A
Yeah,
I
I
don't
think
that
would
be
affecting
the
the
ranking
there,
because
it's
like
we,
we
would
probably
be
picking
up
like
if,
if
we're
taking
that
headline
apart,
we
would
pick
up
the
two
individuals
and
we
we
would
focus
on
like.
Oh,
these
two
are
now
related
or
kind
of
mentioned
in
the
same
headline
kind
of
thing,
but
kind
of
like
the
individual
words
that
are
probably
less
critical
for
us.
L
I
A
Okay,
otherwise,
we'll
get
back
to
you.
Let's
see.
M
M
So
the
question
is:
is
there
a
span
of
time
after
which
not
crawling
a
specific
page?
You
delete
the
links
contained
in
that
page,
let's
say
from
the
link
graph
or
you
keep
those
links
till
google
both
crows
again
that
page
and
just
to
clarify,
I
mean
the
links
that
are
contained
on
the
on
the
page
and
not
the
links
that
are
pointing
to
the
page.
M
Yeah
yeah
yeah.
If
you
grow
up
a
page,
for
example,
a
years
ago
and
after
one
year
you
stop
using
the
links
on
that
page
because
you
didn't
crawl.
The
page
again,
that's
that
now
I
I
don't.
A
What
would
probably
happen
is
the
the
rest
of
the
web
and
kind
of
the
that
website
would
evolve
over
time,
and
it
could
happen
that
those
links
are
less
relevant
after
some
period
of
time,
but
we
would
still
keep
them
so,
for
example,
if
you
have
a
link
in
an
article
on
the
new
york
times-
and
it's
currently
an
article
on
the
home
page,
then
that
link
might
be
seen
as
very
relevant
for
us,
but
five
years
later,
that
link
is
within
an
archive
section
of
the
new
york
times
and
somewhere
hidden
away
in
the
seller.
A
Then
that
link
would
still
be
there,
but
it's
a
lot
less
relevant
within
the
context
of
the
new
york
times,
so
those
are
kind
of
the
changes
that
that
tend
to
happen,
but
it's
not
that
we
would
say.
Oh,
we
haven't
looked
at
this
link
in
five
years.
Therefore
we
will
ignore
it.
A
Okay,
let
me
take
a
break
here
for,
for
the
recording
and
you're
all
welcome
to
stick
around
a
little
bit
longer
and
we
can
answer
more
questions
as
well,
but
just
so
that
we
have
a
reasonably
sized
recording.
Thank
you
all
for
joining
in
thanks
to
it's
cool,
to
see
so
many
of
you
here
wow
and
thanks
for
all
of
these
questions
that
you
submitted,
and
hopefully,
if
you're
watching
this
on
youtube.
You're
you're
definitely
welcome
to
join
us
in
person
at
some
point
in
the
future
as
well.
A
I
try
to
do
these
on
a
weekly
basis,
alternating
between
early
by
time
and
later
by
time,
to
try
to
get
more
time
zones
involved,
but
you're
always
welcome
to
join
in.
So
with
that,
let
me
pause
the
recording
and
then
we
can
move.