►
From YouTube: English Google SEO office-hours from October 16, 2020
Description
This is a recording of the Google SEO office-hours hangout from October 16, 2020. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Watch out for new sessions, and add your questions at https://www.youtube.com/user/GoogleWebmasterHelp/community
Feel free to join us - we welcome webmasters of all levels!
Subscribe to the Google Search Central Channel → https://goo.gle/SearchCentral
A
Hi
everyone
and
welcome
to
today's
google
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
question
around
their
website
and
web
search.
A
bunch
of
stuff
was
submitted
already
on
youtube.
We
can
go
through
some
of
that,
but
if
any
of
you
want
to
get
started
with
the
first
question,
you're
welcome
to
jump
in
now,
hey
joe.
B
Hi
hey,
so
we
have
a
competitor
who,
on
his
website,
has
a
great
number
of
product
reviews
and
after
a
mutual
agreement,
we
would
like
to
use
some
of
his
pedigrees
on
our
website,
alongside
with
our
reviews
for
the
benefit
of
visitors
who
would
be
interested
in
them
and
the
they
would
help
them
to
decide
on
the
product.
B
So
both
our
and
his
dog
size
are
well
established
and
belong
to
the
top
10
in
the
niche.
And
would
google
see
this
as
something
that's?
Ok
over
could
disarm
ours
or
his
rankings
in
order
to
avoid
duplicate,
content
and
google
seeing
their
reviews
on
our
website,
because
we
don't
really
want
google
to
see
them
because
they
are
there
only
for
our
users.
B
We
are
thinking
about
lazy
loading
them.
Would
this
be
okay,.
A
So
I
guess,
starting
with
the
last
part,
lazy
loading
is
fine.
It's
fine
also
to
have
parts
of
your
pages
where
you
say
these
should
be
excluded
from
the
snippet,
so
using
the
data
no
snippet,
for
example,
if
you're
using
reviews
that
are
not
sourced
yourself,
then
you
shouldn't
use
them
in
structured
data.
A
That's
kind
of
the
other
thing
I
think
the
the
trickier
part
is
is
really
with
regards
to
the
link
there
and
it
sounds
like
you're
kind
of
exchanging
things
of
value
there
and,
from
my
point
of
view,
that
kind
of
falls
into
the
bucket
of
link
exchange,
and
that
could
be
something
where
the
web
spam
team
would
say
this.
This
would
be
problematic
my
my
general
recommendation
there
would
be.
A
If,
if
this
is
on
a
larger
scale,
then
I
would
definitely
work
to
make
these
links
no
follow,
so
that
users
can
click
through
if
they
want
and
there's
some
value
in
that,
at
least,
but
that
you're
sure
that
it
doesn't
come
across
as
we're
buying
the
reviews
from
them
in
exchange
for
for
backlinks.
B
C
Hi
john
hi.
Well
so
I
have
actually
a
few
questions
most
of
the
questions.
Actually
the
top
question
for
the
main
question.
So
my
first
question
is
the
main
question:
if
we
move
a
domain
to
another
domain,
entirely
doesn't
mean
that
the
ranking
the
old
website
has
the
news
that
will
get
all
the
ranking.
A
It
should
transfer
all
of
the
signals,
yeah
yeah,
if,
if
everything
is
otherwise
fine,
then
all
of
that
should
transfer
normally
and
we
we've
worked
for
a
really
long
time
to
make
kind
of
these
domain
moves
as
smooth
as
possible,
and
I
think
for
the
most
part
they
work
really
well,
and
sometimes
we
we
do
run
into
weird
edge
cases,
but
it
should
essentially
transfer
everything.
A
It's
trickier,
if
you
have
one
domain,
that
already
has
existing
content
and
you
move
something
there,
because
then
you're
kind
of
merging
two
websites,
but
if
you're
really
just
moving
from
one
domain
to
another
one
to
one
and
it's
just
changing
the
domain
name.
Essentially,
then
that's
something
we
should
be
able
to
pick
up
very
easily.
C
So
here's
my
next
question,
like
we
have
a
client
who
provide
a
swimming,
build
swimming
pool
and
they
also
build
a
spa
now
what
they
did
before
they
had
one
website
for
swimming
buildings,
uniform
and
another
website
for
spa.
C
C
A
So
so
that
sounds
like
you're
a
merging
site
right
yeah
in
general,
we
we
do
try
to
figure
out
what
what
the
right
approach
is
with
merging
sites,
but
it's
harder
than
when
you're
moving
from
one
domain
to
another,
because
you
don't
really
know
what
the
final
outcome
should
be.
So
essentially
what
happens
is
on
a
page
by
page
basis,
we
try
to
move
things
over
and
then
on
on
a
whole
across
the
whole
website.
A
So
that's
something
where
I
would
expect
some
of
these
pages
on
a
page
by
page
basis,
if
you're
just
moving
them
to
essentially
just
transfer
over
and
for
a
lot
of
the
rest,
it
will
probably
take
a
bit
of
time
to
settle
down,
and
I
I
imagine,
will
see
a
positive
effect
of
merging
these
two
sites
together.
But
it's
something
where
it's
it's
hard
to
say
what
the
final
outcome
will
be,
because
it's
not
like
all
of
the
traffic
from
site.
C
Okay
and
the
last
question,
this
is,
as
it
happens,
opposite
of
the
incarceration.
So
one
point
they
build
garden
shed
and
they
build
a
playground
for
the
kids.
Now
the
playground
for
the
kids.
They
want
to
create
a
separate
domain
for
the
playground
section
and
all
the
product
they
have
on
the
on
the
website.
They
want
to
relate
those
products
to
the
new
domain.
Now
the
question
is
the
old
website,
the
ranking
they
have
for
the
playground,
related
keywords
with
the
new
website
and
you
domain
have
those
ranking.
A
Maybe
I
I
think
splitting
sites
up
is
almost
harder
than
merging
them,
but
it's
a
similar
situation
where
you
can't
do
it
purely
on
a
page
by
page
basis,
because
of
all
of
the
things
like
internal
linking
will
be
very
different.
If
you
take
things
out
of
one
website,
I
I
think
sometimes
it
makes
sense
to
separate
things
out
for
the
most
part.
I
generally
recommend
concentrating
things
more
rather
than
separating
them
out,
but
some
sometimes
they're
they're
really
good
reasons
to
split
things
out
across
separate
domains.
D
Thank
you,
john.
Can
I
ask
a
question
regarding
sidemove
sure:
could
it
be
the
case?
You
know
if
you're
moving
to
a
domain
that
previously
had
that
was
associated,
you
know,
selling
illegal
drugs
or
pornography?
Could
it
affect?
You
know
the
transfer
of
page
rank
and
signals
over
or
some
it
has.
Some
sort
of
you
know
algorithm
the
motion
that
affecting
the
inside.
A
Yeah,
I
think
that
can
happen
so
so,
for
example,
if
there's,
if
the
old
domain
or
the
the
domain
that
you're
moving
to
was
was
an
adult
content
domain
and
then
that's
something
where
it
can
happen
that
our
safe
search,
algorithms
kind
of
stick
to
that
classification
and
say
well,
this
is
a
this
is
adult
content
and
it
can
take
quite
a
bit
of
time
for
that
to
kind
of
settle
down
and
get
dropped
with.
Regards
to,
I
don't
know
other
other
kinds
of
spam.
It
really
depends
a
bit.
A
I
mean
other
kinds
of
spam:
it's
not
not
necessarily
spam
if
it's
just
adult
content,
but
with
regards
to
other
kinds
of
I'd,
say
like
problematic
or
tricky
content,
it
is
something
where,
for
a
large
part,
we
try
to
replace
it
with
the
the
new
version,
but
sometimes
there
are
effects
that
are
also
outside
of
the
website
as
well
such
as
maybe
there
are
lots
of
external
links
pointing
to
the
site
with
problematic
anchors.
A
That
might
be
something
where,
on
the
one
hand,
we
might
be
ignoring
these
already
or
it
could
be
even
that
they're
causing
problems
for
the
website.
So,
if
you're
moving
to
a
site
that
has
a
longer
history,
then
that
is
something
to
kind
of
consider
that
it
might
be
harder
to
get
it
into
a
neutral
state.
Again.
D
Yeah
I
mean
I've
seen
I've
seen
one
case
that
you
know
we
moved
to
a
spamming
domain.
You
know
initially
traffic
you
know
dropped
by,
I
don't
know
75
at
the
time,
but
after
after
broadca
updates,
you
know
around
six
months
later
we
actually
recovered
most
of
the
traffic.
You
know,
I
don't
know
what
yeah
that
that's
very
weird.
I
think.
A
But
I
think
that
can
happen
yeah.
That's
I
in
general,
with
our
algorithms.
We
try
to
make
it
so
that
they
adapt
over
time
to
things
like
that
and
if,
if
we
move
things
over
and
we
have
kind
of
these
residual
signals
for
the
the
new
domain
that
you're
moving
to,
then
that
should
settle
down
over
time
and
sometimes
that
that
takes
a
couple
of
months.
Sometimes
that
takes
a
year
or
so
and
it's
depending
on
what
what
the
domain
was
doing
beforehand.
It's
it's
like
easier
or
it's
harder.
E
John,
that's
very
related
to
my
question.
So
when
we
migrated
our
domain,
we
checked
for
any
links
to
the
you
know
before
we
even
decided
to
do
the
migration
we
checked
for
any
links
to
the
new
domain.
We
checked
for
our
content
history.
There
were
only
two
links,
no
content
up
for
like
10
years,
at
least
according
to
the
wayback
machine.
E
What
other
things
should
we
have
looked
for
like
what's
within
our
control,
to
check
for
a
new
domain
so
that
we
don't
run
this
risk
because
that
seems
to
be
the
indication
we're
getting?
Is
that
that's
what's
going
on
with
us?
So
we
don't
want
to
run
into
this
in
the
future
and
also
still
don't
really
know
what
to
do
right
now.
A
Yeah,
I
I
don't
know
so
I
I
saw
your
question
this
this
morning
and
one
of
the
things
on
our
side
is
we're
still
looking
into
some
of
the
the
other
signals
that
that
were
kind
of
kind
of
missing
there,
and
at
the
moment,
I'm
mostly
trying
to
push
the
team
into
just
getting
it
resolved
as
quickly
as
possible,
where
I
think
that
first
step
was
good,
but
it
it
does
turn
out
that
there
are
other
things
that
are
kind
of
stuck
there
as
well.
A
But
I
I
don't
know
what
exactly
else
you
should
be
watching
out
for
so
I
think
the
the
obvious
ones
are
really
the
things
that
you
looked
at
like
the
links
to
the
site,
the
previous
content.
A
A
F
Sure
hi
john,
I
wanted
to
ask
about
see
results
on
carousel
in
europe
and
the
eligibility
criteria.
F
For
that,
could
you
explain
how
to
you
know
how
to
get
featured
there,
because
our
domains
are
market
leaders
in
in
some
of
the
countries
in
europe,
and
we
we
feel
that
we
should
be
eligible
for
that,
but
there
is
like
no
documentation
whatsoever
which,
which
carousel
did
you
mean
the
c
result
zone
like
you
see
like
related
results,
for
example,
if
you,
if
you're
looking
for
a
mechanic,
you
would
feature
like
gumtree
or
olx.
A
Okay,
I
don't
know
if,
if
you
want,
let
me
just
drop
my
email
address
here.
If
you
want
to
maybe
send
me
an
email
with
what
exactly
you're
you're
seeing
and
what
what
your
site
is,
I
I
can
try
to
find
someone
and
pass
that
on
all.
A
Awesome,
thank
you
in
general.
A
lot
of
these
kind
of
I
don't
know
extra
lengths.
I
don't
even
know
what
what
they're
called
where
we
link
to
other
sites
for
more
information
is
something
that
happens
algorithmically,
but
I
don't
know
how
how
things
are
handled
in
that
particular
case
on
your
side,
okay,
I'll,
send
an
email.
Thank
you
sure.
A
Okay,
let
me
go
through
some
of
the
submitted
questions.
We
have
written
this
question
on
top.
I
think
we
kind
of
looked
into
that
already.
Let's
take
a
look
at
the
next
one.
We
have
a
large
site
and
a
section
on
the
site.
We
have
a
forum.
This
forum
is
an
old
cms
and
it's
difficult
to
optimize
for
speed,
we're
looking
to
make
sure
that
we
have
good
core
web
vital
score
before
2021.
A
Basically
is
speed,
looked
at
on
a
page
by
page
basis
or
could
slow
speed
on
some
pages
of
your
site
affect
how
google
sees
your
site
as
a
whole
good
question,
so
in
general,
with
with
our
algorithms,
we
try
to
be
as
fine-grained
as
possible.
So
if
we
can
get
granular
information
for
your
site
and
recognize
the
individual
parts
of
your
website
properly,
then
we
will
try
to
do
that.
A
However,
it
depends
a
little
bit
on
your
site
and
how
much
data
we
have
for
your
site,
especially
when
it
comes
to
speed
where
it's
based
or
it
will
be
based
because
it's
not
not
live
yet
it'll,
be
based
on
the
core
web
vitals
and
the
chrome
user
experience
report,
data,
which
is
just
a
very
small
sample
of
the
the
people
that
visit
your
site,
aggregated
and
that's
something
that
doesn't
have
data
for
every
url
of
a
website
so,
depending
on
how
much
data
is
available
there
and
how
easily
it
is
for
us
to
figure
out
which
parts
of
your
site
are
separate
and
then
that's
something
that
we
can
do
easier,
or
that
is
a
little
bit
harder.
A
We
we
have
similar
things-
I,
I
guess
similar
mechanisms
across
various
other
signals
that
we
use
in
search.
One
of
them,
for
example,
is
with
with
adult
content
where,
if
you
have
a
part
of
your
website
with
adult
content
and
a
part
of
your
website
that
has
like
normal
content
or
other
content
on
it,
then
the
easier
we
can
recognize
that
these
are
separate
parts
and
separate
them
out
individually.
A
If
that
makes
sense
for
your
website
and
the
easier
in
in
your
case,
it
would
be,
for
example,
to
split
out
the
forum
from
our
site,
where
we
can
tell
oh
slash
forum,
is
everything
forum
and
it's
kind
of
slow
and
everything
else?
That's
not
in
slash
forum
is
really
fast.
If
we
can
recognize
that
fairly
easily,
that's
a
lot
easier,
then
we
can
really
say
everything
here
in
such
forum
is
kind
of
slow.
Everything
here
is
kind
of
okay.
A
On
the
other
hand,
if
we
have
to
do
this
on
a
per
url
basis,
where
the
url
structure
is
really
like,
we
can't
tell
based
on
the
url.
If
this
is
a
part
of
the
forum
or
part
of
the
rest
of
your
site,
then
we
can't
really
group
that
into
parts
of
your
website
and
then
we'll
kind
of
be
forced
to
take
an
aggregate
score
across
your
whole
site
and
apply
that
appropriately.
A
I,
I
suspect,
we'll
have
a
little
bit
more
information
on
this
as
we
get
closer
to
announcing
or
kind
of
closer
to
the
date
when
we
start
using
core
web
titles
in
in
search.
But
it
is
something
you
can
look
at
already
a
little
bit
in
search
console.
There's
a
core
web
vitals
report
there
and
if
you
drill
down
to
individual
issues,
you'll
also
see
this
url
affects
so
many
similar
urls
and
based
on
that,
you
can
already
kind
of
tell.
A
Okay
and
now,
like
a
really
long
question
from
david
who
doesn't
like
pinterest,
is
the
indexing
api
going
to
open
up
to
general
urls?
I
don't
know
my
guess
is
it'll
be
tricky
because
we
see
a
lot
of
abuse
with
all
of
these
submit
to
to
indexing
features
that
we
have,
and
I
I
don't
I
don't
know
if
it
would
make
sense
to
open
up
yet
another
channel
that
we
have
to
kind
of
figure
out
how
to
deal
with
more
views
on
is
discover
going
to
be
in
the
api.
A
I
I
hope
so.
I
will
take
this
question
as
a
nudge
and
ask
the
team
again:
can
we
have
fetch
and
render
api
that
has
a
static,
ip
or
a
unique
user
agent?
A
I
don't
think
so,
mostly
because
the
the
fetch
and
render
feature
or
the
the
inspect
url
feature
in
in
search
console
is
meant
to
reflect
what
googlebot
would
actually
see
and
is
not
meant
to
be
used
kind
of
as
a
single
request
kind
of
fine
tuning
for
for
google
bot
in
the
sense
that
you
have
a
specific
ip
address
or
a
specific
user
agent
there.
A
I
I
don't
think
that's
something
we'd
be
able
to
do
a
suggested
feature.
Google
parser
breakers,
there's
something
in
the
code.
You
can
fetch
that
can
break
your
parser.
For
example,
a
few
years
ago,
an
image
tag
in
the
head
would
break
parsing
and
therefore
terminate
the
head
prematurely.
A
We
want
to
make
sure
that
our
terrible
html
isn't
causing
issues.
I
guess
the
easy
solution
is
not
to
make
terrible
html.
It
sounds
like
you're
pretty
advanced,
so
you
can
figure
that
out,
but
the
the
part
with
individual
elements
in
the
head.
That
still
applies
in
the
sense
that
when
we
render
a
page
similar
to
a
browser,
if
there
are
elements
in
the
head
of
the
page
that
belong
to
the
body,
then
we
will
open
up
the
body
and
treat
the
rest
of
that
section.
A
Essentially
as
part
of
the
body
and
there
there
are
specific
elements
that
we
really
need
to
be
able
to
find
in
the
head
of
the
page
so
that
we
can
take
them
seriously.
That
includes
things
like
the
royal
canonical,
the
robot's,
meta
tags
and
hreflang
links,
for
example.
A
So
if
you
have
elements
on
top
of
the
head
of
the
page
that
essentially
break
the
head
in
in
the
dom,
that's
something
that
could
cause
problems
for
those
elements,
and
I
don't
think
you
explicitly
see
that
in
in
fetch
and
render
or
an
inspect
url
in
search
console,
because
we
we
just
say
well:
oh
it
looks
like
it
looks
like
the
webmaster
wanted
to.
Like
start
the
head
of
the
page.
A
Here
we
will
just
treat
the
rest
like
it
is
or
treat
to
rest
like
it
is
part
of
the
body,
so
we
wouldn't
necessarily
see
that
as
a
bug
or
something
that
is
the
site
is
doing
wrong,
but
rather
we're
just
trying
to
deal
with
the
broken
html
that
is
out
on
the
web.
A
Extend
the
api.
Some
of
us
are
using
screen
screen,
scraping
techniques
to
extract
a
lot
of
data
that
isn't
available
in
the
api.
Why
not
make
it
available
and
monetize
it?
So
I
don't
think
we'd
want
to
monetize
the
api.
That's
kind
of
the
the
first
step
there
in
in
the
sense
that
everything
around
money
is
really
hard
at
google,
especially
when
kind
of
dealing
with
customers
and
all
of
that.
A
Maintaining
some
of
these
features
that
not
a
lot
of
people
use
is
really
expensive,
bring
back
site.
Removals
pinterest
is
a
parking
meter
of
the
internet.
No
one
likes
to
see
them.
I
I
don't
know
bring
back
sight
removal.
It
sounds
like
you
want
to
remove
other
people's
sites.
I'm
sure
the
seo
of
pinterest
would
be
kind
of
against
that.
I
I
don't
know
I
I
don't
use
pinterest
personally,
but
I
do
see
a
lot
of
people
getting
value
out
of
that.
There
is
a
lot
of
good
content
there.
A
So
I
don't
know
I
I
think
in
general,
with
regards
to
site
removals,
especially
on
a
personal
level
where
you
could
say
like.
I
don't
want
to
see
the
site
anymore.
I
think
that's
an
interesting
idea.
I
think
implementation
is
just
really
really
hard
with
all
of
the
different
ways
that
sites
can
be
visible
in
in
search
nowadays.
A
Is
there
any
preferred
time
zone
for
the
last
notification
date?
Yes
well
in
in
the
last
modification
date,
you
should
be
specifying
a
time
zone.
It
doesn't
matter
which
time
zone
you
use,
but
it
should
have
a
time
zone.
I
think
that's
a
part
of
the
kind
of
date
time
standard
that's
used
in
in
the
xml
file.
A
How
sensitive
is
google
when
it
is
about
to
ignore
the
last
modification
date
in
general?
We
we
use
it
as
a
guide
to
understand
when
pages
have
changed,
and
it's
not
so
much
that
it
has
to
be
exact,
but
rather
we
have
to
be
able
to
understand.
Oh
this
is
this
page
has
changed
since
the
last
time
we
looked
at
it,
so
we
should
look
at
it
again
and
from
from
that
point
of
view,
it's
important
for
us
to
be
able
to
roughly
trust
the
last
modification
date.
A
In
particular.
The
thing
that
we
we
sometimes
see
is
that
people
generate
sitemap
files
and
they
just
use
the
current
date
for
the
last
modification
date
for
all
urls
and
that's
something.
That's
obviously
wrong.
When
we
look
at
the
sitemap
file
and
there
are
10
000
urls
in
there-
and
they
were
all
updated
in
the
last
minute-
that's
probably
you're
you're
calculating
the
last
modification
date
wrong
and,
from
our
point
of
view,
it's
not
so
much
that
we
want
to
penalize
the
website
for
doing
that.
It's
just
well.
A
Let's
see
question
goes
on
in
oops,
I
think
slightly
different
direction.
We
update
our
meta
robots,
frequently
index
and
no
index,
and
two
months
ago
we
implemented
last
modification
on
product
pages
which
are
back
in
stock
in
the
last
seven
days
and
mark
them
as
index.
But
we
didn't
see
any
impact
on
submitted.
Url
marked
no
index,
I
manually
checked
some
of
the
last
modification.
Urls
google
never
seems
to
follow
them
in
in
general.
A
I
think
this
fluctuation
between
index
and
non-index
is
something
that
can
throw
us
off
a
little
bit,
because
if
we
see
a
page
that
is
no
index
for
a
longer
period
of
time,
we
will
assume
that
this
is
kind
of
like
a
404
page
and
we
don't
have
to
crawl
it
that
frequently.
A
So
that's
something
where
probably
what
is
happening
there
is
that
we
see
these
pages
as
no
index
and
we
decide
not
to
crawl
them
as
frequently
anymore,
regardless
of
what
you
submit
in
the
sitemap
file.
So
that's
something
where
kind
of
fluctuating
with
the
the
metano
index
is
probably
counterproductive
here.
If
you
really
want
those
pages
to
be
indexed
every
now
and
then.
A
What
I
would
try
to
do
in
a
case
like
that
is
maybe
set
up
a
page
that
you
can
persistently
maintain
and
link
from
there
to
the
individual
products
that
you
want
to
kind
of
have
listed
or
not
listed,
and
then
maybe
focus
more
on
that
persistent
page
rather
than
on
the
individual
products.
That
kind
of
come
in
and
go
out.
A
So
I
assume
you're
you're
talking
about
the
sitemap
file
right.
Yes,
yes,
yes,
yes,
okay,
so
in
in
the
sitemap
file
for
the
last
modification
date,
the
date
time
uses
a
a
specific
standard,
and
in
that
standard
it
has
a
zone
attached
to
the
end.
If
you
have
a
z
at
the
end
of
the
the
date
time,
then
that
means
it's,
I
think
utc
time,
but
you
can
also
specify
different
time
zones.
There.
G
Okay,
so
the
end
of
every
last
board
tag.
We
have
to
battle
type
2.
A
Yes,
well,
I
I
would
make
sure
that
you're
doing
it
in
the
right
standard.
So
I
I
would
check
the
sitemap
file
documentation
and
look
up
the
the
datetime
standard
that
that
we
use
there.
It's
like
rfc
and
some
some
number
and
usually
there's
a
wikipedia
page
with
a
lot
of
examples
where
you
see
with
time
zone
without
kind
of
with
the
utc
or
the
z
at
the
end,
so
that
you
can
compare
that.
A
Sure,
okay,
in
light
of
the
recent
indexing
issues,
any
update
on
the
last
45
affected,
canonical
urls
that
still
hasn't
been
restored.
Is
it
okay
to
submit
affected
urls
manually
and
search
console?
Can
you
shed
some
light
on
what
happened
here?
What
is
google
to
present
doing
to
prevent
this
in
the
future?
A
I
I
don't
have
any
big
updates
on
this
at
the
moment,
so
I
haven't
been
following
up
on
on
that.
I
think
danny
has
mostly
been
keeping
track
of
that
and
pushing
things
along
one.
One
thing
I
can
mention
here
is
you
can't
use
the
inspect,
url
and
submit
to
indexing
feature
in
search
console
because
at
the
moment
that's
kind
of
in
maintenance,
so
that's
kind
of
tricky
there.
A
I
don't
know
like
for
individual
sites
how
strongly
you
would
still
see
this,
because
my
understanding
is
usually
these
kind
of
issues
are
resolved
for
for
the
visible
urls
of
pretty
much
all
sites
fairly
quickly
within
a
day
or
so,
but
if
you're
still
seeing
a
really
strong
effect
from
this,
maybe
drop
me
a
note
on
twitter
and
I
can
take
a
look
with
the
team
to
see,
if
maybe
there's
something
else,
that's
happening
with
your
site.
In
particular,.
A
I
was
wondering
how
google
search
was
evaluating
the
ad
experience
as
a
ranking
factor.
It's
easy
to
estimate
the
readability
of
a
font
size,
but
I
have
trouble
wrapping
my
mind
around
how
much
a
huge
sticky
video
on
mobile
would
impact
ranking,
let's
say,
30,
to
comply
better
at
standard,
but
still
highly
deceptive
disruptive
experience.
A
So
as
I
I
had
to
double
check.
But
as
far
as
I
know,
we
don't
use
the
ad
experience
report
as
a
ranking
factor
in
in
search.
A
A
Essentially,
you
really
need
to
use
in
search
because
chrome
would
would
handle
that
anyway,
but
it
is
something
where
probably
the
effects
that
you
would
see
in
search
would
be
based
on
things
like
the
the
above
the
fold
content
or
the
I
forgot,
what
what
we
used,
not
the
intrusive
interstitials
but
kind
of
the.
If
you
have
a
lot
of
ads
on
on
top
part
of
your
page,
that's
probably
where
you
would
see
an
effect
from
search.
A
E
A
A
So
I
don't
think
I
really
answered
your
question
there,
but
I'm
not
100
sure
what
what
direction
I
should
go
there.
I
wanted
to
know
how
we
can
stop
paginated
pages
from
ranking.
I
want
them
to
index,
but
not
to
rank,
as
I
always
want
the
first
page
to
rank
and
not
page
two
or
page
three,
but
I
can't
even
add
no
index,
as
the
listings
mentioned
in
those
second
and
third
are
important
if
anyone
search
it
for
a
specific
name.
A
I
want
the
listings
mentioned
on
page
two
and
three
also
to
rank,
but
I
don't
want
other
pages
rather
than
page
one
to
rank,
and
so
that's,
I
think,
a
tricky
situation
in
the
in
terms
of
you
can't
really
specify
where
you
want
your
pages
to
rank
or
whether
or
not
they
should
be
shown
in
the
search
results.
If
you
want
them
to
be
indexed.
So
if
they're
indexed,
then
our
systems
will
try
to
figure
out
how
to
rank
them
appropriately.
A
A
What
I,
what
we
generally
recommend
when
it
comes
to
pagination,
is
to
allow
your
paginated
pages
to
be
indexed
block,
your
filtered
pages
from
being
indexed,
because
these
are
essentially
links
to
the
same
products
as
you
have
before,
unless
there
are
individual
kind
of
filtered
versions
that
you
would
consider
something
more
like
a
category
page
with
regards
to
paginated
pages.
One
way
you
can
make
sure
that
we
focus
more
on
the
first
page
is
to
link
incrementally
between
the
pages
of
your
paginated
set.
A
So,
instead
of
linking
from
page
one
to
page
two,
three,
four,
five,
six,
seven
and
all
of
the
other
ones
link
from
page
one
to
page
two
and
from
page
two
to
page
three
page
three
to
page
four,
so
that
when
we
look
at
that,
we
see
that
the
first
page
is
like
highly
mentioned
within
your
website.
A
So
clearly
this
must
be
very
important
page
and
the
other
ones
are
incrementally
less
important
because
they're
further
further
away
from
your
home
page,
and
by
doing
it
like
that,
you
can
have
those
pages
be
indexed,
but
we
will
understand
that
these
are
not
really
that
relevant
for
your
site
and
usually
that
does
end
up
with
us
kind
of
focusing
on
the
first
page,
especially
if
someone's
searching
for
a
category,
then
we
can
focus
on
that.
First
page
and
say
this
is
clearly
the
best
page
for
that
category.
A
The
site
map
we
submitted
from
years
ago
keeps
updating
the
submitted
date
in
search
console.
It
looks
like
it's
getting
resubmitted,
but
no
one
on
the
account
has
resubmitted
it.
This
happened
a
couple
times
a
month.
What
might
be
happening
there.
So
it's
it's
hard
to
say
without
knowing
which
site
you're
looking
at.
A
Theoretically,
it
could
also
be
that
someone
else
is
resubmitting
your
site
file,
because
it
doesn't
need
to
be
tied
to
your
account
to
to
kind
of
ping
that
update
url.
I
imagine
it's
very
unlikely
that
someone
else
is
randomly
submitting
resubmitting
your
sitemap
file
like
this,
because
there's
there's
really
no
reason
to
do
that.
It
doesn't
help
them.
It
doesn't
really
change
anything
from
your
side.
So
from
that
point
of
view,
my
guess
is
it's
probably
just
your
your
cms.
That
is
doing
this
for
you
rendering
check
question.
A
We
have
a
website
single
page
application
where,
when
we
check
the
index
html
and
the
url
inspection
tool,
it
looks
fine
rendering
and
mobile
friendly
test
looks
good
as
well,
but
when
we
test
the
live,
url
the
html
is
minimal
and
the
screenshot
is
blank.
What
might
be
the
reason
for
this
discrepancy?
A
So
my
my
guess,
just
based
on
this
question,
without
looking
at
the
urls,
is
that
we
have
trouble
rendering
the
page
at
a
high
speed.
So
when
it
comes
to
google
search,
we
we're
very
patient
and
we
render
pages
essentially
if
they
take
a
little
bit
longer.
That's
fine.
If
some
of
the
embedded
content
isn't
available
yet
we
will
fetch
that
individually
and
then
use
that
for
rendering,
but
with
the
testing
tools.
A
We
want
to
give
you
an
answer
as
quickly
as
possible
and
when
we
we
see,
someone
is
using
one
of
these
testing
tools.
We
set
the
timeouts
a
little
bit
more
aggressively
just
so
that
we
can
give
people
an
answer,
and
that
means
that
if
you're,
using
these
tools
to
look
at
the
the
screenshot
or
look
at
how
things
are
rendered,
it's
very
possible
that
some
of
the
embedded
content,
especially
if
you're
pulling
things
from
an
api
or
from
your
back
end
and
that
they're
timing
out
for
that
specific
test.
A
On
the
one
hand,
if
it
works
in
search,
then
you're,
probably
fine,
because
search
like
I
mentioned,
is
a
little
bit
more
patient.
On
the
other
hand,
it
makes
it
a
little
bit
harder
to
debug.
So
I
would
try
to
look
into
how
much
embedded
content
is
actually
needed
for
your
pages
to
load.
You
can
look
at
a
waterfall
diagram
in
things
like
webpagetest.org
or
in
chrome,
developer
tools,
directly
and
kind
of
make.
A
rough
kind
of
estimation
of
is
this
something
that
I
can
improve.
A
So
my
recommendation
there
would
be
not
to
panic
if
it's
working
in
search,
but
still
to
kind
of
look
into
ways
that
you
can
improve
that
over
time
is
it
recommended
to
use
keywords
as
it
is
in
the
content
to
get
the
best
results
in
google?
Does
a
crawler
use
the
fundamentals
of
ai
to
make
a
combination
of
keywords
that
are
closely
related
with
the
content
and
then
rank
them?
A
A
So
one
kind
of
the
the
simplest
approach
that
we
take
is
we
we
split
a
page
into
words
and
we
store
all
of
the
words
in
an
inverted
index
so
that
when
we
know
that
someone
is
looking
for
this
combination
of
words,
then
we
can
find
all
of
the
documents
that
have
those
words
in
them
and
then,
based
on
that
list
of
documents,
we
can
kind
of
reorder
that
list
and
say
well.
This
is
the
right
order
that
we
should
be
showing
these
to
two
people.
A
These
algorithms
are
there
to
try
to
help
deal
with
cases
where
we
don't
really
understand
what
the
user
is
searching
for.
So
we
we
still
see
about.
15
of
all
searches
are
completely
new
every
day,
and
these
aren't
things
that
we
can
prepare
for,
and
we
have
to
kind
of
estimate
what
what
is
the
user
actually
searching
for?
Are
there
any
acronyms
in
there?
Are
there
synonyms
that
we
can
use?
Is
there,
like
singular
plural,
that
we
can
figure
out
and
that's
something
we
for
longest
time?
A
G
H
A
H
So,
basically,
because
we
are
still
working
into
that
and
one
question
that
we
have
because
we
are
still-
and
I
think
that
at
the
experience
question
that
you
had
on
the
chat
and
you
the
you,
I
think
you
were
trying
to
mention
the
page
diode
algorithm.
Oh.
F
A
I,
for
for
a
lot
of
these
quality
algorithms-
I
don't
know
if
we
have
a
specific
time
where
we
we
can
say
that
kind
of
it's
it's
been
resolved
now
it
depends
a
little
bit
on
so
I
don't
know
specifically
with
a
with
page
layout
algorithm,
but
with
a
lot
of
these,
we
we
do
look
at
it
more
on
a
and
a
broader
level
for
a
website
where
we
say
well,
we
we
can't
test
every
page
individually
for
this,
but
we've
seen
a
large
part
of
the
pages
for
the
site.
A
A
So,
if
you're,
making
changes
like
that
and
you're
essentially
affecting
the
quality
of
your
pages
overall,
then
that
is
something
that
I
would
expect
is
not
going
to
happen
overnight.
It's
not
going
to
happen
within
a
week.
It's
more
a
matter
of.
I
don't
know
two
three.
Four
five
months,
maybe
where.
H
A
A
I
could
imagine
that
if
you're
talking
about
something
that
is
really
a
bigger
part
of
the
page
that
it
could
could
be
affecting
there,
but
we
don't
have
like
specific
numbers
where
we
say.
Oh,
the
ad
has
to
be
maximum
this
many
pixels
or
this
many
percent
of
the
viewport.
That's
not
some
something
we
have.
H
Yeah
one
other
question
like-
and
this
is
just
just
trying
to
find:
if
we
are,
we
missed
something
like
what
could
be
some
reasons
that
would
cause
a
drop
of
traffic
35
to
40
across
subdomains,
not
just
the
main
domain,
but
across
sub
domains
as
well
like
that
could
be
that
that
could
cause
this.
A
A
But
it's
it's
really
hard
to
say
kind
of
in
in
general,
because
on
the
one
hand,
we
we
make
algorithm
changes
the
rest
of
the
web
changes
all
the
time
and
all
of
that
kind
of
plays
in
together.
Even
if
you
don't
make
any
changes
on
your
website,
then
that
can
can
at
some
point
cause
either
a
subtle
decline
or
kind
of
a
really
strong
drop.
H
Yeah,
when
you're
talking
about
quality
issues,
would
it
cause
quality
issues?
If,
let's
say,
for
example,
we
are
updating
the
content
frequently,
but
not
significantly.
A
Perfectly
fine
yeah,
I
don't.
I
don't
think
that
that
would
be
generally
a
problem.
Usually
what
what
I
see
with
regards
to
quality
issues
is
is
really
when
I
don't
know
when
you
look
at
the
website
overall,
and
you
can
tell
that
users
aren't
trusting
the
content
as
much
anymore
or
they're,
just
not
sure
about
the
content
anymore,
then
that's
something
where
I'd
say
like
that:
they're,
more
clear-cut
quality
issues,
just
individual
updates
of
a
site
or
making
a
lot
of
changes
on
a
site
or
making
few
changes
on
a
site.
A
H
Yep
and
last
question
yahoo
syndicates
a
lot
of
our
content.
In
the
majority
of
cases,
our
content
ranks
first,
because
google
recognizes
as
the
source
of
the
content,
but
yahoo
doesn't
canonicalize
it's
content
that
is
syndicating
from
us
and
in
some
rare
cases
they
manage
to
rank
above
us
for
some
content
that
is
created
by
us.
Could
that
be
a
cause
for
a
job
like
this.
A
No,
I
don't,
I
don't
think
so.
I
I
think
that's
something
where
ultimately,
the
the
issue
that
you'll
see
is
that
the
these
syndication
sites
essentially
rank
as
well
in
the
search
results,
and
sometimes
they
rank
above
your
content
because
just
like
for
whatever
reasons,
but
it's
not
a
sign
that
we
would
say
your
site
is
lower
quality
because
it's
syndicating
content
to
other
sites.
I
I
think
in
general,
that
would
almost
be
the
opposite
that
other.
If
other
sites
are
wanting
to
syndicate
your
content,
then
that's
kind
of
a
good
sign.
H
Yeah,
we
are
circling
back
always
to
the
and
to
us
being
affected
by
the
page
layout
algorithm
and
we
made
changes,
and
we
now
understand
that
it
will
take
some
months
for
those
changes
to
be
recognized.
But
how
can
we
make
sure
that
those
are
the
necessary
changes
like
what
else
are
we
kind
of
missing,
or
how
can
we
check
to
make
sure
that,
after
two
three
months,
we
don't
realize
that?
Oh,
we
missed
this
thing
and
we
should
have
changed
this
as
well.
A
I
I
don't
think
you
can.
Unfortunately,
I
don't
think
there
there
is.
Like
a
I
mean
there,
definitely
isn't
a
testing
tool
that
will
tell
you
what
specifically
you
need
to
change
with
regards
to
your
website's
quality.
The
the
best
I
I
would
recommend
is
to
get
a
lot
of
people's
opinions
on
it.
So
that
could
be
something
where
you
could
go
to
the
webmaster
help
forum
and
say
like
from
a
quality
point
of
view.
A
What
else
should
we
be
taking
into
account,
and
sometimes
you
get
a
lot
of
feedback
that
is
less
relevant,
maybe
small
details,
but
sometimes
you
also
get
some
feedback
on
issues
that
you've
been
trying
to
push
out
of
your
mind
for
a
while,
and
that
you
actually
should
focus
on
a
bit
more
thanks.
A
lot
sure.
I
Hi
john,
I
have
a
question
related
to
the
merging
of
the
websites
that
we
talked
about
earlier,
so
we're
merging
two
pretty
big
websites,
not
for
seo
reasons
for
branding
purposes,
and
there
are
some
checklists
out
there.
But
I
want
to
ask
you:
what
is
the
most
common
error
that
people
do
when
they're
merging
big
websites
like
what
do
they
forget?
A
I
I
don't
know
if,
if
we
really
have
like
a
list
of
the
common
errors
that
that
people
make,
the
one
thing
I
would
recommend
doing
is
really
tracking
all
of
the
things
that
you're
doing
really
making
sure
you
have
a
clear
list
of
all
of
the
urls
ahead
beforehand
and
where
they
should
be
going
so
that
you
can
check
this
afterwards
as
well,
that
all
of
the
redirects
are
in
place
that
all
of
the
internal
links
are
are
working
appropriately
and
just
making
sure
that
you
have
all
of
these
technical
details
really
nailed
down,
and
it
is
something
that
is
like,
especially
if
you're
doing
it
all
in
one
at
one
time.
A
It's
it's
kind
of
nerve-wracking,
but
it's
I
I
think
the
the
best
you
can
do
is
really
just
make
sure
that
all
of
those
small
details
are
all
covered
properly.
I
Okay,
thank
you
and
the
last
question.
The
developer
seems
to
not
want
to
do
a
full
301.
That's
just
a
two
or
two:
how
big
of
a
difference
would
it
make
to
only
do
a
302,
not
a
full
permanent,
redirect.
A
So
my
my
guess
is:
it
will
take
a
little
bit
longer
for
the
canonicalization
to
to
kick
in
and
to
focus
on
on
that
new
url
that
you
have
there.
But
ultimately,
I
I
think
like
in
the
long
run,
it
won't
tell
you
a
big
difference,
but
it
if
you're
moving
permanently,
then
it
feels
like
a
301
would
be
the
right
thing
to
do
in.
In
any
case,
I
was
hoping
that
you
would
give
him
more
of
a
yes.
A
Yeah,
I
I
mean
it's
like
it
is.
It
is
what
we
recommend
it's
it's
something
like.
If,
if
you
don't
do
it
and
a
lot
of
sites,
get
it
wrong
and
we
have
to
deal
with
that
as
well,
but
if,
if
you
really
want
to
make
sure
that
everything
is
is
perfect,
then
a
301
is:
is
the
right
way
to
go.
J
Hey
john,
can
I
ask
the
question
sure
yeah
okay,
so
we
have
a
software
application
website
and
basically
it's
a
product
website.
So
recently
we
decided
that
we
should
create
a
news
portal
kind
of
section
so
that
we
can
cover
industry
news
and
we
would
be
writing
like
four
to
five
articles
every
day.
Now.
J
This
situation
is
something
like
that,
because
our
website
is
a
product
website
and
the
team
is
like
confused
whether
we
should
create
a
separate
section
for
news
on
the
on
this
existing
website,
or
should
we
just
create
a
new
domain
for
the
news
section,
because
the
confusion
is
that
if
we
go
with
the
existing
website,
what
are
the
chances
that
google
might
consider
our
product
website
as
a
news
portal
only
after
a
certain
period
of
time,
or
vice
versa,
like
it's
a
product
website,
so
google
might
consider
that
it's
not
a
news
website
so
why
to
rank
then
using
the
news
column.
A
I
I
would
recommend
keeping
it
within
your
existing
website
by
by
keeping
it
within
your
existing
website,
you're
automatically
kind
of
making
sure
that
it
it
has
all
of
the
weight
that
it
can
get
from
from
the
rest
of
your
website.
A
So
that's
something
where,
if
you
split
it
out
into
a
separate
domain,
then
essentially
you're
starting
with
a
new
website,
and
you
have
to
kind
of
work
to
get
that
well
known
out
on
the
web.
Whereas
if
you
start
with
your
existing
site,
then
you
already
have
kind
of
something
to
work
with
with
regards
to
google,
seeing
it
as
a
new
site
or
product
site
in
general.
That's
not
something
I
would
worry
about,
especially
if
you
can
structure
the
urls
in
a
way.
That
is
clear.
A
This
is
news,
and
this
is
like
the
rest
of
your
site,
for
example.
So
if
you,
if
you
have
a
clear
url
structure
there,
I
I
definitely
don't
see
any
problem
there.
J
Yeah
with
existing
approach,
we
decided
to
create
a
separate
category
for
the
new
section
and
we
would
be
like
using
youth's
schema
so
that
google
has
no
doubt,
but
it
was
just
like.
Google
should
not
be
confused
like
it's
a
news
website
on
the
product
website,
so
that
was
confusing
yeah.
Thank
you
for
getting.
A
A
K
We
have
news
from
the
first
four
years
and
every
week
we
do
original
news
coverage,
which
is
linked
back
by
other
publishers,
but
we
never
appear
in
the
top
stories,
but
they
appear
what
what
we?
What
should
we
do.
A
It's
it's
hard
to
say
because
for
the
the
top
stories,
it's
not
so
much
that
there's
a
specific
meta
tag
or
something
kind
of
technical
that
you
need
to
do
there,
but
rather
it's
it's
an
organic
search,
feature
and
depending
on
how
how
our
search
algorithms
kind
of
look
at
that
for
your
site.
That
can
go
one
way
or
it
can
go
the
other
way.
A
So
that's
really
kind
of
tricky
to
say.
One
thing
you
could
do
is
maybe
start
a
thread
in
the
webmaster
help
forum,
so
that
someone
can
take
a
look
at
your
details
and
see
if
there's
something
specific,
that
they
can
point
out.
But
in
many
cases
there
is
nothing
specific
kind
of
like
technical
to
point
out
and
say
you
need
to
do
this
specifically
to
be
visible
in
the
top
stories
section.
A
Oh,
thank
you
cool.
Okay!
Maybe
we
can
take
a
break
here.
I
need
to
jump
off
and
and
do
some
other
things
this
time.
Thank
you
all
for
joining.
It's
been
great
having
you
all
here
and
thanks
for
all
of
your
questions
and
like
I
mentioned,
if
there's
something
that's
still
kind
of
stuck
or
that
you
need
some
help
on
feel
free
to
to
ping
me
on
twitter,
and
let
me
know
about
that
and
otherwise
yeah.
Thanks
for
all
your
questions
and
have
a
great
weekend,
everyone.