►
From YouTube: English Google SEO office-hours from July 9, 2021
Description
This is a recording of the Google SEO office-hours hangout from July 9, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
Take the Search Central YouTube survey!
Click here to access the survey: https://goo.gle/Search-Central-YT-Survey
We appreciate your feedback!
A
A
We
have
a
bunch
of
stuff
already
submitted
on
youtube
which
we
can
go
through,
but
we
also
have
a
bunch
of
people
raising
their
hands
already.
So
maybe
we'll
go
through
some
live
questions
first
and
then
go
and
see
what
what
all
has
been
submitted
on
youtube.
Let's
see,
first
on
the
list
is
ashvani.
I
think.
B
Yeah
hi
hi
john
hi
john.
I
have
a
question
related
to
website,
revamping
and
redesigning
okay.
So,
like
I
have
done
earlier,
I
I
did
like
lots
of
transformation
of
website
like
from
wordpress
to
ph
to
another
language
and
another
language
to
wordpress
so
many
times.
But
what
like,
like
in
last
six
months,
I
have
revamped
total
three
website,
okay
and
in
every
website.
B
What
I
feel
like
earlier
before
revamping
their
traffic
was
almost
like,
you
can
say,
10
to
15
in
between
10
to
15
000
in
a
month,
okay
and
after
revamping
of
website
like
after
after
one
month
of
revamp.
Okay
suppose
I
I
did
reamp
in
january,
so
in
february
onward,
their
traffic
is
like
down
to
zero
or
even
like
50
or
60,
from
10
000
or
50
000.
So
I
have
applied
all
these
things
like.
B
I
have
checked
like
here,
their
url,
their
meta
tag,
meta,
title,
meta
description,
whatever
things
which
are
required
for
on
pace
factor,
I
did
all
check
301
and
all,
but
still
I
am
facing
this,
these
kind
of
issues
with
last
three
websites
from
last
six
months.
So
I
don't
know
what
is
the
mistake
I
am
doing
for
every
website,
but
before
six
months,
like
in
2020,
these
issues
are
not
like
generally
happening,
but
in
2021.
B
I
am
facing
these
kind
of
issue
and
for
this,
like
I
can
say,
the
website
is
go
in
like
go
inside
the
google.
B
A
Okay,
so
in
general
these
kind
of
things
I
would
look
at
on
a
per
website
basis.
There
is
nothing
on
our
side
that
is
is
kind
of
saying
like.
If
a
website
is
revamped,
then
we
must
change
its
ranking,
so
that's
kind
of
one
one
thing
kind
of
kind
of
first
off,
like
if
you're
seeing
this
with
three
websites.
A
That
sounds
like
maybe
you're,
doing
something
unique
with
with
the
ring
vamp
process
and
not
that
there's
something
on
google
side
that
would
be
blocking
revamps
in
general
for
revamps.
There
there's
sometimes
a
few
things
that
come
together
and
it's
it's
sometimes
tricky
to
figure
out
exactly
what
what
all
is
happening.
A
But
if,
on
the
other
hand,
those
factors
don't
align
like
if
the
urls
change,
if
the
layout
changes,
if
the
content
changes,
if
you
don't
have
redirects
from
the
old
urls
to
the
new
ones,
then
those
are
essentially
aspects
that
say
to
us
that
we
have
to
treat
this
as
a
new
website,
because
essentially,
we
crawl
from
the
start
and
there's
completely
different
content
or
it's
a
completely
different
in
setup
or
it's
a
completely
different
layout
or
the
urls
are
completely
different.
A
So
that's
essentially
from
our
side,
we
would
say:
oh
it's
a
new
website.
We
will
start
over
and
try
to
understand
it
again.
So
that's
that's
something
that
that
I
would
watch
out
for
there.
The
other
thing
is
also
that
we
also
make
other
kind
of
ranking
changes
across
the
web
as
well,
and
sometimes
when
you
do
a
revamp,
you
get
the
timing
in
in
such
a
perfect
way
that
it
aligns
exactly
with
when
we
make
a
core
update
or
when
we
make
a
bigger
ranking
chain.
A
And
then
it's
really
hard
to
recognize
is
this
issue
because
of
my
kind
of
technical
change
that
I
made
or
is
this
issue,
because
google
just
generally
would
have
understood
my
website
differently
anyway
and
trying
to
figure
out
like?
Is
it?
Is
it
something
that
that
you
did
with
the
revamp?
Or
is
it
something
that
google
changed?
A
I
think
that's
kind
of
a
good
first
step,
and
to
do
some
of
that,
it's
it's
really
useful
to
just
double
check,
all
of
the
technical
details
and
really
kind
of
like
get
a
map
of
all
of
the
old
urls
and
then
check
them
in
archive.org
and
see
what
they
look
like
before
and
confirm
what
they
look
like
now:
use
the
different
testing
tools
to
make
sure
that
it's
all
crawlable
and
indexable.
B
Yeah
john,
like
everything,
is
okay,
whatever
you
said,
is
all
fine
from
my
side,
but
except
one
thing
like
I
have
changed
the
structure
of
the
website
like
initially
it
was
a
plain
text.
Now
it
is
in
fold
or
it
is
in,
like
you
can
see
multiple
section.
There
is
a
multiple
section
in
website,
so
I
guess
this
might
be
a
one
factor
where
I
am
getting
this.
A
Is
right:
yeah,
okay
and
it's
it's
something
where
changing
the
the
structure
of
a
website.
It
will
affect
how
search
looks
at
it
and
it
can
be
a
positive
effect
too.
So
that
might
be
like
the
the
previous
revamps
that
you
did
if
you
went
from
a
one-page
website
to
a
multi-page
website,
it
might
be
that
that
was
a
good
change
for
those
websites,
but
it
might
be
that
the
same
change
for
the
current
website
that
you're
working
on
does
not
make
so
much
sense.
So
it
definitely
changes
something.
B
Okay,
if
suppose,
there
are
a
multiple
paragraph
in
your
website,
okay
and
for
an
instance,
I've
just
changed
like
for
suppose
the
third
paragraph
I
shifted
to
second
number
poison
and
the
second
number
paragraph
I'm
shifted
to
third
number
poison.
So
does
it
really
matter
in
ranking
purpose.
A
Not
usually,
but
it
it
can
affect
a
little
bit,
because
we
do
try
to
understand
the
context
of
the
the
text
that
you
have
on
the
pages
and
if
you
move
one
paragraph
from
something
that
is
very
prominent
to
kind
of
an
area
where
it's
like.
Oh
this
is
a
side
note,
then
that
could
affect
how
we
see
that
information.
B
C
A
B
Okay,
okay
and
does
image
change
matter
in
seo
point
of
view,
if
suppose,
while
revamping,
I
changed
my
image
like
earlier,
it
was
30
kb
and
now
it
it
is
15
kb
image
is
uploaded
and
the
image
is
also
different
than
earlier.
So
does
it
really
hamper.
A
A
D
Actually,
my
question
is
related
to
link
exchange
up.
To
what
extent
is
it
permissible
to
exchange
the
links
or
not
considered
as
spam?
Let's
say
I
have
been
outreaching
to
their
websites
from
my
competitors
have
taken
the
website
links
so
basically
as
a
human
tendency,
we
want
something
in
consideration,
so
whenever
I
outreach
them
so
they
also
ask
link
in
exchange.
So
what's
the
best
practices
when
it
comes
to
backlinks,
exchanging
the
backlinks
so.
A
A
link
exchange
where,
where
both
sides
are
kind
of,
like
you
link
to
me
and
therefore
I
will
link
back
to
you
kind
of
thing
that
is
essentially
against
our
webmaster
guidelines.
So
that's
something
where
our
algorithms
would
look
at
that
and
try
to
understand
what
is
happening
here
and
try
to
ignore
those
links.
And
if
the
web
spam
team
were
to
look
at
it,
they
would
also
say
well.
This
is
not
okay,
and
if
this
is
the
majority
of
the
links
to
to
your
website
like
this,
then
they
might
apply
manual
action.
D
A
D
What
this
comes
under
natural
natural
backlinks
like
we
have
to
create
the
quality
content.
Naturally,
people
will
give
us
link,
that's
only
natural
or
something
because.
A
That's
essentially
the
the
idea
behind
kind
of
natural
linking
I
it's
from
from
our
point
of
view,
it's
fine
to
contact
people
and
tell
them
it's
like
by
the
way.
I
have
this
great
content,
and
maybe
it's
something
that
you
would
appreciate
for
your
website
as
well.
That's
that's
generally
fine,
but
anything
beyond
that
where
you're
saying
like
well,
you
must
link
to
me
like
this
or
you
must
pay
me
money
or
I
will
pay
you
money
for
this
link
or
I
will
exchange
something
for
this
link.
A
That's
all
something
that,
from
from
our
point
of
view,
would
make
this
an
unnatural
link,
so
so
providing
it
to
people
and
then
and
promoting
it
and
saying,
like
here's
great
content
and
they
link
to
you,
that's
perfectly
fine,
providing
it
and
saying
like
I
will
do
an
exchange.
If
you
give
me
a
link,
that's
something
that
we
would
consider
unnatural.
E
Yeah
hi
john
hi,
so
I
wanna
ask
you
two
questions.
Okay,
so
first
one
is
suppose
if
there
is
a
url
of
my
website,
which
is
performing
good
in
google
search,
getting
traffic
and
all
those
things,
but
for
some
unfortunate
reason
reasons
it
got
404
okay.
So
now
what
I
did
is
I
created
a
new
url
and
shifted
the
old
one
301
to
the
new
one,
okay
and
re-index,
the
new
one.
So
I
want
to
ask
you:
will
the
will
my
new
url
will
be
able
to
perform
the
same
way
as
the
old
one?
A
I
mean
it's
it's
something
where,
with
with
the
redirect
you're
forwarding
a
lot
of
the
signals
from
the
old
url
to
the
new
one,
so
yeah
that's
kind
of
a
a
good
starting
point,
but
there's
a
lot
more
than
just
kind
of
what
you're
forwarding
with
the
redirects
that
that's
in
play
there.
It's
like
the
content
itself
has
to
be
relevant.
A
Things
on
the
page
have
to
be
appropriate
for
search.
All
of
that
has
to
align
too,
but
the
the
redirect
is
essentially
a
good
sign
for
us
to
say
well,
this
url
is
replacing
this
old
one,
and
ideally,
we
can
just
forward
everything
to
to
the
new
url.
A
It's
it's
essentially
the
same
like
you're
you're
changing
the
url,
but
we
we
can
forward
some
of
the
signals
there,
but
the
the
content
on
the
url
is
also
kind
of
what
is
important.
E
Okay,
okay,
my
second
question
is
so
suppose:
why
is
it
happening
that
the
featured
snippet
are
different
for
two
locations?
So
suppose
I
am
searching
a
query
in
india,
the
featured
snippet
is
different,
but
for
the
us
location
the
feature
is
different.
Why
is
that?
Why
is
the
reason
behind
it.
A
It's
it's
essentially
a
snippet
from
a
search
result
and
sometimes
search
results
are
different
across
different
locations.
So
from
from
our
point
of
view,
that's
that's
completely
normal.
It's
you
know.
Sometimes
you
see
it
across
different
countries.
Sometimes
you
see
it
even
within
the
same
country
slightly
different.
A
E
So
one
case
that
can
be
possible
in
this
that
the
website,
which
is
ranking
on
featured
snippet
is
not
ranking
in
the
us
okay.
But
what
I'm
saying
is
that
my
website,
which
is
ranking
or
on
featured
discomfort
in
india,
is
also
ranking
on
the
us.
Then
don't
you
think
the
future
is
limited
will
be
also
same,
not
necessarily
now
any
best
practices
to
acquire
the
featured
snippet
for
the
us
location
as
well.
No,
like
we
don't
have
any
guidelines.
A
Sure
robin.
F
Yes,
hi
john,
can
you
hear
me
yes
great,
so
I
have
a
question
with
regards
to
faq
schema.
So
how
would
you
detect
that
the
question
is
the
same
over
different
sites?
How?
How
would
you
differentiate
between
sites,
because
we
have
all
our
country
with
our
websites
on
the
same
top
level
domain,
but
separated
by
different
subfolders
and
so
and
and
all
of
the
websites
does
have
hf
link
tags,
which
it
tells
tells
us,
for
example,
that
this
is
the
canadian
english
site?
F
And
this
is
the
u.s
english
site,
but
from
when
we,
whenever
we
publish
english
content
from
our
sort
of
master
site
and
that
will
automatically
be
pushed
to
all
the
english
in
all
our
english
websites.
So
we
are
looking
into
creating
some
sort
of
faq,
page
or
faq
component
that
our
different
webmasters
can
use.
F
But
we
cannot
like
secure
that
they
won't
put
the
same
question
in
in
on
the
different
pages.
So
would
you
be
able
to
detect
this.
A
I'm
pretty
sure
we
would
be
able
to
detect
it
because
it's
like
a
same
piece
of
text
across
multiple
sites,
but
I
it
sounds
like
it's
something
that
that
makes
sense
for
for
your
situation,
especially
if
these
are
country
specific
sites,
then
we
would
use
something
like
hreflang
or
other
methods
to
try
to
figure
out
which
one
to
show
appropriately
and
then
some
sometimes
for
different
countries.
You
have
the
same
questions.
I
I
think.
That's
that's
perfectly
fine.
F
Yeah
yeah
because
yeah
we
we
are
an
industrial
company,
so
they
I
mean
the
the
we.
We
are
good,
pushing
the
same
products
all
over
the
world
so
but
we
would
probably
be
able
to
to
like
use
the
faq
and
possibly
get
the
image
to
faq
results
in
the
on
the
search
results
page.
A
I
think
that's
that's
perfectly
fine,
because
if
these
are
kind
of
international
sites,
it's
not
that
we
would
show
them
on
the
same
search
results
page.
So
it's
not
that
it
would
look
weird
it's
essentially
either
we
show
the
canadian
version
or
the
french
version
or
whatever
yeah
and
like
if
they
have
the
same
questions.
That's
generally
fine
for
for
internationalization.
C
A
C
Well,
I
was
just
going
to
comment,
I
mean
the
faq
thing
says
you
can
only
have
it
on
one
page,
so
we
also
have
a
lot
of
sites
with
english,
global
and
american
english.
With
the
same
questions,
possibly
slightly
varied,
so
that's
okay,
then
I
yeah
yeah.
A
That
that's
generally
fine
like
what
problematic
is
for
us.
If
you
have
the
same
questions
and
answers
across
the
whole
site
or
if
you
have
like
a
hotel
site
and
every
hotel
listing
is
like
what
what
is.
I
don't
know,
cancellation
period
and
you
have
the
same
questions
and
answers
everywhere.
Then
that
starts
to
look
kind
of
annoying
yeah.
C
I
mean
ideally
we'd
like
to
put
it
on
individual
pages,
so
if
we
have
a
cocktail,
we'd
have
the
faqs
generally
and
then
on
the
direct
page.
But
at
the
moment
we
just
put
the
text
on
the
page,
but
to
my
actual
question,
we've
seen
some
really
weird
results
immediately
after
the
july
core
update
is
all
our
sites
are
big
beverage
sites.
You
know
it's
one
of
it's
a
big
drinks
company
so,
for
example,
on
the
rum
on
the
rum
brand.
C
C
You
know
the
product
pages,
but
we
can't
see
the
cocktails
and
if
you
look
in
sem
rush,
for
example,
if
we
look
at
mojito,
which
is
a
very
popular
rum
cocktail,
we
could
see
where,
before
we
were
ranking
page
one
quite
often,
and
then
for
all
of
the
terms
related
to
mojito,
you
can
see
it
just
dropped
to
zero.
C
Now
we
thought
this.
Obviously
we
know
there's
some
problems
with
our
cocktail:
markup
recipe
markup,
but
he
was
thinking
if
the
algorithm
had
changed.
We'd
expect
to
see
them
drop,
lower
down
the
page
or
to
page
two,
but
not
a
complete
wipeout
and
we've
actually
done
research,
because
we've
got
a
vermouth
brand
and
that's
happened,
but
ironically
in
the
german
locale,
we
don't
see
the
same
effect
and
also
we've
looked
at
some
of
the
competitors
like
jack
daniels
and
there's
some
there's
some
sites
in
that
area.
Driesly
epicurious.
C
Some
of
these
have
dropped
completely
and
differed,
which
is
another
big
big
player
in
that
area,
which
is
a
review
sort
of
cocktail,
place
they've
sort
of
dropped
off
to
page
two,
but
it
seems
to
be
okay
on
mobile,
so
we're
still
ranking
quite
well
on
mobile.
Obviously,
that's
probably
about
60
to
80
percent
of
our
traffic.
But
it's
just
really
weird.
That's
completely
gone.
C
We
could
understand
if
the
algorithm
has
changed
and
our
sites
aren't
as
good
anymore,
but
we
wouldn't
expect
them
to
see
completely
go,
and
I
know
I
think
there
was
someone
else
who
had
a
similar
problem
with
his.
It
was
a
blind
sight,
so
it's
really,
there
seems
to
be
something
really
weird
happening
with
indexing.
C
Are
still
indexed
in
search
console
but
they're,
just
not
turning
up
at
all,
not
like
they've
dropped
from
like
position
4
to
position,
13
or
15..
So
really
that's
the
question
we
were
trying
to
like
raise
and
it
does
appear
to
be
across
the
industry
and
recipe
related.
A
Yeah,
so
I
I
looked
into
it
a
bit
with
the
other
site
that
posted
the
details
in
the
forums
and
it
it
turns
out.
We
we
have
a
system
that
tries
to
recognize
soft
404s
across
desktop
and
mobile
separately,
and
essentially,
what
happens
is
that
sometimes
we
see
pages
that
essentially
on
desktop
look
like
a
404
page,
so
we
say:
oh,
this
is
soft
404
on
desktop.
A
We
don't
need
to
index
it
and
on
mobile
it
looks
like
a
normal
page,
so
we'll
actually
index
it
there
and
that's
where
you
would
see
this
difference
between
kind
of
the
indexing
side
where,
if
on
desktop
or
mobile,
you
search,
you
do
a
site
query
for
the
url
and
it
shows
up
normally
and
on
the
other
device
type.
You
do
a
site
query
and
it
doesn't
show
at
all.
A
Then
that's
essentially
a
sign
that
for
that
device
type,
our
systems
have
picked
up
somehow
that
this
is
a
soft
404
page
and
the
tricky
thing
here
is:
I
mean.
On
the
one
hand,
this
is
fairly
new,
so
I
I
was
a
bit
surprised
that
we
do
this.
On
the
other
hand,
in
search
console,
we
do
show
soft
404s,
but
we
show
it
for
the
mobile
version.
So
if,
on
the
mobile
version,
everything
is
okay
from
your
side,
then
in
search
console
it'll,
look
like
oh
it's!
A
It's
indexed
normally
he's
like
what
is
your
problem,
whereas
on
desktop,
if
we
see
it
as
a
soft
404
there,
you
would
not
be
able
to
see
that
directly
in
search
console.
A
So
that's
that's
kind
of
something
new
that
that
has
started
happening.
I
don't
know
maybe
like
a
month
or
so
I
I
don't
have
the
exact
time
time
frame
there,
but
it
is
something
where
at
least
the
the
other
side.
With
with
the
blinds.
I
think
a
lot
of
those
examples
were
really
useful
for
the
team
to
figure
out
like
how
do
we
need
to
improve
this
classifier.
C
A
Maybe
email
me
a
bunch
of
urls
also
from
the
other
sites
that
where
you
saw
this,
then
that
would
be
super
useful
to
pass
on
to
the
team.
Yeah.
A
404
is
kind
of
in
that
direction.
It's
like
they
return
200
and
our
systems
look
at
the
page
and
say
oh
for
whatever
reason
we
think
this
is
an
error,
page
or
kind
of.
A
C
C
A
Sure
cool
all
right,
andre.
H
Yes,
hello
right
now,
I'm
in
job
applications-
and
I
looked
at
the
implementation
of
json
ld
on
web
pages
to
because
google
now
supports
the
rich
snippets
for
job
offers.
I
think
I
asked
about
that
four
years
ago.
H
I
don't
know
for
how
long
it
is
in
there,
and
I
have
some
questions
for
special
points
there.
It's
about.
Let
me
have
a
look
experience,
requirements
and
education
requirements.
A
I
I
don't
know
if
they
would
be
shown
as
rich
snippets
or
as
kind
of
those
job
search
pages.
I
think
for
google
jobs
we
we
have
those
kind
of
interstitial
pages.
I
I
don't
know
off
hand
exactly
how
it
all
looks,
but
I
I
don't
think
we
would
show
it
as
a
rich
snippet
in
the
search
result,
but
rather
kind
of
like
tied
to
it
from
the
there
is
a
special
I
think,
a
special
layout
in
the
search
results
specifically
for
the
jobs,
kind
of
content.
H
I
mean
I
could
show
it
to
you
and
I
would
still
have
a
question
about
the
link
to
the
website
how
google
checks,
if
there's
the
possibility
of
online
application
on
the
website.
Maybe
you
can
tell
us
something
about
that.
H
A
If
there
is
a
problem
we
do,
I
I
think
I
think
that
is
something
that
we
do
try
to
to
figure
out
and
my
my
assumption,
like
I
I'm
not
involved
with
the
jobs
stuff
directly,
but
my
my
assumption
is
that
we
we
have
a
mix
of
kind
of
manual
and
automatic
checks
for
these
kind
of
things,
because
completely
automatic,
I
I
don't
think
would
really
be
possible,
so
it
has
to
be
kind
of
this.
This
mix.
H
Okay,
do
you
have
any
tips
for
us
how
to
optimize
here
for
good
results,
or
is
it
just
something
that
is
based
on
location
of
the
job
offer.
A
I
think
mostly
the
location,
but
I
actually
don't
know
any
of
the
details
there
yeah
sorry,
maybe
what
I
would
do
in
in
your
situation
is
find
someone
who's
been
using
it
for
longer,
maybe
in
the
us
or
one
of
the
countries
where
it
was
rolled
out
earlier
and
try
to
pick
their
mind
and
like
get
their
insight
on
what
worked.
Well,
what
didn't
work?
Well,
because
it's
also
something
where,
from
from
my
side
like,
if,
if
there
were
a
secret
ranking
trick,
then
I
would.
C
A
Okay,
cool:
we
still
have
a
bunch
of
questions
lined
up.
I
I
think
you're
like
ready
to
go
michael,
but
let
me
get
some
of
the
youtube
questions
in
just
so
that
we
kind
of
make
sure
that
those
folks
also
get
an
answer
I'll
run
through
fairly
quickly.
Let's
see
our
site
is
still
being
indexed
in.
Google
desktop
the
site
is
facing
indexing
and
ranking
issues.
Because
of
that,
when
will
the
next
batch
be
moved
to
mobile
first
indexing?
A
I
I
don't
have
a
timing
on
the
next
patch,
but
you
would
not
be
seeing
ranking
issues
because
of
a
site
that
is
still
on
googlebot
desktop.
So
it's
something
where
we
we
expect
sites
to
be
indexable
by
both
desktop
and
mobile
kind
of
users.
So
that's
something
where
indexing
side,
like
your
content,
should
be
available.
A
If
you're
seeing
ranking
issues
that
wouldn't
be
from
kind
of
the
mobile
first
indexing
or
not,
then
maurice's
question
we
talked
about
that
briefly
after
redirecting
and
changing
a
cdn
we're
seeing
a
drastic
rate
and
a
drastic
drop
in
crawl
rate.
Yes,
that's
very
reasonable.
If
you
change
your
website's
infrastructure,
then
we
will
change
our
crawling,
on
the
one
hand,
first
to
be
a
little
bit
conservative
and
make
sure
we
don't
cause
problems
and
then
later
on,
we
automatically
ramp
up
again.
A
So
if
you
change
to
a
different
cdn,
that's
a
significant
change
in
the
infrastructure,
and
we
recognize
that
change
and
we
hold
off
crawling
for
a
while,
and
then
we
ramp
up
again.
If
we
think
everything
is
fast,
if
a
website
is
migrated
to
react
from
html
php,
does
it
negatively
affect
the
whole
website?
A
What
are
the
major
things
to
be
taken
care
of
so
react
is
a
javascript
framework
and
if
you're
implementing
the
site
with
client-side
javascript,
then
that
is
a
significant
change
on
your
website.
On
the
one
hand,
you're
completely
changing
kind
of
the
front
end
of
the
website,
so
javascript
or
not.
That
is
a
really
significant
change,
so
you
would
expect
to
see
some
changes
in
seo
in
in
that
regard.
Some
of
these
changes
can
be
positive.
A
It's
not
that
you
automatically
drop
in
ranking
just
because
you
change
your
front
end
and
the
other
part
is
you're.
Moving
to
a
javascript
front
end
or
assuming
this
is
the
case,
and
then
for
that
you
need
to
take
into
account
that
javascript
sometimes
is
a
bit
tricky
with
regards
to
indexing,
we're
pretty
good
at
it
now,
but
it's
still
something
you
need
to
watch
out
for,
and
I
would
double
check
all
of
our
tools
and
guides
on
javascript
seo
to
make
sure
that
you
have
all
of
that
lined
up.
A
A
A
With
regards
to
kind
of
like
automatically
ranking
for
all
of
your
localized
content.
That
does
not
happen.
So,
on
the
one
hand
you
have
to
provide
clean
internal
linking
within
your
website.
We
have
to
be
able
to
crawl
and
index
all
of
those
pages,
and
then
we
essentially
use
hreflank
to
swap
out
the
right
one.
So
just
automatically
moving
everything
to
a
lot
of
different
languages
does
not
multiply
the
visibility
of
your
website.
A
A
Oh,
I
I
think
I
I
provided
a
short
answer
on
this
on
twitter
just
before
so
I
I
double
checked
the
website
and,
on
the
one
hand
april
is,
is
a
few
months
ago,
but
it's
still
kind
of
relatively
seen
fairly
close,
and
we
have
to
recrawl
the
whole
website
to
understand
that
these
no
index
tags
were
dropped
and
the
main
difficulty
we
saw
there-
or
I
saw
when
I
looked
at
the
website-
is
that
the
website
is
very
slow
in
and
that
essentially
means
that
we're
crawling
much
less
than
we
would
like
to,
and
that
means
it
will
take
much
longer
to
kind
of
see
that
the
no
index
has
dropped.
A
Removing
a
tool
from
indexing
using
the
removal
tool
will
also
remove
the
page
from
the
core
web
vitals
report.
No,
probably
not
so
the
url
removal
tool
just
hides
the
url
in
the
search
results.
It
doesn't
change
indexing
and
you
would
continue
to
see
that
url
in
other
reports
in
search
console.
So
that
includes
things
like
structured
data
reports,
amp
reports,
the
the
core
web
vitals
reports,
all
of
those
places
where
index
content
is
used.
You
would
still
see
that
url.
A
How
long
does
it
take
for
google
to
drop
pages
from
the
index
if
bots
encounter
a
5x6
status
code
due
to
problems
with
the
server
being
irresponsive
for
about
three
days,
google
d
index
nearly
one
quarter
of
my
content,
without
even
showing
any
errors
in
search
console?
How
long
will
it
take
for
my
content
to
be
back?
A
So
usually
we
recommend
a
503
status
code
for
somewhere
up
to
a
day.
Sometimes
it's
it's
okay,
if
it
takes
maybe
two
days
to
resolve,
but
if
we're
talking
about
an
air
time
frame,
which
is
more
than
two
days,
which
three
days
is
kind
of
close,
but
it's
still
a
little
bit
longer
and
then
we
would
start
dropping
things
from
from
the
index
there
and
usually
what
happens
is
we
would
probably
drop
the
things
that
are
most
visible
in
the
search
results?
A
A
On
the
other
hand,
that's
also
an
advantage
for
you,
because
that
means
the
urls
that
we
think
we
would
show
the
most
are
also
the
ones
we
would
recrawl
again
the
most.
So
we
would
try
to
pick
those
up
fairly
quickly
and
my
guess
is
after
a
week
or
so
for
the
most
part
that
will
settle
back
down
again,
everything
will
be
normal
again.
There
is
no
kind
of
long-lasting
grudge
against
the
website
if
it
shows
errors
like
this.
A
Implementing
server
side,
rendering
for
improving
page
speed
and
keywords
ranking,
is
that
a
good
strategy
so
implementing
anything
that
improves
page
speed
is
usually
a
good
strategy,
especially
now
that
we
use
the
core
web
vitals
as
the
ranking
factor.
So
that
sounds
like
a
good
thing,
if
you're
implementing
it
in
a
way
that
does
improve
the
speed.
A
Obviously,
implementing
server-side
rendering
is
not
a
magic
bullet
that
will
automatically
make
your
website
faster.
But
if
you
can
implement
it
in
a
way
to
make
your
website
faster,
then
go
for
it.
Server-Side
rendering
does
not
automatically
improve
keyword
ranking,
so
it
basically
takes
the
existing
content.
A
You
have
and
just
provides
it
in
a
different
way
to
to
users
to
search
engines
and
because
of
that,
it's
not
like
a
magic
seo
bullet
that
will
catapult
your
site
to
position
one,
but
rather
it's
it's
kind
of
like
a
technical
change
that
you're
making
that
could
make
things
faster
for
users,
there's
a
lot
of
work
with
server-side
rendering.
So
I
wouldn't
just
assume
you
can
do
like
a
check
box
somewhere
in
your
cms,
and
suddenly
everything
is
server
side,
rendering
we
have
three
systems
in
our
website.
A
A
Let's
see
it
sounds
like
and
kind
of
like,
like
changing
the
urls
a
little
bit.
Would
that
hurt
our
crawl
budget?
What's
the
best
practice
to
adopt
in
a
scenario
like
this,
considering
the
these
are
kind
of
like
the
ideal,
url
paths
for
user
experience,
so
having
different
frameworks
on
a
website,
different
infrastructures
on
a
on
a
shared
domain
is
perfectly
fine,
that's
not
something
that
would
cause
any
problems
with
regards
to
urls.
A
The
main
thing
I
would
watch
out
for
here
is
kind
of
like
what
I
mentioned
before
is
if
you're
changing
things
like
the
url
structure,
then
that's
something
that
will
be
reflected
in
search
and
that
takes
a
bit
of
time
to
be
reprocessed.
So
if
you
change
the
url
structure,
make
sure
that
you
have
all
of
the
redirects
set
up
properly.
A
My
recommendation
is
to
crawl
your
website
locally.
First,
before
you
make
those
changes
so
that
you
have
a
clear
list
of
all
of
the
urls
that
that
are
available
on
your
website,
and
you
can
double
check
that
list
afterwards
to
make
sure
that
they're
all
redirecting
to
appropriate
new
urls.
So
that's
kind
of
the
the
main
thing
I
would
watch
out
for
here.
I
don't
think
having
different
frameworks
on
a
website
would
change
any
of
that.
A
It's
really
like
you're
doing
a
revamp
of
your
site
structure,
and
because
of
that,
you
have
to
do
all
of
the
revamp
things.
One
of
the
difficult
or
I
don't
know
difficult
or
trickier
parts,
probably
that
that
makes
this
a
little
bit
harder.
Is
that
react
and
vue.js
are
traditionally
client-side
rendered
frameworks,
which
means
they're,
they're
javascript
powered,
which
means
that
anything
that
involves
redirects
needs
to
probably
be
rendered
so
that
a
crawler
can
recognize
those
redirects.
A
A
Is
it
a
good
sign
that
google
automatically
indexes
new
posts
within
hours
of
posting
versus
taking
days
or
even
a
week
to
be
indexed
before
that?
Doesn't
sound
like
a
bad
sign
to
me,
so
I
think
that's
generally.
Okay,
does
google
consider
elements
which
are
present
in
the
first
fold,
section
to
measure
cls,
lcp
and
fid,
or
does
it
consider
the
whole
web
page
to
measure
core
web
vitals
metrics?
A
I
would
double
check
the
core
web
vitals
documentation
that
we
have
on
the
I
think
web.dev
website
there's
a
ton
of
information
there
on
how
we
calculate
and
measure
these
things-
and
I
I
don't
think
kind
of
like
me,
trying
to
explain
that
in
five
seconds
would
would
really
be
useful
in
that
regard.
So
I
I
would
double
check
the
information
that
we
have.
There
also
keep
in
mind
for
core
web
vitals
for
page
experience,
we
use
the
the
live
field,
data
from
users,
so
things
that
people
actually
see.
A
We
don't
use
kind
of
the
lab
tests
where
you
would
like
theoretically
test
the
page
then
core
web
vitals.
Is
that
concerning
seo
landing
pages
only
or
is
it
also
important
for
sem
pages,
so
kind
of
ads
landing
pages?
I
don't
know
what
what
the
quality
score
things
are
with
regards
to
ads
landing
pages.
A
So
I
I
don't
know,
I
would
not
be
surprised
if
at
some
point-
or
maybe,
if
already,
if
google
ads
is
using
this
as
a
part
of
the
quality
evaluation,
but
I
I
have
no
no
insight
there
at
all
we're
trying
to
improve
our
discovery
integration.
We
have
some
doubts
regarding
max
image
preview
large.
What
would
happen
if
we
put
the
meta
tag
on
a
page
but
for
users
on
mobile
devices,
we
serve
a
hero,
image
smaller
than
1200
pixels.
A
Would
that
mean
we
need
to
serve
big
images
for
every
user,
regardless
of
the
form
factor,
wouldn't
that
go
against
the
core
web
vitals
recommendations?
Yes,
that's
that's
tricky
and
problematic.
The
solution
that
you
probably
need
to
look
into
is
responsive
images
in
particular.
There's
a
picture
element
that
you
can
use
in
html,
where
you
can
specify
different
resolutions
of
images,
and
with
that
we
can
recognize
the
higher
resolution
image
and
the
normal
resolution
images
and
for
users
it'll
automatically
show
the
appropriate
version
and
for
search
for
discover.
A
So
if,
if
we
can't
recognize
that
these
are
clearly
separate
urls,
then
we
will
treat
them
as
the
same
thing
with
regards
to
core
web
vitals,
and
probably
from
kind
of
the
the
field
data
point
of
view,
we
will,
we
will
mix
them
together.
It's
a
little
bit
different.
If
you
have
completely
separate
urls,
then
probably
what
would
happen
there
is.
We
would
recognize
these
are
different
parts
of
a
website
and
we
would
potentially
group
them
separately.
A
This
is
kind
of
the
internal
or
private
content.
This
is
kind
of
the
public
content
and
then
potentially,
our
systems
would
be
able
to
group
those
separately
and
treat
them
separately,
whereas
if
essentially
everything
looks
the
same
to
us
and
from
a
user
point
of
view,
it's
like
they
happen
to
be
logged
in.
Therefore,
they
see
like
20
times
as
much
content
and
it's
much
slower
page.
A
A
A
So
I
would
double
check
the
the
chrome,
the
web.dev
information
on
core
web
files
just
to
be
sure
there,
but
it
is
something
where,
if
you
have
kind
of
like
significantly
different
content
on
the
same
website,
especially
if
you're
using
the
same
urls,
then
I
would
expect
our
systems
to
not
be
able
to
differentiate
that
clearly
between
them,
how's
the
integration
with
google
search
and
adsense.
A
Regarding
page
speed
or
page
experience,
I
don't
know
it's
it's
hard
for
me
to
say
I
I
don't
know
what
what
all
is
happening
on
on
the
adsense
side.
I
do
know
from
from
the
chrome
side
there
there
are
different
teams
involved
with
regards
to
page
experience
overall,
and
they
they
do
work
with
some
of
the
the
bigger
sites
out
there
that
are
providing
content
that
are
working
on
embeds
to
try
to
help
them
to
improve
their
kind
of
core
web
vital
score.
Overall,
these
are
not
teams
directly
involved
in
search.
A
So
that's
something
where
my
my
guess
is:
if
adsense
is
something
that
is
significantly
slowing
things
down,
we
see
that
as
something
that
is
shared
across
the
web
overall,
that
someone
on
on
the
chrome
or
the
partnership
side
would
try
to
get
in
touch
and
help
them
to
improve
things,
but
also
I
mean
they
have
a
lot
of
information
available
already
in
our
public
documentation
can
improve
things
on
their
side
too.
A
I
was
using
my
home
page
as
a
landing
page,
which
is
ranked
on
some
keywords.
Now
I
need
to
use
my
home
page
as
a
home
page
which
isn't
the
landing
page.
When
I
redirect
it,
the
ranks
have
been
transformed
to
the
new
page.
However,
when
I
disabled
the
redirect
the
ranks
decreased
again,
how
can
I
redirect
my
old
landing
page
and
use
the
home
page
for
something
else?
A
I
I
guess
you,
you
kind
of
saw
the
answer
there
yourself
in
that
you
can't
both
redirect
and
not
redirect
at
the
same
time,
with
the
redirect
we
forward
the
signals
without
a
redirect.
We
kind
of
keep
the
pages
separate.
So
this
is
something
where,
if
you're
significantly
changing
your
sites,
then
it
takes
a
while
for
us
to
understand
that,
and
sometimes
the
new
structure
of
your
site
is
better.
Sometimes
it's
not
so
good.
A
Oh
okay,
wow
running
through
all
the
questions,
maybe
I'll
jump
back
through
some
of
the
live
questions
here
as
well,
and
I
I
have
a
bit
more
time
afterwards
as
well.
If
any
of
you
want
to
stick
around
and
ask
more
questions,
we
can
definitely
do
that.
Let's
see
michael
you're
up
first.
I
Hi
john,
so
I've
noticed
a
lot
of
sites
now
include
a
short
summary
or
list
of
key
points
on
the
top
of
their
content
and
those
bulleted
points
generally
under
the
headline
duplicate,
often
verbatim
the
content
that
the
reader
will
then
later
find
in
an
article
is:
is
this
helpful
to
google
or
is
there
any
added
value
to
this
practice?.
A
I
I
don't
think
it
changes
anything
for
us
it's.
It
seems
like
it's
probably
more
of
a
usability
thing,
or
maybe
it
encourages
people
to
read
on
and
find
more
information,
but
I
don't
think
it
would
change
anything
from
from
google's
point
of
view.
I
Yeah
I
mean
I,
I
was
wondering
if
people
did
this
just
because
people
are
generally
lazy
and
they'll
read
the
summary,
but
for
me
as
a
practice
when
I'm
reading
it
it's
just
like
hey.
I
read
this.
You
know
I
read
the
bullet
point
and
then
two
paragraphs
later
I
say
I
I
just
read
this:
it's
not
a
great
user
experience.
A
E
I
A
A
J
Sorry
hi
jim
hi,
a
friend
of
mine,
has
asked
me
to
help
him
with
his
website.
J
He
was
vanished
from
google,
so
first
thing
I
did
was
a
search
console
connecting
and
looking
on
site
column
website
name,
it
doesn't
appear
anywhere.
I
couldn't
see
in
any
manual
actions,
so
I
wanted
to
ask
for
a
reconsideration
because
I
think
google
has
finished
it.
I
don't
know
how
do
I
do
this
without
a
manual
action,
because
I
can't
yeah
see
a
url.
A
Somewhere,
you
can
only
do
a
reconsideration
request
if
there
is
an
active
manual
action,
because
that's
something
that
would
only
be
looked
at
in
in
cases
like
that.
I
I
think
I
mean
you're
welcome
to
to
drop
the
url
in
the
chat
here
and
I
can
take
a
quick
look
after
the
recording
stops
to
to
see
what
what
is
happening
there,
but
in
practice,
if
none
of
the
website
is
appearing
there,
there
are
two
possibilities
or
two
common
possibilities.
A
On
the
one
hand,
there
might
be
something
technical,
that's
happening
on
the
website
that
maybe
it's
returning
a
an
error
code
or
something
like
that,
and
it
looks
normal
in
a
browser,
but
it
doesn't
look
normal
to
to
the
search
engine
crawler
that
that
can
happen
from
time
to
time.
Another
thing
is:
there's
the
url
removal
tool
where
you
can
remove
your
whole
website
and
hide
it
in
the
search
results.
A
But
it's
something
where
we've
noticed
that
sometimes
people
use
the
tool
to
remove
an
alternate
version
of
their
website
and
they
don't
realize
the
tool
removes
all
versions
of
the
website.
So
if
you
have
www
and
non-www,
for
example,
if
you
want
to
remove
the
non-www
version
with
that
tool,
then
it
removes
all
of
those
versions
and
then
it
looks
in
search
console.
It
looks
like
everything
is
normal,
but
in
the
search
results
everything
is
removed,
so
those
are
kind
of
the
the
common
variations
there.
But
I'm
happy
to
take
a
quick
look
to
see.
J
Okay,
so
when
I
put
search
console
connected
it
would
it
see
in
the
past
if
things
had
been
happening.
A
A
little
bit
yes,
so
the
performance
report
and
I
believe
the
index
coverage
reports
they
take,
maybe
a
half
a
week
or
up
to
a
week
to
to
be
updated.
When
you
verify
the
site
for
the
first
time
and
then
it
will
show
the
older
data
as
well.
A
All
right,
jason.
G
Cool
hi,
john,
so
I've
got
a
real
fun
one
for
you,
and
I
also
asked
this
one
on
google
ask
googlebot
and
twitter
about
a
month
ago
you
answered
a
question
about
the
maximal
maximum
indexable
page
limits
and
talked
a
little
bit
about
static
versus
dynamic
pages.
So
I'm
curious
on
your
your
take
on
something
as
part
of
our
consumer
journey.
We
use
url
parameters
to
kind
of
understand
what
content
a
consumer
was
interested
in
and
clicked
on,
and
then
also
subsequent
to
that.
G
G
The
consumer
experience
is
very
similar
right,
but
it
is
again,
like
I
mentioned
different
with
links
that
go
to
different
content
silos
based
on
the
url
param.
So
what's
your
guidance
here?
Should
we
roll
canonical
to
the
naked
url
with
no
parameters,
or
is
it
okay
to
let
googlebot
decide
which
url
is
canonical
and
also
if
I
could
reserve
a
follow-up.
A
I
I
would
generally
recommend
trying
to
rel
canonical
to
the
clean
url
if
you
can
just
because
it
makes
it
a
lot
easier
for
us
to
understand.
This
is
the
url
that
you
actually
want
to
have
index
and
it
doesn't
have
to
be
the
clean
url.
It
can
be
like
one
of
the
parameter
versions
as
well,
but
essentially
to
kind
of
provide
some
guidance
for
the
situation
where
we
look
at
the
url.
We
recognize.
Oh,
it's
like
99
the
same
as
the
other
url
that
we
saw,
which
one
of
these
should
we
index.
A
A
It's
not
guaranteed
that
we
would,
but
I
it
kind
of
helps
us
and
the
the
other
thing
is
also
that
if
you
have
all
of
these
different
parameter
versions,
if
all
of
them
were
indexable-
and
we
would
look
at
them
and
say
oh
they're,
very
similar
but
they're
different,
then
we
would
end
up
indexing.
A
lot
of
separate
urls
for
your
website,
and
that
means
we
would
kind
of
dilute
the
value
of
your
website
across
all
of
these
different
urls
and
that
kind
of
means.
Well,
we
have
more
different
urls
to
index.
A
G
Okay,
so
I
I
guess
in
this,
isn't
the
follow-up,
but
you
know,
would
there
be
any
kind
of
I
mean
I
I
don't
know
any
penalty
associations
with
that
at
all
I
mean
based
on
on
what
I've
outlined,
or
is
it
just
kind
of
the
the
algorithmic
aspect
of
it
that
you're
you're
you're
thinking
through
on
this.
A
A
If
it's
essentially
the
same
content,
you're
swapping
out
city
names,
some
something
like
that,
then
I
could
imagine
that
our
systems
would
look
at
this
and
say:
oh
it's
a
collection
of
doorway
pages
and
algorithmically.
We
might
say:
oh,
we
just
pick
one
of
these
pages
and
show
that
and
ignore
the
parameter
in
the
future.
A
If,
if
it
were
really
excessive,
then
it's
possible
that
the
web
spam
team
would
look
at
it
and
say:
oh
they're
they're
trying
to
kind
of
rank
with
all
of
these
doorway
pages.
Therefore,
we
have
to
take
a
manual
action,
but
it's
it's
hard
for
me
to
recognize.
If
you're
talking
about
doorway
pages
or
if
you're
just
saying
well,
these
are
personalized
and
like
they
have
slightly
different
content
for
different
users,
because
maybe
they
came
in
with
an
ad
or
they
came
in
from
a
different
site,
or
something
like
that.
G
Fair
enough,
the
unrelated
follow-up
and
thank
you
for
the
for
the
the
clarification
much
appreciated,
so
we
had
a
malformed
301
when
we
migrated
content
from
a
subdomain
to
dub
dub
dub
a
little
over
a
year
ago.
The
issue
with
the
301
was
noticed
about
mid
to
late
may
and
and
we've
noticed
differences
in
our
logs
for
how
googlebot
is
crawling
that
sub
domain.
G
So
you
know
we
we
are
assuming
that
it's
it's
working
as
it
should
now
and
we've
validated
it
several
times,
but
we're
still
having
like
25
000
urls,
showing
in
on
a
site
colon
search
for
that
the
the
old
sub
domain.
So
I
guess
the
question
is:
are
there
any
next
steps
on
how
we
get
the
dub
dub
dub
version
of
those
pages
in
the
index.
A
It's
it's
possible
that
that
is
working
as
expected,
that
we've
already
kind
of
moved
most
of
the
urls
over
the
the
tricky
part.
There
is
with
a
site
colon
when
you're,
specifically
searching
for
something
that
you've
moved.
Our
systems
understand
that
the
content
used
to
be
there
and
we
will
try
to
show
it
to
you
anyway,
even
if
we've
already
moved
things
over
in
indexing.
A
A
So
I
I
kind
of
double
check
that
you're
not
like
searching
for
some
or
trying
to
find
something
that
doesn't
exist
anymore
and
our
systems
are
trying
to
be
helpful
and
kind
of
check
to
see
if
the
cache
page
is
actually
the
new
one
and
if
it
is
then
like,
I
would
just
let
that
go
cool.
Thank
you
sure.
A
All
right,
let
me
pause
the
recording
here.
Like
I
mentioned,
you
all
are
welcome
to
hang
around
a
little
bit
longer
and
we
can
go
through.
Some
of
the
more
questions
looks
like
lots
of
people
still
have
raised
hands
so
lots
of
stuff
to
do,
but
it's
always
good
to
kind
of
limit
the
recordings
to
a
certain
amount.