►
From YouTube: English Google SEO office-hours from February 4, 2022
Description
This is a recording of the Google SEO office-hours hangout from February 4, 2022. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
here
on
the
search
relations
team,
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
sessions
where
people
can
join
in
and
ask
their
question
around
their
website
in
search.
A
B
Hi,
john
hi,
my
question
is
related
to
a
few
of
the
industry,
let's
say
stock
industry
and
the
mutual
fund
industries.
So
we
have
like
titles
and
meta
description
and
header
tags,
so
we
have
added
the
value
of
this
stocks
right.
Let's
say
today
that
stock
price
could
be
twenty
dollar
tomorrow
it
could
be
thirty
and
exit
could
be
so
that
title
or
the
value
just
the
particular
the
number
it
keeps
on
changing
every
single
day.
B
So,
just
like
you
know,
you
have
like
20
dollars
for
30
or
could
be
anything
so
just
by
changing
the
titles
every
single
day
or
the
head
attacks
or
the
meta
description,
the
values
within
it
does
it
have
a
significant
ranking
effects,
because
I've
heard
you
know
saying
that
your
url
should
be
clean
and
it
should
be
there
for
the
longer
duration
to
get
the
value
does
the
same
applies
for
the
titles
metas
and
the
headers
or
or
could
be
any
other.
You
know
factors.
B
Roughly
yeah,
but
not
too
much
it's
only
the
value.
That's
keep
on
not
just
changing
every
single
day.
So
let's
say
today's
google
stock
price
is
like
20.
So
this
is
my
title
for
to
today.
However,
for
the
next
day
it
would
be
like
today's
google
share
price
is
like
20
25,
so
only
the
number
is
changing.
However,
I
look
at
it
as
a
the
whole
title
is
changing,
so
does
it.
A
I
I
think,
that's
fine.
It's
I
mean
it's
it's
something
where
we
we
wouldn't
give
it
any
special
weight
if
your
title
tag
keeps
changing.
But
if,
if
you
want
to
to
update
your
titles
regularly,
that's
that's
totally
up
to
you.
The
the
difficulty.
I
I
suspect
is
more
that
if
you
change
your
titles
on
a
daily
basis,
we
might
not
recrawl
that
page
on
a
daily
basis.
So
it
might
be
that
you
change
it
every
day,
but
in
the
search
results.
The
title
that
we
show
is
a
few
days
old.
A
Just
because
that's
that's
the
last
version
that
we
picked
up
from
that
page,
but
that's
more
of
I'd,
say
like
a
practical
effect
rather
than
a
strategic
effect.
B
A
It
helps
us
to
recognize
when
something
has
changed,
but
it's
it's
not
necessarily
going
to
happen.
That
we'll
say.
Oh
we've
seen
this
page
change
every
day.
Therefore
we'll
recrawl
it
every
day.
It
might
be
that
we
recruit
every
day.
It
might
be
that
we
recall
it
every
week
or
every
month.
So
it's
not
that
the
changes
that
you
make
with
the
titles
would
affect
how
quickly
we
recrawl
okay,
one.
B
More
question
so
sure
yeah.
Actually
I
was
having
a
particular
domain.
Okay,
so
it's
for
one
of
my
clients,
so
that
was
a
travel
domain,
the
name
itself
right,
but
they
were
into
business
for
travel,
business,
let's
say
10
to
15
15
years
and,
however,
they
have
stopped
working
on
that
particular
domain
and
then
restarted
that
particular
domain
for
an
e-commerce
website.
B
So
the
domain
remain
same.
However,
just
the
business
model
has
been
changed
from
travel
to
e-commerce,
so
we
have
so
when
we
started
again
for
adding
the
google
search
console.
We
got
like
more
than
lags
and
likes
of
links
in
the
links
section
of
the
google
search
console.
However,
there
were
some
penalties
as
well
which
responded
to
it
and
does
how
much
time
you
know,
like
google,
usually
take
to
understand,
there's
a
change
of
business
for
this
particular
domain
earlier
they
were
like
in
travel.
B
We
have
done
a
lot
of
pr.
I
know
a
lot
of
like
man
like
marketing
like
for
for
all
this
particular
domain.
However,
does
these
things
can
help?
You
know
because
it's
been
published
by
so
many
bloggers,
authors
or
news
websites,
so
this
particular
thing
has
been
changed
on
our
own
website.
Also,
we
did
write
some
content
related
to
this.
However,
my
main
question
is:
like
you
know:
how
much
time
does
google
take
to
understand
that
this
particular
domain
was
for
x
business,
but
now
they
have
shifted
to
x.
B
Why
business
like?
I,
I
don't
think.
A
There's
any
particular
time,
so
that's
something
that
it
essentially
organically
happens
over
time.
If
you
reuse
an
existing
domain
and
you
put
something
else
on
it,
then
over
time,
we'll
learn
that
it's
it's
a
new
website
and
we'll
treat
it
accordingly,
but
there
is
no
specific
time
frame
for
that.
A
A
They
I,
I
guess,
the
the
two
aspects
I
would
watch
out
for
are
if,
if
the
previous
website
was
involved
with
a
lot
of
shady
practices,
especially
around
external
links,
then
that
might
be
something
that
you
want
to
clean
up,
and
the
other
aspect
is:
if
there's
any
web
spam
manual
action,
then
obviously
that's
something
that
you'd
want
to
clean
up,
so
that
you
can
kind
of
start
from
a
clean
state.
It's
it's
never
going
to
be
completely
clean.
B
So
so
is
it
more
like
to
disavow
all
those
backlinks
which
we
are
getting
for
the
for
the
travel
business
kind
of
thing
from
from
many
countries?
So
does
that
mean
that
we
should
disallow
all
those
links
from
different
countries
for
the
travel
significant
and
then
only
focus
for
the
commerce
one
or
doing
a
lot
of
pr
and
all
such
activities?
And
getting
you
know
the
news,
but
for
our
e-commerce
business.
A
So
if
I
don't
know
the
the
previous
business
bought
a
lot
of
links
or
did
a
lot
of
weird
things
with
regards
to
links,
then
that's
something
you
want
to
clean
up
and
that's
something
you'd
want
to
clean
up
regardless.
If
it's
a
previous
business
or
just
your
previous
seo
that
was
working
on
it.
So
that's
that's
kind
of
the
direction
I
would
have
just
because
it's
a
different
topic
or
maybe
from
a
different
country.
I,
like
that's,
not
a
reason
to
disavow,
like
I
don't
think
that's
problematic.
C
John,
I
have
a
follow-up
on
the
same
okay.
If
so,
so
does
it
make
also
any
sense
if,
if
for
links,
I
understand
you
told
that
you
should
get
the
links
updated
if
there
is
any
problem
but
for
content
suppose
earlier
it
was
travel
domain.
So
all
the
content
was
about
travel
so
site
owner.
If
he
moves
to
e-commerce,
should
he
also
approach
the
old
website
owners
where
they
have
published
travel
content
about
travel,
and
should
he
ask
that
kindly
remove
travel,
related
content
and.
A
No,
I
I
don't
think
that's
I
mean
like
you
can,
but
I
I
don't
really
think
that's
that's
necessary,
because
websites
evolve
over
time
and
what
worked
as
a
link
in
the
past
might
not
work
as
a
link
in
the
future.
It's
not
that
this
will
cause
you
any
problems.
If
someone
wrote
about
kind
of
like
what's
what
the
previous
owner
of
a
domain
did.
C
D
F
Okay,
can
you
hear
me
yes,
perfect,
I'm
sorry,
okay,
I'm
yeah!
That's
korean!
I
have
a
question
about
relationship
between
failure
from
robot,
txt
and
server
and
connectivity
failure
and
searched
traffic
from
google,
because
I
saw
more
than
55
percent
of
robot
txt
and
server
connected
with
failure
recently
in
search
concern.
F
So
we
had
seven
days
failures
in
november
and
it
happened
once
in
december,
so
our
server
team
blocked
some
ip
addresses
from
googlebot
since
so
since
november,
our
website
traffic
from
google
has
dropped
about
six
twenty
percent.
So
if
nothing
changes
during
the
day
during
their
date,
is
it
reasonable?
If
I
blame
the
server
team
for
the
decrease
of
their
traffic,
I'm
just
wondering
so
how
much
effect
on
the
problems
will
be.
You
know
from
the
robot
and
server
connectivity
to
google
searching
is.
F
Fixed
now
or
they
kind
of
opened
their
ip
address,
but
they
we
had
one
more
problem
in
december
and
they
don't
know
what
what
is
the
reason,
because
they
all
open
their
ip
address
and
they
don't
know
what
is
the
reason
from
their
google
search
concern.
A
So
I
I
think
there
there
are
two
things
maybe
to
to
watch
out
for
on.
On
the
one
hand,
the
if
you
have
server
connectivity
issues,
we
would
not
see
that
as
a
quality
problem,
so
it
wouldn't
be
that
the
ranking
for
your
pages
would
drop.
So
I
I
think,
maybe
that's
the
the
first
step.
So
if
you
see
that
the
ranking
of
your
pages
is
dropping,
then
that
would
not
be
from
from
the
technical
issue.
A
A
So
that's
that's
kind
of
a
simple
way
to
figure
out.
Is
it
from
a
technical
problem
or
not?
Are
the
are
the
pages
gone
from
the
index
and
if
so,
that's
probably
from
a
technical
problem
or
are
the
pages
just
ranking
lower?
Then
that's
not
necessarily
from
a
technical
problem
and
if
it
is
from
a
technical
problem
like
if
these
pages
are
gone,
then
usually
we
will
retry
those
those
missing
pages
after
a
couple
of
days
maybe-
and
we
will
try
to
index
them
again.
A
So
if,
if
the
issue
was
in
november
and
you
fixed
it
in
november,
then
I
assume
you
should
not
see
any
effects
from
that
anymore.
Just
because
we
we
should
have
re-indexed
those
pages
in
the
meantime,
if
you
do
still
see
that
we're
not
indexing
the
pages
properly
from
your
website,
I
would
double
check
the
crawl
error
section
in
search
console
to
see
if
there
is
still
perhaps
a
technical
issue
where
sometimes
maybe
googlebot
is
blocked.
E
Okay,
so
I
have
small
questions
first,
should
the
title
be
with
the
company
name
like
we
are
in
a
news
industry
and
should
we
add
like
to
each
title
the
company
name
like
we
are
like
test
the
news
or
dash
news.
This
is
the
first.
You
can't
answer
it
if
you
can,
if
it's
small,
yes
or
no,
I
think
I.
A
I
think
it
makes
sense
so
when,
when
we
show
a
title
link
for
a
website,
if
it's
for
a
kind
of
a
lower
level
page
we'll
try
to
show
the
name
of
the
website
as
well
in
in
the
title
that
we
show.
So
if
you
add
it
yourself,
then
you
kind
of
like
already
include
that
information.
So,
okay.
E
So
I
think
it
says:
okay,
the
second
thing
some
of
our
competitors
is
like
copying
the
article
and
base
it
again
on
their
website,
but
they
rewrite
the
main
header
or
the
title
inside
the
article,
so
they
are
showing
before
us
in
the
search
results.
Is
it
a
good
practice
to
to
to
do
that?
Like
rewrite
the
title,
the
header
title
inside
the
article
or
the
or
the
main
title
inside
the
article.
A
I
I
don't
know
I
mean
if
they're
copying
your
content,
then
I
don't
think
that's
a
good
practice.
So
from
from
that
point
of
view,
I
I
don't
think
just
rewriting
the
headline
wouldn't
make
it
a
a
good
article,
it
it
if
they're
really
copying
your
full
content.
I
might
also
look
into
the
dmca
process
to
see
if
that
applies.
There.
E
Actually,
I
applied
three
or
four
copywriter
removal
and
they
removed
and
google
removed
their
articles,
but
for
the
first
two
or
three
days
they
are
showing
up
before
us
in
the
in
our
article
like
they
are
because
they
just
three
are
rewriting.
So
I
thought
should
we
do
that,
so
I
think
it's
a
no.
We
shouldn't
not
okay,.
E
Okay,
we
have
a
comments
section
under
the
article,
but
it's
not
showing
in
google
like.
If
I
searched
for
a
comment
like
someone
is
type
and
comment,
and
I
search
for
it,
I
don't
find
it
like.
The
article
is
not
showing
there.
Is
there
any
good
or
best
practice
for
the
comment
section
I
should
do
so.
Can
google
like
see
more
or
our
comments,
go
because
comments
is
like
discussion
about?
The
news
it's
happening
is
what
the
good
practice
of
this.
A
It's
up
to
you
like
you,
you
can
decide
if
you
want
to
have
that
shown
or
not,
but
it's
it's
essentially
a
technical
element
on
on
the
page.
So
it's
not
that
there's
a
setting
in
search
console
to
turn
it
on
or
off
it's.
Basically,
there
are
different
ways
of
integrating
comments
on
web
pages
and
some
of
those
ways
are
blocked
from
indexing
and
some
of
those
ways
are
easy
to
index.
A
So
if
you
want
to
have
your
comments,
indexed
then
make
sure
that
you,
you
kind
of
implement
them
in
a
way
that
is
easy
to
index
and
the
inspect
url
tool
in
search
console
will
tell
will
show
you
a
little
bit
what
what
we
find
on
the
page.
So
you
can
test
to
see.
Can
google
index
the
comments,
but
it
sounds
like
even
when
you
search
in
google
directly,
you
already
see
that
we
can't
index
them.
So,
yes,.
E
Okay,
should
we
add
the
title
in
in
the
not
the
title,
the
dates
in
the
title
like
we
have
some
prices
of
currency,
so
our
currency
today
is
like
all
the
currencies
for
today.
Should
we
add
the
title
or
should
we
add
the
date
inside
the
title?
Is
it
if.
A
E
A
For
news
articles,
I
I
do
think
it
makes
sense
to
include
the
date
in
various
places
on
the
page
and
that
can
include
the
title
just
because,
with
news
articles,
we
try
to
understand
what
the
primary
date
is
of
the
page,
and
we
do
that
by
looking
at,
like
all
of
the
mentions
and
things
that
you
have
on
the
page,
and
if
we
can
confirm
that
date
with
these
mentions
on
the
page,
then
it's
easier
for
us
to
pick
that
up,
but
I
think
for
for
a
page
that
is
changing
constantly
like
like
currency
prices,
like
you
mentioned,
or
previously
stock
prices.
E
A
E
A
E
Sure,
okay,
last
one
should
we
add,
like
our
url
now,
is
like
website
slash
a
news,
slash
id
of
the
news.
Should
we
add
the
category
like
maybe
usa
or
arab
states
or
something
slash,
then
the
news?
Is
it
like
affecting
the
category
of
the
news
inside
that
url.
A
I
don't
think
you
would
notice
any
visible
change
there.
I
I
think
for
for
tracking
purposes.
Sometimes
it
makes
it
easier
because
then
you
can
look
at
analytics
and
say:
oh,
this
category
is
very
popular
or
this
category
is
not
working
so
well,
but
from
from
an
seo
point
of
view,
I
don't
think
there
would
be
any
noticeable
difference.
D
Hi,
john
hi,
I
had
few
questions
about
google
crawling
and
indexing
so
recently
what
we
listen.
We
are
facing
some
issue
for
some
of
our
client
website.
So
we
are
writing
content
and
posting
content
on
their
blog.
What
happens
last
four
months?
We
are
posting
content
on
the
website,
but
most
of
the
content
are
not
getting
indexed
by
google
and
when
we
inspect
url
we
usually
get
two
message:
one
is
the
url
is
crawled,
but
not
indexed
yet
or
url
is
discovered,
but
not
indexed
yet
so.
D
My
first
question
is
about:
if
it's
a
url
scroll
but
not
indexed,
does
it
mean
google
thinks
this
content
is
not
perfect
for
indexing
or
what
does
it
mean.
A
I
I
don't
think
it
means
anything
in
particular,
so
I
I
think
that's
always
kind
of
an
an
easy
early
assumption
to
say:
oh
google
looked
at
it
but
decided
not
to
index
it
most
of
the
time
like
when
we
still
crawl,
something
it
doesn't
necessarily
mean
that
we
will
automatically
index
it.
So
I
I
would
almost
treat
those
two
category
of
kind
of
not
indexed
as
as
a
similar
thing
and
it's
it's
tricky,
because
we
don't
index
everything.
D
And
the
next
one
is
in
this
scenario:
if,
if
we
like,
it
is
not
only
for
one
client,
we
are
facing
these
four
multiple
clients,
five
or
six
clients
at
the
same
moment.
So
is
there
any
other
way
like
we
are
trying
to
inspect
the
url
then
requesting
for
indexing?
We
are
trying
to
use
google
search
console.
Is
there
any
other
way
to
make
it
faster,
or
is
there
any
other
things?
If
we
do
this,
maybe
that
will
help
for
getting
indexed.
D
A
Mean
there
there
are
different
things
which
perhaps
you're
already
doing
on,
on
the
one
hand,
kind
of
making
sure
that
it's
easy
for
us
to
recognize
the
important
content
on
a
website
is,
is
really
good,
which
means
sometimes
means
making
less
content
and
making
better
content.
So
having
fewer
pages
that
you
want
to
have
index,
the
other
thing
is
internal.
Linking
is
very
important
for
us
to
understand
what
you
would
consider
to
be
important
on
a
website.
A
So
things,
for
example,
that
are
linked
from
the
home
page
are
usually
some
a
sign
that,
like
you
care
about
these
pages,
so
maybe
we
should
care
about
them
more
things
with
external
links.
They
also
kind
of
go
into
that
category
of
where
we
see
other
people
think
that
these
pages
are
important,
then
maybe
we'll
see
them
as
important
too
and
then
sitemaps
and
rss
feeds
from
a
technical
point
of
view,
also
help
us
a
little
bit
better
to
understand.
Like
these
pages
are
new
or
they
have
changed
recently.
A
D
Okay
and
the
last
question,
so
what
happens
when
we
saw
that
those
articles
are
not
getting
indexed,
so
we
told
our
client
to
share
this
article
on
their
social
media
account
like
facebook
or
twitter
instagram,
and
those
actually
help
us
to
draw
some
link
from
some
other
website
to
those
blog
posts.
Now
we
we
are
using
this
blog
post
for
interlinking
as
well
to
other
pages,
so
getting
linked
to
those
blog
posts
which
are
not
indexed
by
google
will
have
any
impact
on
the
website,
ranking
or
will
add
any
value.
A
I
mean
if,
if
we
find
external
links
to
to
those
pages,
then
chances
are
we
we
might
crawl
and
index
that
page,
it's
like
a
little
bit
higher.
I
guess
it
depends
a
little
bit
on
what
kind
of
external
links.
Of
course
there
are
like
links
from
social
media
directly,
usually
have
no
follow
attached,
so
we
don't
really
forward
any
signals
there
and
if
it's
something
where
we
can
recognize
well,
these
are
maybe
problematic
links
or
not
that
useful
links.
A
Maybe
we
will
ignore
those
too,
but
obviously,
if
if
we
can
tell
that
kind
of
some
something
is
seen
as
being
important,
we'll
probably
go
off
and
crawl
and
index
that
page
more
likely.
What
you
generally
won't
see
is
that
we
will
kind
of
like
forward
value
to
the
rest
of
the
website.
If
we
don't
actually
index
that
page,
because
if
we
decide
not
to
index
a
page,
then
it's
still
that
situation
that
well
we
don't
have
a
destination
for
those
links.
So
we
can't
do
anything
with
those
links
for
the
rest
of
the
website.
G
Yeah
hi
john
hi,
so
our
website
is
like
based
out
of
india
mainly
and
like
our
users,
are
also
from
india.
So,
like
a
few
days
back,
what
we
did
was
we
moved
all
our
dynamic
traffic
behind
the
cdn,
but
what
we
have
been
observing
is
that,
like
in
the
search
console
under
crawl,
starts,
our
response
times
have
gone
up
from
like
300
milliseconds
to
like
around
a
second
so
like
we
are
not
understanding
and
also
the
crawl
rate
has
dropped.
G
It
dropped
from
around
2
million
requests
a
day
to
around
80k
requests
so
like
we
are.
Not
understanding
does
like
moving
to
a
cdn
impact,
how
google
crawls
our
website
and
like
put
the
network
latency
that
is
introduced
due
to
the
cdn
servers
being
present
close
to
from
where
google
is
crawling.
Our
site,
like
does
that
impact,
how
the
crawl
rate,
as
well
as
our
search,
ranking
as
well.
G
A
Of
view
this
this
would
not
change
anything,
so
maybe
that's
kind
of
the
the
first,
the
first
question
ahead,
but
it
it
can
like.
If,
if
you
change
your
hosting
significantly,
what
will
happen
on
our
side
is.
The
crawling
rate
will
move
into
a
more
conservative
area
first,
where
we'll
say
we'll
we'll
crawl
a
little
bit
less
first,
because
we
saw
a
bigger
change
in
hosting,
which
could
be
a
move
to
a
cdn
or
move
from
a
cdn
or
from
one
cdn
to
a
different
cdn
and
then
over
time
over.
A
I
don't
know
a
couple
of
weeks,
maybe
a
month
or
so
we
will
increase
the
crawl
rate
again
to
kind
of
see
where
we
think
it
will
settle
down.
So
essentially
that
drop
in
crowd
rate
overall
for
for
move
to
a
cdn
or
change
of
a
cdn
that
can
be
normal.
The
the
crawl
rate
itself
doesn't
necessarily
mean
that
there's
a
problem
because
like
if
we
were
crawling
two
million
pages
of
your
website
before
it's
unlikely,
I
I
assume
that
you
would
have
two
million
pages
that
change
every
day.
A
So
it's
not
necessarily
the
case
that
we
would
miss
all
of
the
new
content
on
your
website.
We
would
just
try
to
prioritize
again
and
figure
out
which
of
these
pages
we
actually
need
to
recrawl
on
a
day-to-day
basis.
So
just
because
the
crawl
rate
drops
is
not
necessarily
a
sign
for
for
concern.
What
would
worry
me
more
is
the
change
in
the
average
response
time,
because
the
the
crawl
rate
that
we
choose
is
based
on
on
the
average
response
time.
A
It's
also
based
on
server
errors,
those
those
kind
of
things,
and
if
the
average
response
time
goes
up
significantly,
we
will
kind
of
stick
to
a
lower
crawl
rate.
A
So
that's
something
where
I
assume,
if,
if
the
average
is
around
one
second
per
url
on
your
website,
I
feel
that's.
That's
quite.
G
So,
actually
about
that
the
average
response
time
has
actually
improved
for
our
users.
It
has
only
increased
for
the
google
crawler
so
yeah.
Now
we
are
not
able
to
explain
that.
Actually.
A
A
Then
that's
sometimes
a
little
bit
tricky
to
to
diagnose
and
to
figure
out
what
exactly
is
causing
that
delay,
but
that,
from
from
our
point
of
view,
the
the
response
time
is
critical
for
the
amount
of
crawls
that
we
do
per
day,
because
we
want
to
limit
the
number
of
active
connections
that
we
have
to
your
server
and
if,
if
the
response
time
is
high,
then
like
we
run
into
that
limit
fairly
quickly,
whereas
if
we
can
crawl
quickly,
then
we
we
don't
have
that
much
of
a
problem
with
the
number
of
concurrent
connections.
G
We
tried
running
tests
like
from
us
locations
like
with
cdn
without
cdn.
The
latency
that
we
are
observing
are
same.
So
is
there
anything
else
that
we
can
look
into
like
why.
A
I
don't
know
it's
it's
hard
to
say
I
I
would
use
maybe
some
testing
tools
that
are
running
within
google's
network.
A
G
So,
john,
here
same
with
me,
so
what
we
observed
was
the
time
it
took
for
a
request
to
be
served
from
us
right
that
did
not
change
at
all
before
or
after
we
moved
our
urls
over
there.
So
could
it
be
that
previously,
our
origin
servers
were
directly
requested
by
google
and
they
were
in
india,
and
google
is
considering
the
network
latency
over
there
to
be
very
high,
so
it
is
reducing
that
network
latency
and
showing
a
reduced
response
time
in
the
crawl
rate
and
for
the
cda
and
our
servers.
G
The
the
the
site
would
be
served
from
us
location,
as
the
cdn
servers
would
also
be
present
in
us.
So
for
google.
Probably,
the
server
is
now
present
in
u.s
instead
of
india,
and
it
is
not
reducing
that
you
know
network
latency,
asset.
A
That
shouldn't
be
the
case.
No,
we,
we
really
use
the
the
actual
latency
that
we
see
not
anything
where
we
say.
Oh,
it
might
be
like
this
because
the
the
report
there
is
less
about
usability
for
users
and
really
more
about
kind
of
the
practical
crawling
from
google
side.
A
So
that's
something
where
we
really
report
exactly
what
we
saw
and
like
like,
I
said
what
what
might
be
happening
is
that
there's
something
along
the
the
change
of
that
request.
That
is
slowing
things
down.
It
might
also
be
that
the
kind
of
on
average,
the
requests
that
googlebot
makes
are
just
slower
because
of
the
urls
that
are
accessed.
So
I've
seen
this,
for
example,
when
when
a
website
has
a
lot
of
very
large
pdf
files,
then
obviously
per
request.
A
The
amount
of
bytes
that
need
to
be
transferred
is
very
high,
but
that
doesn't
mean
that
kind
of
on
average,
when
you
look
at
other
urls
that
it's
it's
a
very
long
time,
but
it's
kind
of
that
large
number
due
to
the
size
of
the
bytes
that
is
kind
of
pulling
things
down.
It
can
also
be
that,
for
example,
we
crawl
a
lot
of
search
pages,
and
maybe
the
search
functionality
on
your
site
goes
directly
to
the
origin
servers
and
takes
a
long
time
to
be
processed.
A
Then
again,
it
could
look
like
the
average
response.
Time
is
quite
high,
so
I
I
would
try
to
look
into
the
the
details
there.
Another
thing
is
perhaps
something
around
the
lines
of
your
dns
connection
or
anything
in
the
line
of
the
connection
there.
That
might
be
slower
for
for
googlebot
to
process.
Okay,.
D
G
A
Yeah
that
that
would
not
be
affecting
the
ranking,
so
the
the
technical
side
of
things
is,
is
really
purely
about
getting
all
of
your
pages
as
quickly
as
possible,
getting
the
new
pages
indexed
and
if
we
don't,
if
we
aren't
able
to
crawl
as
much
we'll
try
to
prioritize
more
but
that
wouldn't
be
affecting
the
rankings.
A
Okay,
let
me
go
through
some
of
the
submitted
questions
so
that
we
don't
lose
track
of
them
completely
and
I'll
go
through
more
of
the
the
live
questions
as
well.
Afterwards,
don't
worry,
I
think
the
first
one
was
the
cdn
question
which
we
just
had
the
next
one
I
have
is:
we've
undertaken
a
substantial
co-change
of
our
website
to
javascript,
with
server-side
rendering
to
static
html
it's
been
for
months,
and
our
search
rankings
are
not
recovering
after
a
drop.
A
A
It's
it's
really
hard
to
say
without
seeing
the
the
website
itself,
but
essentially,
if
nothing
on
the
website
has
changed.
If
you
just
move
from
client
side
rendering
to
server
side
rendering,
then
I
would
not
expect
to
see
any
change
in
rankings
or
visibility
on
the
website.
A
One
of
two
things
are
happening:
on
the
one
hand,
it
might
be
that,
with
the
change
of
your
infrastructure,
your
website,
layout
and
structure
has
changed
as
well,
so
that
could
include
things
like
internal
linking,
maybe
even
the
urls
that
are
findable
on
the
website
and
those
kind
of
things
they
can
affect
ranking.
A
The
other
thing
could
be
that
maybe
there
there
were
just
changes
in
ranking
overall
that
were
happening,
and
they
just
happened
to
coincide
with
when
you
made
the
technical
changes
on
your
website
and
that's
something
where
I
I
think
you
can
kind
of
try
to
figure
out
if
that
might
be
happening
by
really
comparing
your
previous
website
to
your
current
website,
to
figure
out
where
they're
really
significant
structural
changes
or
not
if
there
are
no
structural
changes
and
you're
seeing
a
change
in
rankings-
and
I
would
assume
this
is
more
of
a
general
ranking
change
rather
than
something
specific
to
this
infrastructure
change
that
you
need
within
some
industries.
A
A
I
don't
know.
I
think
this
is
almost
like
a
philosophical
question.
From
our
point
of
view.
It's
definitely
not
the
case
that
we're
trying
to
kind
of
like
focus
on
big
websites
or
anything
like
that,
but
from
from
a
purely
a
practical
point
of
view.
Obviously,
if
you're
a
small
company
and
you're
trying
to
compete
with
larger
companies,
then
it's
it's
always
going
to
be
hard
and
especially
on
the
web.
A
Overall
they've
kind
of
grown
their
websites,
they
have
really
competent
teams.
They
work
really
hard
on
making
a
fantastic
web
experience
and
that
kind
of
means
for
smaller
companies
that
it's
it's
a
lot
harder
to
kind
of
gain
a
foothold
there,
especially
if
there's
a
very
competitive
existing
market
out
there
and
it's
less
about
large
companies
or
small
companies.
It's
really
more
about
kind
of
the
competitive
environment
in
general,
and
that's
something
where
you
kind
of
as
a
small
company.
A
You
should
probably
focus
more
on
your
strengths
and
the
weaknesses
of
the
competitors
and
try
to
find
an
angle
where
you
can
shine
where
other
people
kind
of
don't
have
the
ability
to
shine
as
well,
which
could
be
specific
kinds
of
content
or
specific
audiences
or
anything
along
those
lines.
Kind
of
like
how
you
would
do
that
with
a
normal
physical
business
as
well.
A
So,
with
regards
to
an
image
itself,
we
would
probably
not
find
a
lot
of
value
in
that
as
an
anchor
text.
If
you
have
an
alt
text
associated
with
the
image,
then
we
would
treat
that
essentially
the
same
as
any
anchor
text
that
you
have
associated
with
the
link
directly.
So
from
our
point
of
view,
the
alt
text
would
essentially
be
converted
into
a
text
on
the
page
and
be
treated
in
the
same
way.
A
A
A
The
question
now
is
whether
the
page
authority
and
page
ranking
will
be
negatively
affected
if
there
are
many
backlinks
to
domain
a
and
also
future
backlinks,
which
will
point
back
to
domain
a
is
the
domain
authority
from
the
old
domain
automatically
inherited
by
the
new
domain,
even
if
there
are
lots
of
links.
So
I
I
think
there
are
two
aspects
here.
A
On
the
one
hand,
if
you're
moving
from
one
website
to
another-
and
you
use
the
redirects
kind
of
to
to
move
things
over
and
you
use
the
various
tools
that
we
have
like
the
or
the
change
of
address
tool
in
search
console,
then
that
helps
us
to
really
understand.
Everything
from
the
old
domain
should
just
be
forwarded
to
the
new
one.
So
that's
kind
of
it's
essentially
a
situation
where,
as
much
as
possible,
we'll
take
everything
from
the
old
domain
just
apply
it
to
the
new
one.
A
The
other
aspect
there
is
on
a
per
page
basis.
We
also
try
to
look
at
canonicalization
and
for
canonicalization.
We,
we
look
at
a
number
of
different
factors
that
come
in.
On
the
one
hand,
redirects
play
a
role
things
like
internal,
linking
play,
a
role,
the
rel
canonical
on
the
pages
play
a
role,
but
external
links
also
play
a
role.
So
what
could
happen
in
kind
of
probably
more
edge?
Cases?
A
I
guess
is
that
if
we
see
a
lot
of
external
links
going
to
the
old
url,
and
maybe
even
some
internal
links
going
to
the
old
url,
that
we
actually
index
the
old
url
instead
of
the
new
one.
Because
from
our
point
of
view,
it
starts
to
look
like
well.
Actually,
the
old
url
is
the
right
one
to
show
and
the
new
one
is
maybe
more
of
a
temporary
thing.
A
We
we're
having
a
problem
with
e-commerce
facets
filters
that
are
getting
indexed
even
though
they're
blocked
by
robots
text
and
have
a
canonical
tag.
That
point:
is
there
a
point
in
adding
no
index
tags
too?
So,
probably
not
the
the
short
answer,
I
I
guess
is:
if
the
url
is
blocked
by
robot's
text,
we
don't
see
any
of
those
meta
tags
on
the
page.
We
don't
see
the
real
canonical
on
the
page,
because
we
don't
crawl
that
page
at
all.
A
A
A
So
that
would
be
kind
of
my
first
step.
There
is
to
try
to
figure
out.
Do
normal
users
see
these
pages
when
they
search
normally
and
if
they
don't
see
them,
then
that's
fine.
You
can
just
ignore
them
if
they
do
see
these
pages
when
they
search
normally,
then
that's
a
sign
that
maybe
you
should
be
focusing
on
other
things
on
the
rest
of
your
website
and
then
a
quick
question
about
the
google
google
favicon
user
agent.
Does
it
only
crawl,
the
homepage?
A
How
does
it
respond
if
the
home
page
redirects
to
subfolder
within
the
same
domain?
Will
it
still
download
the
favicon
from
the
subfolder?
Yes,
it
should
so
the
the
fabicon
that
we
pull.
A
Should
I
I
believe
we
only
do
that
kind
of
one
favicon
per
domain.
I'm
not
100
sure,
but
I
I
think
that's
the
case,
but
if
you
redirect
your
home
page
or
if
you
redirect
the
fabicon
file
to
a
different
part
of
your
website,
we
should
be
able
to
still
pick
that
up
because
practically
what
would
happen
here
is.
We
would
follow
that
redirect,
but
we
would
probably
still
index
it
as
your
home
page
anyway.
A
So
from
a
practical
point
of
view,
if
you
search
for
your
name,
probably
we
would
show
the
the
root
url,
even
though
it
redirects
to
a
lower
level
page
wow
lots
of
questions
left,
but
also
a
bunch
of
hands.
Maybe
I'll
just
go
through
some
of
the
the
folks
with
the
raised
hands,
and
I
have
some
more
time
afterwards
as
well.
So
we
can
go
through
more
questions
too.
Let's
see
yogesh.
H
Hey
hi
john
hi,
it's
so
the
question
is
regarding
the
site
maps
and
I'm
working
on
the
project
of
like
e-commerce.
So
it
has
a
you
use
a
new
number
of
urls,
like
let's
say,
lacks
of
url,
so
we
have
like
segmented
so
the
with
compression
with
sitemaps
of
sitemaps.
H
So
from
past
month,
I
am
seeing
that
search
console
is
not
able
to
fetch
it,
but
the
twist
is
the
main
sitemap
the
root
side
map
is
able
to
index
and
it's
showing
status
success,
but
the
nested
sitemap
sitemap123
that
is
showing
couldn't
fetch
and
the
discord.
Url
is
zero
and
I
tried
doing
inspect
of
each
site.
Sitemap
they
were
showing.
The
fetch
is
successful,
so
just
want
your
inputs
on
that.
So
what?
What
will
be
the
scenario
in
this
case.
A
I
I
don't
know
it's
it's
hard
to
say
so
I
I
think
the
yeah
one
of
the
things
I
I
have
heard
I
haven't
actually
looked
at
it
in
detail-
is
in
search
console.
If
you
have
an
index
site
map
which
is
kind
of
that
site
type
of
sitemap
thing
that
you
said,
then
we
don't
show
the
details
of
the
detail
pages
directly
in
search
console,
so
essentially
to
get
that
information
you
might
need
to
submit
the
individual
sitemaps
individually.
A
A
The
other
thing
is
that
what
what
might
help
here
is
to
just
change
the
url
of
some
of
these
detail,
sitemaps
and
see
if
that
works,
because
it
might
might
end
up
that
we
we
try
to
recrawl
some
of
these
site
maps
and
for
whatever
reason
we
we
think
we
can't
do
it,
and
we
kind
of
remember
that,
like
that
state.
So
if
you
use
a
new
url
for
the
scifi
file,
maybe
it'll
help
us
to
understand.
A
A
So
I
I
would
try
that
for
maybe
a
handful
of
them
first
and
if
you
see
that
it
works,
then
maybe
you
make
sense
to
change
the
the
naming
convention
that
you
have.
A
H
So
two
thing
comes
in
my
mind,
so
one
thing
is
like
it's
one
month
before
everything
was
working
properly
so
that
sidewalk
was
able
to
crawl
and
the
nested
sideness
was
also
working
properly
and
if,
let's
say
I'm
testing
for
one
batch
of
site
map
and
checking
how
it
is
work.
But
if
I
change
that
to
let's
say
one
site
that
I'm
changing
to
the
next
folder
and
testing
out,
isn't
it
the
same?
Data
is
already
there
with
the
google
and
just
the
same
iteration,
I'm
resubmitting
and
creating
confusion.
H
How
do
you
mean
means,
let's
say,
site
map
one
and
sitemap?
Two
sitemap
is
already
there
in
the
sitemap
index
file
and
then
I'm
just
creating
a
let's
say:
xyz.com,
slash,
sitemap1
and
trying
to
test,
but
in
that
site
map
one.
It's
already
like
20
000
urls
are
there
if
I'm
testing
out
that
already
that
data
is
available
with
the
google
yeah,
the
previous
site.
A
I
I
I
haven't
actually
tested
it
in
search
console
or
at
least
not
recently,
but
I
I
my
understanding
is
in
search
console.
We
don't
kind
of
show
the
details
of
the
sitemap
index
files,
so
if
you
submit
them
individually,
then
yes,
it's
the
same
sitemap
file
and
probably
we've
already
processed
that,
but
because
of
the
way
that
the
search
console
reporting
works,
we
would
be
able
to
show
you
that
information
directly
so.
I
Hi,
oh
yeah,
so
I
have
a
question.
Maybe
quick
one.
Regarding
product
reviews,
we
want
to
revamp
our
image
guidelines
and
we
want
to
be
able
to
distinguish
ourselves
in
this
space,
so
we're
thinking
about
different
ways
that
we
can
add
original
imagery,
some
more
stable
than
others.
A
I
I
think,
the
the
guidelines
that
we
have
for
reviews
or
the
recommendations
that
we
have
should
really
kind
of
be
focused
on
kind
of
unique
photos
that
you
create
of
these
products,
so
not
kind
of
like
artificial
review
photos.
I
I
don't
think,
like
our
our
systems
would
automatically
recognize
that,
but
it's
it's
probably
something
that
we
would
look
at,
at
least
on
a
manual
basis
from
time
to
time.
A
So
I
don't
know
just
like
looking
at
the
recommendations
that
we
have
with
regards
to
reviews.
It
feels
like
it's
not
really
in
line
what
we're
trying
to
do
there,
where
we're
trying
to
like
really
bubble
up
reviews
where
we
can
tell
that
someone
is
actually
testing
this
product.
In
I
don't
know
real
life.
A
Okay,
let
me
take
a
break
here
with
the
recording-
and
I
have
more
time
for
for
you
folks,
with
your
hands
raised
so
we'll
get
to
to
you
as
well,
if,
if
you're
watching
this
on
youtube,
thanks
for
sticking
around
to
the
end,
if
you'd
like
to
join
us
for
one
of
these
sessions
in
the
future,
I
post
them
in
the
community
section
of
the
channel,
usually
a
couple
days
before
we
we
do
the
sessions
themselves,
yeah
thanks
for
all
of
the
questions
and
hopefully
we'll
see
each
other
again
in
one
of
the
future
episodes
all
right.