►
From YouTube: English Google SEO office-hours from February 18, 2022
Description
This is a recording of the Google SEO office-hours hangout from February 18, 2022. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
sessions
where
people
can
join
in
and
ask
their
questions
around
the
website
in
the
web
search.
A
It
looks
like
we
have
a
bunch
of
questions
submitted,
but
we
also
have
a
ton
of
people
raising
their
hands
already,
so
maybe
we'll
just
jump
right
in
and
start
with.
The
first
folks,
let's
see
nero,
I
think,
you're
on
top
of
the
list
here.
B
Yeah
hi
john
thanks
for
the
opportunity,
so
my
first
question
is:
do
does
google
search
console,
consider
image,
image,
packs
video
packs
local
listing
into
the
rankings
or
those
are
the
script
and
basically
the
urls
are
counted.
A
We
we
try
to
include
everything
that
includes
a
link
to
your
site.
So
if,
if
there
is
a
block
of
images
and
one
of
those
images
is
from
your
site
and
it
links
to
your
site,
then
that
would
count-
or
if
there's
a
local
listing
that
is
shown
and
there's
a
list
to
your
website
there.
Then
that
would
count.
B
Okay,
thank
you
so
much.
My
second
question
is
so
this
is
like
a
case
study.
We
had
a
page
for
electric
bikes
and
scooters
okay
and
later
on,
we
we
understood
that
electric
scooter
is
also.
You
know,
a
very
big
search
when
it
comes
to
volume
and
electric
bike
and
electric
scooter
as
a
product
are
two
different
things.
B
So
we
thought
of
creating
two
separate
pages
at
first
we
we
had
just
one
page,
and
now
it's
almost
like
six
to
eight
months,
we
used
to
rank
well
for
electric
bike
and
scooter,
which
is
in
top
five
when
we
were
having
just
the
electric
bike
pitch,
but
now
after
having
electric
scooter
and
electric
bike
quiz,
still
the
electric
bike,
phase
ranks
and
electric
scooter
which
doesn't
rank.
B
We
have
tried
a
number
of
things
like
you
know,
trying
to
separate
the
content
of
the
two
pages
generate
more
and
more
internal
links,
giving
electric
scooter
links
from
electric
bike
page,
but
still
it
doesn't
show
up
and
sure,
sir.
So
should
we,
you
know,
wait
for
some
more
time
so
that
google
can
understand
the
intent
or
will
it
be
wise
to
you
know,
merge
those
pieces
again.
A
Sometimes
it
makes
sense
to
have
pages
that
are
specifically
targeted
on
individual
items
and
finding
that
balance
is
something
that
is
essentially
up
to
you
and
it's
more
of
a
strategic
question
and
I
think
the
the
things
that
you're
doing
there
sound
like
the
right
things,
but
whether
or
not
this
will
lead
to
kind
of
success.
For
that
specific
query
is
is
hard
to
say
and
it
might
change.
B
A
B
A
A
Sure
all
right,
praveen.
C
Hi
john,
how
are
you
hi
hi?
So
my
question
is
about
product
reviews
update,
and
this
is
basically
for
my
own
understanding
that
I
want
to
learn
more
about
this
thing.
I
just
wanted
to
understand
how
google
identifies
whether
a
page
or
site
is
related
to
product
reviews.
So
let
me
give
an
example
like
there
are
case
studies
that
tell
you
that
there
are
it's
mainly
affiliate
sites
that
you
know,
because
they
write
about
product
reviews,
pros
and
cons.
C
They
compare
products
but,
for
example,
there's
an
e-commerce
site
which
is
selling
pens,
fountain
pen,
pen
and
that
that
kind
of
stuff-
and
they
also
have
a
blog
where
they
review
their
own
products.
They
they
do
write
about
pros
and
cons
of
their
products,
compare
different
products
so
but
though
it's
an
ecommerce
site,
but
the
blog
is
also
talking
about
product
reviews.
C
And
can
be,
you
know,
analyzed
through
analyzed
by
product
reviews,
update
the
product
reviews.
A
I
think
they
would
be
relevant
for
any
kind
of
product
review,
so
I
wouldn't
necessarily
try
to
see.
Like
does
google
think
my
site
is
a
product
review
site
or
not,
and
then
I
will
do
these
good
practices,
but
rather,
if
you
think
these
good
practices
would
apply
to
your
content,
then
like
just
do
those
good
practices,
so
that's
kind
of
how
how
I
would
see
it
not
like
does
google
think
I
fit
into
this
category.
Therefore,
I
must
follow
these
these
steps,
but
rather
do
these
steps
make
sense
for
my
content.
C
Okay,
just
a
follow-up
question
on
this:
the
same
e-commerce
site.
They
have
this
product
category
pages
where
they
have
listed
all
the
products
like
pens,
but
they
also
like
it's
a
standard
practice
for
most
of
the
site
that
they
they
create.
A
lot
of
content
on
this
category
pages
like
two
thousand
three
thousand
words
of
content
and
that
content
also
talks
about
product
reviews
like
they're,
okay,
they're,
just
saying
that
these
type
of
pens
you
can
buy
pros
and
cons
seems
like
a
product
review
page
more
than
a
category
page.
A
You
usually
things
like
category
pages,
are
pretty
clear
to
understand,
but
but
again
like
I,
I
would
think
more
about
whether
or
not
these
recommendations
make
sense
for
that
piece
of
content
or
not
and
like
just
because
google
might
get
confused
that
it
could
be
a
review
because
you
have
a
lot
of
products
on
it,
doesn't
necessarily
mean
that
it
makes
sense
to
follow
those
steps.
But
a
lot
of
times,
though,
those
recommendations
still
make
sense.
C
Okay:
okay,
just
one
more
question:
it's
it's
about
indexing
api,
so
the
google
docs
it
mentions
that
the
indexing
api
should
be
used
for
pages
like
job
posting
or
broadcasting
events.
You
know,
okay!
Is
it
possible
that
we
we
can
try
this
api
for
different
type
of
content,
like
some
news
articles
or
you
know,
blog
content?
That
kind
of
I
mean.
A
C
A
A
All
right,
isabel.
A
D
Well,
our
question
is
about
our
blank
keyword.
Our
main
bank
award
we
were
talking
about
in
this
extreme
behavior
two
weeks
ago,
is
about
when
we
have
a
brand
like
in
phase.
D
We
are
in
the
first
position
in
the
search
engine
and
when
we
add
our
gun
power
planet,
our
website
is
filtered
by
the
by
google.
We
would
like
to
know
why
the
cause
for
why
google
filter
our
website
when
the
organic
position
is
the
first
one.
When
is
alone,
when
we
add
our
ram
is
filtered.
A
D
This
is
the
the
brand
when
we
looking
for
in
google
in
phase
we
are
in
the
face
in
spain,
we
are
in
the
first
place
and
when
you
are
in
phase
our
planet,
our
result
is
filter.
This.
D
D
A
Okay,
okay,
I
I'll
check
back
with
the
team
to
see
what
what
they
found
so
far.
D
F
All
right,
hi
john,
have
a
good
time
hi.
I
have
some
questions
about
eat
in
quality
readers
guidelines.
The
ultra
expertise
are
important.
So
do
you
think
it's
also
important
for
a
real
algorithm.
F
A
I
I
would
assume
that
there
is
some
indirect
kind
of
work
done
to
try
to
to
do
similar
things.
Yes,
I
mean
we,
we
put
this
in
the
guidelines
so
so
that
we
can
kind
of
guide
the
the
quality
testers
to
to
double
check
these
things.
And
if,
if
we
think
that
it's
something
important,
then
I
would
assume
that
folks
on
the
quality
search,
quality
side
also
work
to
try
to
understand
that
in
a
more
algorithmic
way.
A
F
Okay,
thank
you.
In
some
article
I
see
people
are
speaking
about
online
brand
mention
I
want
to
know
your
opinion
in
this
case.
Do
you
think
it's
also
important
for
algorithms,
only
brand
mentioned.
A
So
I
mean
I,
I
don't
think
it's
it's
a
bad
thing
just
for
users,
because
if
they
can
find
your
website
through
that
mention
then
that's,
that's
always
a
good
thing,
but
I
wouldn't
assume
that
there's
like
some.
I
don't
know
seo
factor
that
is
trying
to
figure
out
where
someone
is
mentioning
your
website
name.
F
A
Like
comments
on
on
the
bottom
of
an
article-
yes,
I
so
so
what
what
I
think
is
is
really
useful
there
with.
Those
comments
is
that
oftentimes
people
will
write
about
the
page
in
their
own
words,
and
that
gives
us
a
little
bit
more
information
on
like
how
we
can
show
this
page
in
the
search
results.
A
So
from
from
that
point
of
view,
I
think
kind
of
comments
are
a
good
thing
on
a
page.
Obviously,
finding
a
way
to
to
maintain
them
in
a
reasonable
way
is
sometimes
tricky,
because
people
also
spam
those
comments
and
and
like
all
kinds
of
crazy
stuff
happens
there.
But
overall,
I
think
if,
if
you
can
find
a
way
to
maintain
comments
on
a
web
page,
that
gives
you
a
little
bit
more
context
and
helps
people
who
are
searching
in
different
ways
to
also
find
your
content.
F
Thank
you
so
much
and
for
the
last
question,
do
you
think
different
type
of
ssl
is
also
important
for
seo
now
or
just
free
ssl.
A
Are
enough
free
ssl
is
is
perfectly
fine,
so
the
different
types
of
certificates
is
is
more
a
matter
of
kind
of,
like
I
don't
know
like
what
you
want
to
do
with
a
certificate,
but
from
our
point
of
view
we
just
watch
out
for
is
this
a
valid
certificate
or
not,
and
all
of
these
certificates
are
valid.
E
Hi
john,
it's
fawa.
Yes,
great!
Thank
you.
So
I
have
a
question
regarding
multiple
job
posting
on
different
domains
with
job
hosting
structured
data.
As
I
have
understood
in
the
guidelines,
it's
not
so
good
to
post
multiple
times
structured
data
on
the
same
page,
let's
say
so
as
an
international
job
provider,
we
have
a
corporate
structure
with
multiple
brand
websites,
but
also
multiple
sub
domains
for
one
brand.
E
The
question
is:
is
it
problematic
if
the
same
job
postings
are
posted
on
the
same
root
domain,
but
different
sub
domain
with
structured
data
from
the
job
posting?
So
let's
say
one
at
the
same
job
is
one
on
jobs,
dot
domain
de
and
the
other
one
is
on
www.domainde
it's
the
same
job,
maybe
different
styling
but
same
structured
data
on
both
subdomains
is
posted.
Is
there
any
problem
with
that.
A
So
from
that
point
of
view
just
having
it
hosted
different
times
on
the
same
website
or
in
different
subdomains,
that
should
be
perfectly
fine
as
well.
I
I
don't
know
the
details
of
the
all
of
the
guidelines
around
job
postings,
though
so
it
might
be
that
there's
some
obscure
mention
that
you
should
only
list
it
once
on
each
website,
but
I
would
be
surprised
if
that
were
a
problem.
E
A
I
don't
know
I
don't
know
how
how
they
would
tr,
because,
usually
we
we
do
try
to
dedupe
different
listings
and
we
do
that
for
for
all
kinds
of
listings.
So
if
it's
an
image
or
if
it's
a
web
page
or
anything
else
where,
if
we
can
recognize
that
it's
the
same
primary
content,
we
will
try
to
just
show
it
once
I.
I
would
assume
that
we
do
the
same
thing
for
google
jobs.
I
I
don't
know
the
details.
There,
though.
A
All
right,
corinna.
G
Yeah
hi
john,
I
have
a
question
regarding
internal
duplicate
content,
so
I
have
the
content
of
a
pdf
file.
A
case
study
I
submitted
to
my
website
now.
I
want
to
present
it
as
well
in
an
html
blog
article.
Does
this
have
any
negative
impact
for
my
site
because
of
duplicate
content.
A
So
we
wouldn't
see
it
as
duplicate
content,
because
it's
different
content,
it's
like
one,
is
an
html
page.
One
is
a
pdf
it's
even
if
the
primary
piece
of
content
on
there
is
the
same.
The
the
whole
thing
around
it
is
different,
so
kind
of
from
that
level.
We
we
wouldn't
see
it
as
duplicate
content.
A
I
I
think
at
most
the
difficulty
might
be
that
in
the
search
results
it
can
happen
that
both
of
these
show
up
at
the
same
time,
and
whether
or
not
you
want
that
to
happen,
that's
more
almost
like
a
strategic
question
on
your
side.
A
So
I
from
from
my
point
of
view,
I
wouldn't
see
it
as
a
negative
when
it
comes
to
seo,
but
maybe
you
have
strategic
reasons
to
kind
of
have
either
the
pdf
or
the
html
page
more
visible.
So
then
they
compete
against
each
other.
Like
would
you
say
they
could
sure
yeah?
I
think
I
think,
for
the
most
part,
pdfs
will
probably
be
less
visible,
just
because
they're
less
tied
in
with
the
rest
of
your
website,
in
that
in
your
internal,
linking
you'll
link
to
the
web
pages
and
then
from
one
of
those
web
pages.
A
You'll
link
to
the
pdf,
so
they'll
be
a
little
bit
kind
of
de-emphasized
there
from
internal
linking,
but
they
they
could
appear
in
the
same
search
results
and
they
could
kind
of
compete
with
each
other.
There.
A
G
A
H
My
question,
my
main
one
is
centered
around
the
idea
of
paginated
content.
So
if
I
have
a
say
a
long
discussion
thread
with
you
know,
maybe
100
or
more
comments,
it's
probably
intuitive
to
split
it
over
multiple
pages,
so
the
length
of
the
initial
page
isn't
too
long
for
people
to
scroll.
H
So
the
question
is:
let's
say:
a
new
comment
is
posted
towards
the
end
of
the
discussion
thread.
It
gets
added
on
to
the
end,
which
could
appear
say
on
page
4
or
page
5
or
beyond,
because
it's
the
newest
comment
and
then
across
all
pages
of
the
discussion
thread
you
know
the
date
it
was
updated
will
reflect
the
most
recent
activity.
A
I
think
that's
ultimately
up
to
you,
so
that's
something
where
I
I
would
try
to
think
about
like
which
which
of
these
comments
you
want
to
prioritize.
I
assume.
If
something
is
on
page
four,
then
we
would
have
to
crawl,
like
page
one,
two,
three
first
to
find
that,
and
usually
that
would
mean
that
it's
it's
kind
of
like
a
longer
away
from
the
main
part
of
the
website
and
from
our
point
of
view,
what
would
probably
happen
there
is.
A
Whereas
if
you
say
the
newest
comment
should
be
the
most
visible
ones,
then
maybe
it
makes
sense
to
kind
of
reverse
that
order
and
show
that
show
them
differently,
because
if
they're,
the
newest
comments
are
right
on
the
main
page,
then
it's
a
lot
easier
for
us
to
recrawl
that
more
often
and
to
give
it
a
little
bit
more
weight
in
the
search
results.
So
that's
kind
of
I
I
guess
up
to
you
kind
of
how
you
want
to
balance
that.
H
Sure
and
that's
true,
because
some
of
the
messages
are
timely
and
we
would
want
google
to
know.
Oh,
this
is
the
newest
content.
Being
added
onto
this
series
of
pages.
Am
I
correct
in
understanding?
Remember
those
old
tags
link
rel
equals
next
and
previous
those
were
depreciated,
correct,
yes,
okay,
so
that
doesn't
even
matter
okay
and
then
for
for
a
canonical
url.
H
You
know
I
already
have
several
views
of
a
thread.
For
instance,
it
could
be
sorted.
The
default
sort
is
conversation
order,
but
there's
also
sorting
the
newest
messages
first,
as
you
kind
of
alluded
to,
but
that's
not
the
default
and
then
there's
also
sorting
by
the
top
voted
like
in
order
of
you
know,
in
order
of
what
was
voted
up
the
most
and
however,
in
each
case
the
canonical
is
always
on
the
default
sort,
because
I'm
always
concerned
about
duplicate
content
across
pages
when
it's
it's
simply
viewing
the
same
content
in
different
ways.
H
Does
google
place
you
know
a
lot
of
weight
on
the
canonical
or
is
there
still
a
chance?
Google
might
crawl
it
in
those
other
sort
orders.
We.
A
I
would
assume
that
we
sometimes
crawl
it
in
the
other
sort
orders,
but
we
would
probably
use
the
canonical
for
the
most
part
because
for
for
canonicalization,
we
take
into
account
different
different
factors,
and
that
includes
the
rel
canonical.
It
also
includes
things
like
internal
linking,
and
I
assume
the
internal
linking
will
also
go
to
that
default
sort
order
version
so
probably
most
of
those
signals
align
already
on
the
default
sort
order
version.
A
H
A
But
so
no,
I
don't
know
like
I.
I
thought
we
we
had
a
page
on
that,
but
I
think
we
might
not.
I
actually
have
finished
that
up,
but
essentially
what
happens
when
we
render
a
page?
Is
we
use
a
fairly
high
viewport
like
if
you
have
a
really
long
screen,
and
we
render
the
page
to
see
what
what
the
page
would
show
there
and
usually
that
would
trigger
some
amount
of
infinite
scrolling
in
whatever
javascript
methods
that
you're
using
to
to
trigger
the
internet
scrolling
and
whatever
ends
up
being
loaded
there?
A
That
would
be
what
we
would
be
able
to
index.
So
that's
something
where,
depending
on
how
you
implement
infinite
scroll,
it
can
happen
that
we
have
kind
of
this
longer
page
in
the
index.
It
might
not
be
that
we
have
everything
that
would
fit
into
that
page
because,
depending
on
how
you
trigger
infinite
scroll,
it
might
be
that
oh
you're
just
loading
the
next
page
kind
of
thing,
and
then
we
might
have
two
or
three
of
these
pages
loaded
on
one
page
with
infinite
scroll,
but
not
everything.
A
Thanks
all
right,
stefan.
I
Yes,
hello,
I
have
an
issue
with.
I
explained
it
in
the
in
the
comments
on
the
on
the
youtube
page
already,
where
we
have
a
website
which
has
been
ranking
for
several
years.
It's
a
translated
version
of
a
of
the
main
website.
I
I
A
I
Not
yet,
but
I
could
put
it
in
shout
out
in
the
chat
or
in
the
in
the
in
the
posting
here
in
the
chat.
I
A
I
Let
me
see
the
main
site
is:
has
been
online
since
2005..
The
translated
german
version
was
online
from
2010
until
the
end
of
2020,
and
it
has
been
offline
from
december
2020
to
june
2021.
A
Yeah,
okay,
I
it
feels
like
it
should
be
catching
up
in
the
meantime,
but
I
I
can
pass
that
on
to
the
team
to
to
double
check.
If
there's
anything
on
our
side,
that.
A
A
All
right,
let's
maybe
jump
to
some
of
the
submitted
questions
and
I'll
get
back
to
to
all
of
the
many
people
with
eurasians
as
well
towards
the
end,
and
I
have
a
bit
more
time
afterwards
as
well
to
kind
of
extend.
If
we
have
need
for
more
questions.
A
Let's
see,
is
there
good
ratio
or
consideration
on
google
side
when
it
comes
to
amount
of
pages?
In
other
words,
are
positions
of
high
traffic,
ranking
pages
hurt
by
many,
let's
say,
50
of
total
pages
on
a
domain
not
being
indexed
or
being
indexed
but
not
receiving
traffic?
From
our
point
of
view,
there's
no
specific
ratio
that
we
would
call
out
for
like
how
many
pages
a
website
should
have
or
how
many
indexable
pages
a
website
should
have
that's
ultimately
really
up
to
you.
A
So
that's
something
where
usually,
I
would
recommend
having
fewer
pages
rather
than
having
more
pages
and
that
kind
of
plays
across
the
board,
in
the
sense
that,
from
from
a
ranking
point
of
view,
we
can
give
these
pages
a
little
bit
more
weight.
From
a
crawling
point
of
view,
it's
easier
for
us
to
keep
up
with
these.
A
So
especially
if
you're
starting
off
with
a
new
website,
I
would
recommend
starting
off
small,
focusing
on
something
specific
that
you
want
to
achieve
and
then
expanding
from
there
and
not
just
going
in
and
saying.
Well,
I
have
500
000
pages
and
I
want
google
to
index
them
all
because,
especially
for
a
new
website,
when
you
start
off
like
in
such
a
big
number
of
pages,
then
chances
are.
A
Let's
see
when
using
url
inspection
in
search
console
on
established
highly
visible
pages.
How
concerning
is
it
that
the
referring
pages
listed
are
long
retired,
microsite
domains?
A
That
would
not
bother
me
at
all,
so
in
particular,
the
referring
page
in
search
in
the
inspection
tool
is
kind
of
where
we
first
saw
the
mention
of
your
pages,
and
if
we
first
saw
them
on
some
random
website,
then
that's
just
where
we
saw
them
and
that's
kind
of
what
we
list
there.
It
doesn't
mean
that
there's
anything
bad
with
with
those
pages.
A
So,
from
our
point
of
view,
that's
purely
a
technical
thing.
It's
like
well,
here's
where
we
found
it.
It's
not
a
sign
that
you
need
to
kind
of
like
make
sure
that
it
was
first
found
on
some
very
important
part
of
your
website.
A
A
I
usually
use
the
for
the
referring
page
when
you
run
across
pages
that
you
think
like.
Why
did
google
even
find
this
page
or
where
did
this
come
from?
If
there
are
weird
url
parameters
in
there
or
if
there's
something
really
weird
in
the
url,
where
you're
saying
well,
google
should
never
have
found
this
page
in
the
first
place,
then
looking
at
the
referring
url
is
something
that
helps
you
to
figure
out
like.
Where
did
this
actually
come
from?
A
Is
this
something
that
I
did,
or
maybe
some
random
person
on
the
internet
dropped
a
broken
link
to
my
website
somehow
and
that
got
picked
up
that
can
help
you
a
little
bit
to
figure
out?
Is
it
something
you
need
to
fix,
or
is
it
just
the
way
the
internet
is.
A
A
The
the
redirect
is
fine.
I
I
think
that's
a
good
practice
to
have.
What
I
would
also
do
is
just
make
sure
that
the
new
url
of
the
cycle
file
is
directly
submitted
in
search
console
or
at
least
in
the
robot
text
file,
so
that
we
can
go
directly
to
the
sitemap
file
and
that
we
don't
have
to
go
through
the
redirect.
I
I
don't
think
that
changes
anything
significantly.
It
just
makes
it
a
little
bit
cleaner
in
that
we
can
go
directly
to
sitemap
file
and
get
all
of
those
urls.
A
A
Let's
see
a
question
about
the
indexing,
api
and
e-commerce
sites.
I
think
we
touched
upon
that
before
the
indexing
api
is
really
just
for
job
postings
and
kind
of
these
live
events
in
the
search
console
report,
97
of
the
crawl
work,
requests
are
refreshed
and
only
three
percent
is
discovery.
How
to
optimize
this,
and
let
google
discover
more
pages.
A
We
don't
really
have
any
guidelines
on
how
how
that
balance
should
be
and
how
you
should
try
to
tweak
it
in
general,
it's
it's
kind
of
normal
for
especially
an
older,
more
established
website
to
have
a
lot
of
refresh
crawl,
because
we
will
look
at
like
the
the
amount
of
pages
that
we
know
about
that
grows
over
time
and
the
amount
of
new
pages
that
comes
in
tends
to
be
fairly
stable.
A
So
it's
it's
pretty
common,
especially
for
a
website
that
is
kind
of
established
and
just
kind
of
like
slowly
growing
to
to
have
a
balance
like
this.
That
most
of
the
crawling
is
on
the
kind
of
refresh
crawling
and
not
so
much
on
the
discover
crawling.
I
think
it
would
be
different
if
you
had
a
very
short-lived
website.
A
Maybe
I
don't
know
classifieds
or
kind
of
local
news
where
you
have
a
lot
of
new
articles
that
come
in
and
the
old
content
becomes
irrelevant
very
quickly,
then
I
think
we
would
tend
to
focus
more
on
discovery,
but
especially
if
you
have
something
like
an
e-commerce
site
where
you're
just
growing
the
amount
of
content
that
you
have
slowly
and
most
of
the
old
content
remains
valid.
I
I
tended
to
see
that
the
the
amount
of
refresh
crawling
there-
that's
probably
probably
going
to
be
a
bit
higher.
A
During
the
last
few
weeks,
I've
noticed
a
huge
drop
in
crawl
stats
from
700
to
50
per
day.
Is
there
a
way
to
understand
from
the
search
console
report?
What
could
be
the
cause
of
this
drop?
Could
it
be
source
page
load?
How
can
I
correct
correctly?
Read
the
crawl
request
breakdown
so
on
on
our
side
there,
there
are
a
few
things
that
go
into
the
amount
of
crawling
that
we
do.
A
On
the
one
hand,
we
try
to
figure
out
how
much
we
need
to
crawl
from
a
website
to
keep
things
fresh
and
useful
in
our
search
results
and
that
kind
of
relies
on
understanding
the
quality
of
your
website.
How
things
change
on
your
website?
We
we
call
that
the
the
crawl
demand
and,
on
the
other
hand,
there's
kind
of
the
limitations
that
we
see
from
your
server
from
your
website
from
your
network
infrastructure.
A
With
regards
to
how
much
we
can
crawl
on
a
website,
and
we
try
to
balance
those
too
and
the
restrictions
are
tend
to
be
tied
to
two
main
things.
On
the
one
hand,
the
overall
response
time
to
requests
to
to
the
website
and,
on
the
other
hand,
the
number
of
errors
specifically
server
errors
that
we
see
during
crawling.
A
So
if
we
see
a
lot
of
server
errors,
then
we
will
slow
down
crawling
because
we
don't
want
to
cause
more
problems.
If
we
see
that
your
server
is
getting
slower,
then
we
will
also
slow
down
crawling
because
again
we
don't
want
to
cause
any
problems
with
the
crawling,
so
those
are
kind
of
the
the
two
main
things
that
come
into
play
there.
A
The
the
difficulty,
I
think
with
the
speed
aspect,
is
that
we
have
two
essentially
different
ways
of
looking
at
speed,
and
sometimes
that
gets
confusing
when
you
look
at
the
crawl
rate,
so
specifically
for
the
crawl
rate.
We
just
look
at
how
quickly
can
we
request
a
url
from
your
server
and
the
the
other
aspect
of
speed
that
you
probably
run
into?
A
Is
everything
around
core
web
vitals
and
how
quickly
a
page
loads
in
a
browser
and
the
speed
that
it
takes
in
a
browser
tends
not
to
be
related
directly
to
the
speed
that
it
takes
for
us
to
fetch
an
individual
url
on
a
website,
because
in
a
browser
you
have
to
process
the
javascript
pull
in.
All
of
these
external
files
render
the
content
kind
of
recalculate
the
positions
of
all
of
the
elements
on
the
page,
and
that
takes
a
different
amount
of
time
than
just
fetching
that
url.
A
The
other
thing
that
comes
in
here
as
well
is
that
from
time
to
time
we
or
well
depending
on
what
what
you
do.
We
we
try
to
understand
where
the
website
is
actually
hosted
in
the
sense
that
if
we
recognize
that
a
website
is
changing
hosting
from
one
server
to
a
different
server
that
could
be
to
a
different
hosting
provider
that
could
be
moving
to
a
cdn
or
changing
cdns.
A
Anything
like
that,
then
our
systems
will
automatically
go
back
to
some
safe
rate,
where
we
know
that
we're
not
going
to
cause
any
problems
and
then
step
by
step,
increase
again
so
anytime
you
make
a
bigger
change
on
your
website's
hosting,
then
I
would
assume
that
the
crawl
rate
will
drop
and
then,
over
the
next
couple
of
weeks,
it'll
go
back
up
to
whatever
we
think
we
can
safely
crawl
on
our
website,
and
that
might
be
something
that
you
you're
seeing
here.
A
The
other
thing
is
that
from
time
to
time,
our
algorithms
to
determine
how
we
kind
of
classify
websites
and
servers
they
can
update
as
well.
So
it
can
certainly
happen
that
at
some
point,
even
if
you
don't
change
anything
with
your
hosting
infrastructure,
that
our
algorithms
will
try
to
figure
out
well,
oh
actually,
this
website
is
hosted
on
this
server
and
this
server
is
one
that
is
frequently
overloaded.
So
we
should
be
more
cautious
with
crawling
this
website
so
that
we
don't
cause
any
problems
and
that's
something
that
also
settles
down
automatically
over
time.
A
Usually
over
a
couple
of
weeks,
and
if,
if
that
were
the
case,
then
probably
things
will
will
settle
down
and
kind
of
get
back
into
a
reasonable
state.
Again,
the
other
thing
that
you
can
do
in
search
console.
You
can
specify
a
crawl
rate.
I
believe
it's
in
the
setting
per
site
and
that
helps
us
to
understand
that
you
have
specific
settings
specifically
for
your
website
and
we'll
try
to
take
that
into
account.
A
And
finally,
it's
like
so
many
things
that
come
into
this.
Finally,
one
thing
that
you
can
also
do
is
in
the
the
help
center
for
search
console.
We
have
a
link
to
reporting
problems
with
google
bot
and
if
you
notice
that
the
the
crawling
of
your
website
is
way
out
of
range
for
what
you
would
expect
it
to
be,
then
you
can
report
problems
with
googlebot
through
that
link.
A
A
So
many
things
about
crawling:
let's
see
we
recently
added
web
stories
to
to
my
website,
which
is
focused
on
marketing
a
destination.
I
put
the
name
of
the
country,
jordan
and
the
title
the
good
news
is.
I
got
thousands
of
impressions
on
them.
The
bad
news
is
that
it
drove
my
overall
click-through
rate
down
to
0.5
percent.
I
am
worried
that
this
low
click-through
rate
is
going
to
affect
my
search
results
for
the
whole
website.
A
Any
advice
it
will
not
affect
the
the
search
results
for
your
website,
so
that's
kind
of
like
the
the
first
thing
to
to
be
really
clear
about
it's.
It's
not
going
to
be
that,
like
some
pages,
have
a
very
low
click-through
rate.
Therefore
we
will
rank
this
website
lower.
That's
that's
not
how
it
works.
A
So
from
from
that
point
of
view,
I
would
look
at
these
web
stories
if
they
work
well
for
you
and
if
they
work
well
for
you
what
you're
trying
to
achieve
with
your
website,
I
I
would
keep
them
if
the
web
stories
themselves
don't
work
well
for
you,
then
maybe
tweak
them
or
find
a
different
format
to
use
there.
But
I
would
not
worry
that
the
performance
of
the
web
stories
somehow
relates
to
the
ranking
of
the
rest
of
your
content,
because
that's
that's
definitely
not
the
case.
A
I
I
think
with
web
stories
themselves
doing
seo,
for
them
is
sometimes
tricky
because
there's
very
little
text
on
them.
So
if
you're
getting
impressions
for
them,
then
you're
already
doing
some
some
really
good
work,
but
in
in
general
these
are
essentially
html
pages
they're
presented
in
a
slightly
different
way
in
the
search
results
and
then
discover,
but
it's
not
going
to
be
that
those
web
stories
will
negatively
affect
the
rest
of
your
website's
ranking.
A
I
see
you
also
have
your
hand
raised,
so
maybe
I'll
just
go
over
to
you.
If
there's
anything
more,
that
you'd
like
to
add
to
the
question.
J
J
J
A
I
don't
know
so
we
we
include
them
in
search,
console
and
as
kind
of
normal
pages
as
well.
So
there's
a
little
bit
of
information
there,
but
I
I
do
think
it's
it's
challenging
just
because
there's
so
little
textual
content
on
these
web
stories
and
from
a
search
ranking
point
of
view.
We
we
almost
need
to
figure
out
a
little
bit
better.
How
we
can
recognize
this,
like
small
amount
of
information
in
a
web
story
and
rank
them
appropriately,
because
they're
very
often
really
nice
to
look
at.
J
A
J
A
A
A
Let
me
see
some
more
questions
here.
How
quickly
does
google
consider
the
effect
of
internal
linking
in
rankings
in
terms
of
link
juice?
Is
it
recommended
to
highlight
new
content
with
channeling
link
juice
to
it?
Wow,
I
I
don't
know.
Lingus
is
always
one
of
those
terms
where
people
have
very
conflicting
feelings
about,
because
it's
not
really
kind
of
how
our
systems
look
at
it.
A
With
with
regards
to
internal
linking,
I
I
do
think
this
is
one
of
the
most
important
elements
of
a
website,
because
it's
a
great
way
for
you
to
tell
us
what
you
would
consider
important
on
your
pages
and
especially
like
most
websites
have
a
homepage
that
is
seen
as
the
most
important
part
of
the
website
and
especially
links
that
you
can
provide
from
those
important
pages
to
other
pages
that
you
think
are
important.
That's
really
useful
for
us,
and
it
can
be
that
these
are
temporary
links
too.
A
But
of
course,
if
you
remove
those
links,
then
that
kind
of
internal
connection
is
gone
as
well
and
with
regards
to
how
quickly
that
is
picked
up,
I
would
assume
that's
essentially
picked
up
immediately
as
soon
as
we
recrawl
and
re-index
those
pages.
So
it's
not
that
there's
any
kind
of
an
extra
latency
that
we
would
add
where
we
would
say
well.
This
internal
link
was
just
now
added,
therefore,
we'll
look
at
it
in
a
week
from
our
point
of
view.
A
The
next
question
I
have
in
the
list
is
like
super
long
and
we're
kind
of
low
on
time.
So
maybe
I'll
just
jump
back
to
some
of
the
live
questions
here
we
have
so
many
people
still
with
raised
hands.
Let's
eat
that
four.
K
Hi
john,
I
have
two
short
questions,
so
basically
the
first
one
is
about
a
smaller
market
that
doesn't
have
so
much
competition.
So
it's
not
united
states
right.
So
we're
making
a
lot
of
upgrades
on
our
site
like
adding,
frequently
asked
questions,
we're
getting
good
organic
links
on
our
pages
and
we
have
more
content
than
our
competition,
but
we're
firmly
placed
on
the
fifth
position
and
we
can't
move
now
for
months.
K
A
I
I
can't
I
mean
I
don't
have
anything
specific
to
kind
of
say
like
in
general,
if
you're
in
position,
five
and
you're
doing
a
lot
of
things.
You
should
also
do
this
when
it
comes
to
search
and
rankings,
there's
just
so
many
different
things
that
come
into
play
there.
So
it's
it's
not
really
like.
There's
this
one
trick
to
get
past
position.
Five,
essentially
it's
I
don't
know
a
function
of
all
of
the
the
competition
that
you
have
there.
A
All
of
the
other
signals
that
we've
collected
for
your
site
for
your
content,
and
so
it's
almost
like
keep
doing.
Good
things
is,
is
kind
of
the
best
advice.
K
Yeah,
because
some
results
in
front
of
us
are
just
translations
of,
say
english
sites,
which
is
automatic.
It's
not
even
you
know,
content
that
it's
deliberately
made
yeah
yeah.
So,
okay.
The
second
question
is,
we
started
a
new
page
about
two
months
ago
and
we
have
pages
that
are
discovered
but
not
indexed.
A
That
can
be
forever.
I
I
it's
something
where
we
we
just
don't
crawl
and
index
all
pages,
and
it's
it's
completely
normal
for
for
any
website
that
we
don't
have
everything
indexed
and
especially
with
a
newer
website.
If
you
have
a
lot
of
content,
then
I
would
assume
that
it's,
it's
expected
that
a
lot
of
the
new
content
for
a
while
will
be
discovered
and
not
indexed
and
then
over
time.
A
So
from
from
that
point
of
view,
it's
not
that
I
would
say
you
should
just
wait
a
little
bit
and
then
suddenly
things
will
get
better
with
crawling
and
indexing.
It's
more
that
like
continue
working
on
the
website
and
making
sure
that
our
systems
recognize
that
there's
value
in
crawling
and
indexing
more
and
then
over
time
we
will
crawl
and
index
more
okay.
Thank.
L
G
M
As
we
researched
I've,
we
found
that
some
competitors
they
are
doing
this
in
they
are
doing
the
subject
in
this
way
like
the
this,
their
category
page
is
like
www.
M
With
subdirectory,
like
canvas
dash
english
version
and
and
with
their
category
page
and
both
their
page
content
and
products,
url
on
this
category
page
are
all
starts
with
that
country.
Subdirectory.
M
So
we're
thinking
about
doing
it
in
this
way,
but
as
we
discussed
with
our
tech
team,
they
said
it
requires
maybe
more
infrastructure
inference
in
infrastructure.
So
maybe
we
think
if
we
can
like
only
set
the
display
url
in
subdirectory,
but
but
the
product
urls
on
the
category
page
is
the
exact
same
at
the
english
site.
M
A
M
A
I
I
think
it's
sub-optimal
because
it's
it's
almost
like
you're,
creating
half
of
an
international
setup
and
not
the
full
international
setup,
and
what
I
would
try
to
look
at
there
is
whether
or
not
the
the
traffic
from
individual
countries
to
those
category
pages
is
sufficient
enough
for
you
to
make
this
kind
of
a
change
in
the
sense
that,
if,
if
the
category
pages
themselves
are
not
super
visible
in
the
search
results,
then
probably
by
adding
the
international
aspect
just
to
the
category
pages
themselves.
A
It's
it's
almost
more
something
where
what
what
you
could
do
is
in
search
console,
look
at
kind
of
filter
for
the
category
pages
if
they
have
a
specific
url
pattern
and
see
how
much
traffic
they
get
from
individual
countries
and
then
based
on
that,
try
to
consider
what
what
would
change
if
the
traffic
from
maybe
the
the
top
three
countries
went
to
individual
country
versions
and
just
for
those
category
pages
to
try
to
figure
out.
I
I
could
imagine
like
it
will
be
visible
in
the
search
results.
A
But
is
it
visible
enough
to
be
worth
the
effort
that
you
put
into
it
because
it
it
could
very
well
be
that
you
put
a
lot
of
work
into
this
and
the
change
in
the
category
pages
for
visibility
internationally
is
minimal,
and
then
you
decide.
Oh,
I
don't
want
to
do
anything
for
internationalization,
because
it's
like
such
a
small
effect.
A
So
I
don't
know
it's.
For
my
from
my
point
of
view,
I
would
be
worried
that
you
put
a
lot
of
work
into
it
and
don't
actually
get
a
lot
of
new
value
out
of
it.
If
you
just
take
the
category
pages
and
not
like
think
about
what
what
the
primary
pages
are
for
for
the
website-
and
it
might
be
that
the
category
pages
are
actually
the
primary
pages
and
putting
the
internationalization
work
into
that
is
actually
the
most
important
part.
A
A
I
I
don't
hear
a
lot
of
feedback
from
people
saying
that
this
makes
a
big
difference,
so
I
don't
know
if
it's
actually
something
you
need
to
do,
especially
if
it's
a
more
kind
of
a
complex
setup,
but
I
would
try
to
make
it
so
that
it's
as
clear
as
possible,
which
country
is
relevant
for
the
individual,
urls
kind
of
with
a
clear
path
in
the
url
there.
I
think
there
was
a
question
someone
submitted
as
well
about
using
the
country
as
a
url
parameter
at
the
end.
A
Theoretically,
you,
you
can
do
that,
I
think
for
our
systems.
It
makes
it
a
lot
harder
to
recognize
which
urls
belong
to
which
country.
A
So
I
would
see
that
as
being
less
likely
for
us
to
pick
up
for
geo
targeting,
obviously,
if
you're
using
ahreflang,
then
that's
kind
of
less
of
an
issue
there,
because
you
can
do
that
on
a
per
url
basis.
M
Okay,
I
see
thank
you
so
another
two
questions
is
about
crawling
so
like
we
are
a
huge
e-commerce
site
and
as
we
checked
our
crew
report
and
we
found
that
there
are
huge
amounts
of
urls
in
the
discovered
but
unindexed
at
that
time
and
about
twice
about
highest
than
the
index
pages.
M
So
a
this
and
that
diction
of
our
side
problem
and
how
can
we
identify
the
problem
and
solve
it?
Because
we
think
it's
an
a
huge
issue
like
there
are
so
many
index
pages.
It
should
be
indexed
pages,
but
not
indexed,
and
we
also
checked
many
times
for
our
like
basic
technical
points
like
robot
texts
and
maybe
mental
texts
in
code
source.
A
So
in
particular,
we
find
urls
of
kind
of,
like
all
all
kinds
of
urls
across
the
web
and
a
lot
of
those
urls
don't
need
to
be
crawled
and
indexed
because
maybe
they're
just
variations
of
urls.
We
already
know,
or
maybe
they're,
just
some
random
forum
or
scraper
script
that
has
copied
urls
from
your
website
and
included
them
and
are
broken
away,
and
we
we
find
all
of
these
links
all
the
time
on
the
web.
A
So
it's
it's
very
normal
to
have
a
lot
of
these
urls
that
are
either
crawled
and
not
indexed
or
discovered
and
not
crawled,
just
because,
like
so
many
different
sources
of
urls
across
the
way.
A
So
what
I
would
do
is,
first
of
all,
try
to
maybe
download
a
list
of
a
sample
of
those
so
that
you
can
look
at
individual
examples
and
try
to
classify
which
of
those
urls
are
actually
ones
that
you
care
about
and
which
of
these
are
ones
that
you
can
ignore
and
anything
that
looks
kind
of
really
weird.
As
a
url,
I
will
just
ignore
it.
A
I
don't
think
you
need
to
do
anything
specific
for
that,
unless
you
can
tell
that
this
link,
this
url
was
found
within
your
website,
so
that's
kind
of
the
the
main
classifications
that
I
would
take
there
and
the
ones
that
you
do
care
about.
A
So
if
these
are
individual
products
or
categories
that
are
just
not
being
found,
try
to
figure
out
what
you
can
do
in
a
systematic
way
to
make
sure
that
all
of
these
urls
are
better
linked
between
each
other,
because,
especially
with
a
larger
e-commerce
site,
it's
it
can
get
tricky
because
you
kind
of
like
can't
look
at
every
url
individually,
all
the
time,
but
sometimes
there
there
are
kind
of
like
tricks
that
you
can
do
where
you
say.
Well
anything
that
is
first
level
category.
A
My
guess
is
that
you're
probably
doing
a
lot
of
this
right,
and
it's
just
that
with
a
very
large
website.
It's
it's
kind
of
normal
that
we
just
can't
pick
up
everything
so
to
some
extent
like
I,
I
would
kind
of
just
accept
that
as
well.
A
Google
can't
crawl
and
index
everything
one
of
the
things
you
can
do
to
help
mitigate
that
a
little
bit
is
if
you
recognize,
for
example,
that
these
are
individual
products
that
are
not
being
crawled
and
indexed
then
make
sure
that
at
least
the
category
page
for
those
products
is
crawled
and
indexed,
because
that
way,
people
can
still
find
some
content
for
those
individual
products
for
on
your
website.
So
it's
not
that
people
are
not
finding
your
website
for
those
queries.
M
I
say
so
yeah
we'll
try
a
few
several
methods,
you
said,
and
yes,
as
we
checked
few
samples
in
averages
and
jsc
backhand
and
most
pages
are
our
product
pages,
and
we
also
have
some
category
pages
that
that
are
on
index.
So,
basically,
these
pages
are
the
most
we
care
about
and
we
want
them
to
index.
I
think
there
is
no
like
there
is
no
url
like
some
unimportant
in
the
in
the
in
the
jsc
backhand,
the
google
chrome.
M
A
Yeah,
so
one
one
thing
I
I
would
also
try-
which
I
I
don't
know
perhaps
you're
already
doing-
is
to
see
if
you
can
crawl
your
website
yourself,
so
that
you
have
a
little
bit
more
direct
data
of
how
how
a
website
like
yours
can
be
crawled,
and
there
are
various
kind
of
crawling
tools
out
there
for
very
large
e-commerce
sites.
A
I
think
it's
tricky
just
because
there's
so
many
urls,
but
maybe
there's
a
way
for
you
to
kind
of
like
limit
it
to
one
segment
of
your
website
and
by
crawling
the
the
website
yourself.
You
can
also
see
which
of
these
urls
are
linked
like
very
far
away
from
the
home
page
and
which
of
these
are
linked
closer
to
the
home
page
and
based
on
that,
sometimes
you
can
tweak
the
site
structure
a
bit
to
kind
of
make
sure
that
things
are
reasonably
close
or
reasonably
stable.
With
regards
to
the
distance
from
the
home
page.
M
Oh,
I
see
sure
that's
that's
another
method
that
we
didn't
like
yeah.
We
didn't
think
of
so
based
on
this
situation.
Do
you
think
if
we
continue
to
like
add
new
product
pages
and
add
new
category
pages
to
our
website?
Will
these
new
pages
be
ranked
further
back
in
the
in
the
google
quick
like
they
will
rank.
A
It
depends
on
on
how
you
implement
them
or
integrate
them
within
the
rest
of
your
website.
So
just
because
something
is
new
doesn't
mean
we
will
crawl
it
less
frequently
it's
more.
We
try
to
figure
out,
like
is.
Is
this
prominently
linked
within
your
website?
So
is
the
internal
linking
kind
of
telling
us
this
is
important
or
not,
and
if
it's
important,
then
we
will
crawl
and
index
it,
even
if
it's
a
new
page,
so
just
because
something
is
new,
doesn't
mean
that
we
won't
crawl
and
index
it.
M
Oh
so
like,
if
we
build
new
pages
like
in
the
google
chrome
crazy,
maybe
they
are
not
in
the
at
the
later
position
than
this
existing,
not
index
yeah
yeah,
let's
go
to
here,
we'll
try
the
message
you
you
said
and
thank.
L
M
A
All
right,
I'm
going
to
pause
the
recording
here.
I
I
still
have
more
time.
It
looks
like
we
have
like
millions
of
people
raising
their
hands,
so
we
will
try
to
see
how
how
many
we
can
get
through.
Thank
you
for
sticking
around
for
so
long.
A
If
you're,
watching
this
on
youtube,
you're
welcome
to
to
join
us
live
in
one
of
the
future
episodes
I
tend
to
post
the
link
for
these
a
couple
days
ahead
of
time
in
the
community
section
of
our
channel,
so
feel
free
to
jump
in
there
if
they're
you're,
keen
on
asking
questions
in
person.