►
From YouTube: English Google SEO office-hours from August 13, 2021
Description
This is a recording of the Google SEO office-hours hangout from August 13, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
search.
A
bunch
of
questions
were
submitted
already,
but
we
also
have
people
here
live,
so
maybe
we
can
start
with
the
people
who
who
are
here
now
and
already
have
their
hands
raised.
Let's
see
michael,
I
think
you're.
First
on
the
list
good
morning,
john
hi.
B
My
question
is
around
core
web
vitals
and
in
search
console.
What
concerns
me
is
I've,
seen
an
increase
in
my
urls
that
have
been
bucketed
into
the
poor
category,
but
unfortunately,
with
pagespeed
insights,
it's
impossible
for
me
to
validate
that
or
corroborate
that
information.
Do
you
have
any
information
on?
B
A
Okay,
so
I
I
think
the
the
whole
topic
of
speed
is
surprisingly
complicated
in
that
you
look
at
it
and
has
like
three
measurements.
You're
like
oh,
I
can
figure
three
things
out,
but
they're
just
so.
B
Many
things
involved,
I'm
sorry
to
interrupt
john.
This
is
actually
a
cls
issue,
so
this
one
is
is
not
speed
but
similar,
similar
problem.
A
Yeah
exactly-
and
I
think,
maybe
first
of
all,
just
to
to
be
clear.
The
the
whole
page
experience
update
is
completely
unrelated
to
the
core
updates,
so
the
core
updates
that
you,
you
might
see
happening
or
that
that
we
announce
they're
more
about
understanding,
relevance
of
the
site
so
more
about
the
quality
and
the
content
of
the
site
and
the
page
experience
update
is
purely
on
kind
of
the
core
web
vitals
and
the
other
page
experience
factors.
So
that's
that's.
A
Maybe
one
thing
to
to
keep
in
mind
that
these
are
really
really
separate
the
in
search
console.
What
we
show
is
essentially
field
data,
which
we
also
call
real
user,
metrics
or
rum
data,
which
is
essentially
data.
That's
collected
from
users
when
they
go
to
your
website,
so
that's
essentially
kind
of
the
scale
on
on
which
we
we
operate
there.
The
I
don't
know
the
most
pure
measure
that
we
can
get
essentially
because
this
is
what
users
actually
see
and
page
speed,
insights
and
some
of
the
other
tools.
A
Have
this
kind
of
connection,
this
kind
of
a
screen
do
this
kind
of
thing
on
their
phones
and
we'll
try
to
estimate
what
what
those
metrics
could
be,
and
because
of
that,
you
might
see
differences
across
the
data
and
search
console
and
the
data
that
you
see
when
you
run
the
tests
yourself,
because
the
tests,
when
you
run
them
yourself,
are
essentially
estimations
and
in
search
console,
you
will
see
what
users
actually
saw.
So
that's
kind
of
the.
I
think
the
the
main
difference
there
and
that
applies
to
speed
cls
first
input
delay.
A
And
then,
when
you
test
it
yourself,
you
might
think
well,
this
metric
looks
okay,
but
what
users
actually
see
is
slightly
different
with
regards
to
cls,
I,
I
think
that's
that's,
sometimes
super
hard
to
to
pinpoint
what
what
I
would
recommend
doing.
There
is
trying
to
make
some
simplified
pages
and
trying
to
see
if
you
can
see
what
users
might
be
seeing
so
that
you
can
narrow
down
a
little
bit
which
elements
on
the
page
are
actually
causing
the
shift.
A
What
you
can
also
do
is
kind
of
instrument,
the
pages
yourself.
So
in
web.dev
we
have
some
javascript
libraries
that
you
can
use.
Where,
essentially,
you
add
the
code
to
your
pages
and
then
it
reports
through
google
analytics
or
some
other
analytics
tool
that
you
use
the
metrics
that
users
are
actually
seeing
that
lets.
You
get
data
a
little
bit
faster
because
in
search
console
because
of
the
way
that
we
aggregate
the
data,
that's
always
about
28
days
delayed.
A
So
if
you
make
a
change
now,
it
takes
almost
a
month
for
you
to
see
that
in
search
console
and
if
you
instrument
the
pages
yourself,
you
can
do
things
like
a
b
testing
and
track
that
you
can
see.
Essentially,
if
you
release
something
today,
you
can
check
tomorrow
what
users
actually
saw
there
and
you
can
narrow
things
down
a
little
bit
faster
and
once
you've
figured
out
which
elements
on
the
page
are
causing
this
cls
shift.
Then
you
can
kind
of
work
to
to
improve
that
on
the
side.
A
So
that's
kind
of
the
direction
that
that
I
would
go
there
kind
of
using
the
the
field
data
that
you
see
in
search,
columns
console
as
a
way
of
recognizing
oh
there's
an
issue
and
then
trying
to
narrow
that
down
iteratively
and
using
the
lab
test
to
kind
of
confirm
that
you're
on
the
right
track
there
and
to
try
to
reproduce
what
users
are
actually
seeing.
A
Cool,
let's
see
julian.
C
Hey
john
how's,
it
going
hi,
so
I
have.
I
have
a
two-part
question
about
google
evaluating
website
quality
so
in
the
past
office
hours
you've
talked
about
if
the
overall
quality
of
a
website
is
an
issue
specifically,
if
it
has
tens
of
thousands
of
pages
or
more
that
aggressively
de-indexing
low-quality
pages
is
a
good
idea,
which
makes
sense
so
removing
bad
pages
from
the
index
that
haven't
had
a
chance
to
meet
a
certain
quality
threshold.
C
So
we
have
a
website
in
that
camp
and
in
the
last
couple
months
we've
made
an
effort
to
do
just
that
and
in
past
office
hours,
you've
you've
said
it
can
take
months
or
even
up
to
half
a
year
for
google
to
recalculate
the
quality
and
treat
a
website
differently
in
search
so
question
one
would
be:
do
you
sort
of
stand
by
that
time
frame
and
can
anything
affect
the
time
frame
and
is
it
sort
of
correlated
to
the
size
of
a
website
and
then
a
quick
follow-up
for
question
two?
C
C
So,
for
instance
like
would
you
look
for
gradual
improvement,
or
would
it
sort
of
be
like
flat
for
for
many
months
and
then
you
sort
of
would
see
you
know
things
shoot
up
if
we
indeed
indeed
did
do
the
right
thing,
because
I
guess
it's
a
scary
proposition
to
sort
of
remove
thin
pages
from
your
index
and
then
wait
six
months
and
then
see
if
you've
done
enough.
You
know.
Does
that
all
make
sense.
A
Yeah,
I
I
I
think
the
timeline
is
is
probably
about
right.
I
I'm
sure
there
are
situations
where
it
can
go
a
little
bit
faster,
but
on
a
whole,
that's
that's,
probably
something
which,
which
takes
that
I
don't
know
half
year
or
longer,
almost
for
things
to
to
kind
of
settle
down,
and
it's,
I
think,
partially,
due
to
kind
of
us
needing
to
reindex
everything
on
the
site,
understanding
the
site
again
partially,
also
just
because
all
of
the
the
quality
signals
that
we
collect.
A
They
just
take
a
long
time
to
to
kind
of
be
built
up,
and
so
that's
something
where
a
couple
of
months
half
a
year,
maybe
even
a
bit
longer.
I
I
think
that
would
be
kind
of
the
the
norm.
I'm
sure
there
are
situations
where
it
can
be
faster,
but
for
for
the
most
part,
it's
is
probably
in
that
range
with
regards
to
recognizing,
when
you're
kind
of
on
the
right
track.
A
I
I
think
that's
super
hard
because
on
on
the
one
hand,
you
have
the
situation
where
you're
moving
pages,
so
you
might
see
kind
of
like
a
drop
in
traffic
from
those
pages.
I
think,
overall,
if
you
look
at
the
metrics
for
your
site,
the
pages
that
you
remove
are
probably
pages
that
don't
get
a
ton
of
page
views.
A
So
maybe
that's
something
that
something
that
wouldn't
pull
down
your
overall
traffic
from
search
or
not
not
that
much,
but
it
takes
a
long
time
for
things
to
kind
of
start,
seeing
more
and
more
traffic
and
kind
of
being
picked
up
as
being
higher
quality.
So
one
thing
I
would
try
to
do
is
to
find
try
to
find
some
other
proxy
metrics
that
you
can
use
for
recognizing
the
quality
of
your
site,
and
that
could
be
something
like
like
looking
into
analytics
and
looking
at
things
like,
I
think
like
they.
A
They
call
it
the
engagement
right
now
or
other
metrics,
where
you
have
kind
of
time
on
site,
where
you
have
something
where,
when
you
look
at
the
metrics
now
or
when
you
compare
to
the
metrics
that
you
had
maybe
a
month
ago
before
you
started
working
on
this,
you
can
tell
that
actually,
those
were
pretty
bad.
And
with
these
improvements
we
see
kind
of
this
shift
in
user
behavior.
A
And
it's
not
so
much
that
we
would
use
that
user
behavior
directly
in
search,
but
it.
But
it's
it's
something.
That's
more
like
a
leading
indicator
for
you
to
let
you
know
that
you're
on
the
right
track,
so
that's
kind
of
the
direction.
I
would
look
at
it
there.
On
the
one
hand,
kind
of
you
shouldn't
see
a
big
drop
from
the
pages
that
you
remove.
A
C
Okay
and
thanks
that
that
really
helps
so
would
would
one
strategy
that
you
recommend
be.
Could
you
test
a
small
subset
of
pages,
take
some
of
the
better
quality
ones
and
maybe
do
like
a
campaign
around
them
to
sort
of
get
them
shared
around
a
little
bit
if
you
believe
and
that
they're
good
quality?
Would
that
expedite
the
reconsideration
of
just
those
pages
by
any
chance
or
is
it
kind
of
more
like?
No?
If
you
have
a
website
quality
issue
as
a
whole,
those
might
see
a
suppression
too,
no
matter
what.
A
I
I
think
for
tracking
the
metrics
yourself
if
you're
looking
at
analytics.
That's
that's,
definitely
a
good
way
to
look
at
it
where.
D
A
You
can
drive
traffic
there
and
say
it's
like.
Instead
of
I
don't
know,
50
people
a
day
coming
to
these
pages,
you
have
a
thousand
people
coming
to
those
pages.
You
have
a
little
bit
clearer,
metrics
from
an
seo
point
of
view
from
a
kind
of
a
google
search
quality
point
of
view.
I
I
don't
think
taking
a
a
small
subset
of
a
of
a
site
and
improving
the
quality
of
that
would
be
enough
to
to
make
us
say.
A
A
I
think,
for
the
most
part,
we
would
look
at
it
or
try
to
look
at
it
as
as
an
overhaul
thing,
and
it
might
be
that
the
small
set
that
you
improved
kind
of
moved
the
overall
average
up
a
little
bit
higher.
But
that's
not
something
I
I
would
kind
of
wait
for,
because
if
you're
talking
about
a
half
a
year
and
you
improve
10
of
your
pages
and
after
half
a
year,
you
realize
oh,
I
need
to
improve
even
more
like
you,
you've
kind
of
wasted
a
lot
of
time.
There.
E
Hey
john,
a
couple
of
months
ago,
on
one
of
these
hangouts,
I
was
asking
some
questions
about
the
book
structured
data
and
you
know
getting
an
entry
point
in
the
search
results.
So
we
went
away
and
we
made
some
changes
to
how
we
had
implemented
that
and
made
it
all
correct,
because
there
was
some
errors
and
you
very
kindly
the
last
time
we
spoke
offered
to
give
the
internal
team
a
nudge
or
basically
just
give
someone
a
heads
up
to
check
the
inbox
on
that
google
form.
A
Probably
easiest
just
to
send
me
a
short
tweet
and
some
kind
of
details
of
what
what
it
actually
was
so
so
I
can
double
check
what
what
they
said
there
yeah
amazing.
Thank
you
very
much
that
was
it
short
and
sweet
or
or
drop
some
information
here
in
the
chat.
Then
I
can
pick
that
up
afterwards.
F
Cool
thanks,
john,
so
some
background,
our
our
systems
and
content
delivery
network
are
designed
to
let
all
real
users
view
our
content
and
filter
out
some
bots,
while
letting
others
like
googlebot
through
and
just
for
some
additional
context.
Earlier
this
year
we
changed
our
server
monitoring
suite
and
we
thought
we'd
carried
over.
All
of
the
reporting
needs.
We
had
to
ensure
that
you
know
googlebot
could
access
our
content.
Unfortunately,
it
seems
we've
missed
some
and
noticed
in
search
console
some
500
series
errors
cropping
up
starting
late
last
month.
F
The
question
came
up
from
our
technology
team,
whether
this
represented
real
user
impact
and
why
we
would
look
specifically
at
googlebot
and
not
real
user
metrics
to
prove
that
there's
an
issue
here.
So
so,
given
that
context,
I
have
a
few
questions.
The
first
is
just
to
get
the
technology
concern
out
of
your
out
of
your
out
of
the
way.
From
your
perspective,
how
does
googlebot
view
500
series
errors,
and
could
you
give
any
clarity
on
you
know
established
thresholds
at
which
point
googlebot?
F
Will
you
know
crawl
source
content
based
less
based
on
those
errors.
A
We
don't
have
any
strong
thresholds
on
that,
but
essentially,
what
happens
with
500
errors
is
we'll
try
to
retry
them,
and
if
we
continue
to
see
that
the
500
errors,
then
we
will
kind
of
slow
down
crawling.
And
if
we
continue
to
see
that
they
ate
their
500
errors,
then
we
will
drop
those
urls
from
what
they
index.
A
So
that's
something
where
if
every
now
and
then
individual
pages
have
a
500
error,
it's
like
no
big
deal
like
we
will
retry
them.
They'll
remain
indexed
and
the
next
time
we
retry
them.
That's
fine,
but
if
a
large
part
of
a
site
consistently
has
500
errors
and
we
might
assume
that
maybe
we're
causing
the
problem
and
we'll
slow
down
crawling
of
the
whole
site
and
at
some
point
we'll
say
well,
it
looks
like
these
pages
are
really
gone.
A
We're
going
to
drop
them,
so
that's
essentially
the
the
effects
that
you
would
see
there
and
if
you're
talking
about
a
large
site
and
wondering
like
what
percentage
of
500
errors
is,
is
okay,
I
I
don't
know
my
my
feeling
is,
if
you're
seeing
something
more
than
one
percent,
then
that's.
That
sounds
like
something
is,
is
kind
of
broken
and
probably
would
be
something
where
we
would
start
to
slow
down.
But
I
I
don't
think
we
have
any
hard
thresholds
where
we'd
say
like
this.
F
Okay,
cool.
Thank
you.
Also,
our
server
logs
show
a
200
given
to
googlebot
at
the
same
time
stamp
as
the
504
in
search
console.
Our
content.
Delivery
network
is
telling
us
that
if
googlebot
gets
a
504
from
a
cdn,
then
they'll
automatically
try
to
fetch
from
the
origin.
Could
you
confirm
deny
or
possibly
confuse
that
for
me.
A
I
I
don't
think
we
do
anything
special
with
regards
to
504,
but
I
kind
of
need
to
I'd
have
to
double
check
so
in
in
the
search
developer,
documentation
we
just
put
up
a
page
with
all
of
the
http
status
codes
and
how
we
react
to
that.
I
I
don't
think
we
have
anything
where
we
would
say
we
would
defer
to
the
origin
rather
than
a
cdn,
because
we
don't
actually
see
that
difference
because
from
from
our
point
of
view,
we
access
the
domain
name.
F
A
A
G
Daniel
yeah,
so
I've
got
a
I've,
got
clients
currently
going
through
a
site.
Migration
and
they've
changed
the
architecture
quite
a
bit.
I
shifted
some
things
around.
I
just
wanted
to
share
an
example
of
the
current
site
on
the
new
site,
so
in
the
chat
on
center
of
the
current
site.
So
if
you
navigate
to
that
page,
you'll
see
that,
on
that
page,
their
subcategory
pages
are
split
into
tiles,
so
you've
got
all
the
dc's
adhesives,
adhesive,
vinyl,
foil
street
yeah
sheets,
etc.
G
So
these
are
all
on
page
links
and
then,
if
you
look
at
the
current
stage
inside
the
doing
away
with
those
internal
links
for,
for
that
page,
say
all
across
all
those
top
category
pages
and
then
insert
them
into
the
main
navigation.
So
my
main
concern
going
through
a
site
migration
or
go
through
a
site.
Migration
is
always
a
tense.
I
think
I
feel
like
an
intense
experience
to
make
sure
that
we
don't
lose
any
visibility
that
I'm
kind
of
you
know.
G
You
know
the
old
sites
really
really
talking
to
the
the
current
one.
So
my
question
is
around
how
you
handle
links
and
navigation
as
opposed
to
on-page
links.
You
know
my
understanding
is
that
if
we've
got
let's
say
a
lot
of
backlinks
and
natural
backlinks
that
are
going
to
that
category
page
all
that
link
acquisition.
G
You
know
so
that
all
all
those
all
that
link
equity
is
getting
split
between
all
the
links
off
the
page,
but
if
we
don't
no
longer
have
those
links
on
that
specific
page
and
we've
got
them
into
the
navigation.
Is
google
really
treating
the
navigation
as
the
same?
Because
I
would
have
thought
that
all
of
a
sudden,
those
individual
links
now
become
thousands
and
thousands
of
links
into
the
subcategory.
G
So
my
main
concern
is
that
we
go
from
one
side
to
the
other
and
then
we
start
to
see
rankings
drop
because
that
page
no
longer
doesn't
necessarily
doesn't
have
the
power
that
it
did,
because
it's
now
not
a
link
off
a
page
but
a
a
link
off
another
off
of
the
navigation.
So
I
kind
of
wanted
to
get
a
good
idea,
because
I'm
sure
they'll
absolutely
kill
me
if
we
do
lose
any
visibility
during
this
process.
So
kind
of
any
information
you
can
give
me
is
guidance
at
the
minute.
G
I've
advised
him
to
revert
back
to
the
tiled
type
of
way,
because
it's
a
collection,
type
pages-
it's
shopify,
probably
won't
work
so
yeah,
so
it'd
be
good
to
get
your
your
kind
of
feedback
and
guidance
on
on
on
the
the
different
structures.
G
A
A
What
what
I
would
watch
out
for
is
to
try
to
avoid
changing
the
urls.
So
the
with
regards
to
the
urls
that
you
had
there.
One
is
kind
of
like
a
page,
slash
adhesives
and
the
other
one
is
collection,
slash
all
adhesives,
so
you're
changing
page
urls
there.
That's
something
that
always
kind
of
means.
We
we
have
to
kind
of
re
restructure
what
we
have
indexed
for
your
site
and
kind
of
understand.
What
the
context
of
the
new
pages
is.
A
You
have
to
make
sure
that
the
redirect's
in
place
or
what
have
you
there
so
that's
kind
of
the
the
one
thing
there
with
regards
to
moving
those
links
from
a
collection
page
to
the
navigation,
where,
probably
that
navigation
is
shared
across
the
whole
site,
I
I
think
both
of
those
approaches
could
work.
What
probably
would
happen
here
is
the
the
category
pages
would
lose
a
little
bit:
visibility
in
favor
of
the
kind
of
lower
level
pages
that
you're
linking
to
directly
from
the
navigation,
because
it
flipped
then.
G
A
G
G
Well,
I
would
have
thought
that
you
know
all
the
main
category
pages
will
be
in
the
navigation
and
then
the
sub
category,
but
they
have
say
3
000
over
3
000
collection
pages,
so
they
all
won't
fit.
So
these
other
pages
will
be
on
like
an
a
to
z,
page
as
well,
and
that's
another
thing
that
concerns
me,
because
now
we're
kind
of
taking
these
urls
and
put
it
into
the
corner
of
the
website
all
linking
off
one
page.
So
I
think
that
that
might
be
an
issue.
G
A
Make
sure
that
the
lower
level
pages
kind
of
the
the
long
tail
is
a
lot
stronger
and
you're
kind
of
balancing.
I
don't
know
like
almost
like
short
tail
versus
long
tail
and
that's
it's
almost
like
a
strategic
decision
where
you
there's
no
right
or
wrong,
but
it's
more
like
well
in
your
situation
in
your
market
with
your
site.
What
makes
more
sense,
do
we
need
to
concentrate
more,
do?
A
Can
we
already
kind
of
spread
out
a
little
bit
broader
and,
from
my
point
of
view,
it's
not
a
matter
of
technically
correct,
because
both
of
these
could
be
technically
correct.
It's
more
like
what
do
you
want
to
focus
on.
G
Yeah,
so
we
would
just
say
that
if
these,
these
we'll
call
them
subcategory
pages,
but
if
they
went
into
the
navigation
that
you
you
know,
google
sees
them
as
more
important
because
we've
put
them
in
the
main
navigation,
so
in
theory
fingers
crossed
they
wouldn't
lose
visibility,
maybe
they
might
gain,
or
at
least
maintain
what
they've
got.
A
I
I
mean
if
so
so
I
I
don't
know
the
whole
site.
So
it's
it's
hard
for
me
to
to
say
offhand
just
by
looking
at
these
two
pages,
but
I
I'm
assuming
that
the
main
category
pages
were
already
linked
across
the
site
on
the
old
version.
A
The
new
one,
the
main
category
pages,
will
be
linked
as
well
as
subcategories
and
maybe
even
products,
so
something
along
that
line
yeah.
So
from
from
my
point
of
view,
what
would
happen?
There
is
the
main
category
pages
because
they're
a
part
of
this
bigger
set
of
pages
that
are
now
linked
across
the
site.
They
would
lose
a
little
bit
relative
strength,
so
some
of
that
strength
would
be
kind
of
distributed
to
the
other
pages.
So
it's
not
that
the
main
pages
would
remain
the
same,
because
they're
still
linked
from
the
same
pages.
A
G
You're
saying
yeah:
that's
interesting:
okay,
okay
and
it's
not.
Is
it
considered
thousand
sites
to
take
up
so
much
time,
but
is
it
considered
thousands
because
again
it's
every
single
page
is
a
new
url
and
we've
got
a
hundred
thousand
pages.
So
is
that
a
hundred
thousand
internal
links
or
are
you
just
looking
at
them?
You
know
it's
a
navigation,
so
you're
just
looking
at
it.
I
don't
know.
A
I
think
we
we
would
still
count
them
as
internal
links,
but
I
I
wouldn't
worry
so
much
about
the
number.
So
one
of
the
things
that
that
I
I
like
to
do
with
my
test
sites
is
to
to
do
something
like
running
screaming
frog
over
them
and
generating
that.
I
don't
know
that
the
tree
graph
or
something
they
have.
D
G
That's
very
interesting.
Thank
you
very
much.
I
appreciate
that.
Thank
you.
That
gives
me
something
to
go
back
with.
A
Okay,
cool,
let
me
go
through
some
of
the
submitter
questions,
I'll
I'll,
get
to
to
the
rest
of
you
as
well.
Don't
worry
and
we
we
have
a
little
bit
more
time
to
afterwards
to
kind
of
go
through
the
remaining
stuff
as
well,
but
just
to
make
sure
that
the
submitted
questions
don't
get
ignored.
Twenty
percent
of
my
pages
are
not
getting
an
x.
It
says
they're
discovered
but
not
crawled.
A
A
So
usually,
if
we're
talking
about
a
smaller
site,
then
it's
not
mostly
not
a
case
that
we're
limited
by
by
the
crawling
capacity,
which
is
kind
of
the
crawl
budget
side
of
things.
If
we're
talking
about
a
site
that
has
millions
of
pages,
then
that's
something
where
I
I
would
consider
looking
at
the
the
crawl
budget
side
of
things,
but
smaller
sites,
probably
less
so
with
regards
to
the
quality
when
it
comes
to
understanding
the
quality
of
a
website.
A
That
is
something
that
we
take
into
account
quite
strongly
with
regards
to
crawling
and
indexing
of
the
rest
of
the
website,
but
that's
not
something,
that's
necessarily
related
to
the
individual
urls.
So
if
you
have
five
pages
that
are
not
indexed
at
the
moment,
it's
not
that
those
five
pages
are
the
ones
we
would
consider
low
quality.
A
If
you
have
a
smaller
site
you're,
seeing
that
a
significant
part
of
your
pages
are
not
being
indexed,
then
I
would
take
a
step
back
and
try
to
reconsider
the
overall
quality
of
the
website
and
not
focus
so
much
on
technical
issues
for
those
pages,
because
I
think,
for
the
most
part
sites
nowadays
are
technically
kind
of
reasonable.
If
you're
using
a
common
cms,
then
you
it's
really
hard
to
do
something
really
wrong
and
it's
often
more
a
matter
of
kind
of
the
the
overall
quality.
A
The
other
thing
to
keep
in
mind
with
regards
to
indexing
is
it's
completely
normal
that
we
don't
index
everything
off
of
a
website.
So
if
you
look
at
any
larger
website
or
any
even
mid-size
or
smaller
website,
you'll
see
fluctuations
in
indexing,
it'll
go
up
and
down,
and
it's
never
going
to
be
the
case
that
we
index
100
of
everything.
That's
on
a
website.
So
if
you
have
100
pages-
and
I
don't
know,
80
of
them
are
being
indexed,
then
I
wouldn't
see
that
as
being
a
problem
that
you
need
to
fix.
A
A
Let's
see,
question
about
home
page
versus
landing
page
when
it
comes
to
seo
rankings.
If
I
have
a
small
brochure
or
website
should
I
be
focused
on
keywords,
I
want
to
rank
for
for
on
the
home
page,
for
example,
would
I
want
to
rank
for
roofing,
company
and
location?
I
could
set
up
the
home
page
as
construction
company
with
different
surfaces,
but
I
also
want
to
rank
for
roofing.
Is
there
best
practice
to
follow
for
these
situations?
A
A
There
makes
a
lot
of
sense,
and
sometimes
it
makes
sense
to
pick
an
area
where
there,
maybe
there's
not
a
lot
of
competition,
maybe
where
you're
one
of
the
few
companies
in
that
area
that
offers
that
service
just
so
that
you
can
kind
of
slowly
build
up
your
reputation
and
the
things
that
you
do
on
your
website
for
those
kind
of
queries
and
then
from
there
you
can
kind
of
expand.
So
that's
that
would
be
generally
my
recommendation
here
when
it
comes
to
smaller
businesses.
A
A
Let's
see
what
would
be
a
better
strategy
having
a
404
page
for
pages
that
don't
exist
or
redirecting
any
non-existent
page
to
the
home
page.
Our
website
is
recovering
from
a
malware
attack
in
which
tens
of
thousands
of
pages
that
were
made
were
redirecting
to
some
shady
website.
We're
fixing
it.
But
now
we
probably
have
150
000
pages
with
four
or
four
errors
in
search
console
yeah,
I
don't
know
recovering
from
malware,
is
always
super
annoying
and
a
lot
of
work.
A
So
hopefully
it
goes
well,
and
hopefully
you
can
lock
things
down
to
avoid
this.
This
kind
of
situation
in
the
future
in
in
practice,
in
a
situation
like
that,
I
don't
think,
there's
any
big
difference
with
regards
to
doing
a
404
page
or
redirecting
to
their
home
page.
If
you
redirect
to
the
home
page,
we'll
probably
see
that
as
kind
of
a
404
page
for
the
page
that
was
redirecting
there
and
we
would
treat
that
similarly,
so
we
we
would
call
this
a
soft
404
page.
A
I
I
think
in
practice
both
of
these
will
be
similar
in
terms
of
the
time
that
it
takes
to
to
settle
down
what
what
I
I
would
try
to
do
in
a
case
like
this
is
focus
on
the
most
important
pages
of
your
website
and
make
sure
that
all
of
those
work
really
well
and
make
sure
that
those
are
updated
in
search
and
all
of
the
rest,
I
I
don't
know
probably
404
is-
is
the
easiest
approach
here,
where,
if
you
just
remove
those
pages,
it
returns
404
by
default,
and
I
would
just
give
that
maybe
a
half
a
year
or
even
longer,
to
kind
of
settle
down
and
drop
out
of
the
index.
A
So
it's
easy
to
kind
of
get
dragged
into
a
situation
where
you're
focusing
on
an
issue
that
actually
users
don't
see
in
practice
and
that's
something
to
to
kind
of
watch
out
for
I
think
here.
So
I
I
think
404
or
redirecting
both
of
those
are
okay.
A
Some
people
also
use
sitemap
files
to
make
sure
that
we
crawl
them
a
little
bit
faster.
I
don't
think
that
really
matters
when
you're
talking
about
150
000
pages
that
used
to
have
malware
on
them,
I
I
would
just
let
them
drop
out.
Naturally,
let's
see
long
question-
and
this
is
from
mike-
and
I
see
you're
up
next
mike.
Maybe
you
can
just
run
us
through
it
briefly.
H
Yeah
sorry,
I
tried
to
put
a
short
thing
down,
but
I
just
couldn't
so
probably
best.
H
If
I
just
I
probably
will
do
quicker
like
this,
so
I'm
just
trying
to
decide
whether
it's
worthwhile,
creating
unique
urls
for
each
country,
that
the
site
is
relevant
for
the
site's
relevant
for
say,
like
five
english-speaking
countries
and
at
the
moment
there's
just
one
url
and
since
the
majority
of
the
traffic
is
from
the
us,
that's
fine,
and
if
the,
if,
if
the
user's
not
from
the
us,
then
there's
a
pop-up
which
says
click
here
and
it
will
load,
then
the
their
country's
content
and
set
a
cookie.
H
So
that
that's
not
a
problem
in
future.
And
now
I
was
I
came
across,
you
were
talking
around
in
those
webmasters
from
2016,
where
you're
saying
like.
If
you've
got
lots
of
very,
very
similar
content
or
maybe
you're
talking
about
like
identical
content.
It
may
not
be
the
best
idea
to
create
like
unique
urls
for
different
countries
and
to
use
hreflang
and
and
geotargeting
as
well.
I'm
just
I've
done
lots
of
research
on
this.
I'm
still
like
just
completely.
I
don't
know
whether
it's
going
to
be
counterproductive
to
do
this.
H
What
I
seem
to
understand
is
it
basically
just
swaps
out
so
there'll
be
no
ranking
boost
and
so
that
the
only
benefit
there
would
be
an
improved
in
bounce
rate
or
a
lowering
bounce
rate
which
could
improve
ranking,
which
may
be
worth
it
may
not
with
regards
to
the
tech
side
of
things,
but
with
regards
to
the
geo
targeting,
I
guess
the
first
question
is
with
regards
to
like
some,
if
else,
to
use
subfolders
by
geo
targeting
what
does
that
just
mean
setting
up
the
country
in
international
targeting
in
the
search
console
to
each
country,
and
would
that
prove
and
given
that
there
will
be
a
lot
of
duplicate
content
across
each
country?
A
H
It's
a
it's
got.
Reviews
for
tv
shows
on
streaming
services,
so
it
would
be
so
you
know
like
someone
will
be
searching
for,
like
you
know,
good
horror,
movies
on
netflix
or
something
like
that
which
could
be.
You
know.
Most
of
the
time
I
mean
sometimes
they'll
add
in
their
country.
In
there
and
and
a
lot
of
the
pages
are
you
know,
they'll
they'll
have
the
review,
but
they'll
be
the
list,
but
the
list
will
change,
because
the
availability
will
be
different
across
each
thing.
So
that
means
it
definitely
needs
to
be.
H
That
means
it'd
be
very
useless
to
the
user
to
have
you
know
for
them
to
be
like.
Oh,
this
is
available,
but
but
there
will
be
a
significant
amount
of
duplicate
content,
because
a
lot
of
the
shows
are
the
same
across
each
particularly
english-speaking
countries.
A
Now
so
what
maybe
taking
the
the
easiest
one
first
when
it
comes
to
geo
targeting
that
would
help
us
with
ranking
and
boost
the
ranking
when
we
think
that
a
user
is
looking
for
something
local.
A
So,
for
example,
if
someone
is
searching
for
washing
machine
manuals,
then
we
could
just
show
kind
of
washing
machine
manuals
globally,
they're,
not
looking
for
something
local,
but
if
they're
looking
for
washing
machine
repair,
then
obviously
they're,
probably
looking
for
something
more
local
and
in
a
case
like
that,
we
would
try
to
show
the
the
more
local
content
when
it
comes
to
kind
of
reviews
of
tv
shows
or
movies.
A
I
suspect
there's
not
that
much
of
a
kind
of
a
local
intent
when
it
comes
to
the
queries
there.
So
probably
you
would
not
see
a
significant
change
with
regards
to
ranking
with
regards
to
geo
targeting
there.
So
that's
that's
kind
of
my
my
assumption
there.
I
I
don't
know
the
the
market
that
much
if
the
people
are
actually
looking
for
something
local
when
they're
looking
for,
I
don't
know,
horror,
movie,
reviews
or
something
like
that,
it
might
be.
A
But
oh
my
offhand,
I
would
assume
it's
more
of
a
global
thing
right,
so
so
geotargeting
probably
doesn't
make
that
much
sense
and.
H
A
H
A
I
I
think
it
could
because
you're
kind
of
diluting
the
the
value
of
your
content
across
all
of
those
different
country
versions
yeah.
But
it's
also
something
where
I
I
think
like
in
your
situation.
Probably
geo-targeting
wouldn't
make
much
sense
there:
okay,
good!
When
it
comes
to
hreflang
I
I
could
see
that
being
something
that
helps
there.
A
But
you
also
have
the
situation
there
that
you
essentially
create
separate
urls
and
they
have
to
be
indexed
separately
and
they
have
to
be
ranked
separately
and
when
we
show
them
in
search,
we
can
swap
them
out,
but
they
have
to
be
kind
of
like
standing
on
their
own
and,
if
you're
taking
something
like
a
horror,
movie
review
page
and
just
creating
different
country
versions.
Of
that,
I
could
imagine
that
that
kind
of
reduces
the
the
strength
of
your
pages
relative
to
to
the
rest
of
the
competition
that
that
is
out
there.
A
So
if,
if
you're
like
by
far
number
one
for
this
kind
of
a
query,
then
going
off
and
doing
hreflang
for
different
country
versions
like
go
for
it,
but
if
you're,
really
in
in
a
strong
competition
area
where
maybe
you're
ranking
number
five
or
number
10
in
the
search
results
page
then
adding
ahref
lang,
there
just
makes
it
a
lot
harder
for
you
to
kind
of
grab
more
foothold
there.
A
So
that's
that's
kind
of
the
direction
I
would
hit
there.
The
nice
thing
about
atrial
flange,
though,
is
you
can
do
it
on
a
per
page
basis,
so
you
can
try
this
out
for
individual
areas.
You
can
try
it
out
for
things
like
maybe
detail
pages
on
your
site,
maybe
the
home
page,
maybe
specific
categories
and
just
kind
of
see
what
what
happens
there.
A
Is
it
like
really
something
where
you
see
a
change
in
user
behavior
when
they
go
to
those
pages,
if
they're
coming
from
the
right
country
or
is
it
something
where,
like
you,
don't
see
a
significant
change?
If
you
don't
see
a
significant
change
in
how
users
react,
then
probably
it's
not
worthwhile
to
go
off
and
implement
that
yeah.
H
H
There's
a
problem
with
the
server
and
there's
a
very
long
response
time,
which
google
is
like
really
low
at
the
crawl
rate,
which
is
also
annoyingly.
At
the
same
time,
there's
been
lots
of
thin
and
duplicate
content
pages
being
indexed
to
get
the
core
rate.
Is
it
better
to
leave?
The
setting
in
you
know
to
manually
set
the
core
rate
to
be
the
maximum?
If
that
is
above
what
was
previously
being
crawled
or
is
it
better
to
let
google
just
do
some
tests
itself
using
its
customer
crawler.
A
I
think
both
would
work.
I
I
suspect,
if
you
set
it
to
to
let
google
figure
it
out.
It'll
pick
up
a
little
bit
faster
because
it'll
be
able
to
go
past,
that
setting
that
you
apply
there,
but
both
of
those
should
work,
and
it's
also
something
where,
as
soon
as
we
can
recognize
that
your
server
is
actually
a
lot
faster,
then
usually
within
a
couple
of
days,
maybe
a
week
or
so,
we
will
actually
pick
up
crawling
anyway.
H
Cool
is
it
all
right,
so
I
have
just
a
final
question.
Sorry
to
have
lots
of
thin
and
duplicate
content,
no
indexed,
or
is
that
going
to
harm
the.
A
Sure,
maybe
we'll
just
go
and
move
on
to
to
live
questions
again.
Contact
mentor.
I
I
So
the
thing
is
when
the
name
of
my
site
is
searched
like
contact
window
so
that
time
there
is
another
site
which
is
kind
of
writing
a
review
on
my
site,
so
the
review
is
kind
of
like
very
spammy
like
it
is
kind
of
written
like
my
website
is
from
suspicious
domain
provider,
command,
registrar
and
sure
low
ratings
and
also
kind
of
without
any
evidence
they
are
showing
kind
of
a
wrong
thing
and
also
like
in
terms
of
domain
register,
I'm
using
godaddy.
I
So
there
is
nothing
suspicious
from
that
side
and
if
I
check
from
other
like
popular
sites
which
show
reviews
so
my
site
is
showing
fine.
So
the
thing
is
so
I
had
emailed
their
admin
that
the
information
like.
Why
is
it
that
way?
But
I
get
a
response:
auto
replying
the
response
to
kind
of
pay
money
to
get
the
site
remote.
A
I
I
don't
think
we
would
do
anything
special
in
a
case
like
that.
It's
it's
more
something
where,
over
time,
you
kind
of
build
up
the
the
value
of
your
own
website
and
more
and
more
we'll
just
focus
on
your
website
in
the
search
results.
So
that's
not
something
where
we
would
go
off
and
say.
Oh
the
information
for
this
one
page
is
is
not
good
on
this
one
site.
We
will
remove
it
from
search
or
we
will
rank
it
lower
in
the
search
results.
We
wouldn't
do
that
kind
of
manual
intervention.
A
Essentially,
the
the
best
approach
here
is
kind
of
to
just
to
move
forward
and
keep
working
on
your
own
website
and
ignoring
the
the
other
ones
there,
even
if
they
they
are
sometimes
ranking
for
your
name
as
well,
and
it's
in
in
a
sense
similar
to
what
is
considered
reputation.
Management
where
essentially,
like
someone
is
creating
some
content
on
your
name
in
this
case
on
your
site's
name,
that
you
don't
like.
A
And
if
you,
if
you
look
up
reputation,
management
there
like
a
handful
of
strategies
that
you
can
do
there,
which
include
things
like
making
sure
you
have
some
social
profiles
with
your
name
or
making
sure
that
that
other
people
actually
talk
about
your
site
in
in
the
way
that
you
think
is
correct
and
making
sure
that
kind
of
the
the
good
information
is
out
there
as
well,
and
that's
something
that
just
takes
time.
A
It's
not
something
where
you
can
kind
of
like
focus
on
the
bad
content
and
try
to
get
that
removed.
You
essentially
have
to
move
forward
in
in
a
positive
way
and
kind
of
like
say.
Oh,
I
I
see
there
are
people
trying
to
pull
me
back,
but
I
will
continue
my
path
in
in
moving
forward
in
a
positive
way.
I
So
the
thing
is
my
side
is
kind
of
a
mentorship
side,
so
even
one
person
kind
of
sees
that
it
is
a
significant
loss
and
also
the
like.
I
have
no
issue
with
like
like
bad
information,
but
if
it
is
false
like
like
completely
false-
and
I
can
kind
of
show
that
then
would
that
help.
A
Anyway,
we
wouldn't
see
that
as
spam,
though
it's
like
a
lot
of
information
on
the
web
is
false
and
there
there's
some
situations
where
it's
really
critical
for
us
to
make
sure
that
the
information
we
show
is
correct,
which
is
like
areas
around
or
medical
information.
A
But
if,
if
it's
like
information
about
another
business
about
another
website,
then
it's
like
people
have
different
opinions
and
sometimes
they
post
something
incorrect.
They
mix
things
up,
that's
not
something
where
we
would
go
in
and
say
we.
We
need
to
fix
that
manually
and
even
in
cases
where
it's
it's
like
medical
sites
or
where
it's
financial
sites,
it's
not
that
we
go
in
there
and
then
manually
say.
Oh,
this
is
bad
medical
information.
A
It's
it's
hard
work
sometimes,
and
sometimes
people
are
jealous
of
the
successes
that
you
have
so
it's
like.
I
don't
know
it's
it's
annoying,
but
you
kind
of
have
to
move
forward
in
a
positive
way.
J
Hi,
john,
so
I've
I've
been
checking
a
few
pages
in
in
search
console
which
aren't
indexed,
but
they
are
indexable
and
the
canonicalization
seems
to
be
all
in
order
and
I'm
trying
to
see
how
I
can
figure
out
why
these
pages
aren't
indexed.
Is
there
a
way
that
we
can
see
that
either
in
in
search,
console
or
via
any
other
way?
A
Next
question:
no,
I
mean
it's
it's
something
where
even
internally
trying
to
figure
out
like
why
a
page
is
not
indexed.
That's
that's
really
complicated
and
we
don't
have
anything
along
that
line
externally
available
in
search
console
where
you
can
say
like,
but
why
don't
you
like
this
page,
google.
A
Oh
well,
let's
see
ritual.
K
So
I
have
few
questions
this
week
when
question
is
related
to
like
site
map.
If
I
see
some
pages
are
not
present
in
sitemap,
but
still
appearing
in
search
console
as
a
404
and
not
found.
So
what
can
we
do
for
these
type
of
issues?
You
suggest.
A
A
Yeah
it's
it
can
affect
the
the
subcategories
there,
because
usually
the
the
subcategories
would
be
linked
from
the
main
category
page.
It
depends
a
little
bit
on
on
how
you
set
up
your
website,
but
usually
they're
linked
from
there,
and
if
that
category
page
is
not
in
our
index,
then
maybe
we
don't
have
links
to
the
subcategory
pages,
and
maybe
we
won't
be
able
to
crawl
them.
K
Declining
and
I
have
implemented
so
many
techniques
to
improve
them,
but
I'm
not
getting
any
results.
Can
you
suggest
me,
like
any
good
techniques
which
I
can
implement
to
improvise
these
keywords,
climbing
keyboards
and
crp,
or
block
keywords
apart
from
social
media
promotion,
apart
from
like
content,
optimization,
if
I
can
go
further,
any
techniques.
A
A
So
I
I
think
what
what
I
would
do
in
in
a
case
like
this
is
maybe
take
a
step
back
and
go
through
the
seo
starter
guides
that
we
have
so
so
we
have
an
seo
starter
guide.
There
are
various
seo
tools
that
also
have
kind
of
seo
starter
guides,
which
I
think
are
really
good
and
kind
of
work
your
way
through
those
and
see
if
there
are
any
aspects
that
you're
missing
a
lot
of
those
things
are
things
you're
going
to
say?
A
Oh,
this
is
obvious,
like
you,
you
already
know
all
of
this,
but
they're
going
to
be
individual
things
in
there.
Where
you
say.
Oh
this.
This
is
actually
a
good
idea,
and
this
is
not
something
that
we
have
done
in
the
past
and
that
I,
I
think,
is
kind
of
a
way
of
going
from
like
you're
doing
some
things
really
well
to
understanding.
Well,
actually,
you
have
the
whole
foundation
covered
really
well
and
then
you're
kind
of
sure
that
at
least
you're
not
missing
anything.
A
It
doesn't
guarantee
that
you
rank
better
for
all
of
those
keywords,
but
it
makes
sure
that
you
have
kind
of
all
of
that
foundational
things
ready.
So,
like
I
said
like
on
on
our
site,
the
seo
starter
guide,
I
think
there
is
from
moz-
is
a
really
good
seo
starter
guide
out
there.
A
Some
of
the
other
seo
tools
also
have
really
good
guides,
and
I
I
will
just
work
your
way
through
all
of
these
from
the
start,
to
the
end,
take
notes
on
things
that
you
haven't
been
doing
and
then
try
to
implement
some
of
those
things
because
they
they
always
have
like
maybe
nine
out
of
ten
things
are
things
you
already
know
about,
and
10
are
things
that
that
are
new
there
and
those
10
sometimes
can
be
kind
of
the
difference
between
being
kind
of
a
a
mediocre
website
in
search
to
being
one
that
is
actually
pretty
good
and
has
a
good
foundation.
A
Thanks
cool
all
right
with
that
I'll
I'll,
take
a
break
here
with
regards
to
the
recording
I'll
I'll
still
be
here.
So
those
of
you
who
still
have
your
hands
raised
or
who
have
more
and
more
questions
feel
free
to
to
stick
around,
but
just
to
make
sure
we
have
a
reasonable
length
for
the
recording
I'll
pause
it
here.
If
you're
watching
this
on
youtube
thanks
for
watching
and
as
always
feel
free
to
like
and
subscribe,
the
videos
join
us
in
one
of
the
next
hangouts.
A
If
you'd
like
to
it's,
it's
always
interesting
to
see
what
pops
up
all
right,
let's
see!
Stop
the
recording.