►
From YouTube: English Google SEO office-hours from January 21, 2022
Description
This is a recording of the Google SEO office-hours hangout from January 21, 2022. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
and
part
of
what
we
do
are
these
office
hour
sessions
where
people
can
join
in
and
answer
questions
around
web
search,
and
we
can
try
to
find
some
answers.
A
Bunch
of
stuff
has
submitted
on
youtube
already,
so
we
can
go
through
some
of
that,
but
I
already
see
people
raising
their
hands
here.
So
maybe
we'll
just
start
with
some
of
the
live
folks
here
and
wow.
That
was
a
cue
like
suddenly
lots
more
hands,
let's
see
go
for
it.
B
Hello,
john
hi
nice,
to
meet
you
today,
I'm
gonna
ask
is
the
value
of
internal
links
are
similar,
for
example,
is
there
a
difference?
If
is
there
a
different
of
value
of
internal
links
in
header
footer
or
in
content.
A
It's
it's
pretty
similar.
I
I
don't
think
there's
anything
like
quantifiably
different
about
internal
links
in
different
parts
of
the
page.
I
think
it's
it's
different
when
it
comes
to
the
content
in
different
parts
of
the
page,
where
we
try
to
figure
out
what
is
unique
to
a
page,
but
with
regards
to
links,
I
I
don't
think
that's
that's
anything.
B
A
A
I
don't
know
that's
that
doesn't
mean
that
we
would
not
show
the
site
at
all.
Okay,
thank
you,
evan.
If
the
boxings
are
millions,
even
if
the
backlinks
are
millions,
it's.
I
think
one
of
the
things
that
that
I
kind
of
like
about
our
algorithms
is
that
we
have
so
many
different
factors
and
you
don't
have
to
be
perfect
with
all
of
these
factors,
and
sometimes
people
do
bad
things
accidentally.
Maybe
they
follow
bad
advice.
A
C
Sir,
I
want
to
ask,
after
november
google
search
console
update
my
website
not
crawling.
A
There
I
think
there
are
two
possibilities.
One
is
that
maybe
there's
a
technical
issue.
I
don't
think
that's
necessarily
the
case
in
in
your
case,
because
it
sounds
like
some
of
the
pages
are
being
crawled.
Normally,
the
the
other
is
essentially
just
that
we
don't
crawl
everything
all
the
time
we
don't
index
everything
on
the
web,
and
sometimes
we
have
to
prioritize
things,
and
that
can
mean
that
we
try
to
understand
what
is
kind
of
the
overall
value
of
a
website.
A
D
So
over
the
last
year,
we've
been
making
a
lot
of
technical
improvements
to
the
site
and
our
customers
seem
to
be
pretty
happy
with
the
site.
However,
since
the
end
of
october,
the
number
of
pages
indexed
by
google
has
dropped
by
dramatically
25
about
500
000
pages
and
the
ones
that
we've
submitted
so
that
that
was
for
all
pages
known.
The
submitted
ones
have
dropped
by
50
over
50
percent.
D
We've
done
the
kind
of
research
we
can
imagine
that
would
be
relevant,
and
the
thing
that
we
found
is
that
if
there
is
a
review
on
a
product
page
or
if
there
are
no
reviews
on
the
product
page,
the
schema
validator
is
unhappy
because
there's
no
review
mentioned,
but
we're
doing
something
wrong
there.
But
that's
the
only
big
change
that
we
can
see.
That's
gone
from
a
slow
but
steady
growth
all
year
to
a
dramatic
drop
in
the
last
two
months.
D
A
So,
just
because
structured
data
is
not
completely
valid
on
a
page
wouldn't
mean
that
we
would
drop
it
from
indexing.
So
that's
that
seems
unrelated
to
me.
I
imagine.
Maybe
the
report
and
search
console
shows
all
of
these
errors
and
you
look
at
them.
You
say:
well,
I
don't
care
about
the
markup
there
and
that
that's
fine.
It's
it's
not
a
sign
that
we
think
your
website
is
bad
because
the
structured
data
isn't
valid.
A
It's
just
we
want
to
let
you
know
in
case
you
wanted
to
use
this
structured
data,
it's
like
it's,
it's
not
working,
but
that
wouldn't
affect
crawling
or
indexing
or
ranking,
or
anything
like
that.
I
it's
kind
of
hard
to
to
say
offhand
like
what
what
might
be
causing
that
it
could
be
similar
to,
like
the
previous
question,
that
our
systems
are
kind
of
unsure
about
the
quality
overall
of
your
website
when
it
comes
to
such
a
large
website,
where
you
kind
of
looking
at
the
the
mass
of
numbers
there.
A
A
So,
for
example,
sometimes
we
index
pages
with
different
parameters
attached
to
them,
maybe
like
analytics
tracking
parameters,
and
it
can
easily
happen
that
we
suddenly
index-
I
don't
know
100
000
of
those
pages
and
they're
all
index
and
in
the
graph
it
looks
like
that's
a
that's
a
big
thing,
but
if
we
were
to
drop
all
of
those
pages,
it
wouldn't
change
anything
for
your
website,
because
these
are
kind
of
accidentally
indexed
pages.
A
So
in
the
graph
that
could
look
very
dramatic
and
that
it
goes
up
and
like
all
of
these
things
are
indexed
and
then
it
goes
down
you're
like
oh
what
what
broke,
but
it
might
just
be
that
our
systems
are
kind
of
fixing
an
issue
with
regards
to
indexing
that
doesn't
really
affect
the
rest
of
your
website.
A
D
Okay,
this
one
thing
that
we
noticed
was:
this
is
the
first
time
we've
ever
seen,
crawled,
but
not
indexed,
and
maybe
that's
just
because
before
it's
been
not
crawled
and
not
indexed,
this
is
a
kind
of
a
new
thing
where
we're
like.
Okay,
we
feel
like
this
is
telling
us
something,
but
we're
not
quite
sure
how
to
interpret
it.
What
to
parse
from
it.
A
Yeah,
I
I
don't
think,
there's
really
much
you
can
kind
of
pull
out
from
that.
I
I
think
the
two
statuses
crawled
and
not
indexed
and
discovered,
not
indexed.
I
think
they're
essentially
equivalent
in
that
we
we
know
about
the
url,
it's
like
like
we.
We
confirmed
that
we've
heard
about
it,
but
we
decided
not
to
index
it
and
that's
something
where
we're
looking
with
with
the
indexing
teams
to
figure
out
is.
A
Is
this
really
like
a
general
problem,
because
we
hear
more
and
more
reports
about
this,
or
is
it
essentially
just
it's
more
visible
than
it
used
to
be?
Because
even
in
the
past,
we
would
always
only
index
like
a
portion
of
the
website,
but
we
never
showed
people
that
in
search
console,
we
just
focus
on
kind
of
the
the
traffic
that
you're
getting
not
why
we're
not
indexing
individual
pages
yeah?
A
But
if
you
want
you
could
you
can
drop
a
your
url
here
in
the
chat
and
I
can
pass
that
on
to
the
team,
especially
if
you
have
some
sample
urls
from
your
website
where
you're
kind
of
feeling
well.
This
is
really
important
content
and
it's
not
being
indexed
and
is
it
something
you're
doing
or
something
we're
doing
kind
of
thing.
A
All
right,
marcos.
E
Hello,
john
hi:
I'm
trying
to
find
out
some
more
information
about
the
images
I
recently
see
increasingly
in
the
organic
search
results
directly
in
search
not
in
image
tab
because
there's
no
documentation
in
google
in
the
search
central
and
I
try
to
find
out
some
common
things
about
images
of
some
differences,
but
I
don't
see
anything
connected
to
the
resolution
or
size
or
the
ranking
position.
E
A
From
from
our
side
these,
these
are
essentially
just
a
a
kind
of
snippet,
so
it's
not
based
on
any
particular
markup
that
you're
doing
on
the
pages.
It's
not
something
specific
that
you
define
on
the
pages.
It's
really
just
we
recognize
these
images
are
on
these
pages
and
for
whatever
reason,
our
algorithms
think
that
for
users,
showing
some
images
or
kind
of
the
these
snippet
images
would
help
them
to
decide
which
of
the
results
to
click
on.
So
that's
from
from
our
side.
It's
something
like.
A
Yes,
we
are
showing
images
and
snippets,
and
I
I
think
in
the
past,
we
didn't
do
that
so
much,
but
it's
not
based
on
anything
specific
that
you
can
control
other
than
like.
Maybe
deciding
you
don't
want
any
images
on
these
pages
image
indexed
and
you
want
to
use
like
the
the.
What
is
it
no
image
index
meta
tag,
but
it's
it's
not
something
where
you
can
say.
I
want
these
images
indexed
or
these
images
shown
in
the
snippet.
It's
essentially
we're
picking
images
from
the
page.
E
No,
no,
not
not
bad
images,
but
sometimes
it
picks
one,
sometimes
six,
sometimes
no
in
images.
Even
there
are
images
on
the
on
the
landing
page.
So
I
don't
see
the
connection
why
or
where
this
decision
happens
to
show
or
not
to
show
an
image.
A
Yeah
yeah,
I
don't
know
what
we
can
do
to
make
it
a
little
bit
easier
to
understand,
but
yeah
maybe
should
we
should
at
least
document
that
we
do
show
images
in
the
snippet
and
that
it's
not
something
that
you
can
easily
control.
That's
a
good
point.
A
All
right
and
then
we
have
let's
see
agency
web
plus.
F
Yes,
perfect
should
be
better.
Thank
you
for
having
me
here.
We
have
a
client
that
ran
a
franchisee
business
in
france
and
all
the
so
they
have
multiple
websites,
one
for
each
location.
They
run
the
business.
F
The
thing
is
that
every
website
has
exact
same
content,
but
location
address
and
phone
number,
so
we,
the
first
two
websites,
are
performing
very
well
in
terms
of
positioning
for
some
related
search
keywords,
but
every
new
location
that
has
been
added
based
on
the
first
two
websites.
F
A
I
I
can
see
how
it
would
be
correct
for
our
systems,
maybe
to
flag
it
as
duplicate
content,
if
it
is
duplicate
content.
So
that's
I,
I
think,
kind
of
a
a
tricky
part
there.
The
the
best
way
to
make
sure
we
don't
see
it
as
duplicate
content
is
really
to
make
sure
that
it's
not
duplicate
content
and
the
the
address
and
phone
number,
I
think
is,
is
a
big
step
in
that
direction.
A
But
it's
it's
sometimes
not
enough,
because
we
do
try
to
understand
what
what
the
kind
of
the
the
main
content
is
of
a
page,
and
we
use
that
essentially
for
recognizing
duplicates
and
that's
something
where
I
suspect,
where
you're
running
into
issues
at
the
moment,
in
that
the
main
content
of
the
page
is
essentially
the
same
for
all
of
these
these
sites,
and
because
of
that,
we
see
them
as
being
duplicate.
A
I
I
think
I
think
there
are
two
approaches
that
that
I
would
recommend
here.
One
might
be
to
pick
one
domain
and
focus
on
that
and
have
individual
landing
pages
for
the
locations
within
that
one
domain.
A
That
would
probably
make
it
also
such
that
the
content
is
much
more
visible
in
search
because,
instead
of
having
to
promote
all
of
these
individual
websites,
you
promote
one
strong
website,
and
that
makes
it
a
little
bit
easier
to
to
rank
that
content
in
search
so
that
that
would
be
the
one
approach
the
other
would
be.
If
you
know
that
this
is
going
to
be
a
very
small
number
of
different
locations,
then
maybe
there
is
something
that
you
can
do
to
make
the
content
a
little
bit
more
unique
across
these
locations.
A
F
So,
to
sum
up,
the
the
the
first
option
is:
if
we
have
a
large
amount
of
locations,
is
to
regroup
all
the
locations
under
a
single
domain
and
create
some
sort
of
landing
pages
for
specific
locations.
F
A
G
A
Yeah
I
mean
the
the
other
approach
is
to
focus
on
things
like
google,
my
business,
google
business
profiles.
I
think
the
new
name
is
and
to
see
the
the
websites
more
as
kind
of
like
a
business
card
for
the
google
business
profile
listing
and
in
cases
like
that,
you
don't
necessarily
need
to
have
all
of
these
websites
indexed.
A
F
F
So
yes,
of
course,
if
if
the
brand
was
very
well
established
already
and
very
well
known,
I
mean
the
the
there's
no
problem
at
all,
but
the
thing
is
that
the
brand
is
unknown
yet
and.
A
Yeah,
but
in
that
case
I
would
really
focus
on
making
one
very
strong
website,
because
that
also
makes
it
a
lot
easier
to
rank,
because
then
you
just
promote
that
one
strong
website.
You
can
use
that
for
any
marketing
campaigns
that
you
do
and
all
of
the
the
links,
the
kind
of
the
the
signals
that
you
get
for
your
business,
they're,
concentrated
and
then
spread
out
to
the
individual
locations
rather
than
having
each
individual
location
kind
of
fight
for
themselves.
G
Hi
john
good
morning
hi,
so
we
have
a
pretty
strong
us.com
presence
with
no
real
international
implementation
at
the
moment,
but
we
are
starting
to
build
out
into
canada
and
uk
and
we
are
kind
of
at
a
crossroads
today.
Should
we
go
cctlds
or
keep
it
under
justa.com
we're
a
real
estate
company.
So
our
pages
are
pretty
location,
specific
we're
probably
going
to
end
up
in
about
a
dozen
countries
within
the
next
year
or
two.
G
So
my
thought
process
is
being
able
to
take
advantage
of
being
able
to
target
locally
and
search
console
where,
with
the
cctlds,
where
I
can't
do
that
with
the
global.com
and
technology
is
not
an
issue
for
us,
so
you
know
either
way
kind
of
works.
So
I
was
wondering,
is
one
you
know:
is
there
an
advantage
of
going
one
way
or
the
other,
or
is
it
easier
for
google
one
way
or
the
other.
A
I
I
think
there
are
multiple
things
that
kind
of
come
into
play.
One
is
probably
also
the
the
aspect
of
having
multiple
sites
versus
having
one
really
strong
website
kind
of
like
like
with
the
previous
question.
If
you're
talking
about
a
dozen
different
locations,
maybe
that's
less
of
an
issue.
Maybe
that's
that's
something
that
that
works
out.
A
The
other
thing
is
with
geo
targeting
we
do
use
that
when
we
recognize
that
people
are
looking
for
something
local
so
either
way
that
you
go,
I
would
try
to
make
sure
that
we
can
figure
out
what
you
are
targeting
to
use
for
the
website
and
cctld
is
super
obvious.
So
that's
that's.
I
think
a
good
approach,
sub
domains
is,
is
an
option.
Subdirectories
would
be
an
option.
What
I
wouldn't
do
is
just
kind
of
use,
url
parameters
or
something
like
that
where
it's
like
well.
A
Main
global
website
and
you
can
search
for
an
individual
country
and
then
somewhere
in
the
url.
That
country
is
also
mentioned,
because
that
would
essentially
mean
that
we
can't
do
geo
targeting
at
all
for
the
web.
G
Right
right,
okay,
yeah
right
now
we're
doing
like
these.
You
know
the
dev
site
is
like
slash
frca
or
we're
looking
at
the
dot
ca
for
canada,
but
it's
just
we're
trying
to
figure
out
which,
which
makes
sense
we
have
like
our
us
site-
has
40
million
pages.
Our
candidate
site
would
probably
have
a
half
a
million
pages
with
only
you
know,
five
percent
of
them
are
actually
one-to-one
because
they're
location
specific.
So
I
don't
know
if
that
might
make
any
difference.
A
A
But
it
I
think
it
also
depends
a
little
bit
on
what
your
marketing
goals
are.
If
you
want
to
really
position
these
as
like
we're
the
real
estate
company
for
canada,
then
you
kind
of
want
to
have
your
your
own
domain.
If
you
want
to
position
it
as
like
we're
a
global
company-
and
we
also
do
canada,
then
having
that
within
your
existing
domain,
might
might
be
okay,
so
I
think,
from
a
technical
point
of
view,
like
all
of
these
options
are
open,
which
way
you
go
is
kind
of
almost
like
a
strategic
decision.
H
Hey
john,
can
you
hear
me?
Yes,
yes
great,
so
this
time
I
only
have
one
like
urgent
and
important
questions
is
about
index
pages
like
we
just
found
that
from
like
january
13th,
that
our
index
pages
drop
like
over
90
like
these
are
indexed
pages
and
these
actual
traffic
in
jsc,
backend
and
also
in
our
log
report.
So
we
check
our
jsc.
H
H
We
also
checked
the
samples
that
index
and
also
the
code,
but
not
indexed
in
our
in
our
site.
We
didn't
find
anything
anything
unusual
like
the
client
call,
and
also
some
also
the
mental
attack
marked
as
no
index.
No,
there
is
not
that
that
issue,
so
we
just
couldn't
find
a
like
a
way
to
identify
the
problem.
H
A
No,
I
I
don't
know
offhand.
It
sounds
very
similar
to
one
of
the
previous
questions,
so
I'm
kind
of
worried
that
I
need
to
dig
into
this
a
little
bit
more.
If,
if
you
can
send
me
some
sample
urls,
then
I'd
love
to
take
a
look
or
maybe
drop
them
here
in
the
chat,
and
then
I
can
pick
them
up
afterwards
and
check
them
out
with
with
the
team.
I
I
think
the
the
one
aspect
that
you
probably
also
want
to
check
is
whether
or
not
we
can
actually
crawl
them
properly.
A
I
imagine
you
already
looked
into
that,
but
it's
always
good
to
kind
of
double
check
there.
But
again,
if
you
can,
if
you
can
drop
me
some
urls,
I'm
I'm
happy
to
take
a
look
at
that
with
the
team,
because,
like
one
site
kind
of
going
down
and
indexing
is
one
thing,
but
if
they're
like
multiple
people
at
the
same
time,
coming
in
with
this
kind
of
topic,
then
we
should
probably
take
a
better
look.
H
Okay,
so,
like
you
said,
I
will
have,
we
have
checked
the
urls
in
google
or
live
live
text.
Yes,
they
are
all
global
and
also
indexable,
but
the
worst
thing
is
they
are
not
yeah
and
we
also
check
our
word
like
same
map
and
the
robot's
text,
all
good,
not
nothing
unusual
and
also
the
mobile
friendly
text.
H
H
Okay
and
yeah.
That's
one
thing,
unusually
spotted
we're
guessing
that
if
that's
a
problem
but
like
I
said
I
already,
we
already
analyzed
the
law,
our
log
report
and
find
the
actual,
like
the
the
rate
of
these
urls,
that
google
cloud
is
very
less.
Maybe
two
percent
to
three
percent
yeah.
H
A
huge
amount
of
these
urls.
A
Yeah,
I
I
mean
what
what
always
happens
is
we
discover
a
lot
of
urls
for
for
websites
and
if,
if
we
don't
think
that
they're
important,
we
will
kind
of
keep
them
in
our
list
and
at
some
point
we'll
try
to
crawl
them,
and
I
suspect
these
are
just
kind
of
random
urls
that
we
discovered
over
time
and
we
try
to
crawl
them
from
time
to
time
to
see
if
there's
anything
that
we're
missing.
H
J
A
Yeah
I
mean
that
that
happens,
and
sometimes
we
collect
these
urls
for
a
longer
period
of
time
and
then
try
them
maybe
a
year
later.
So
it's
it's
sometimes
hard
to
track
back.
But
it's
it's
not
a
sign
of
a
problem.
It's
it's
really.
Just
our
systems
recognize.
Oh,
we
have
some
extra
space
to
crawl
urls
and
we
also
happen
to
have
this.
This
big
collection
of
urls
that
we
don't
know
anything
about,
so
we
will
just
try
them.
H
Okay,
great,
so,
as
for
the
index
pages,
do
you
have
some
maybe
mean
technical
like
aspects
and
that
maybe
affect
this?
Maybe
results
in
this.
A
It's
hard
to
say
I
mean
usually
the
the
main
issue
is
really
about
overall
quality
of
a
website
which
kind
of
goes
into
the
decision,
whether
or
not
to
index
individual
urls,
and
that's
something
that
can
also
change
over
time,
not
not
so
much
that
the
quality
of
your
website
changes
but
kind
of
our
perception
of
the
quality
of
the
website
can
change
over
time
and
that's
usually
the
the
main
element
that
comes
into
play
there.
A
And
if
you
see
these
kind
of
indexing
changes
like
happening
over
a
short
period
of
time,
then
it
could
be
that
our
systems
have
just
kind
of
like
changed
the
way
that
we
evaluate
quality
for
your
website
and
suddenly
everything
is
in
a
slightly
different
bucket,
whereas
if
you
see
them
over
a
longer
period
of
time,
then
it's
usually
more
that,
like
over
time,
our
systems
are
less
and
less
confident
about
the
website.
G
H
Also
one
weird
a
weird
situation:
we
spotted
is
that
the
thing
is
we
have
one
one
main
domains
and
several
subdomains
based
on
different
countries
and
the
situation
I
just
said
about
the
job
index
pages
only
occur
in
one
subdomain
site
and
only
occur
the
the
several
specific
page
in
this
one
subdomain
like
this.
This
subdomain
we
have
product
pages
category
pages
and
also
like
some
online
discount,
these
discount
pages
yeah
yeah,
and
we
found
this
situation
in
it's
specific
online
and
discount
pages,
not
in
other
pages
yeah
yeah.
A
Yeah,
I
I
mean,
if
it's
a
specific
type
like
the
that
a
change
in
the
perception
of
quality
by
country
or
by
language,
that
that
is
also
normal
from
from
our
point
of
view,
but
if
you're
finding
it
in
a
specific
kind
of
pages,
then
that
might
be
something
to
to
look
into
a
little
bit
more
as
well,
especially
if
you're
saying
like
these
are
discount
pages.
I
don't
know,
maybe
coupon
pages
those
kind
of
things,
because
sometimes
certain
types
of
pages
almost
fall
into
the
category
of
like.
A
H
Sure
sure
we'll
send
to
you
later
and,
like
you
said,
the
site,
the
specific
site.
We
also
have
these
specific
pages
in
other
subdomains
sites
like
spanish
site
spanish
site
and
also
like
some
germany
sites.
They
are
all
good,
nothing
unusual.
Only
the
only
that
one
french
site.
A
Okay,
I
I
see
we
still
have
lots
of
hands
raised,
but
also
lots
of
questions
submitted,
so
I'll
jump
over
to
some
of
the
submitted
questions
and
get
back
to
the
people
who
have
their
hands
raised
as
well.
A
A
We
don't
use
trailing
slashes
in
our
url,
so
when
a
new
folder
is
added
to
search
console,
the
trailing
slash
is
automatically
added
to
the
address
and
no
data
is
captured
and
reported
for
the
non-trailing
slash
version.
Is
there
a
way
to
add
a
folder
and
capture
the
stats
for
the
non-trailing
slash
index
page
as
well?
A
A
So
if,
if
you
have
kind
of
like
the
home
page
of
one
section
of
your
website-
and
it
doesn't
have
a
trailing
slash,
then
we
would
see
that
as
a
page
within
the
higher
level
site
and
so
on
the
domain
level.
Probably
you
would
see
all
of
these
if
you
want
them,
the
data
visible
in
independently
you
kind
of
have
to
pull
that
out
from
the
higher
level
property
in
search
console.
A
My
website,
averaging
around
200
000
session
a
day,
was
hit
by
technical
problem
and
the
site
was
down
for
14
to
15
hours
just
two
days
ago,
while
yesterday's
traffic
was
roughly
normal
today,
a
lot
of
our
pages
have
gone
missing
from
google
searches.
The
site
has
been
stable
for
eight
years.
We've
never
had
a
problem
like
this
before.
What
do
you
recommend?
A
So
usually,
if
you
have
this
kind
of
technical
issue
for
a
short
period
of
time,
it
can
happen
that
these
pages
will
drop
out
of
our
index
and
usually
they
will
pop
back
in
fairly
quickly
as
well.
So
what
what
usually
happens
is
the
pages
that
we
crawl
more
often
probably
get
picked
up
first
and
get
noticed
during
this
technical
issue,
and
maybe
we
dropped
them
during
that
time.
A
The
best
way
to
kind
of
protect
against
this
kind
of
issue
is
to
make
sure
that
you
have
some
system
in
place
that
can
serve
a
503
result
code
when
things
go
wrong
and
it
might
be
that
it
doesn't
trigger
automatically.
But
even
if
you
can
manually
turn
this
503
result
code
on
essentially
what
happens
then
is
when
we
crawl
the
pages
during
that
time
and
see
the
503,
then
we
will
say:
oh
there's,
there's
a
problem
here.
A
A
Should
you
notice
that
something
like
this
is
happening
and
if
you
can
serve
a
503
for
a
day
or
two,
then
you
should
not
see
any
any
changes
in
your
search
indexing
at
all.
If
it's
longer,
then
obviously
you
could
still,
but
at
least
for
those
one
day
or
two,
you
kind
of
protect
it,
and
in
the
case
that
you
can't
do
that,
like
you
did
here,
I
I
would
assume
that
this
will
just
come
back
automatically.
I
don't
think,
there's
anything
manual
that
you
need
to
do.
A
A
A
backlink
question
we're
using
pr
services
to
spread
out
our
pr
articles,
the
service
publishers
on
various
websites,
mostly
with
nofollow,
and
I
think
it
goes
into
like.
Sometimes
they
don't
have
a
nofollow
on
these
sites.
What
should
we
do
with
those
kind
of
do
follow
links?
Should
we
disavow
them
or
not,
use
the
pr
service
at
all?
From
from
our
point
of
view,
if
you're
taking
care
of
the
bulk
of
the
issue,
then
that's
generally
okay
and
that's
not
something
that
I
would
worry
too
much
about.
A
If,
if
it's
the
case
that
you're
using
this
pr
service
and
99
of
the
links
have
a
normal
link
to
your
website
and
one
percent
has
a
nofollow
on
it,
then
that
seems
a
little
bit
fishy.
But
if,
if
you're
saying
that
most
of
these
links
are
no
followed
like
like
they
should
be
for
pr
articles,
I
I
would
not
worry
about
this.
A
A
How
do
we
notify
google
so
in
in
a
case
like
this,
where
you're
splitting
or
merging
websites?
You
can't
use
the
change
of
address
tool
in
search
console
because
it
relies
on
the
fact
that
the
move
is
a
one-to-one
move
from
one
domain
to
another
domain
and
as
soon
as
you're
splitting
or
merging
websites,
then
that's
not
a
one-to-one
move
anymore.
That's
essentially
kind
of
something
that
has
to
be
processed
on
a
per
url
basis.
A
So
for
these
kind
of
things,
essentially,
what
you
want
to
do
is
just
set
up
redirects
properly
and
is
kind
of
follow
the
normal
guidelines
that
we
have
for
site
moves
and
just
kind
of
keep
in
mind
that
the
search
console
setting
for
change
of
address
is
probably
not
suitable.
There
also,
the
search
console
setting
will
try
to
test
some
sample
pages
on
your
site.
A
For
that
redirect,
and
it
might
be
that
it
looks
like
everything
is
okay,
but
I
I
think
it
would
still
be
wrong
to
use
that
setting
if
you're
splitting
a
website
up
just
because
it
could
potentially
mess
up
signals
a
little
bit.
I
doubt
that
it
would
really
cause
problems,
but
I
don't
think
you
would
have
any
advantage
of
using
that
kind
of
change
of
address
tool
if
you're
not
actually
moving
from
one
domain
to
another.
A
How
important
is
it
for
these
localized
pages
to
get
their
own
schema,
or
can
google
pull
enough
information
from
the
primary
english
version
of
the
page
that
lists
all
worldwide
offices
and
phone
numbers?
So
I
I
think
there
are
two
aspects
here.
On
the
one
hand,
if
you
have
different
local
versions
of
a
page,
we
would
not
pull
the
structured
data
from
a
different
page
to
try
to
apply
to
that
local
version
of
the
page.
A
A
My
understanding
is,
we
don't
show
this
in
the
search
results
anyway.
At
most,
it
might
be
shown
in
like
a
knowledge
panel
on
the
side,
which
would
probably
also
be
pulled
from
your
google
business
profile
listings.
So
probably
this
particular
type
of
structured
data
is
not
critical
for
for
your
website
anyway.
A
So
from
that
point
of
view
like,
if
you
want
to
put
it
on
these
pages,
go
for
it.
If
you
can't
put
it
on
all
of
these
pages,
then
it's
like,
probably
not
critical,
because
we
don't
show
it
anyway.
If
it
were
something
that
you
wanted
to
have
shown
in
the
search
results,
then
I
would
make
sure
that
you
put
it
on
all
of
the
pages.
There's
no
magical
kind
of
like
taking
it
from
one
version
of
the
site
and
applying
it
to
another
version.
A
About
a
year
ago,
you
mentioned
that
there
were
experiments
with
a
badge
in
the
search
results
regarding
page
experience
and
core
web
vitals.
Is
this
still
something
that
we're
going
to
see
in
the
future?
So
I
can't
promise
on
what
will
happen
in
the
future.
Unfortunately,
and
since
we
haven't
done
this
badge
so
far-
and
it's
been,
I
think
like
over
a
year,
my
feeling
is
probably
it
will
not
happen.
A
I
I
don't
know
for
for
certain,
and
it
might
be
that
somewhere,
a
team
at
google
is
working
on
making
this
badge
happen
and
we'll
get
upset
when
I
say
it,
but
at
least
so
far
I
I
haven't
seen
anything
happening
with
regards
to
a
badge
like
this,
and
my
feeling
is,
if
we
wanted
to
like
show
a
badge
in
the
search
results
for
core
web
vitals
or
page
experience,
that
probably
we
would
have
done
that
already.
A
So
I
it's
really
hard
to
say
there.
How
frequently
does
the
google
news
algorithm
surface
or
accept
or
review
new
news
sites
on
the
platform?
A
I
don't
know,
I
don't
have
much
insight
into
the
google
news
side,
so
I
can't
really
say
much
with
regards
to
how
frequently
things
change
there
we're
moving
our
website
from
magento
1
to
magento
2..
In
the
past
we
sold
internationally,
so
we
had
sub-folders
for
different
countries
and
now
we're
going
to
focus
only
on
uk
and
we
don't
need
those
sub-folders.
A
A
So,
first
of
all,
I
don't
know
what
is
involved
with
switching
from
magento
1
to
magento
2..
It
might
be
just
a
back-end
change
that
google
doesn't
notice
at
all.
So
from
from
that
point
of
view,
I
would
double
check
things
like
the
the
url
structure
of
your
website.
How
that
changes,
the
internal
linking
if
anything,
changes
there,
the
the
content
that
you're
displaying
it,
how
you're
displaying
it
to
to
users
all
of
those
things.
A
A
Essentially,
all
of
these
changes
would
be
kind
of
like
restructuring
your
website
kind
of
thing.
So
it's
it's
something
where
you
would
expect
to
see
some
fluctuations
in
search
as
all
of
that
bubbles
down,
and
the
important
part
is
really
that
you
make
sure
that
everything
is
is
cleanly
moved
to
the
new
location.
A
You
will
see
fluctuations
because
we
have
to
recrawl
the
whole
website
and
understand
the
new
structure,
and
that
will
happen
in
either
of
these
cases,
and
I
think
the
the
aspect
to
keep
in
mind
is
these
kind
of
fluctuations
can
last
for
a
while.
It
can
be
that
it
takes
a
month
or
two
to
kind
of
settle
down
again.
A
So
it
would
be
good
from
a
practical
point
of
view,
to
try
to
find
a
time
where
you're
not
reliant
on
search
traffic
as
much
or
maybe
where
you
can
do
some
extra
marketing
campaigns
or
something
else
to
kind
of
even
things
out,
or
at
least
so
that
people
who
are
involved
with
the
website
have
the
right
expectations
that
this
is
going
to
take
a
little
bit
of
time
and
there
will
be
some
fluctuations
in
search,
let's
see,
and
maybe
the
last
one
from
from
the
list
here
as
a
website
owner
seo,
can
we
be
100
sure
that
every
single
answer
we
get
on
this
youtube
channel
is
200
factually
true,
or
should
we
not
solely
rely
on
these
answers
and
always
do
our
own
research?
A
I
don't
know
that's
that's
like
a
a
big
question.
I
don't
know
like
how
I
could
be
200
factually
true,
that's
kind
of
a
high
bar
in
in
general.
Most
of
the
things
that
we
talk
about
are
based
on
the
documentation
and
especially
the
technical
details.
You
can
look
them
up
in
the
documentation,
so
there
should
always
be
a
way
to
kind
of
double
check
what
is
happening
here.
A
Some
of
these
things
are
more
strategic
questions
and
for
especially
for
these
strategic
questions,
it's
something
where
you
can
use
different
strategies
and
come
to
the
same
goal.
It's
not
that
this
one
strategy
is
exactly
what
you
need
to
do.
It
might
be
that
you
could
do
it
in
a
different
way
and
the
other
part.
I
think
that
is
super
critical.
A
To
keep
in
mind,
is
I'm
answering
specific
questions
from
individual
people
here
and
they
have
very
unique
websites
and
it's
sometimes
not
possible
to
just
take
one
answer
and
apply
it
to
all
of
the
web.
I
I
think
that's
very
tempting
nowadays
in
that,
especially
on
twitter,
you
have
a
short
short
form
text
and
it's
easy
to
say.
A
Oh
john
said
you
should
get
rid
of
all
country
versions
and
only
keep
a
uk
website,
and
but
that
might
apply
to
this
one
website,
but
that
doesn't
apply
to
everyone
and
taking
something
simple
like
that
will
just
result
in
lots
of
discussions,
and
then
people
saying
oh
john,
is
wrong,
which
maybe
I
am
wrong.
But
it's
it's
based
on
this
one
question
here
and
this
one
website
and
it's
not
something
that
you
can
generalize
to
everything
on
the
web.
A
So
my
recommendation
here
would
always
be
if
this
is
something
that
you're
relying
on
for
a
business.
Don't
take
just
one
person's
kind
of
comment
on
that,
but
rather
use
that
as
input
to
make
your
own
decision,
and
it
might
be
that
you
decide
to
go
that
that
direction
that
I
mentioned
it
might
be
that
you
decide.
Well,
I
don't
care
what
google
says.
I
I
think
my
other
approach
is
better
or
I
have
other
consultants
who
are
telling
me
something
different.
A
A
K
Hi
good
morning,
john,
can
you
hear
me
yes,
okay,
great
yeah,
thanks
for
having
us
again.
My
question
today
is
about
internal,
linking
I
found
an
interesting,
a
tweet
from
you
this
week,
which
says
internal
linking
is
most
important
for
website
structure
and
more
than
ul
structure.
K
If,
if
I
got
this
right
and
my
my
practical
question
would
be,
is
it
does
it
make
sense
to
look
at
the
internal
links
from
important
pages
of
a
website
to
see
if
they
have
links
from
internal
from
other
internal
important
pages
and
to
do
some
kind
of
link
hygieney
in
the
sense
of
yeah,
removing
links
to
to
less
important
pages,
so
that
the
links
to
the
important
pages
have
more
weight
and
does
it
make
sense
to
ega
calculate
the
internal
page
rank,
which
is
also
possible.
A
I
I
mean
it's
something
you
you
can
do
it's
it's
a
bit
tricky,
because
we
we
try
to
be
smart
with
how
we
process
internal
links
and
especially
some
some
very
common
pages.
I
get
a
lot
of
links
like
an
about
us
page
or
terms
of
service
they're
linked
from
across
the
whole
website,
but
at
the
same
time
we
understand
this
is
kind
of
a
pattern
that
that
is
normal,
and
it
doesn't
mean
that
we
should
rank
the
terms
of
service
page
for
anyone,
who's
searching
for
the
company
name.
A
So
it
it's
something
where,
on
the
one
hand,
the
the
internal
linking
is
is
something
that
you
can
control,
but
I
wouldn't
like
go
overboard
and
say:
well
I
remove
links
to
pages
that
I
don't
think
are
critical,
because
that's
especially
something
that
happened,
I
think
when
we
introduced
the
nofollow
that
people
would
say.
Oh
my
terms
of
service,
all
links
will
be
no
follow
to
it
and
that
doesn't
change
anything.
It's
like
a
lot
of
work
and
you
have
to
maintain
it
forever,
but
it
doesn't
change
anything
for
your
website.
A
This
graph,
where
you
have
like
different
flowers
and
trees
or
whatever,
to
show
the
structure
of
the
website
and
when,
when
you
look
at
that,
sometimes
you
can
tell
at
first
glance:
is
there
a
clean
structure
or
is
it
completely
messy
and
if
it's
completely
messy,
then
I
think
there
is
definitely
room
to
clean
that
up
and
to
make
it
clear
what
the
structure
should
be
and
by
making
a
clearer
structure.
You
are
helping
us
to
understand,
which
pages
you
think
are
more
important.
A
So
that's
definitely
something
I
I
would
try
to
find
ways
to
clean
up.
I
mean
it's
not
it's
not
that
I'm
saying
like
your
website
will
rank
better
if
you
have
a
clean
structure,
but
it's
more.
A
If
we
understand
your
website
should
rank
in
this
range,
which
of
these
pages
are
the
most
important
ones,
and
that's
something
that
you're
telling
us
there
and
that's
something
that
that
gives
you
value
and
that
you're
sending
people
to
the
pages
that
you
care
about.
A
So
that's
that's,
certainly
something
I
I
would
look
into
doing.
I
I
think
that's
a
good
use
of
time,
especially
for
larger
websites
that
have
evolved
over
many
years.
It's
it
just
gets
really
messy
organically
over
time
and
every
now
and
then
it
takes
it,
takes
time
and
energy
to
kind
of
restructure
things
and
make
sure
that
everything
is
is
kind
of
well
organized
again,
and
that
is
something
that
you
will
see:
effects
from.
K
Okay,
thanks
and
and
what
about
the
internal
page
rank,
which
is
quite
easy
to
to
calculate?
Would
you
recommend
doing
something
like
that
to
see
which
pages
does
do
have
the
most
important
or
the
most
weight
from
internal
links,
or
would
you
say
this
is
something
yeah,
that's
not
necessary.
A
I
I
don't
know
I
I
do
it
for
for
things
that
that
I
play
around
with,
but
it's
I
I
think
it's
something
for
for
people
who
like
to
play
with
these
things.
Sure
it's
it's
kind
of
interesting,
but
the
the
aspect
that
you
can't
really
model
in
in
there
is
that
individual
pages
will
get
different
external
links
and
that
essentially
affects
the
the
internal
page
rank
as
well
like.
If
everyone
is
linking
to
your
terms
of
service
page,
then
it's
suddenly.
A
It
is
something
that
has
a
lot
of
page
rank
and
page
rank
is
something
that
we
use
in
our
systems,
but
we
use
lots
of
other
things.
So
it's
it's
kind
of
an
interesting.
A
I
don't
know
almost
like
a
gadget
from
from
a
technical
point
of
view,
but
I
wouldn't
see
it
as
something
that
is
like
super
critical
from
from
a
practical
point
of
view,
it's
more
like
oh
you'd,
like
to
mess
with
numbers
and
play
with
graphs.
Like
sure
you
can
calculate
this,
I
wouldn't
see
it
as
something
that
is
reflected
one-to-one
at
google.
K
A
K
K
A
Also,
within
the
page
rank
functions,
I
I
think
there's
like
the
dampening
factor
and
different
other
things
that
you
can
mess
with
and
you
can
you
can
play
around
with
that.
Yeah.
G
L
I
have
a
question:
I
run
a
site
in
the
netherlands
and
of
course,
in
the
netherlands
we
speak
dutch,
but
we
also
use
a
lot
of
english
terms
actually
in
our
language
and
in
our
marketing
terms
as
well
for
this
site.
So
we
have
pages
that
contain
dutch
and
also
contain
english,
and
also
sometimes
in
pieces
like
age
ones,
or
you
know
those
those
kind
of
elements
on
the
page.
I
was
wondering
how
does
google
deal
with
such
a
page
that
has
like
two
languages
on
it,
so
not
separated
by
the
href
lang
but
yeah?
L
It's
a
it's
a
dutch
website,
but
how
does
google
yeah
deal
with
that?
Could
it
influence
ranking-
or
you
know,
with
with
the
region
codes
in
the
href
lan
alone-
will
be
sufficient
or
are
there
other
things
that
we
should
look
into.
A
So
if,
if
you
just
have
it
in
dutch
with
like
some
english
terms
in
there,
then
you
wouldn't
use
ahreflang.
So
that's
kind
of
I
think
the
the
one
aspect,
also
within
the
html.
You
have,
I
think,
the
language
code
that
you
can
specify.
That's
something
that
we
don't
use.
A
If
we
can't
determine
one
primary
languages,
then
we
might
use
like
multiple
languages
and
assign
that
to
the
page,
and
you
can
sometimes
see
that
if
you
do
something
like
a
site,
query
for
your
website
and
then
go
into
the
advanced
search
settings
and
specify
language,
then
you
can
sometimes
see
like
which
language
is
being
recognized
for
my
website
and
if
you
try
other
languages,
you
might
see
that,
oh,
it's
being
recognized
for
dutch
and
english,
which
doesn't
mean
that
it
has
less
weight
in
dutch.
It's
just
well.
D
A
From
from
that
point
of
view,
it's
it's
kind
of
something
you
can
kind
of
double
check
there.
I
think
the
the
one
situation
I
would
watch
out
for
is
if
your
page
is
recognized
as
being
in
a
language
that
is
not
correct,
like,
for
example,
if
you
have
an
english
website
on
vacation
homes
in
spain
and
all
the
addresses
are
in
spanish
and
all
the
place
names
are
in
spanish
and
we
think
the
whole
page
is
only
in
spanish.
A
L
Yeah
yeah,
what
I
understand
is
if
somebody,
because
we
have
multiple
language
sites,
we
also
have
an
english
site
and
we
also
have
a
german
site.
So
we
do
use
href
lang,
but
we
only
have
the
the
the
region
code,
not
the
not
the
language
code
on
it,
but
if
somebody
would
then
search
for
a
english
term,
which
we
do
have
pages
for
on
the
dutch
side,
but
in
english
google
sort
of
tries
to
find
whether
yeah
you
can
connect
that
to
that
search
term.
That's
somebody
and
then,
on
the
dutch
pages.
G
E
A
L
A
Okay,
cool
thanks
cool
okay!
Let
me
pause
the
recording
here
for
for
those
of
you
watching
the
youtube
recording
thanks
for
watching
until
the
end.
If
you'd
like
to
join
us,
live
watch
out
for
the
link
we
post
these
in
the
community
section
of
the
channel
and
feel
free
to
join
in
and
come,
I
don't
know,
join
the
office
hours
in
person
or
at
least
add
your
questions
to
to
this
session,
so
that
we
can
go
through
them
cool
all
right.
So
let
me
pause.
The
recording
here.