►
From YouTube: English Google SEO office-hours from April 16, 2021
Description
This is a recording of the Google SEO office-hours hangout from April 16, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome webmasters of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
here
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
websites
and
web
search
and
looks
like
a
whole.
Bunch
of
you
are
here
today.
So
that's
pretty
cool.
A
A
B
Hi
hi,
so
my
question
is
like
when
we
are
googling
the
brand
name
with
the
country.
Instead
of
ranking
our
home
page,
google
is
ranking
the
page
which
contain
the
country
name
in
the
page
content
or
meta
title,
so
how
to
fix
this.
A
A
But
it's
like
what
I
would
usually
do
with
these
kind
of
things
where
you
run
across
a
query
where
you
think
your
site
is
ranking
in
a
weird
way,
is
to
first
try
to
figure
out.
If
that's
actually
a
problem
in
the
sense
of
are:
are
people
actually
searching
like
this
for
for
your
company
and
what?
What
are
they
expecting
to
find
and
depending
on
what
you
see
there,
you
might
notice?
Well,
actually
it's
only
me
that
is
searching
like
this.
A
For
my
company
or
you
might
see,
lots
of
people
are
searching
like
this
for
my
company
and
if
you
see
lots
of
people
are
actually
searching
like
this
for
your
company,
then.
Obviously,
since
these
are
your
pages,
you
you
can
adjust
them
and
change
them.
However,
you
would
like
to
have
them
be
and
based
on
that,
we'll
try
to
show
that
in
search.
C
A
C
Okay,
so
I
have
a
question.
I've
also
asked
it
in
the
in
the
youtube
comments
regarding
canonicalization
and
your
official
guidelines
regarding
not
using
no
index
in
order
to
sort
of
control
this
a
bit
better.
So
why
do
you
recommend
not
using
no
index
in
order
to
fix
it?
If,
for
instance,
google
is
picking
up
the
wrong
canonical
or
not
or
ignoring
the
canonical
tag,
yeah.
A
So
I
I
think
the
not
not
using
no
index
is
more
almost
more
of
a
theoretical
recommendation
than
something
that
I
would
say
is
is
a
requirement
in
the
sense
that,
if
you're
saying
that
something
is
a
canonical
of
a
different
page,
then
you're
saying
that
they're
equivalent
and
if
you're
saying
one
of
them
should
not
be
indexed
and
the
other
one
should
be
indexed.
Then
you're
kind
of
saying
well,
they're
not
really
equivalent,
because
you
want
them
treated
in
very
different
ways.
A
A
From
a
practical
point
of
view.
I
I
don't
see
this
causing
any
problems.
So,
if
you're,
in
a
situation
where
you
really
strongly
want
to
have
one
page
to
be
a
canonical
and
you've
done
everything
else,
that
you
need
to
do
to
make
sure
that
the
internal
linkings
are
in
the
right
place.
The
sitemap
file
has
the
right
urls.
A
All
of
that
then,
using
no
index
like
that
is
something
that
is
not
going
to
cause
any
problems.
So
that's,
I
think,
that's
always
kind
of
the
the
tricky
aspect
there.
I
as
as
humans,
we
like
to
simplify
things
a
little
bit
and
as
like,
try
to
fit
a
simple
model
in
our
mind
and
then
like
theoretically
they're
different.
They
shouldn't
be
treated
the
same,
but
from
a
practical
point
of
view.
Sometimes
it's
like
you
do
what
you
have
to
do
and
it
just
works.
So
it's
not
that
your
site
will
rank
worse.
A
If
you
do
it
like
that,
awesome
thanks
sure.
Let
me
just
see
so.
People
are
raising
their
hands,
which
is
fantastic,
great
way
to
to
get
people
lined
up
and
mirage.
Let's
see,
yeah
hi,
john
hi,
so.
D
John,
actually,
I
was
just
going
through
some
international
target
guidelines,
so
we
are
doing
in
fact.
Nowadays
a
lot
of
people
have
to
just
expand
to
internationally.
If
business
requires
this-
and
I
had
some
confusion
about
diff
different
multilingual
and
I
just
checked
out
all
google
documents,
but
still
there
is
some
confusion.
D
So
one
thing
that
you
have
recently
made
very
clear
is
that
if
I
have
different
version
in
different
languages,
then
hr
flying
is
not
really
required.
If
wrong
page
is
not
ranking.
D
Right
so
so,
and
but
the
other
thing,
the
problem
that
I
am
having
is,
I
could
not
find
unless
I
could
not
understand
what
is
country
target
and
why
it
is
required.
So
so
there
are
different
case
scenarios
right.
One
version
is
that
when
I
have
dot
com
version-
and
I
am
just
creating
different
folders
and
targeting
them
into
different
country,
the
other
case
is
when
I'm
having
these
country
specific
domains
and
targeting
them.
D
D
A
Yeah,
so
it
it
is
a
complicated
topic.
I
I
think
any
website
that
is
expanding
globally
runs
into
these
issues
and
there
there
tend
to
be
two
approaches.
One
is
just.
I
will
make
lots
of
different
versions
and
hope
that
one
of
them
works,
but
I
think,
from
a
practical
point
of
view-
that's
probably
not
so
so
great
in
in
general,
we
differentiate
between
geotargeting
and
the
hreflang
annotations.
So
with
geotargeting,
that's
a
setting
in
search
console.
If
you
have
a
country-specific
top-level
domain
that
applies
there.
A
So
that's
from
so
if
you
have
something
that
is
specific
to
an
individual
country,
and
you
know
that
users
are
looking
kind
of
with
a
local
intent
in
those
countries,
then
with
geotargeting
you
get
a
little
bit
of
a
ranking
boost
with
a
treflang.
On
the
other
hand,
it's
more
that
if
you
already
have
pages
that
are
ranking
in
these
individual
countries,
make
sure
that
google
is
showing
the
right
version
of
that
page.
A
So
they're
they're
subtly
different
in
in
that
regard,
and
I
think
the
the
tricky
part
that
you
kind
of
touched
upon
as
well
is:
you
can
have
a
global
website
that
works
well
in
lots
of
countries,
and
you
could
also
have
specific
country
specific
pages
or
specific
country-specific
websites
that
also
work
well
in
different
cases,
and
I
I
think
the
decision
on
whether
or
not
to
have
a
global
site
or
country-specific
versions
is,
is
tricky
and
it's
hard
to
to
kind
of
come
up
with
an
absolute
answer
for
that,
on
the
one
hand,
there's
the
aspect
of:
do
you
really
have
something
country
specific
that
you're
offering,
or
are
you
just
taking
a
kind
of
a
globally
available
product
and
making
it
available
in
all
countries?
A
A
So,
for
example,
if
you
have,
I
don't
know
documentation
on
javascript,
then
people
are
not
going
to
search
in
their
country
for
documentation
on
javascript
and
expect
to
find
something
from
their
country
they're
in
their
language,
obviously,
but
not
necessarily
from
the
country.
On
the
other
hand,
if
you
need
a
washing
machine
repairman
or
something
like
that,
then
probably
they're
going
to
search
in
with
the
kind
of
mindset
that
they
would
like
to
actually
find
someone
who's
local
and
who
is
nearby.
Who
can
help
with
this
problem.
A
So
those
are
different
aspects
that
you
kind
of
have
to
take
into
account.
I
mean
you,
don't
have
to
take
them
into
account,
but
I
think,
if
you
take
all
of
these
things
into
account,
it's
a
lot
easier
to
come
up
with
an
efficient
approach
and
say:
okay
for
these
kind
of
queries,
I
can
tell
people
want
to
have
something
local
and
I
do
have
something
local.
Then
I
will
create
some
geo-specific
content
for
those
different
pages
and
for
other
types
of
queries.
A
I
see
users
are
just
searching
for
general
information,
maybe
a
product
manual,
or
something
like
that,
and
for
that
I
don't
need
to
have
a
geo-specific
page
and
then,
based
on
that,
you
can
make
a
decision.
Is
it
worthwhile
to
set
up
a
country-specific
website,
or
is
it
essentially
just
like
trying
to
cover
a
tiny
bit
more,
but
not
actually
significantly
significantly
enough
more
that
it's
worthwhile
to
create
a
whole
new
website
for
that
so
yeah.
C
A
I
think
it's
there's
a
lot
of
thought
that
goes
into
this
and
I
think,
if
you're
just
getting
started,
and
you
want
to
expand
globally
with
a
website,
I
would
definitely
try
to
find
some
some
seo
consultant
someone
who
can
help
you
who
has
done
this
before
with
other
websites.
Who
can
give
you
some
of
the
the
tips
and
tricks
and
kind
of
help
you
to
avoid
some
of
the
pitfalls
along
the
way.
D
F
F
So
just
a
quick
story
from
where
my
question
comes
from,
so
we're
a
leading
seo
agency
in
switzerland,
which
which
has
quite
a
lot
of
clients
and
we're
actually
one
of
the
biggest
seo
agencies
when
it
comes
to
words
and
articles
in
the
in
the
well
german,
austrian
and
the
switzerland
region.
So
we
have
about
500
pages
plus
minus
and
we
have
about
550
000
words.
So
the
crawl
budget
is
a
good
question
when
it
comes
to
our
website.
F
But
the
biggest
question
so
far
that
I
had
is
what
about
the
pagination
of
let's
say
the
blog
articles
and
other
kinds
of
articles.
So
let's
say
the
tags
categories
and
pagination.
So
what
does
google
think
about
them?
Should
they
be
no
index?
Should
they
be
no
follow?
What
kind
of
text
should
they
contain,
because
I
couldn't
find
a
trustworthy
source
that
says
like
yeah?
F
A
I
I
think,
that's
always
good
good
question.
So
what
what
I
try
to
differentiate
with
regards
to
pagination
is
why
why
you're
doing
this,
the
the
pagination
part,
so
in
particular,
is
it
that
you're
splitting
content
up
into
multiple
pieces
and
kind
of
paginating
a
very
long
article
and
splitting
it
into
parts.
One
part
two
part
three
or
is
it
more?
A
matter
of
you
have
links
to
individual
pages
that
you
would
like
to
have
found.
A
Two
part
three
part
four
within
your
website
and
that
those
pages
are
also
indexable,
that
they
don't
have
a
canonical
pointing
to
another
page,
for
example,
but
rather
that
they're
they're
easily
findable
and
indexable.
So
if
it
comes
to
content,
that
is
split
up,
make
sure
it's
absolutely
indexable.
A
When
it
comes
to
essentially
just
a
matter
of
finding
links
to
other
pieces
of
content,
then
I
I
think
it's
something
which
you
could
argue
both
ways
in
terms
of
with
a
lot
of
websites.
The
the
content
is
so
well
cross-linked
that
we
don't
necessarily
need
to
see
page
five
of
a
category
page.
So
maybe
you'll,
say
page.
Five
of
the
category
pages
should
be
no
indexed
or
you
can
kind
of
use
the
rel
canonical
pointing
back
to
the
first
page
of
the
set.
A
Those
are
all
kind
of
things
that
you
can
do
to
reduce
the
amount
of
pages
that
need
to
be
crawled
from
a
website,
so
that's
kind
of
the
the
thinking
there
and
with
regards
to
crawl
budget.
I
would
really
only
look
at
this
if
you're,
in
a
situation
where
you
have
hundreds
of
thousands
or
millions
of
pages,
and
that
then
it's
more
a
matter
of
crawl
budget
where
we
say
well
we're
running
out
of
time.
A
We
can't
actually
crawl
everything
here
if
you're,
in
the
order
of
thousands
or
tens
of
thousands
of
pages
on
a
website,
then
from
our
point
of
view,
we
we
can
deal
with
that.
That's
like
it's
it's
a
lot,
but
it's
not
it's
not
an
excessive
amount
and
if,
if
your
server
is
up
to
it,
maybe
we'll
crawl
it
all
in
one
day.
Even,
but
if
you
have
hundreds
of
thousands
or
millions
of
pages
or
hundreds
of
millions
of
pages,
then
you
really
need
to
make
sure
that
that
crawling
is
as
efficient
as
possible.
A
So
those
are
kind
of
the
the
aspects.
I
would
look
at
there
beautiful.
Thank
you,
john
sure,
paulina.
G
Hi,
so
we're
I
want
to
go
back
to
atria
flang.
G
I
posted
a
question
on
the
community
tab,
and
so
we
we
have
a
scenario
where
we
have
a
huge
global
website
where
we
have
to
set
up
hreflang
in
a
site
map,
because
html
is
just
not
feasible
and
I
found
best
practices
for
where
you
have
hreflang
in
the
html,
where
you
have
separate
mobile
and
desktop
pages.
So
the
mobile
pages
point
to
mobile
pages,
desktop
pages
point
to
desktop
pages.
How
do
we
do
this
when
we're
setting
up
page
reference
in
a
sitemap.
A
G
It's
it's
so
we
have
some,
we
don't
have
them
separately.
We
have
annotations
set
up
for
some
of
the
websites,
but
not
for
all,
so
half
of
them
are
properly
set
up,
half
of
them
are
not
yet
properly
set
up.
A
G
We
have
m
dot
and
we
also
have
a
different
site
with
the
slash
sp.
So
we
have
a
couple.
A
Wow,
okay,
that
sounds
complex
yeah,
so
I
I
think
in
in
a
case
like
that.
What
what
I
would
try
to
do
first
of
all,
is
map
out
the
whole
site
and
get
all
of
the
urls
and
figure
out
how
they're
how
they're
connected
and
essentially
our
our
guidelines,
with
regards
to
sitemaps
for
mdot
pages
are,
are
such
that
you
don't
need
to
lace
them
in
a
sitemap
file.
But
I
think
in
your
case,
where
you
have
ahreflang
annotations,
that
you
want
to
apply
to
the
mdot
pages,
then
I
would
absolutely
list
them.
A
So
it's
not
that
there
is
any
downside
to
listing
the
mdot
pages.
Our
recommendation
is
essentially
just
based
on,
like
you
don't
need
to
that.
You
can
save
time
doing
it,
but
if
you
really
want
to
apply
annotations
for
those
mdot
pages,
then
I
would
list
them
in
the
sitemap
file
and
then
there
on
an
individual
basis.
You
can
also
include
the
href
laying
annotations
between
the
mdot
versions.
G
Okay,
so
like
technically
with
this,
because
I
I
don't
think
we
can
have
like
an
m
dot,
something
m.example
is
the
english
page
and
then
the
www
example
is
also
the
english
version.
So
we
can't
set
it
up
like
that,
so
it
would
be
the
m
dot,
with
an
annotation
plus
a
nature
of
lang.
A
Yes,
so
it
would
be
the
the
m
dot
that
you,
you
would
list
the
m
dot
in
the
sitemap
file,
and
then
you
have.
I
think,
that
separate
block
with
the
the
the
hrflang
annotations
in
the
sitemap
file
and
on
a
per
url
basis.
You
would
list
the
alternate
mdot
version.
So
the
english
mdot
page
would
point
out
the
the
french
mdot
page
and
the
german
mdot
page
things
like
that.
A
G
A
A
Good
luck,
thanks,
joy.
H
G
H
A
Probably
not
I
I
I
think
in
in
general.
That
would
be
fine.
If
I,
I
would
kind
of
see
that,
as
a
way
of
I
don't
know
recognizing
that
googlebot
doesn't
actually
have
an
ad
blocker
installed.
A
So
it's
it's
kind
of
a
unique
setup
that
googlebot
has
with
regards
to
rendering
pages,
and
I
I
think
that
would
kind
of
be
okay
in
in
regards
to
cloaking
the
the
cloaking
team
mostly
tries
to
watch
out
for
situations
where
you're
really
showing
something
different
to
users
as
to
googlebot
and
with
regards
to
kind
of
ad
blocking-
or
I
don't
know
other
other
kind
of
things
where
it's
like.
You
have
to
be
logged
in
to
actually
see
the
content
and
that's
that's
kind
of
different.
A
So
I
I
don't
know
I
I'm
not
not
completely
a
fan
of
the
all
of
these
anti-ad
blocking
setups,
but
it's
something
where.
If,
if
you
need
to
do
it,
then
I
I
think
that's
that's
an
appropriate
approach.
A
If,
if
it's
an
html
overlay
on
on
top
of
the
existing
page,
then
I
don't
see
that
as
being
problematic,
because
we
would
still
see
the
actual
content
in
the
html
kind
of
behind
that
that's
similar.
If
you
have
something
like
a
cookie
banner
or
a
cookie
interstitial
that
you're
essentially
showing
just
an
html
div
on
top
of
the
page.
From
our
point
of
view,
if
we
can
still
index
the
actual
content
from
the
page,
then
that's
fine.
H
Okay,
awesome
and
then
last
question
for
the
new.
A
product
review
update
a
lot
of
the
the
websites
that
do
cover
product
reviews
also
cover
deals
on
specific
days.
Do
we
anticipate
that
algorithm
affecting
deals,
specific
posts?
Think
of
like
a
prime
day
or
something
like
that.
A
I
don't
know,
I
don't
know
it's
it's
hard
to
say
I
I
think
the
the
lo,
the
the
change
is
primarily
based
on
really
kind
of
review-based
websites
and
all
of
the
websites
that
have
a
significant
amount
of
review-based
content
and
also
have
other
content.
It's
it's
sometimes
tricky
to
to
say
in
advance
exactly
how
that
will
apply.
There.
A
Joshie,
oh
my
gosh,
probably
having
trouble
with
the
names.
Sorry.
I
So
it's
an
indian
name,
hi
john,
so
I
want.
I
have
two
questions
for
you.
One
is:
is
it
necessary
to
optimize
the
whole
website
if
you
want
to
rank
a
particular
page
or
the
url
or
it's
okay?
If
you
don't
optimize
the
whole
website.
A
That's
a
good
question
so
in
in
general,
we
we
do
look
primarily
at
the
page
when
it
comes
to
ranking,
but
it's
always
embedded
within
the
context
of
your
whole
website.
So
that's
something
where
I,
I
would
say
it's
worthwhile
to
to
improve
your
website
overall,
so
that
actually
we
understand
a
lot
better.
How
this
particular
page
is
really
important
and
what
the
value
of
that
particular
page
is
so,
in
particular
things
like
internal,
linking
and
kind
of
understanding
of
the
the
headings
and
the
content
of
the
pages.
A
All
of
that
helps
us,
even
if
it's
outside
of
just
that,
one
page
that
you
care
about.
So
that's
something
where
I
I
would
say.
If
you
care
about
one
particular
page
in
your
website,
then
you
should
make
sure
that
it's
like
the
rest
of
your
website,
that's
associated
with
that
page,
is
also
kind
of
improved
as
much
as
possible.
I
Okay,
okay,
my
second
question
is
one
of
our
competitor
is
using
the
redirected,
expired
domains,
okay
and
they
were
not
on
the
last
50
rankings
also
and
suddenly
they
are
ranking
on
first
page
on
the
first
or
the
first
serp
ranking.
So
does
google
ignore
the
redirected,
expired
domain
or
push
down?
It
pulls
down
the
website
when
they
find
the
use
of
redirect
expand
domain
and
how
much
time
the
google
takes
to
find
that
out.
A
Yeah,
I
I
think
this
is
like
a
tactic
that
is
as
ancient
as
the
web.
You
buy
some
some
old
expired
website,
and
then
you
redirect
it
to
your
website
and
we
have
a
lot
of
practice
with
it.
So
it's
something
where
sometimes
it
feels
like
sites
are
getting
away
with
it
or
seeing
some
value
out
of
it,
but
essentially
from
from
our
algorithms
point
of
view,
we
we
do,
try
to
recognize
this
and
work
with
it
appropriately.
A
So
it's
not
not
something
that
I
I
would
recommend
doing.
I
I
think
from
from
a
business
point
of
view.
Sometimes
it
makes
sense
to
buy
an
existing
website
and
say
I
I
will
kind
of
like
a
business
work
together
and
have
a
larger
client
base.
Something
like
that,
but
it's
very
different
from
like
buying
an
expired
domain
name.
I
Okay,
so
google
does
ignore
that
or
does
it
push
down
the
rankings
if
they
found.
A
For
the
most
part,
we
would
ignore
that,
like,
if
that's
kind
of
the
the
default
approach
that
that
I
see
from
from
our
side,
is
that
when
we
can
recognize
that
something
kind
of
sneaky
is
being
done
and
if
we
can
ignore
it,
then
we'll
try
to
ignore
it.
If
there
are
situations
where
it's
so
kind
of
built
into
the
website,
the
sneakiness
that
we
can't
just
ignore
that
one
part,
then
it's
possible
that
our
algorithms
will
say.
Oh
we,
we
don't
know
how
to
treat
this
website
anymore.
We
should
be
more
careful.
A
J
A
I
I
would
focus
on
the
the
links
that
you
see
in
search
console.
So
with
regards
to
redirected
pages,
we
we
don't
see
a
redirect
the
same
as
a
link.
So
if,
if
you
think
that,
let
me
let
me
just
back
up
a
little
bit
so
essentially
with
regards
to
what
what
for
us
is
a
link
is
essentially
a
kind
of
a
connection
between
two
canonical
pages
and,
if
you're
redirecting
one
page
to
a
different
page,
then
the
canonical
will
be
like
either
the
destination
or
the
source
page
there.
A
And
if,
if
from
there,
you
have
a
link
to
one
of
your
websites,
then
essentially
that
link
is
between
those
two
canonical
versions.
So
from
from
that
point
of
view,
if,
if
you're
worried
about
individual
links,
I
would
look
at
what
we
show
in
search
console,
because
in
search
console
we
show
kind
of
the
the
destination
and
the
source
of
those
links
and
that's
what
we
would
take
into
account
and
that's
what
you
can
use
for
the
disavow
tool.
A
I
think,
for
the
most
part,
if
you're
looking
at
it
on
a
per
url
basis
like
this,
then
probably
you're.
Looking
at
it
too
in
too
much
detail,
and
probably
it's
not
something
that
you
need
to
change
unless
you
have
a
manual
action.
So
if
you
don't
have
a
manual
action,
then
I
wouldn't
look
into
all
of
the
urls
that
are
redirecting
to
your
site
and
try
to
tweak
things
there.
A
J
No,
I
don't
have
a
manual
action,
but
I
have
a
client
who
is
running
a
domain
marketplace
and
every
domain
that
he
is
selling
it's
redirecting
to
a
product
base.
So
some
of
the
domains
that
are
redirecting
might
have
many
backlinks
and,
for
example,
I
noticed
that
the
disavowing
tool
has
a
limit
of
50
000
euros
yeah.
So
my
my
client
has
more
than
20
20
hundred
thousands
of
redirecting
domains.
What
how
should
I
handle
them?.
A
Yeah,
so
what
what
I
would
do
there
is
to
try
to
find
an
approach
that
works
outside
of
the
disavow
tool
and
handle
it
in
in
that
way.
So,
for
example,
what
you
can
do
is
set
up
a
a
separate
site
that
works
kind
of
as
as
a
jumping
place
where
your
all
of
these
domains
are
redirecting
to
that
particular
site
and
that
particular
site
then
redirects
to
to
your
primary
site.
And
then
you
can
use
the
robot's
text
to
block
crawling
of
that
kind
of
site,
or
that
page
that's
in
between.
A
So
essentially
you
have
all
of
these
domains
that
you're
selling
and
they
redirect
to
one
one
central
location,
that
location
is
blocked
by
robots
text.
So
googlebot
doesn't
see
what
happens
there
and
from
there
you
redirect
to
your
normal
pages,
so
users
when
they
go
to
those
other
domains.
They
see
the
redirect
normally,
but
googlebot
essentially
sees
everything
going
to
one
particular
pace
and
then
it
gets
lost
and
that
way
you're
kind
of
taking
care
that
nothing
from
those
links
is
affecting
your
normal
website.
A
So
it
it
sounds
like
something
kind
of
sneaky
to
do,
but
it's
essentially
in
in
a
case
like
this,
where
you
have
a
lot
of
redirects
going
to
your
pages,
and
you
want
to
make
sure
that
they
don't
negatively
or
positively
affect
your
website.
Then
having
this
kind
of
a
robotic
jumping
board
in
between
makes
essentially
solves
the
problem.
For
you.
A
Sure,
okay,
let
me
run
through
some
of
the
submitted
questions
and
I'll
get
through
back
to
to
all
of
you
with
your
raised
hands
as
well.
Let's
see
some
of
them.
We
might
have
already
covered
too.
We
recently
restructured
our
website
to
make
it
less
convoluted
due
to
many
subpages,
since
the
site
is
now
pretty
straightforward,
we're
not
sure
if
including
a
breadcrumb
navigation
is
still
necessary.
A
A
A
A
The
other
aspect
is
the
kind
of
ui
that
we
show
in
the
search
results.
Where
more
and
more
I
don't
know
more
and
more,
but
it's
something
that
I
I
see
more
often
we
we
don't
show
the
url
in
the
search
result,
but
rather
try
to
show
some
kind
of
a
breadcrumb
navigation
for
that.
For
that
individual.
A
A
So
for
a
smaller
site,
maybe
that's
not
as
critical,
but
it's
kind
of
a
nice
way
to
give
some
sense
of
structure
for
users
when
they're
searching
and
finding
your
pages
in
the
search
results.
So
those
are
kind
of
the
the
primary
aspects
there
and
if
you
feel
you
don't
need
breadcrumbs
from
a
ui
point
of
view
and
you're
sure
that
the
internal
linking
works
well
within
your
website,
and
you
can
test
this
with
the
the
various
crawling
tools
that
are
out
there.
A
And
if
you
look
at
the
search
results,
pages
and
you're
saying
well
these
these
look.
Okay
to
me,
then
I
don't
think
you
need
to
add
any
breadcrumb
navigation.
On
top
of
that,
on
the
other
hand,
if
you're
saying
the
internal
linking
is
not
that
great
and
or
you
would
like
to
have
these
kind
of
clear
breadcrumbs
in
the
search
results,
then
adding
the
breadcrumb
structure
data
seems
like
a
good
move
when
switching
a
website
from
squarespace
to
wordpress.
Will
this
affect
my
google
search
rankings?
A
Yes,
most
likely
it
will
and
it
can
be
both
well,
not
both,
but
either
positive
or
negative
for
your
website
in
terms
of
like
what
what
the
final
effect
will
be
so
anytime
you're
changing
your
cms
you're,
making
pretty
big
changes
on
your
website,
and
sometimes
people
make
these
big
changes
because
they
want
to
improve
things
for
seo.
A
So
it
can
be
positive
if
you
do
them
well,
it
can
also
be
negative
if
you
kind
of
make
this
migration
in
a
bad
way,
so
in
particular,
when
we
think
about
a
website,
we
don't
think
just
about
the
text
on
the
pages.
So
if
you
copy
and
paste
the
text
to
the
other
website,
that's
kind
of
one
part
of
a
website,
but
the
other
part
is
everything
around
the
website.
A
All
of
these
things
add
up,
and
for
us,
it
kind
of
paints,
a
bigger
picture
of
how
we
should
show
this
website
in
the
search
results.
So,
if
you're
making
a
change
by
switching
from
one
provider
to
a
different
provider,
then
on
the
one
hand,
this
is
a
chance
to
rethink
your
website
and
think
about
like
what
could
I
do
to
significantly
improve
things
overall?
A
On
the
other
hand,
it's
also
a
chance
to
make
sure
that
you're
kind
of
creating
a
checklist
for
yourself
and
saying
like
these
are
all
of
the
things
I
want
to
make
sure
remain
in
place
when
I
move
over.
So
things
like,
I
want
to
make
sure
that
all
of
the
old
urls
continue
to
work
or
that
they
at
least
redirect
make
sure
that
you
have
the
right
headings
on
the
right
pages,
not
that
you
end
up
with,
like
just
your
company
name
as
a
heading
for
every
page,
on
your
website.
A
Think
about
the
images
as
well
and
think
about
like.
Are
you
getting
a
lot
of
traffic
from
image
search?
And
if
so,
maybe
you
should
think
about
images
even
more
than
you
would
otherwise.
So
it's
like
there's
there's
a
lot
of
work.
That's
involved
with
trying
to
figure
out
like
what
you
need
to
watch
out
for
with
a
migration
like
this.
I
think
the
the
good
part
is
that
a
lot
of
these
more
mainstream
cms
setups.
A
They
have
a
lot
of
practice
with
migrations
and
for
the
most
part,
if
they
they
have
tools
to
kind
of
export
and
import,
then,
for
the
most
part,
that'll
work
out
fairly
well,
so
in
a
lot
of
cases,
probably
it'll
be
pretty
smooth,
but
I
would
really
recommend
if
you
care
about
search,
make
sure
to
read
up
on
everything
around
site.
Migrations
make
a
lot
of
checklists
list.
All
of
the
pages
that
you
have
everything
that
you
care
about,
so
that
you
can
kind
of.
A
When
you
make
that
migration,
you
can
double
check
to
make
sure
that
you
really
have
everything
covered,
that
the
kind
of
the
tools
that
you
have
available
actually
do.
What
they're
supposed
to
do,
because
the
tricky
part
is
by
the
time
that
you
notice
that
something
has
gone
wrong
with
a
site
migration.
A
It's
it's
often
like
a
month
or
two
later
and
then
you're
already
kind
of
going
down
the
wrong
road
and
fixing.
That
is
sometimes
a
lot
harder
than
if
you
can
fix
that
and
recognize
that
right
when
you're
doing
the
migration
will
amp
stories
be
more
prominent
in
the
search
results.
I
don't
know
my
guess
is
they
will
be
at
least
for
a
while
to
kind
of
see
how
things
are
are
showing
up.
But
I
I
really
don't
have
any
insider
information
on
amp
stories.
A
A
As
user
experience
is
one
of
the
determining
ranking
factors.
Could
you
give
us
a
quick
insight
into
how
user
is
measured
by
google
yeah?
I
I
think
that's
super
tricky
I.
So,
on
the
one
hand,
we
don't
have
a
pure
user
experience,
ranking
factor
kind
of
thing,
but
rather
we
we
have
individual
things
which
kind
of
map
into
user
experience
and
the
the
two
parts
that
I
I
thought
I'd
call
out
are.
On
the
one
hand
we
have
the,
I
think
it's
the
page
layout
algorithm.
A
I
don't
I'm
not
sure
what
the
official
name
is,
but
essentially
the
the
algorithm
where
we
try
to
recognize
when
there's
significant,
unique
content
above
the
fold
on
a
web
page
which
kind
of
maps
into
user
experience
in
terms
of
like
is,
is
the
page
when
a
user
clicks
on
it
from
the
search
results.
Is
it
just
filled
with
ads,
or
is
there
actually
content
there
and
that's
something
we
we
do
care
about.
So
that's
something
that
kind
of
maps
as
a
ranking
factor
and
kind
of
also
goes
into
usability
a
bit.
A
The
other
aspect
is
everything
around
core
web
vitals,
which
is
coming
up.
I
think
what
is
it
next
month
or
so,
where
we
take
into
account
things
like
speed
usability
with
the
cumulative
layout
shift.
First,
input
delay
as
kind
of
a
measure
for
how
quickly
a
page
is
responsive
and
take
that
into
account
with
regards
to
ranking.
A
So
those
are
kind
of
the
the
primary
aspects
that
I
am
kind
of
aware
of,
that
would
fall
into
kind
of
user
experience
as
a
ranking
factor,
but
it's
it's
tricky,
because
it's
not
that
we
have
like
user
experience
as
a
ranking
factor.
It's
just
that
there's
a
little
bit
of
overlap
as
google
introduced
passage
ranking.
Does
this
shift
importance
from
optimizing
the
page
as
a
whole
to
optimizing
certain
sections
of
a
page?
Can
a
good
passage
ranking
boost
the
overall
ranking
of
a
website?
A
I
I
don't
know
so
I
I
would
still
focus
on
individual
pages.
I
think
the
the
passage
ranking
part
is
more
for,
if
you're,
in
a
situation
where
you
have
really
long
websites
or
really
long
pages,
then
it's
easier
for
google
to
recognize.
This
particular
part
of
a
web
page
is
important
and
to
rank
that
appropriately.
A
A
A
Then
this
question,
I
didn't
quite
understand
what
what
is
the
treatment
that
google
gives
to
public
url
for
a
whole
website
used
for
test
purposes
only
so
I
I
think
things
around
essentially
test
websites
or
staging
websites
for
the
most
part.
We
don't
try
to
recognize
them
and
say:
oh,
this
is
a
staging
website
and
treat
it
differently,
but
rather
the
the
tricky
part
is
sometimes
we
do
find
them
in
search
and
then
we
end
up
showing
them
in
search
and
you
as
a
site
owner
might
say.
A
If,
if
you
want
to
be
sure
that
google
does
not
crawl
and
index
your
staging
website,
then
I
would
make
sure
that
you're
using
some
kind
of
authentication
on
the
staging
website,
so
that
google
can't
actually
access
the
content,
but
rather
that
you
can
still
use
it
like
on
your
own
to
test
things
out,
but
at
least
google
won't
be
able
to
access
it.
So
that's
kind
of
the
the
recommendation
there.
The
advantage
of
using
authentication
versus
things
like
robots
text
or
no
index
on
pages
is
that
with
authentication.
A
You
recognize
right
away
if
you
accidentally
include
that
on
your
on
your
primary
website.
So
if
you
push
that
staging
website
live
and
you
leave
authentication
in
place,
then
that's
pretty
obvious.
Whereas
if
you
push
your
staging
website
live
and
you
leave
the
robot's
text
block
and
paste
or
the
no
index
in
place,
then
you
really
have
to
watch
out
for
that
and
it
it
happens
all
the
time
that
sites
end
up
being
blocked
by
robots
text
or
no
index,
because
a
staging
website
gets
pushed
live
like
this.
A
So
I
don't
know
if
your
question
went
into
that
area,
but
otherwise,
like
feel
free
to
to
ask
again
or
ask
here,
live
we're
using
an
approved
third
party
to
collect
our
service
and
product
reviews,
and
these
are
displayed
on
our
website
using
a
widget,
but
the
data
is
actually
held
against
our
listing
on
the
review
site
itself.
A
How
would
google
feel
if
we
copied
these
reviews
and
display
them
on
our
web
pages
as
opposed
to
just
using
the
widget
to
display
them?
So
I
I
think
there
are
two
aspects
there.
On
the
one
hand,
like
there's
the
whole
aspect
like:
are
you
allowed
to
do
this
with
your
your
service
provider
or
not?
I
I
don't
know
how
those
things
are
handled.
A
The
other
aspect
is
the
kind
of
the
structured
data
side
on
google.
From
from
google's
point
of
view,
you
can
include
these
reviews
on
your
pages
if
you
essentially
treat
them
more
as
testimonials
rather
than
reviews,
and
with
regards
to
structured
data
on
the
pages,
you
can
only
use
this
review
structured
data
on
your
site
in
a
case
like
this,
if
you're
collecting
the
reviews
yourself
and
so
if
it's
not
something
where
you're
taking
reviews
from
someone
else's
site
and
copying
them
to
your
site,
so
those
are
kind
of
the
aspects
there.
A
If
you
want
to
copy
and
paste
these
reviews
and
you're
allowed
to
do
so,
then
just
make
sure
that
you're
not
using
the
review
structured
data
for
that,
our
user
experience
for
both
desktop
and
mobile,
taken
into
account
when
it
comes
to
core
web
vitals
as
a
ranking
factor,
or
is
it
only
the
mobile
experience
for
the
user
that
counts
at
the
moment?
The
plan
is
only
to
use
the
mobile
user
experience
so
on
in
the
desktop
search
results.
A
We
plan
not
to
use
these
kind
of
core
web
vitals
with
regards
to
ranking,
at
least
for
the
moment
on
in
the
mobile
search
results.
We
you
take
into
account
the
mobile
core
web
vitals
of
a
page.
That's
why
also
in
search
console,
we
focus
more
on
the
mobile
side
of
things
with
regards
to
the
core
web
vitals,
the
other
factors
that
are
used
for
the
page
experience
ranking
element.
They
still
apply
to
both
desktop
and
mobile,
though
so
in
particular
https.
A
A
I
don't
know
so.
I
would
focus
on
what
what
actually
provides
kind
of
authentic
authenticity
to
to
users
where
users
actually
feel
trustworthy
or
feel
trust
about
a
website
that
they're
looking
at.
Obviously,
if
there's
some
specific
qualifications
that
you
have,
I
think
it
makes
sense
to
highlight
that
on
a
page
but
just
like
taking
random
seals
and
copying
and
pasting
them
on
a
website.
I
don't
think
that
really
impresses
users
and
definitely
doesn't
impress
googlebot.
A
So
that's
something
where
I
would
focus
more
on
what
what
is
actually
acceptable
by
users
and
what
makes
sense
there
wow
okay
running
towards
the
end
of
time
or
the
the
end
of
the
session
here.
Maybe
I'll
just
go
back
to
to
more
questions
from
you
all
for
the
moment.
I
also
have
a
bit
more
time
afterwards.
So
if
you
want
to
hang
around
a
little
bit
longer,
we
can
chat
a
little
bit
more
as
well.
A
Let
me
see,
I
think,
I
think,
the
the
first
couple
we
who
still
have
the
hands
up,
I
think
we
chatted
briefly.
Maybe
let's
go
with
ritu.
K
A
K
Good
so
this
week
I
have
some
few
questions
related
to
schema,
markup
and
site
links
and
search
console
and,
firstly,
I
will
start
with
efficient
schema.
I
have
implemented
apple
app
schema
on
my
web
pages,
but
I'm
not
able
to
recognize
where
it
is
live
or
not
like
it
is
showing
in
search
console.
K
It
is
appearing
these
pills
having
faq
schema
is
valid,
but
I
want
to
see
it
is
live
or
not
how
it
is
appearing
like
we
apply
on
ratings
for
review
ratings.
It
appears
as
a
rating
in
google
scrp,
but
for
epic
schema,
I'm
not
able
to
recognize.
How
can
I
recognize
that.
A
I
don't
know
if
there's
there's
a
simple
trick
to
do
it,
so
what
one
thing
you
can
do
is
to
do
a
site
query
for
your
website.
So
just
cite
colon
and
then
your
domain
name
or
specific
pages.
Where
you
have
this
markup
and
for
the
most
part
we
will
try
to
show
the
the
markup
that
we
would
show
there.
A
So
that
could
be
one
way
to
kind
of
see
what
it
might
look
like.
The
other
thing
with
faq
markup
in
general
or
schema
markup
overall.
Is
we
just
don't
show
it
all
the
time?
So
sometimes
you
need
to
try
different
queries
out
and
see
where
it
would
show
up
in
search
console
in
the
performance
report.
You
can
filter
down
by
some
of
the
rich
results
types.
L
Not
always,
yes,
I
have.
K
Checked
in
search
performance
report
this
I
have
seen
search
appearance
option
and
there
I
have
seen
web
light
results,
effective,
schemas,
valid
pages
which
is
having
effective,
schemas
I've
seen.
But
I
want
to
see
like
how
it
will
appear
when
I
I
will
implement
a
fake
schema
on
the
pages,
how
it
will
show
on
google
a
crp.
I
want
to
say
like
that.
I
just
wa,
because
it
is
showing
just
only
that
these
pages
having
faq
schema
yeah.
A
You,
if
you
just
want
to
see
what
it
looks
like
you
can
also
use
the
rich
results
test
and
I
I
believe
it
has
a
preview
link
there,
where
you
can
preview
what
the
page
would
look
like
with
that
type
of
structured
data,
so
that
might
also
be
an
option.
K
Okay,
so,
okay,
so
one
more
question:
it
is
related
to
site
links.
Actually,
sir,
I
have
some
pages
which
I
want
to
appear
in
google
scrp
as
a
site
links
desired
pages.
I
I
built
some
backlinks
also
for
them,
and
I
did
internal
linking
also
I'm
doing
from
last
one
yeah.
This
trust,
but
still
those
pieces
are
not
appearing.
What
more
tactics
can
I
do
for
this?
Like?
Does
they
can
in
google
crt.
A
Yeah
site
links
are
are
something
that
happens
algorithmically
it's
not
based
on
any
particular
markup
on
a
web
page.
So
for
for
the
most
part,
it's
not
something
that
you
can
explicitly
target
to
try
to
make
it
appear.
A
So
it's
something
where,
if
we
recognize
that
there's
a
need
for
the
user
when
they
search
in
this
particular
way
to
show
them
more
results
from
a
website,
then
we'll
show
site
links,
but
it's
not
something
that
you
can
just
tweak
on
your
website
and
we
would
show
it
so
I
I
wouldn't
necessarily
see
it
as
something
that
you
would
be
able
to
optimize,
for
I
think
if
we
do
show
side
links
for
a
particular
page,
then
that's
something
where
you
can
think
about
like
is
are
these
titles.
Okay,
is
the
right
selection?
A
K
We
don't
have
any
so
one
option
is
coming
search
experience.
Let
us
grab
light
results,
I'm
not
getting.
What
exactly
that
is.
Can
you
explain
me
web
light
results
web.
A
Light
yes,
so
web
light
is,
is
a
system
that
we
have
in
place,
which
is
active.
I
think
only
in
a
handful
of
countries
where
we
see
that
users
with
very
limited
devices
and
very
limited
bandwidth
are
using
the
web
to
search
and
where
we
can
recognize
that
it
makes
sense
for
us
to
show
them
a
version
of
a
page
that
is
much
lighter,
so
that
is
that
is
much
faster
for
them
to
interact
with,
and
that
essentially
is
is
shown
in
search
console
there.
A
So
we
we
have
a
help
center
page
on
web
lite.
I
I
would
double
check
that
it
also
has
information
on
how
to
opt
out.
If
you
say
I
I
don't
want
this
to
happen.
It
gives
you
information
on
how
you
can
test
your
pages,
what
they
would
look
like
with
weblite
so
depending
on
your
site
and
where
your
users
are,
you
might
see
that
in
search
console
or
you
might
not
see
that
at
all,
but
I
I
would
check
out
the
the
help
center
page
for
web
live.
K
K
If
I
want
to
target
my
website
for
multiple
locations
and
but
I'm
having
only
in
search
console
only
one
option,
there
is
no
multiple
option
for
it
like
what
we
can
do
in
that
thing,.
A
Now,
if,
if
you
have
one
website
that
you
want
to
use
geo
targeting
for
from
multiple
locations,
then
you
would
have
to
just
put
it
global
and
not
either
not
select,
something
in
search,
console
or
call
it
unlisted.
I
think,
is
like
if
you
explicitly
want
to
say
it
is
meant
to
be
global,
but
you
can't
take
one
website
and
use
geo-targeting
for
multiple
countries.
A
What
you
could
do,
if
you
have
specific
country
country,
specific
content,
is
to
split
the
website
up
and
then
for
the
individual
parts.
You
can
verify
them
separately
in
search
console
depending
on
your
domain
name,
and
then
you
can
do
country
targeting
for
those
individuals
but
taking
a
website
and
splitting
it
up
so
that
you
have
individual
country
versions
is
a
pretty
big
step.
So
I
wouldn't
just
try
that
out
or
blindly
do
that.
K
So
may
I
know
what
is
the
money
of
not
set
like?
I
have
seen
like
lots
of
tutorials
for
it,
and
I
went
to
lots
of
articles
for
it,
but
I'm
not
forgetting
exactly
what
not
said
meaning
like
in
google
analytics.
We
see
the
term
not
set
the
traffic
coming
through,
not
set.
A
I
don't
know,
I
don't
know
how
how
that
so,
what
might
be
happening
is
that
google
analytics
doesn't
see
the
query
that
is
associated
with
a
user
coming
to
a
page,
but
I
I
thought
they
used
something
like
not
provided
or
something
like
that.
A
Yeah
and
essentially
for
for
most
of
the
traffic
that
comes
from
search,
we
we
don't
include
the
query
as
a
parameter.
So
that's
why
analytics,
and
so
they
they
see,
the
traffic
is
coming
from
google,
but
they
don't
know
which
query
it
was
and.
K
But
for
not
set
like
we
have,
we
don't
have
any
like
free
options
like
if
we
want
to
see
the
traffic
sources
for
not
set,
we
can't
see
for
it.
I
don't
know.
A
Okay,
let
let
me
take
a
break
here
and
just
pause,
the
recording
I
I
still
have
a
bit
more
time,
so
if
any
of
you
want
to
to
hang
around
more
and
we
can
get
through
some
of
the
the
remaining
raised
hands
as
well
feel
free
to
stay
along,
but
otherwise
thank
you
for
joining
thanks
for
all
of
the
questions
so
far,
and
I
I
hope
this
was
useful
for
you
all
and
maybe
we'll
see
more,
even
more
people,
I
guess
in
one
of
the
future
hangouts
if
you'd
like
to
join
all
right
and
with
that
I'll
stop
here.