►
From YouTube: English Google SEO office-hours from February 12, 2021
Description
This is a recording of the Google SEO office-hours hangout from February 12, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://developers.google.com/search/events/join-office-hours
Feel free to join us - we welcome webmasters of all levels!
A
A
B
B
B
I
know
danny
sullivan
said
it's
gonna,
look
exactly
like
any
other
snippet.
It's
not
gonna.
Look
like
a
feature
snippet.
It's
not
gonna.
Look
like
a
weird
snippet,
even
though
you
guys
had
an
image
of
a
snippet
looking
differently
in
the
tweets
about
it.
B
One
confirmed
that
and
then
two
are
you
gonna
are:
are
the
patches
in
ranking
snippets
using
scroll
to
text,
feature
that
are
in
some
features,
snippets
and
other
elements?
Those
two
questions.
A
A
With
regards
to
the
scroll
to
snippet
part,
I
I
believe
that's
something
that
we're
just
still
experimenting
with,
so
not
necessarily
that
it
will
automatically
use
that
or
that
it's
like
always
use
that
or
not,
but
my
understanding
is
that's
still
something
where
we're
not
100
sure
how
that
will
be
embedded
in
the
long
run.
A
I
I
don't
think
there
would
be.
There
would
necessarily
be
a
big
difference
there.
I
did
see
some
tweets
from
people
saying
oh
we're
starting
to
see
this
more
in
search
consoles
since
you
launched
this-
and
I
I
don't
know,
if
that's
more
of
a
coincidence
or
if
that's
kind
of
like
directly
tied
to
that
or
not
okay.
In
summary,
you
don't
know,
I
don't
know.
Yes
all
right.
Thank
you
not
even
it
depends
yeah
all
right.
Other
questions,
maybe
I'll
I'll,
know
some
some
answers
along
the
way.
D
I
can
ask
a
quick
one:
that's
okay,
hey
john
okay!
Go
for
it,
so
this
is
related
to
the
unavailable
after
robots
value
or
tag,
so
I'm
curiou
we're
working
with
a
classified
site
where
that
tag
would
definitely
be
helpful
in
order
to
stop
google
from
keep
trying.
Those
four
or
four
expired
ad
pages.
It's
just
that
we're
concerned
that
a
lot
of
users
are
able
to
refresh
their
ad.
D
So
in
that
case
it
won't
expire
at
the
date
it's
supposed
to
expire,
so
they
might
refresh
it
for
another
week
or
even
more
and
they
might
keep
refreshing
it
if
they,
if
they
choose
to
what
is,
is
the
unavailable
after
options
still
a
good
one?
Would
google
will
updating
it?
Make
google
kind
of
understand
that
the
date
keeps
moving
forward
into
the
future.
A
Yeah
I
mean
if
we
refresh
that
page
for
crawling
and
indexing,
then
we
would
see
the
the
updated
unavailable
after
meta
tag,
so
that
would
work
it's
not
like.
You
can
only
specify
it
once
and
then
it's
it's
like
cemented
like
that.
If,
if
we
recall
that
page
and
see
the
new
tag
we'll
take
that
into
account,
so
I
guess
the
situation
would
be
kind
of
tricky.
A
D
So
is
there
any
way
to
make
sure
that
doesn't
happen?
I
mean,
would
using
the
site
map
and
the
last
modified
date
help
if
we
choose
is
that
available.
A
After
that,
okay
sure
yeah,
I
mean
it's
it's
something
where
you
could
change
the
last
modification
date
and
say
like
it.
It
changed
now
like
it's
not
guaranteed
that
we
would
recrawl
the
page
right
away,
but
it's
one
of
those
signals
you
can
tell
us
like
this
page,
has
changed
double
check
to
make
sure
that
we're
not
missing
anything.
D
But
what
happens
if
I
set
the
unavailable
tag
after
tag
for
tomorrow?
Let's
say
and
the
crawler
knows
that
from
tomorrow
going
forward,
it
should
not
crawl
the
page
anymore
and
let's
say
a
week
from
now
the
user
decides
to
reactivate
their
ad.
Will
the
last
modification
date
in
the
sitemap
will
make
google
understand
that?
Well,
maybe
I
should
recruit
that
page
because
it
seems
it
it's
now.
A
I
mean
the
the
unavailable
aftermath
attack
doesn't
say
not
to
recrawl
it
afterwards,
it's
more
that
the
the
site
owner
is
saying
this
will
probably
be
a
no
index
or
a
404
afterwards.
So
it's
it's
more
that
google
can
drop
it
out
a
little
bit
faster
without
needing
to
recall
it.
It's
not
a
sign
that
we
would
stop
refreshing
that
page,
okay,.
D
But
it
will
still
help
with
the
kind
of
overall
cool
budget
stop
making
google
kind
of
slow
down
on
re-crawling
these
guys,
okay,.
E
A
The
the
idea
is
that
we
would
drop
it
even
without
needing
to
recall
it.
So
our
assumption
is
that
if
you
have
the
unavailable
after
meta
tag
set
to
a
date
that
that
page
is
no
longer
available
afterwards
and
that
will
be
no
index
or
404
afterwards,
so
our
systems,
I
I
think,
the
way.
The
way
that
I
understood
it
is
our
systems
are
essentially
going
to
treat
it
as
if
there
is
a
no
index
on
there
after
that
date,
even
if
we
don't
recrawl
it
by
then.
A
I
yeah
I
mean
it's
it's
something
where,
if
you,
if
you
have
a
seasonal
page
that
has
a
limited
lifetime,
then
you
can
definitely
use
that,
especially
to
let
us
know
about
things
that
expire
quickly
to
let
us
know
about
kind
of
the
the
long
tail
content
that
we
don't
recrawl.
That
often
that's
that's
really
useful,
like
like,
for
instance,
classified
sites,
for
example.
I
think
is,
is
a
really
good
example
for
that
seasonal
content
itself,
like
if
you
have
one
I
don't
know
christmas
page.
A
Usually
that's
going
to
be
very
prominent
within
your
site
during
that
season,
so
we're
going
to
recrawl
that
quite
a
bit
and
if
you
added
no
index
to
that
page
at
some
point,
we'll
pick
that
up
fairly
quickly,
so
usually
for
that
kind
of
seasonal
content.
The
unavailable
after
meta
tag,
isn't
really
critical,
but
really
for
the
content
that
we
don't
recrawl.
That
frequently.
F
A
No,
it's
it's
totally
optional
to
have
a
robot's
text
file.
If
there
is
no
robot's
text
file,
there
are
no
restrictions
for
robots
text.
Essentially,
so
that's
that's
perfectly
fine
setup.
F
And
the
next
question
is
about
blog
post
category
and
blog
post
tag,
the
post
category
and
post
track.
Does
it
have
any
impact
on
ranking
of
the
blog
post.
A
Not
necessarily
so
it's
not
that
we
would
try
to
recognize
tags
on
a
page,
but
these
are
links,
and
potentially
they
go
to
kind
of
a
category
page
your
tag
page,
and
that
would
be
another
page
that
we
could
index
or
that
we
could
use
to
pick
up
links
to
your
articles.
So
it's
it's
not
that
there's
any
inherent
kind
of
magic
around
tags,
it's
just
it
creates
more
links
and
more
pages
within
your
site.
A
Thank
you.
Okay.
Let
me
run
through
some
of
the
submitted
questions.
We'll
definitely
have
more
time
for
your
live
questions
as
well,
and
if
you
have
any
comments-
or
I
don't
know
more
questions
to
the
the
questions
that
we
go
through,
feel
free
to
jump
on
in
a
large
site
has
legacy
code
that
generates
parameters
on
internal
links.
These
parameters
are
unique
for
every
session.
If
the
site
serves
googlebot
these
pages,
with
the
parameters
stripped,
would
that
be
considered
cloaking
technically?
Yes,
we
would
consider
that
to
be
cloaking.
A
A
It's
essentially
something
where
you're
kind
of
providing
an
optimization
for
googlebot
that
you're
not
providing
for
users
and
it's
more
a
matter
of
you're
making
it
kind
of
hard
for
yourself
to
maintain
your
site,
because
you
always
have
to
look
at
both
versions
and
if
you
never
see
the
version
that
googlebot
sees
it's
very
easy
to
run
into
a
situation
where
suddenly
googlebot
gets
error,
pages
or
googlebot
gets
bad
links
and
every
time
you
check
that
with
maybe
a
local
crawler,
you
don't
see
those
broken
links.
A
So
that's
something
from
kind
of
a
best
practices
point
of
view.
I
try
to
avoid
that,
but
it's
not
that
you're
going
to
get
flagged
or
get
a
manual
action.
Because
of
that,
if
a
niche
news
portal
is
moved
to
a
subdomain
of
a
larger
general
news
portal,
can
its
visibility
be
affected
by
the
main
domain?
A
So,
for
example,
before
the
migration,
both
sites
rank
for
top
positions
on
a
lot
of
topics
now
many
times
only
the
main
domain
portal
is
able
to
reach
top
positions
and
the
first
page,
while
the
subdomain
has
dropped
considerably
so
anytime.
You
do
this
kind
of
migration,
where
you're
kind
of
splitting
a
site
into
separate
parts
or
you're
taking
multiple
sites
and
combining
them
into
one
one
main
site.
A
I
would
expect
to
see
fluctuations
with
regards
to
kind
of
ranking
in
general
kind
of
temporary
fluctuations,
but
also
long-term
fluctuations,
and
it
can
be
that
the
overall
result
is
something
that
is
much
stronger
than
the
individual
ones
before
it
can
also
be
that
the
overall
result
is
kind
of
a
mixed
bag
of
things
and
causes
problems
in
the
long
run.
So
that's
something
where
it's
it's
really
hard
to
determine
ahead
of
time.
A
What
the
actual
effect
will
be
when
you
go
into
the
situation
of
merging
or
splitting
sides
so
from
from
a
theoretical
point
of
view.
Yes,
it
can
happen
like
this.
It
is
something
where
you
can
put
in
time
and
get
some
help
from
experts
and
more
experienced
folks
to
see
like.
Should
you
expect
problems,
or
will
this
probably
be
okay,
but
it
is
different
from
the
general
site
move
situation
where
you
just
move
one
site
to
a
different
domain,
you're
taking
everything
from
one
site
and
just
passing
it
on
to
a
new
one.
A
As
far
as
I
know,
we
don't
do
anything
with
kind
of
audio
versions
of
content.
We
also
wouldn't
see
that
as
duplicate
content.
So
it's
not
that
you
have
to
avoid
that.
I
mean
duplicate
content
itself,
isn't
really
something
you
really
have
to
avoid,
but
even
if
you
kind
of
wanted
to
avoid
the
situation
that
you're
suddenly
ranking
for
the
same
things
with
different
pieces
of
content,
the
audio
version
is
something
that
we,
as
far
as
I
know,
would
not
even
process
separately.
A
So
at
most,
we
might
see
that
as
a
video,
a
piece
of
video
content
and
show
that
also
with
a
video
snippet,
but
essentially
it
wouldn't
help
or
detract
from
a
page's
overall
ranking.
G
John,
wouldn't,
wouldn't
it
show
us
if
they've
embedded
an
audio
file,
for
example,
to
play
on
the
page?
Doesn't
that
do
anything
I
mean
if
you've
got
a
page
that
literally
just
has
text
versus
a
page
that
has
text
pictures,
video
audio
and
therefore
provides
more
variety
and
depth.
A
I
I
don't
think
we
we
would
look
at
that
and
say:
oh,
there
are
different
kinds
of
content
here,
it's
it's
a
better
page
because
of
that
it
might
be
that
they're,
indirect
effects
like
if
users
find
this
page
more
useful
and
they
recommend
it
more,
that's
something
that
could
have
an
effect.
But
it's
not
the
case
that
we
look
at
the
types
of
content
on
a
page
and
say:
oh
two
types
versus
five
types,
like
the
one
with
five
types,
is
better.
A
I
think
it's
a
bit
different
with
video
and
images
in
that
images
and
video
themselves
can
rank
independently,
like
in
in
image,
search
or
in
video
search.
You
can
also
have
the
same
piece
of
content
be
visible
in
those
other
surfaces,
but
for
audio.
We
don't
really
have
a
separate.
I
don't
know
audio
search
where
that
page
could
also
rank.
I
I
think
the
closest
that
could
come.
A
There
is
the
the
podcast
search
that
we
have
or
the
podcast
kind
of
one
box
thing,
but
that's
really
tied
to
kind
of
the
podcast
content
type
where
you
have
a
feed
of
podcast
information
and
we
can
kind
of
index
it
like
that,
but
just
having
audio
on
a
page
by
itself.
I
don't
think
that
would
change
anything
automatically
in
our
systems.
A
A
Okay,
a
website
has
a
small
snippet.
C
Sorry
go
ahead.
Oh
hi,
sorry,
I
have
a
question
about
mobile
first
indexing,
so
I
just
did
a
I'm
working
on
a
site
that
just
went
through
a
site
move.
So
I
actually
spoke
to
you
about
a
couple
weeks
ago,
but
basically
it's
just
change
of
domain
name,
so
it
didn't
change
the
urls
or
site
architecture,
so
the
old
site
was
being
indexed
by
mobile
googlebot,
but
now
it's
being
indexed
by
desktop
and
it
hasn't
switched
over
yet
so
it's
only
been
a
week
actually
since
it
launched.
C
So
I
I
know
it
can
take
a
bit
of
time,
but
I
was
wondering
if
there's
maybe
a
sandbox
effect,
because
the
domain
itself
was
pre-existing
and
it
was
kind
of
a
park
domain,
but
it
was
a
site
that
was
probably
pretty
previously
indexed
by
google.
C
It's
currently,
according
to
the
crawl
stats
report
in
search
consoles,
currently
being
crawled
96
of
the
time
by
smartphone
googlebot.
So
I'm
wondering
if
like
how
long
I
guess
might
we
expect
for
it
to
switch
over
to
mobile
first
or
if
there
is
potentially
a
sandbox
effect,
because
it
was
previously
a
site
being
indexed
by
desktop.
A
So
I
I
wouldn't
worry
about
the
mobile
first
part
for
for
something
like
that,
because
we
we
kind
of
have
the
the
timeline
set
for
switching
everything
over
to
mobile
first
anyway.
So
that
will
happen
in
I
don't
know
what
is
it
in
march
or
april?
I
don't
know
what
timeline
we
had
there.
A
So
that'll
happen
anyway,
but
with
regards
to
kind
of
moving
to
a
previously
existing
domain,
where
there
was
part
content,
you
can
definitely
see
some
temporary
effect
there,
not
so
much
in
terms
of
kind
of
like
a
sandbox
effect
or
something
like
that,
but
more
in
terms
of
if
we've
always
seen
a
no
index
page
on
this
site
for
the
longest
time,
then
probably
we're
going
to
assume
that
it's
still
no
index
for
a
while-
and
you
might
see
kind
of
this-
I
don't
know
moment
where
is
for
I
don't
know
I've
seen
it
happen
for
maybe
a
week
or
two,
maybe
up
to
three
weeks
where
it's
just
our
systems,
assume
that
this
is
still
a
part
site
and
essentially
treat
the
new
content.
A
C
Okay,
yeah,
I
think
it's
starting
to.
We
are
starting
to
see
some
recovery
with
rankings
and
stuff
that
had
dropped
in
the
last
week,
but
yeah.
I
was
just
wondering
if
there
is
any
sort
of
negative
impact
from
being
indexed
by
desktop,
given
that
like
we're
using
a
responsive
design
and
it's
the
same
architecture
as
the
previous
site,.
A
No,
no,
that
that
definitely
wouldn't
be
the
case.
So
it's
not
that
there's
any
any
kind
of
ranking.
I
don't
know
penalty
or
anything,
holding
a
site
back
if
if
it
was,
if
it's
being
crawled
with
a
mo
with
a
mobile
or
with
a
desktop
crawler,
that's
really
just
a
technical
thing.
On
our
side,
are
we
crawling
with
mobile
or
desktop
and
then
essentially
the
same
content
goes
into
the
index.
C
Okay,
yeah,
because
I'm
seeing
something
that
I'm
not
sure
if
it's
related,
which
is
I'm
getting
some
amp
warnings
for
domain
mismatch
and
it's
it's,
it
seems
to
be
like
sometimes
the
old
version
of
the
new
ever
the
new
version
of
the
site
is
being
has
the
amp
page
indexed,
but
the
old
page
hasn't
been
recalled
yet
since
the
move.
So
I
guess
maybe
the
redirect
hasn't
been
spotted
yet,
and
I
was
wondering
if
that
is
because
of
the
primary
the
primary
league.
A
I
I
don't
think
that
should
be
a
problem,
because
if
we
understand
the
connection
between
your
kind
of
legacy
pages
and
the
amp
version
of
the
page,
then
we
would
crawl
those
appropriately.
So
that's
something
where
I
I
could
imagine
you
might
see
a
temporary
effect
until
all
of
that
settles
down
a
little
bit
where
maybe
we
have
the
amp
url
somewhere
linked
and
we
go
off
and
crawl
it
with
the
desktop
crawler
initially
and
then
we
realized.
A
Oh,
we
have
to
use
mobile
because
it's
amp
then
we
would
pick
that
up,
but
that's
something
that
I
would
expect
should
settle
down
fairly
quickly.
Like
I
don't
know
order
of
a.
I
don't
know
one
two
three
weeks
something
something
around
that
range.
F
C
A
All
right,
let's
see,
okay,
let
me
just
run
through
some
more
of
the
submitted
questions
first,
because
we
always
seem
to
run
it
short
of
time
for
them.
I'll
definitely
have
more
time
for
live
questions
as
well.
A
website
has
a
small
snippet
of
client-side
javascript
that
updates
the
url
with
the
history
api,
for
example.
This
server
response
shows
slash
page
and
the
browser
address
bar
shows,
slash
page,
updated,
no
resources
named
page
updated,
is
seen
in
the
network
resources.
A
So
essentially,
what
would
happen
the
first
time
we
would
see
slash
page
if
it
does
the
history
api,
swapping
with
page
dash,
updated
the
next
time
we
would
try
to
crawl,
slash
page,
updated
and
use
that
as
a
version
essentially
for
for
indexing.
I
mean
it's
not
100
certain,
because
it's
essentially
a
question
of
canonicalization,
but
we
would
see
that
as
a
redirect
and
redirects
are
pretty
strong
sign
for
canonicalization.
A
The
important
part
here
is
that
slash
page
updated
is
actually
a
page
that
we
can
crawl.
So
it
shouldn't
be
the
case
that
you
kind
of
swap
out
the
url
for
something
that
is
not
existent.
If
you
go
there
directly,
so
that's
kind
of
the
main
thing
to
to
watch
out
for
I've
written
outsourced
about
600
articles
in
a
year
all
were
made
more
or
less
in
the
same
way,
still
some
rank
and
show
in
google
and
some
are
not
in
google
at
all,
even
those
that
are
more
than
six
years
old.
A
Then
I
manually
need
to
go
through
all
those
600
articles
and
make
some
changes
and
update
them,
and
then
they
start
showing
up
in
search
what?
What
should
I
do
here
essentially,
so
I
think
the
the
main
thing
to
keep
in
mind
is
we
don't
guarantee
indexing
of
all
pages
on
every
website.
In
fact,
for
most
websites
we
index
just
a-
I
don't
know
a
small
portion
of
the
total
website.
A
So
if
you
have
600
articles
on
your
website,
it
can
be
completely
normal
that
we
go
off
and
index,
maybe
100,
maybe
500,
maybe
somewhere
in
between
it.
It
would
be
unexpected
from
my
point
of
view
if
we
would
always
index
all
pages
on
all
websites,
because
there's
just
I
don't
know,
there's
a
limit
to
the
number
of
pages
that
we
can
index
in
our
system.
So
we
have
to
try
to
prioritize
a
little
bit
and
by
changing
pages
on
your
website,
it's
certainly
possible
that
you
trigger
something
where
google
starts.
A
Seeing
again,
that
these
pages
have
changed
and
goes
off
and
crawls
them
and
sees
well.
Is
there
something
important
that
we
need
to
index
here
and
then
maybe
it'll
index
it
maybe
it'll
be
indexed
for
a
while?
Maybe
it'll
drop
out
again
after
a
couple
of
weeks
or
a
couple
of
months
it.
It
really
depends
it's
not
something
where
there's
a
guarantee
that
we
go
off
and
crawl
and
index
all
of
the
change
pages,
or
that
there's
a
guarantee
of
having
kind
of
an
old
evergreen
page
will
rank
high.
Anything
like
that.
A
So
especially
if
you're
talking
about
a
large
number
of
pieces
of
content
on
your
site,
then
that's
something
where
it's
worthwhile
to
figure
out
a
system
on
your
own,
where
you
can
work
to
determine
the
quality
of
these
pages
and
to
work
to
try
to
improve
those
pages
over
time,
so
that
google
also
kind
of,
is
able
to
go
in
there
and
say
well.
These
are
really
fantastic
pages.
We
do
need
to
crawl
and
index
more
of
these
pages
and
then
over
time.
That'll
pick
up
again.
A
Let's
see
core
web
vitals
question.
We
know
that
all
user
interaction,
pages
or
urls
are
considered.
My
question
is:
does
core
web
vitals
need
to
meet
the
75th
percentile
across
all
devices
or
mobile?
Only
the
dock
says
segmented
across
mobile
and
desktop
devices,
but
it's
not
a
clear
interpretation
at
least
not
for
me.
A
H
A
I
I
don't
know,
I
don't
think
we
we
have
kind
of
clearly
defined
exactly
how
the
the
ranking
boost
would
happen
there,
but
essentially
we
we
would
track
these
per
device
and
we'd
be
able
to
take
them
into
account
per
device.
A
So
I
I
would
assume
it
would
not
be
the
case
that
if
your
desktop
pages
are
really
fast
and
your
mobile
pages
are
slow,
that
we
would
still
count
that
as
being
fast.
Overall,
I
my
assumption
is
that
we
would
try
to
separate
that
out,
but
I
don't
know
like
what
what
the
final
mix
will
be
in
terms
of
it's
like.
Do
you
just
have
to
meet
the
bar,
or
is
there
kind
of
a
flexible
range
that
you
can
reach
to?
To
kind
of?
A
Let's
see
really
long
question,
I
have
this
website
with
subdirectories
on
it
for
korean
and
japanese,
and
I
think
the
question
goes
into
a
little
bit
hr
flang.
Do
you
need
atrial
flang
how
the
links
between
these
versions
work
that
kind
of
thing
and
whether
these
these
different
language
versions,
kind
of
need
to
be
in
line?
A
So
I'm
like
super
simplifying
sorry,
but
I
I
think
in
in
general,
when
you
have
pages
in
different
languages
in
significantly
different
languages.
So
in
this
case
it
sounds
like
it's
swedish
content,
korean
content
and
japanese
content
and
that's
something
where
we
would
be
able
to
rank
those
pages
individually
and
it's
something
where,
if
we
see
someone
searching
in
japanese
we're
not
going
to
show
them
the
swedish
version
of
your
page,
because
it's
pretty
obvious
that
they're
searching
in
japanese
and
that
your
japanese
pages
would
fit
that
query
best.
A
So
I
think
that
makes
it
a
little
bit
easier
in
terms
of
the
the
whole
hf
lying
set
up
and
what
exactly
you
need
to
do
there,
but
it
kind
of
this
this
general
situation
of
if
the
the
queries
are
really
split
by
language.
A
You,
you
don't
really
need
to
to
focus
on
ahrefling,
though
it
would
be
different.
For
example,
if
you
had
something
like
a
global
brand,
where
someone
in
searching
in
japanese
would
search
with
exactly
the
same
word
as
they
would
search
in
in
swedish,
and
then
it
might
be
hard
for
our
systems
to
understand
like
this
user
in
japan,
who
is
searching
for
this
one
name,
do
they
mean
the
original
swedish
site?
A
Do
they
mean
the
local
japanese
site,
it's
hard
for
us
to
understand
and
in
a
case
like
that,
we
would
use
hreflang
to
swap
out
the
appropriate
country
language
version
for
that
user.
So
I.
I
Have
a
follow-up
question
sure
in
in
the
case
where
you
have
indicated
hreflang
appropriately
through
xml,
sitemaps
and
specifically
related
to
english-speaking
countries?
What
happened?
What
should
one
do
if
you,
if
you
see
in
search,
console
that
google's
not
getting
it
right?
So
we've
indicated
canonicals
on
both
uk
and
u.s
pages.
For
example,
we've
marked
that
marked
our
site
maps
up
with
hreflang
and
we're
still
seeing
search
consoles,
say:
hey
we've
actually
selected.
The
uk
version
in
the
context
of
the
us
account.
A
A
I
I
don't
know,
makes
things
more
complicated
in
search
console
itself
in
that
it
tries
to
show
the
canonical
version
and
with
ahreflang
in
cases
where
the
the
same
content
is
available
for
multiple
countries.
We
can
still
recognize
that.
Actually
this
is
duplicate
content,
we'll
index
one
version
and
we'll
use
hreflang
to
show
the
appropriate
local
url
when
someone
is
searching.
A
So
that's
something
you
might
see
in
in
english,
where
you
have
the
same
content
in
english
in
german.
It's
it's
really
common
that
this
happens.
So
that's
something
we
we
see
quite
a
bit
here
in
europe
and
essentially
what
what
happens
is
in
in
search
console
and
the
url
inspection
tool.
If
you
check
kind
of
the-
I
don't
know,
let's
say
the
uk
version.
If
you
check
that
one
you
see,
the
canonical
shows
the
us
version,
then
you
that's
kind
of
what's
happening
there
in
that
we
understand
it's
the
same
content.
A
We
still
see
the
hreflang
they
are
so
we
can
swap
the
urls
out
in
the
search
results,
but
we
will
index
it
as
the
us
version
and
the
tricky
part
is
in
search
console.
We
report
on
the
canonicals
themselves,
so
in
the
search
results
we
might
show
the
uk
version
because
of
we
we
know
their
equivalent.
We
can
swap
out
the
urls,
but
in
the
performance
report
we
will
track
that
as
the
us
version
in
search
console.
I
So
two
quick
follow-ups
to
that
one
is,
I
think,
a
pretty
good
indicator
for
a
geo
mismatch
would
be
looking
at
a
geo.
Specific
search
console
account
looking
at
the
traffic
makeup
between
different
geos
right.
So,
let's
say,
for
example,
I
look
at
canada
and
I
see
that
50
of
their
traffic
is
coming
from
the
uk.
That
would
pretty
clearly
indicate
that
there's
some
mismatch
is
that
appropriate.
A
The
from
from
our
point
of
view,
it's
something
where
the
the
reporting
and
search
console
is
is
confusing
in
cases
like
that
and
you
you
might
say
it
is
wrong
because
it's
reporting
on
the
canonical
but
not
reporting
on
the
one
that
is
actually
shown
but
from
from
an
indexing
from
a
ranking
point
of
view
it
it
should
be
working
out
fine.
A
If
you
need
to
have
it
differently,
then,
essentially,
what
you
need
to
do
is
make
sure
that
these
versions
of
these
pages
are
unique
enough,
so
that
we
don't
run
into
the
situation.
When
we
say.
Oh,
we
will
make
it
easier
for
you
and
just
pick
one
version
to
index.
If
we
clearly
understand
these
are
absolutely
unique
pages,
we
should
index
them
separately,
then
we'll
show
them
separately,
we'll
index
them
separately.
We'll
have
separate
canonicals.
A
There
is
no
number
or
anything
like
that,
but
yeah.
I
I
think
I
think
it's
always
tricky
in
a
case
like
that,
because,
theoretically,
from
a
ranking
point
of
view,
it's
okay,
it's
just
everything
around
tracking
is
super
complicated.
A
J
Hi,
this
is
mohammed
vedio
bus
from
india.
Actually,
I
have
just
stepped
into
seo
industry
and
I
have
two
questions
to
you.
First
one
is
regarding
domain
name,
sorry
domain
and
subdirectory.
Suppose
I
have
a
domain
called
example.com
and
I
have
created
four
server
domain
like
example.one.com
like
till
four.
So
if
I
host
the
block
for
all
four
subdomains
on
the
name,
how
will
it
be
from
the
seo
perspective.
A
You
can
do
that.
I
I
think
it's
something
where
some
people
use
subdomains
some
people
use
subdirectories.
If
you
ask
the
seo
industry
overall,
you
will
get
lots
of
arguments
in
in
either
direction,
so
I
mean
technically,
you
can
do
that
from
from
google's
point
of
view
you
can
use
subdomains
or
subdirectories.
That's
that's
perfectly
fine.
A
I
I
would
first
of
all
try
to
think
about
what
what
possibilities
you
have
available
from
a
technical
point
of
view
on
your
side.
So
if
your
hoster
makes
it
really
hard
to
put
wordpress
into
a
subdirectory
and
it
has
to
be
on
a
subdomain
for
example-
then
maybe
that's
that's
kind
of
the
first
first
way
to
get
started
and
then
at
some
point
later
on,
you
can
decide.
A
J
Second,
question
is
related
to
backlinks.
I
have
heard
a
lot
about
backlinks
that
google
consider
quality
backlinks
when
it
comes
to
quality.
What
exactly
does
you
mean
about?
Does
you
mean
for
quality
quality
backlinks
and
how
google
analyzes
between
natural
and
paid
backlinks
yeah.
A
So
my
my
recommendation,
I
think,
especially
if
you're
getting
started,
is
not
to
focus
on
backlinks,
because
it's
very
easy
to
get
stuck
into
the
situation
of.
Like
you
said:
google
wants
quality,
backlinks
or
google
wants
natural
backlinks.
Therefore
I
will
make
my
backlinks
look
like
quality
or
I
will
make
my
unnatural
backlinks
look
like
they're
natural
and
it's
very
easy
to
to
spend
a
lot
of
time.
Focusing
on
that.
A
Google's
point
of
view
is
that
these
should
be
things
that
are
not
organized
by
you
that
are
not
paid
for
by
you
that
are
not
created
by
you,
but
rather
they
should
be.
Naturally
people
who
say
well.
This
is
really
cool.
I
really
like
that
similar
to
how,
if
you
make
a
website,
you
probably
have
seen
lots
of
other
sites
where
you
say
this
is
cool.
I
I
will
link
to
that.
I
will
refer
to
that
because
it's
it's
something
useful
for
my
users.
A
Thank
you
sure.
Let
me
run
through
some
more
submitted
questions
and
we'll
definitely
have
more
time
for
for
you
all
as
well.
Can
I
prevent
a
section
of
my
page
from
being
used
in
the
meta
description?
Google
is
choosing
a
component
within
the
product
information
as
a
description
of
the
page,
which
reads
really
odd
in
the
search
results.
A
Yes,
you
can
do
something
there
and,
namely
we
have
a
data.
No
snippet
http
attribute
that
you
can
apply
to
some
html
elements
within
your
pages
and
everything
that
is
within
these
html
elements
would
not
be
shown
in
the
snippet.
A
A
So
before
jumping
off
and
trying
to
block
certain
parts
of
your
pages
from
appearing
in
the
snippet,
I
would
double
check
to
make
sure
that
this
is
actually
something
that
people
are
seeing
and
not
just
something
that
only
you
are
seeing
while
you're
trying
to
diagnose
your
pages,
will
the
dot
gq
cctld
get
adsense
approval.
I
have
no
idea,
I
I
don't
know
anything
about
this
tld.
I
don't
know
about
the
adsense
approval
process,
so
no
idea
how
to
increase
organic
traffic
on
new
blogs.
A
I
I
don't
think
I
have
like
a
one
sentence
answer
for
that.
What
I
would
recommend
doing,
maybe
is
checking
out
the
seo
starter
guides.
We
we
have
one
that
we've
worked
on
for
quite
a
while
there's
some
from
some
some
of
the
bigger
other
companies
out
there
as
well.
A
How
can
a
team
technically
improve
seo
when
the
url
was
previously
associated
with
malware
malware?
The
url
was
changed
to
get
away
from
that,
but
still
running
into
issues
with
visibility
having
our
content
show
on
google
news.
So
I
don't
know
about
about.
A
Google
news,
but
essentially
malware
is
something
that
usually
someone
someone
who,
like
hacked
your
website,
added
to
your
pages
and
provided
some
bad
content
for
people
when
they
go
to
your
pages
that
maybe
didn't
text
them
with
a
virus
or
something
like
that
and
when
it
comes
to
malware,
that's
something
that
is
generally
flagged
by
the
safe
browsing.
A
Safe
browsing.
I
don't
know
this
is
safe
browsing,
I
don't
know
safe
browsing
team.
I
think
and
would
be
shown
as
a
warning
in
the
search
results
and
in
the
browser
as
well
and
essentially
when
that
is
resolved.
So
you
can
also
see
that
in
search
console
and
you
can
fix
that
on
your
site
and
we
will
automatically
recrawl
your
site
every
now
and
then
to
check
if
it's
fixed
and
if
it's
fixed,
then
we
will
remove
that
warning
and
everything
will
go
back
to
normal,
also
in
search
console.
A
So
there's
nothing
that
is
kind
of
for
the
longer
term
holding
back
a
site
if
it
was
hacked
with
malware
at
one
point,
usually
that's
something
that
jumps
back
like
completely
for
kind
of
like
on
and
off
switch
right
away.
A
That
said,
if
a
site
is
hacked
and
is
hacked
in
ways
other
than
just
malware,
for
example,
if
someone
adds
a
lot
of
content
to
your
site
that
doesn't
belong
to
your
site
or
if
they
add
phishing
pages
to
your
site
or
if
they
add
hidden
content
on
your
pages,
maybe
links
to
other
hack
pages
or
anything
around
that,
then
that's
something
we.
We
also
try
to
catch
as
hacked
content,
and
we
try
to
ignore
that
as
much
as
possible.
A
It
doesn't
have
to
be
the
case,
but
it's
it's
possible
and
in
in
a
case
like
that,
we
would
start
ranking
your
site
with
that
in
mind
and
if
you
fix
that
which
which
I
hope
you
do
find
all
of
those
things
and
are
able
to
fix
that,
then
over
time,
we
will
see
that
as
well
in
the
content
that
we've
recrawled
and
re-indexed
over
time.
A
But
that
is
something
that
is
less
like
the
kind
of
the
the
real
malware
where
it's
like
a
virus
on
a
page
kind
of
a
thing
where
it's
on
and
off,
but
more
something
where
over
time.
If
that
content
is
on
your
site
for
a
longer
period
of
time,
we'll
associate
that
with
your
site
and
if
you
remove
that
it
takes
a
bit
of
time
for
us
to
kind
of
de-associate
your
site
from
that
to
recrawl
all
of
these
pages
and
recognize.
A
A
So
that's
something
where
you
need
to
make
sure
that
you
don't
just
remove
the
content,
but
rather
you
also
remove
the
hole
where
they
got
through
and
for
both
of
those
things.
I
would
almost
recommend
getting
help
from
someone
who's
experienced
with
have
sites
which
could
be
an
expert
that
that
has
worked
on
these
for
a
longer
period
of
time.
There
are
various
people
who
have
done
that.
A
It
could
also
be
as
a
first
step,
going
to
the
search
central
health
forums
where,
where
the
folks
there
are
very
experienced
with
websites
as
well,
who
may
be
able
to
find
some
of
these
issues
and
help
you
to
clean
out
any
of
the
remaining
hack
content,
because
it's
it
can
get
really
tricky
in
that.
A
It
might
be
that
there's
still
some
hack
content
kind
of
lingering
around
that
you
don't
see
when
you
look
at
it
with
your
browser,
but
that
google
would
see
when
it
crawls
your
website
and
getting
some
tips
and
help
on
that
can
be
quite
helpful
and
speed
things
up
a
little
bit:
oh
wow,
okay,
still
more
people
joining
okay,
we're
kind
of
getting
into
the
the
last
10
minutes.
A
I
I
have
a
bit
more
time
to
hang
around
if
any
of
you
want
to
stick
around
longer,
but
maybe
we'll
go
through
some
of
the
local
things
here
I
see
some
of
you
are
raising
your
hands
so
I'll.
Just
go
through
the
list,
as
as
I
have
it
there
that
four,
I
think
if
I
got
your
name.
K
I
don't,
I
just
have
a
couple
of
questions
short
ones.
Hopefully
so
corvital
said
that
field
data
is
more
important
than
lab
data
right.
So
what
happens
if
we
fix
our
vitals
10
days
before
the.
A
So,
if
you
need
a
couple
more
weeks
before,
everything
is
propagated
and
looks
good
for
your
users
as
well,
then
like
take
that
time
and
get
it
right
and
try
to
find
solutions
that
work
well.
For
that,
the
the
tricky
part
with
the
lab
and
the
field
data
is
that
you
can
incrementally
work
on
the
the
lab
data
and
test
things
out
and
see
what
works
best,
but
you
still
kind
of
need
to
get
that
confirmation
from
users
as
well
with
the
field
data.
K
A
Not
preferences
directly,
but
we
we
do
in
the
image
search
guidelines,
recommend
that
you
use
use
useful
image
names
specifically
for
image
search.
So,
instead
of
just
using
a
number,
maybe
use-
I
don't
know
some
words
kind
of
describing
the
image
as
the
image
file
name.
K
Yeah,
you
would
describe
what's
on
the
picture
actually,
so
that
would
be
fine.
I
guess.
Is
there
any
reason
why
our
english
article
shows
up
in
discover
in
united
states,
but
our
spanish
one
doesn't
show
up
in
discovery
in
spain.
A
I
don't
know
it's,
I
I
don't
know
the
the
exact
triggering
there
from
a
discover
point
of
view,
but
I
could
imagine
that
we,
we
might
look
at
things
differently
depending
on
the
language.
But
it's
it's
all
kind
of
tricky
with
regards
to
discover,
because
it's
so
much
tied
to
what
what
we
see
that
users
prefer
to
see
and
discover.
K
Yeah,
that's
tricky.
Okay
thanks
and
the
last
one
is
the.
If
a
page
is
not
two
years,
but
they
hold
pretty
good
positions,
what
will
happen
to
them?
They're
not
doing
actually
anything,
and
we,
you
know.
A
M
Yes,
hello,
I
have
a
few
questions
for
you.
So,
first
of
all,
we
we
were
discussing
this
two
weeks
earlier.
If,
if
google
sees
a
newer,
a
more
recent
article
for
a
topic,
let's
say
review
or
pro
or
a
product
or
whatever
does
it
mean
that
it
prefers
the
more
fresh
page
over?
Maybe
the
higher
quality
one.
A
Not
necessarily
so
we
we
do
try
to
take
kind
of
the.
A
I
don't
know
age
or
the
freshness
of
content
into
account
for
some
queries,
but
it's
not
the
case
that
that
would
always
apply
and
that's
something
that
can
also
change
over
time,
where
if,
for
example,
some
something
happens
in
the
news
in
one
location,
then
suddenly,
I
don't
know
then
fresher
content
for
that
location
will
be
more
relevant
for
users,
whereas
if
nothing
has
happened
there
for
a
while,
then
maybe
kind
of
the
the
more
stable
reference
content
for
that
location
would
be
relevant.
M
That's
what
I
was
hoping
for,
yes,
but
we
have
a
few
examples
of
a
bit
of
reviews
that
we
think
are
higher
quality
on
our
page
than
they
are.
You
know
on
the
on
others
that
rank
higher
a
lot
higher
than
ours
and
it's
you
know,
frustrating.
M
A
I
don't
think
you
can
really
do
that.
I
mean
what
you
can
always
do
is
give
us
information
about
these
kind
of
queries
where
you
think
the
results
are
bad,
but
when,
when
talking
with
the
ranking
team
about
these
kinds
of
issues,
they
really
need
to
see
that
it's,
it's
really
significantly
bad.
So
not
like
my
page
is
just
as
good
at
number
five
as
the
page
at
number,
four.
A
M
A
What
you
could
do
is
either
post
in
the
help
forum.
That's
that's
one
way
to
do
that.
The
folks,
in
the
help
forum,
are
able
to
look
at
that
and
give
you
some
tips
on
that
as
well,
and
if
there's
something
that
is
like
really
weird,
then
they're
able
to
escalate
that
to
googlers
as
well.
M
Okay,
perfect:
I
thought
it
was
only
the
community
okay
and
then
I
have
another
one.
It
doesn't
matter
if
we
have
share
buttons
or
links
to
social
networks
of
our
our
site.
Does
google
take
this
into
account?
Maybe
the
like
counts,
or
you
know
the
community
on
other
social
networks?
No,
no!
Okay!
I
don't
know
if
you
can
speak
about
analytics.
M
Okay,
but
we
noticed
that
we
had
some
some
issues
with
issues
with
analytics
for
like
a
year,
and
I
was
asking
that,
maybe
maybe
I
mean
do
do
google
analytics
filter
inputs
like
hits
and
page
views
from
other
domains
like
let's
say
our
our
code
is
on
someone
else's
domain
and
does
does
this
maybe
change
our
statistics?
Somehow
I
mean
I
would.
I
would
say
that
it
should
build
right
now.
A
I
I
have
actually
no
idea
how
that
is
handled.
I
I
vaguely
remember
that
there
is
some
way
to
set
that
up,
but
I
have
no
idea.
I
I
would
double
check
in
the
analytics
help
forum,
the
folks.
There
have
definitely
heard
that
and
might
have
a
tip
for
you.
There.
Okay.
N
Yes,
hey
john
quick
one
about
the
home
activities
schema
any
idea
if
that
has
rolled
out
outside
of
the
us
or
plans
to
do
so.
N
I
have
no
idea:
yeah
yeah
documentation
still
just
says
u.s,
I'm
assuming
it
is,
but
yeah
you'd
like
it
in
canada,
too,.
A
I
I
don't
know,
maybe
you
have
to
move,
I
don't
know,
usually
usually
the
folks
that
are
handling
the
documentation
of
all
of
the
structured
data,
stuff
they're,
pretty
on
top
of
things
like
changes
like
that,
but
sometimes
the
teams
also
make
changes
without
telling
the
bigger
teams
at
google
and
then
it's
it's
suddenly
live
in
other
locations.
We
didn't
hear
about
it
yet,
but
we
we
try
to
keep
that
in
sync
as
much
as
possible.
A
Okay,
cool
thanks
cool
muhammad.
J
Hey
hi
hi
john
hi.
This
is
hamid
from
india.
My
question:
I
have
two
questions,
so
I
don't
know
whether
this
platform
is
the
right
platform
to
ask,
but
it's
related
to
amp
web
stories
indexing
on
google.
So
we
have
two
three
websites,
one
is
in
english,
obviously
and
then
regional
sites.
But
what
we
have
saw
seen
is
the
english
articles
get
indexed
very
soon
and
we
get
the
results
as
well,
but
the
regional
languages
don't
pick
up
or
don't
come
on.
J
Google
search
and
we
don't
get
the
page
views
so
is
there
anything
to
do?
The
question
is
whether
there
is
a
preference
in
languages
compared
to
english
and
regional
languages,
and
now
we
have
tried
to
put
english
text
as
well
in
the
regional
languages,
but
it's
not
working,
so
any
any
preferences
are
there.
A
I
I
don't
know
so,
especially
around
web
stories
there.
I
think
there
are
two
aspects
that
might
be
playing
a
role
there.
On
the
one
hand,
we
show
web
stories
in
kind
of
that
unique
ui
in
the
search
results,
but
that's
only
in
certain
countries,
and
we
also
show
them
in
discover-
and
especially
the
I
mean
the
first
one
with
web
stories
in
in
the
search
results
with
that
special
ui.
A
That's
something
that
depends
on
whether
or
not
that
ui
is
shown
in
in
your
country,
for
example.
I
think,
for
example,
in
switzerland,
it's
still
not
shown
so
I
I
would
not
see
that
if
someone
were
to
make
web
stories
specifically
for
switzerland,
that
would
be
kind
of
awkward,
because
most
people
here
wouldn't
see
that
the
other
part
in
discover
is
probably
where
you're
seeing
similar
things.
A
As
the
the
question
like
a
while
back
with
regards
to
english
and
spanish
content
in
discover
where
these
are
essentially
just
different,
I
know
completely
different
environments
or
ecosystems,
essentially
in
different
languages,
where
it's
not
automatically
the
case
that
if
something
in
english
is
shown
in
english
discover
for
users
in
english
that
any
localized
version
would
automatically
be
shown.
Similarly,
so
those
are
essentially
completely
different
things,
and
it's
very
likely.
A
I
I
don't
know
about
the
languages
in
india,
but
it's
very
likely
that
users
interact
with
discover
differently
across
different
languages
where
maybe
in
english,
they
look
at
discover
more
and
they
see
that
more
often
and
your
your
articles
have
more
views
more
clicks
there
and
maybe
in
some
other
languages
they
just
don't
use,
discover
as
frequently
and
therefore
don't
your
pages.
Don't
get
that
many
views
or
clicks
there.
A
There
is
not
so
much
trying
to
make
your
non-english
pages
look
like
they're
english
as
well,
because
then
you'd
just
be
showing
them
to
people
who
are
trying
to
find
english
content,
but
rather
to
continue
to
kind
of
build
out
your
non-english
content,
and
especially
if
you
know
that
the
web
web
stories
are
not
shown
in
the
regions
that
you're
targeting,
then
maybe
it's
worthwhile
to
say,
I
will
focus
more
on
the
regions
where
I
know
it's
shown
for
the
moment
and
then
at
some
later
point
when
you
know
that
it
is
shown
more
in
those
regions,
then
kind
of
put
more
energy
into
those
versions
as
well.
A
Okay,
okay,
thank
you.
Thank
you.
So
much
sure.
Let
me
just
pause
the
recording
here
I'll
I'll
stick
around
a
little
bit
longer.
It
looks
like
there's
still
lots
of
questions
left
lots
of
raised
hands,
so
we
we
can
continue
on
a
little
bit,
but
thank
you
all
for
your
questions
so
far.
I
hope
you
found
this
useful
and
insightful
and
that
that
was
a
reasonably
good
use
of
your
time
and
hopefully
I'll
see
some
of
you
again
in
one
of
the
future
hangouts
as
well
all
right.