►
From YouTube: English Google SEO office-hours from July 2, 2021
Description
This is a recording of the Google SEO office-hours hangout from July 2, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
Google Search Central YouTube survey
Take the survey at https://goo.gle/Search-Central-YT-Survey
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
all
around
web
search
and
their
website
wow
looks
like
a
bunch
of
people
are
still
jumping
in,
but
maybe,
as
always,
maybe
we
can
get
started
with
some
first
live
questions
and
then
we'll
go
through
some
of
the
submitted
questions
on
youtube
as
well.
B
Thank
you,
john
hi,
so
we
are
working
on
a
website
that
uses
href
lang
and
I've
been
trying
to
figure
out.
What's
going
wrong
because
they're
ranking
in
france,
with
their
belgian
pages
they're
ranking
in
france
with
some
luxembourg
pages,
and
I
was
trying
to
figure
out
what
was
wrong.
So
one
of
the
things
I
found
out
is
that
they
didn't
make
a
general
language
page,
so
there's
no
main
fr
page.
So
that
was
the
first
suggestion
I
made.
But
I
was
wondering
since
they're
currently
focusing
on
germany
alone.
B
Would
they
need
to
have
a
d
e,
then
d
e
again
or
is
just
d
e
enough?
If
they're
just
focusing
on
germany,
just
d
e
would
be
enough.
B
Okay,
but
so
it's
it's
they're
selling
in
germany
alone,
it's
not
austria,
not
switzerland,
so
I
was
wondering
whether
they
need
the
dede
as
well
or
if
it's
just
the
e,
when
it
doesn't
matter.
I.
A
C
A
So
it's
it's
not
that
it
would
rank
higher
kind
of
in
germany
because
of
that,
but
more
it's
like
well!
If
someone
is
searching
for
the
brand
name,
for
example
where
we
might
have
the
french
page
and
the
german
page,
and
they
could
be
equally
relevant
because
from
the
brand
name
alone,
we
might
not
know
which
language
version
then
we'd
be
able
to
swap
to
the
german
language
version
for
users
who
have
the
their
setup
in
german.
D
Hi
mr
muller
hi,
I'm
the
director
of
its
search
engine
in
german,
so
we
are
trying
to
fight
issue
with
the
twitter
embeds
on
optimizing
for
core
web
vitals
and,
as
you
know,
it
causes
a
lot
of
troubles
like
content
layout
shift,
first
of
all.
D
Secondly,
it
does
start
loading
all
the
resources
instantly
on
page
load,
whereas
twitter
embeds
are
below
the
fold
and
one
have
to
scroll
to
see
it
right
thus
will
work
with
twitter
somehow
to
fix
that
issue
of
or
has
plans,
because
embedding
feeds
is
a
very
common
practice
across
all
the
bad
corvette.
Vital
is
a
very
big
focus
for
the
google,
so
I
do
think
that
it
could
be
fixed
with
cooperation
with
google
doesn't
have
any
plans
on
that.
A
I
don't
know
about
any
plans,
but
we
we
essentially
don't
have
any
special
treatment
for
any
other
providers,
essentially
of
specific
content.
So
it's
not
that
we
would
say.
Oh,
this
is
a
tweet.
Therefore
we
will
let
it
do
whatever
crazy
stuff
it
wants
with
core
web
idols,
but
it
is
something
where,
if
a
lot
of
sites
are
using
those
embeds,
it
feels
like
it
would
be
in
twitter's
best
interest
to
provide
a
way
to
enable
people
to
embed
those
in
a
reasonable
way.
A
There
was
a
recently
a
conversation
on
twitter
about
this,
where
someone
mentioned
that
they
were
using
kind
of.
I
don't
know
a
specific
css
setup
to
make
sure
that
these
tweets
don't
move
around
that
much.
I
think
it
was
something
like
minimum
maximum
height
and
width,
something
like
that
and
apparently
with
that
they
were
able
to
significantly
improve
things
around
cls
and
I
think
fcp
as
well.
A
So
that
might
be
something
to
look
into
to
see
if
that's
an
option
there
I
mean
it's,
it's
also
something
where,
if
you're
embedding
a
lot
of
these
from
them,
then
it's
like
going
to
them
and
saying:
okay,
you
need
to
get
this
working
better,
otherwise,
we'll
swap
it
out
against
something
else.
That
might
also
be
an
option.
D
A
So
you're,
like
you,
redirected
from
page
a
to
page
b
and
then
afterwards
you
decide
page
a
should
remain
indexed.
Yes,
yeah
and
page
b
should
drop
out
or.
D
Can
either
drop
out
or
can't
stay
alive?
So
what?
What
would
be
a
correct
path
to
handle
both
scenarios.
A
E
Hey
john
how's,
it
going
pretty
good
how
about
you
well
doing
well,
so
a
question
about
the
page
experience
update.
So
in
order
for
a
page
to
benefit
from
the
page
experience
update,
it
needs
to
have
all
the
marks
right.
It
needs
to
have
a
mobile,
mobile-friendly,
https
and
then
either
have
needs
improvement.
I
guess
or
good
core
web
vital
score.
Is
that
correct.
A
E
E
Little
I
just
envisioned
your
little
chart
every
time
I
talk
about
it,
yeah
exactly
yeah
graph
yeah.
So
if
previously,
before
the
page
experience
update,
you
had
a
url
that
was
benefiting
from
https,
whatever
little
benefit.
That
is,
if
the
page,
once
the
page
experience,
update
rolled
out
and
say
that
url
had
a
bad
core
web
vital
status,
would
then
it
have
lost
whatever
little
benefit
it
got
from
https,
because
now
it
doesn't
qualify
for
the
page
experience
update.
E
Okay,
cool:
I
can
cast
one
other,
just
a
little
small
question.
Well,
maybe
smaller,
if,
if
I'm
doing
a
site
migration
and
in
search
console,
I've
executed
the
the
site
migration
to
another
search
console.
Am
I
to
believe
that
it
has
not
fully
completed
in
google's
eyes
until
those
those
notifications
and
search
console
are
gone?
Or
will
I
get
a
new
notification
where
it
says
one
of
your
other
sites
is
moving
to
this
site
and
on
the
other
one
it
says
this
site
is
currently
moving.
A
Yeah,
I
I
think
there
is
a
time
out
there,
but
it's
a
search
console
based
time
out,
which
is
essentially
something
I
I'm
not
sure
what
the
number
is.
I
think
maybe
it's
180
days
or
or
90
days
somewhere
in
that
range,
where
essentially
that
that
status
essentially
drops
out,
but
that's
not
a
sign
that
the
migration
is
complete.
E
F
Hi
yong
hi
yeah.
My
question
is
related
to
like
language
targeting
the
question
is
like:
can
we
like
put
two
different
language
in
a
single
content?
If
suppose,
my
content
is
in
english
language,
and
now
I
wanted
to
target
portuguese
language
or
a
france,
france
language
can
we
include
portuguese,
plus
english
or
french
plus
english.
F
A
Doesn't
create
problem
right,
yeah,
that's
that's
perfectly
fine
for
languages.
We
primarily
look
on
a
per
page
basis,
so
we
want
ideally
to
have
one
clear
language
on
a
specific
page.
So
that's
something
where,
if
you
have
different
pages
with
different
language,
that's
perfectly
fine,
but
it
makes
a
lot
easier
for
us.
If
we
have
one
clear
language
on
a
page
because
then
we
can
clearly
say:
oh
this,
isn't
french
or
this
isn't
portuguese,
or
this
isn't
spanish
and
show
it
to
people
appropriately.
F
Okay
and
again
the
same
question
suppose
I
have
an
image
content
of
700
word
and
I'm
just
converted
all
those
content
into
portuguese
okay.
So
it
is
also
like
700
words.
Now
I
want
additional
200
or
300
word
content
in
portuguese
language.
So
does
it
reflect
like
any
changes
in
the
ranking
position
and
all
does
it
impact
our
seo
strategy.
A
The
these
pages
would
essentially
rank
independently.
So
if
you
have
some
content
in
english
and
slightly
different
content
in
a
different
language,
then
we
would
just
rank
those
pages
independently.
So
that's
something
where
sometimes
it
makes
sense
to
have
more
content
in
a
certain
language.
Sometimes
a
certain
language
just
has
more
words
for
the
same
content
or
fewer
words.
F
No,
and
just
like
suppose,
we
have
a
content
in
english
language
and
we
have
a
similar
content,
which
is
in
portuguese
language,
but
additionally,
I
want
to
add
a
few
more
information
in
portuguese,
so
I
just
wanted
to
add
few
more.
So
in
that
case,
I
am
asking
you:
is
there
any
problem
to
add
a
few
more
content
in
your
portuguese
language?
No,
that's
perfectly
fine,
okay,
and
does
it
like
if
I
use
a
passive
voice
or
active
voice.
Does
it
also
matter
in
terms
of
ranking.
A
F
A
That
can
happen,
so
I
mean
searches
is
very
dynamic,
so
it
can
certainly
happen
that
sometimes
a
page
ranks
really
well
and
then
a
few
hours
or
a
few
days
later
it
ranks
a
lot
worse
and
then
it
ranks
a
little
bit
better
again.
Usually
when
you
see
these
kind
of
fluctuations,
it's
because
our
algorithms
are
not
100
sure
yet
and
over
time
it
will
settle
down
somewhere
in
that
range,
but
it's
not
a
sign
that
there's
anything
specifically
wrong
or
bad
with
that
page.
A
A
I
mean,
if
you
significantly
change
your
website,
then
our
algorithms
have
to
understand
it
again,
so
that
sometimes
takes
time.
Okay!
Okay,
thank
you.
Thank
you
sure.
All
right,
nancy.
G
Hey
john,
first
of
all,
thank
you
very
much
for
doing
these
watch
every
week
and
really
get
a
lot
of
value
out
of
them.
I
work
with
a
lot
of
small
companies
and
we
try
to
apply
all
the
seo
best
practices.
You
know
that
google
lays
out
for
us
it's
a
little
harder
to
do
some
of
the
fancier.
You
know
back
end
code
kind
of
things,
I'm
more
of
a
content,
writer
and
seo
than
a
you
know:
digital
code
guru,
what
you
know.
G
How
can
I
help
these
sites
to
rank
well
when
they're
competing
against
these
big
enterprise
organizations
that
have
a
lot
of
resources-
and
you
know
money
and
and
people
to
throw
behind
their
seo
and
tend
to
you
know,
rank
higher
because
they
are
enterprise
corporations.
You
have
any
advice,
yeah.
I
think
that's
super
hard,
so
just
give
up.
A
No,
no,
no,
but
I
I
think
I
think
small
companies
have
a
lot
of
advantages,
especially
online
in
in
that
area,
because
it's
a
lot
easier
for
them
to
to
kind
of
shift
and
to
move
quickly
compared
to
large
companies.
So
I
see
that
at
least
within
google,
where,
when
we
come
up
with
a
really
cool
idea
that
we
should
implement
in
search
console
it's
going
to
take
like
one
or
two
years
until
we
get
there,
whereas
external
seo
tools
when
they
come
up
with
the
same
idea,
they'll
be
like
oh
I'll.
A
When
some
new
trends
come
up
when
something
I
don't
know
significantly
changes,
you
can
move
really
quickly
and
you
can
be
visible
like
that,
essentially
immediately
in
search,
whereas
with
large
companies
like
I
mean
sure
they,
they
have
a
lot
of
power
and
it's
like
a
strong
brand
and
all
of
that,
but
they
can't
move
as
quickly
and
that,
I
think,
is
kind
of
one
of
the
advantages
with
a
small
company.
A
Oh
it's
like
so
much
work
to
actually
get
that
set
up
and
to
kind
of
prioritize
that
and
then
it's
only
like
a
five
million
dollar
business
like
what
are
we
gonna
do
with
all
of
that
small
change
and
as
a
small
company
you're
like
whoa,
it's
like
a
couple
million
dollars.
That's
fantastic
right!
A
So
that's
something
where
I
also
see
especially
small
companies,
kind
of
have
the
ability,
especially
online,
to
focus
on
these
small
areas
and
say
well,
we'll
do
something
really
unique
here
and
maybe,
if
it
picks
up
a
really
large
company.
We'll
also
do
something
similar
a
couple
of
years
down
the
line,
but
like
we
have
that
kind
of
headway,
at
least,
and
can
kind
of
move
ahead
of
all
of
these
big
companies.
During
that
time,.
G
A
I
don't
know
I
kind
of
always
struggle
to
define
explicit
examples
that
that
we
can
mention
there,
but
I
can't
see
it.
A
I
see
it,
for
example,
with
with
seo
tools
because,
like
I,
I
work
together
with
the
search
console
team
and
it
is
something
where
the
a
lot
of
the
external
or
I
don't
know,
non-google
seo
tools
essentially
have
the
ability
to
kind
of
do
things
in
ways
that
wouldn't
really
be
possible
on
google
side,
be
it
for
policy
or
legal
reasons
or
whatever,
and
I
think
that
gives
these
tool
providers
a
little
bit
of
a
kind
of
a
heads
up,
even
in
in
a
market
where
I
don't
know,
google
is
like
this
giant
corporation
and
they're,
even
offering
it
for
free,
but
yet
all
of
these
small
seo
tool,
companies
or
smaller
seo
tool
companies
still
have
a
chance
to
kind
of
grow,
a
significant
user
base
which
is
relevant
for
them.
A
So
I
mean
that's
not
something
that
I
imagine
the
average
small
business
would
be
creating
seo
tools,
but
it's
like
one
of
those
areas
where
I
tend
to
see
that
happening.
H
Hi
john,
how
are
you
hi
hi,
so
we
got
this
multilingual
website.
We
recently
launched
it
in
an
april
this
year.
So
apart
from
english,
we
got
content
for
users
in
taiwan
and
for
users
in
china.
Now
the
issue
is
that
when
someone
is
searching
for
this
website
on
google
like
searching
for
the
brand
name,
the
site
links
that
they
they
see
in
taiwan.
H
H
H
Yes,
to
add
this,
I
think
the
domain
is
a
dot
gt
domain,
which
is
kind
of
country
called
domain
for
some
country,
but
also
used
by
gaming
in
gaming
websites,
and
we
are
also
kind
of
in
the
same
industry.
So
could
that
be?
Could
that
be
an
issue
like
we
are
using.
A
I
think
it's,
it's
probably
just
something
where
the
the
sightlines
are
suboptimal
and
if,
if
you
could
point
me
to
to
a
screenshot,
I'm
happy
to
pass
that
on
to
the
team,
but
the
these
kind
of
things
happen.
All
I
don't
know
all
the
time,
maybe
not
all.
A
H
A
I
Hey
john,
I
have
a
very
specific
question:
I'm
working
on
a
multilingual
website
and
we've
been
using
the
href
blank
tags
in
the
in
the
site-
maps
not
not
at
the
page
level
but
sidemounts
and
let's
say
the
website
is
structured
in
a
way
that
the
the
main
tool
and
the
main
site
is
is
completely
multilingual.
I
So
let's
say
we
have
the
different
blocks
of
main
main
language
page
and
all
the
alternate
tags.
So
it
was,
would
start,
let's
say,
for
the
english,
then
english
all
the
translations,
then
the
other
side
map,
let's
say
german-
would
have
then
first
the
the
german
main
version
of
the
page
plus
the
alternate.
I
So
I
wanted
to
ask
one:
is
that
a
good
practice?
Would
there
be
any
problem?
If
that's
okay
and
then
the
other
question
is
we
have
the
blogs
and
the
blogs?
There
are
some
blog
posts
that
of
course,
are
not
relevant
to
other
other
regions.
Other
languages,
so
some
will
have
a
translation.
Some
other
won't
have
a
translation.
I
So
can
we
pack
all
of
those
within
one
single
sitemap,
for
each
language
that,
for
some
cases
would
have
a
hreflang
tag
and
the
other
one
wouldn't
so
those
two
questions.
A
Now
so
you
can
structure
cycling
files,
however,
you
want
it's
like
you
can
make
them
by
language
by
site
section.
Essentially
it's
totally
up
to
you.
I
would
try
to
make
them
in
a
way
that
they
can
be
kind
of
stable
with
regards
to
the
urls
that
they
include
so
don't
like,
take
all
of
the
urls
from
your
site
and
randomly
put
them
into
sitemap
files,
but
have
some
some
kind
of
a
system
to
distribute
them.
A
So
if,
if
the
urls
were
to
jump
between
sitemap
files,
we
might
like
process
one
file
and
see
one
url
and
process
the
other
file
and
then
see
the
same
url
again
and
then
in
the
end,
we
might
miss
some
urls
because,
like
it's
randomly
swapping
between
files
but
having
it
kind
of
persistent
in
any
method
that
you
think
is
is
relevant
for
your
side.
It
can
also
be
alphabetical
or
like
whatever
you
want
to
do
is
is
kind
of
that's.
A
That's
the
important
part
with
regards
to
content,
that's
only
in
certain
languages
or
that
doesn't
have
any
atrial
flang
annotations
for
it.
That's
also
perfectly
fine
kind
of
like
I
mentioned
before,
not
having
the
annotations
doesn't
mean,
we
won't
show
it
for
other
languages,
but
having
annotations
kind
of
helps
us
to
show
the
appropriate
version.
So
if
you
just
have
one
version,
then,
like
that's
the
version
that
will
show
you
don't
need
any
hr
flying
annotations
for
that.
J
Hi
it's
international
day
today
and
another
hr
question,
I'm
afraid,
got
a
client
been
talking
to
they're
having
a
problem.
They've
got
english
in
different
regions
of
do
canada,
us,
australia,
uk
they've
got
hreflang,
which
seems
to
be
working
to
share
the
right
one,
but
rich
results.
They've
got
product
rich
markups
that
doesn't
seem
to
get
switched
out.
That
seems
to
be
sticking
to
just
the
one
version.
Is
that
the
way
it's
meant
to
work
or
have
they
got
something
else
going
on
that?
We're
not
sure
about.
J
J
Actually,
the
uk
just
seems
to
sometimes
pick
up
things
so
they'll
search
one
in
canada
sometimes
and
it
will
show
the
us
dollar
result
in
the
rich
results.
So
basically
it
appears
to
be
taking
the
markup
from
the
default
page
from
from
the
one
that
you
actually,
if
you
check
as
well,
is
the
canonical
one.
So
it's
showing
wrong
prices
because
obviously
u.s
and
canadian
dollars
different
as
are
australian
dollars,
so
it
gets
a
little
bit
confusing.
A
A
Okay,
so
what's
what's
probably
happening,
there
is
we're
seeing
these
as
duplicates
and
we're
picking
one
of
those
versions
as
a
canonical
and
then
using
ahreflang
to
swap
out
the
url.
But
since
we
have
one
as
a
canonical,
that's
the
one
where
we
pull
the
rich
result:
information
from
yeah.
So
that's
that
seems
kind
of
unfortunate
that
we
would
kind
of
like
pick
one
canonical
across
different
currencies.
J
I
mean
it
literally
is
just
the
price
and
maybe
a
number
that's
different
on
the
page,
so
it's
entirely
understandable
that
it
is
deduped,
but
it
just
makes
it
kind
of
awkward
in
this
situation.
A
To
pass
that
on
to
the
team,
because
usually
we
we
do
try
to
recognize
when
things
like
phone
numbers
or
prices
are
different
across
different
versions
and
say:
oh,
we
shouldn't
kind
of
like
de-duplicate
these
pages
because
they
are
unique.
So
if,
if
you
have
some
examples,
I'm
happy
to
look
at
that,
but
sometimes
when
it's
really
like
just
the
currency
symbol,
that's
different
across
these
pages,
then
it
can
happen
that
our
algorithms
say
well.
J
A
Just
cheers:
thanks
cool
okay,
let
me
run
through
some
of
the
submitted
questions
as
well,
so
that
we
don't
lose
track
of
any
of
that
and
we'll
have
more
time
for
your
live
questions
along
the
way
as
well.
Let's
see
we're
going
to
relaunch
migrate,
our
site.
A
few
important
pages
won't
be
ready
for
the
launch,
but
will
be
added
later.
How
should
we
handle
these
pages?
Should
we
leave
them
as
404
and
redirect
later
how
long
until
they
lose
their
seo
signals?
A
Should
we
redirect
them
to
a
less
fitting
page
and
later
to
the
right
one,
so
yeah?
I
think
this
is
kind
of
tricky
one
of
the
things
that
I
would
try
to
do
in
a
case
like
this
is
try
to
leave
the
old
page
up
as
much
as
possible
so
that
we
we
see
that
part
of
the
website
is
moving,
but
also
part
is
still
there
and
that
way
we
can
kind
of
transition.
A
A
Then
a
404
is
certainly
a
possibility,
if
you're
sure
that
the
new
page
will
be
up
within
a
day
or
so.
A
503
might
also
be
an
option
which
would
be
kind
of
like
being
almost
like,
sneaky
and
saying.
Well,
the
page
doesn't
work
at
the
moment
and
then
google
will
retry
it
a
little
bit
later
and
then
suddenly
the
page
is
there
and
ready.
A
So
that
might
be
an
option
if
you
know
that
it's
just
a
day
or
two,
a
404
can
also
work,
but
there
you
kind
of
have
a
situation
where
the
page
ends
up
dropping
out
of
the
index
completely
and
when
it
comes
back,
we
essentially
have
to
rebuild
the
information
that
we
know
about
that
page.
So
it's
not
that
the
information
will
be
lost
forever,
but
it
takes
a
bit
of
time
to
to
rebuild
again.
A
So
those
I
think
are
kind
of
the
the
options.
My
preference
would
really
be
to
try
to
at
least
keep
the
old
page
there
until
you
can
swap
it
out
against
the
new
one,
so
keep
it
out
of
200
if
you
kind
of
absolutely
need
to
keep
it,
if
you
need
to
redirect
it
and
have
the
old
version
on
the
new
site,
I
think
that's
also.
A
Okay,
if
you
need
to
kind
of
really
remove
the
page
for
a
longer
period
of
time,
the
404
is
kind
of
the
right
choice,
but
you
have
to
assume
that
it'll
take
a
bit
of
time
to
catch
up
again,
I'm
looking
at
merging
two
brand
websites
where
one
brand
will
overtake
the
other.
However,
there
will
be
a
90-day
window
where
we
run
both
sites
announcing
the
merger.
A
What's
the
best
way
to
set
up
schema
as
an
early
indicator
that
they're,
essentially
the
same
site
before
we
301
redirect
and
ensure
that
the
primary
brand
is
what's
ranked
as
the
secondary
brand
resolves
like
a
canonical
for
the
entire
website.
A
So
the
suggestion
of
a
canonical
is
probably
a
good
approach
here
with
the
rel
canonical
you're,
essentially
telling
us
that
this
content
is
equivalent
to
the
other
one,
and
you
prefer
the
other
one
to
be
the
primary
version.
A
The
I
think
the
downside
of
using
a
rel
canonical
in
this
situation
is
that
we
will
already
start
migrating
the
search
results
over
to
the
new
version
of
your
site,
so
users
would
still
be
able
to
see
both
of
those
versions,
but
in
search
the
essentially
the
new
version
or
the
primary
version
would
start
to
become
more
and
more
visible,
and
it
might
be
a
little
bit
awkward
if
someone
is
looking
explicitly
for
the
old
version,
but
with
the
real
canonical
you
can
kind
of
set
that
up
ahead
of
time
before
you
start
doing
the
move
as
well.
A
The
other
thing
to
keep
in
mind
when
you
do
end
up
doing
the
kind
of
merging
of
the
different
sites
is
that
merging
and
splitting
sites
tends
to
take
longer
than
just
a
pure
redirect
from
one
domain
to
another.
So
that's
something
where
I
would
expect
a
little
bit
of
a
longer
time,
maybe
a
couple
of
months
for
things
to
settle
down
around
search
before
it
actually
ends
up
working.
A
Does
googlebot
read
information
on
a
page
if
it's
in
a
toggle,
a
small
symbol
that
you
click
on,
which
opens
up
with
more
information?
Can
you
recommend
a
helpful,
step-by-step
video
for
beginners
to
implement
structured
data?
A
One
way
to
try
this
out
is
to
just
open
the
page
in
a
browser
and
then
look
at
the
source
code
and
see.
Is
that
information
already
loaded
or
not?
If
it's
loaded
already,
then
most
likely
it'll
be
available
for
indexing?
If
it's
already
on
the
page
and
your
website
is
already
being
shown
in
search,
a
simple
way
to
check
is
also
just
to
search
for
that
piece
of
text
in
quotes
and
to
see
if
google
can
find
it
with
regards
to
getting
started
with
structured
data.
A
I'm
not
aware
of
any
kind
of
like
simple
getting
started
guides
for
structured
data,
because
I
think
it
depends
quite
a
bit
on
the
way
that
you
have
your
website
set
up.
So
that's
something
where,
if
you're
working
with
pure
html
and
you're
doing
everything
manually
then
going
to
the
developer,
documentation
is
probably
a
good
idea.
You
can
copy
and
paste
the
examples
there
if
you're,
using
an
existing,
cms
or
hosting
system
like
wordpress
or
wix,
or
something
like
that.
A
Then
oftentimes
there'll
be
plugins
that
you
can
just
enable
for
the
site
which
make
it
a
lot
easier
for
you
to
add
structured
data
in
a
way
that
doesn't
require
you
to
actually
do
any
of
the
code
part
itself.
So
that's
kind
of
the
direction
I
would
head
where
I
I'm
going
to
assume
that
you
use
something
like
wordpress
or
wix
or
squarespace,
or
something
like
that.
A
I
would
try
to
find
a
plugin
or
a
simple
way
to
just
activate
that
within
your
website
and
then
just
fill
out
the
fields
and
let
the
the
plugin
do.
The
structured
data
for
you
can
backlinks
from
low
quality
websites,
make
a
negative
impact
on
a
blog's
seo
and
do
user
comments
have
any
effect
on
page
ranking,
so
just
backlinks
from
low
quality
websites.
Usually
we
will
just
ignore
those.
I
don't
think
that
would
have
any
negative
effect
on
on
a
website's
seo.
A
What
would
be
more
problematic
is
if
you
go
out
and
actively
buy
a
significant
amount
of
links
for
your
website
or
kind
of
do
something
else
which
essentially
goes
directly
against
our
webmaster
guidelines.
So
that's
something
where
I'd
say.
That's
problematic
and
that's
something
I
would
avoid
doing
and
clean
that
up
if
you
notice
that
happening
from
someone
who
previously
worked
on
your
website,
but
if
you're
just
seeing
random
low
quality
links
to
your
website,
I
would
totally
ignore
those
they
absolutely
have
no
effect
on
your
site
and
regarding
user
comments
and
page
ranking.
A
If
these
are
comments
on
your
website,
then
the
the
primary
effects
that
I
would
expect
from
that
is
essentially
that
sometimes
user
comments
do
provide
a
lot
of
value
additionally
to
your
content,
and
sometimes
those
comments
can
be
used
for
ranking
as
well
in
search.
So
if
someone
is
searching
for
something
that
is
only
visible
in
a
comment
on
your
website,
then
your
pages
could
be
visible
because
of
that
comment.
That
is
also
embedded
on
those
pages.
So
essentially
it's
a
way
of
having
content
that
is
available
on
your
pages.
A
It's
just
not
content
that
you
wrote
yourself,
but
at
the
same
time
it's
also
something
where,
if,
if
this,
these
user
comments
are
really
low
quality
and
they
kind
of
drag
your
site's
overall
quality
down,
then
that
is
something
that
we
would
also
see.
As
a
part
of
your
website,
where
we
say
well
like
we,
we
look
at
your
website
overall
and
we
not
really
sure
about
the
quality
of
your
website
overall
and
our
systems
wouldn't
really
differentiate
between
like
well.
This
is
content.
You
wrote,
and
this
is
content.
A
So
I
think
there
are
two
aspects
here
from
our
guidelines.
We
want
to
make
sure
that
the
structured
data
you
have
on
your
page
matches
the
primary
element
on
your
page.
So
if
you're
saying
that
you
can
add
an
faq
to
a
random
page
on
your
website,
sure
you
can
do
that,
but
is
this
faq
kind
of
the
primary
part
of
the
page
or
relevant
for
the
primary
part
of
the
page?
That's
that's
something
that
you
kind
of
need
to
figure
out.
So
that's
that's
one
aspect.
A
The
other
aspect
is
that
in
the
search
results,
some
of
the
rich
results
types
we
can
combine
and
some
of
them
we
can't
combine.
So,
for
example,
if
you
have
a
recipe
and
you
have
ratings,
then
we
can
often
combine
that
in
the
search
results
in
one
rich
results
type.
However,
if
you
have
an
faq
and
you
have
an
how-to,
then
at
least
from
from
what
I
recall
what
these
look
like-
these
are
things
that
wouldn't
be
combined
in
a
single
rich
result.
A
Type,
which
means
our
systems
would
have
to
pick
one
of
them
to
show,
and
maybe
we'll
pick
the
type
that
you
would
have
chosen.
But
or
maybe
you
would
have
a
different
preference
on
your
side
and
if
you
have
a
strong
preference
on
your
side,
I
would
just
make
that
super
clear
to
us
by
just
providing
that
kind
of
structured
data.
A
So
if
you're
saying
like
oh
the
the
faq
results,
I
really
like
those
for
these
pages
are
super
relevant
here,
but
it's
also
kind
of
an
article
and
kind
of
a
how-to.
Then
I
would
just
focus
on
your
preferred
one
kind
of
the
faq
in
that
case
or
if
you're
saying
like
the
how-to,
is
really
the
way
that
I
want
to
have
this
page
shown
in
search.
Then
I
would
focus
on
that
type.
A
As
I
understand
it,
internal
links
can
path
authority
and
that
authority
can
be
divided
or
diluted
as
more
internal
links
are
added
to
the
page.
Is
this
another
seo
myth,
or
is
this
roughly
how
it
works?
If
so,
does
that
mean
that
having
a
lot
of
internal
links
on
a
page
could
do
more
harm
than
good?
A
So,
yes,
yes
and
no,
I
I
think,
in
the
sense
that
we
we
do
use
the
internal
links
to
better
understand
the
structure
of
a
page,
and
you
can
kind
of
imagine
the
situation
where,
if
we're
trying
to
understand
the
structure
of
a
website
with
the
different
pages
that
are
out
there,
if
all
pages
are
linked
to
all
other
pages,
on
the
website,
where
you
essentially
have
like
a
complete
internal
linking
across
every
single
page,
then
there's
no
real
structure
there.
It's
like
this
one
giant
mass
of
pages
for
this
website
and
they're
all
interlinked.
A
We
can't
figure
out
which
one
is
the
most
important
one.
We
can't
figure
out
which
ones
of
these
are
related
to
each
other,
and
in
the
case
like
that
kind
of
like
having
all
of
those
internal
links.
That's
not
really
doing
your
site
that
that
much
so,
regardless
of
kind
of
pagerank
and
authority
and
passing
things
like
that,
you're,
essentially
not
providing
a
clear
structure
of
your
website,
and
that
makes
it
harder
for
search
engines
to
understand
the
context
of
the
individual
pages
within
your
page
within
your
website.
A
A
Then
it's
a
lot
easier
for
us
to
understand
that
if
someone
is
looking
for
this
category
of
product,
this
is
that
page
that
you
should
that
we
should
be
showing
in
the
search
results.
Whereas
if
everything
is
kind
of
cross-linked,
then
it's
like
well
any
of
these
pages
could
be
relevant
and
then
maybe
we'll
send
the
user
to
some
random
product.
Instead
of
to
your
category
page
when
they're,
actually
looking
for
a
category
of
products.
A
Let's
see
regarding
images
on
our
website
how
to
find
the
perfect
balance
between
maintaining
high
image
quality,
as
well
as
making
sure
that
images
are
lightweight
and
load
fast.
It
seems
that
both
of
these
things
are
encouraged
by
google,
but
they
both
take
away
from
each
other
yeah.
I
I
don't
know
if
there's
ever
a
perfect
balance,
but
especially
with
regards
to
images.
A
So
that's
something
where
the
the
whole
responsive
image
is
set
up,
I
think
is,
is
a
really
good
idea.
The
other
aspect
here
is
also
that
a
lot
of
the
modern
image
formats
that
are
out
there,
I'm
thinking
of
webp
and
eve.
I
don't
know
how
to
actually
pronounce
that,
where
you
essentially
have
ways
of
providing
really
high
quality
images
at
a
fairly
high
resolution
with,
unlike
not,
that
bite
size
prize
price
that
you
would
usually
have
for
really
large
images.
A
A
So
there's
lots
of
things
that
you
can
do
there,
and
I,
I
think
what
is
kind
of
interesting
specifically
about
images
and
seo
is
a
lot
of
the
things
that
you
can
do
are
very
technical
and
very
things
that
you
can
measure
fairly
clearly
where
you
can
take
lighthouse
or
one
of
the
testing
tools
and
just
measure
the
page
as
it
loads
and
see
like
how
large
is
that
page?
A
What's
your
vision
for
the
future
of
seo,
I
don't
know
good
question.
I
I
don't
have
like
that
five-minute
answer
on
on
the
future
of
seo.
I,
I
think
one
of
the
things
that
people
always
worry
about
is
everything
around
machine
learning
and
that
google's
algorithms
will
get
so
far
as
to
automatically
understand
every
website
and
seo
will
be
obsolete.
Nobody
will
need
to
do
that.
A
I
don't
think
that
that
will
happen,
but
rather
with
kind
of
all
of
these
new
technologies,
you'll
have
new
tools
and
new
ways
of
looking
at
your
website
and
making
it
maybe
easier
for
you
to
create
really
good
content
to
create
clear
structures
for
your
website,
similar
how
how
things
have
evolved.
I
think
the
the
last
10
20
years
as
well,
where
in
the
beginning,
you
would
write
your
own
php
code
and
craft
your
own
html,
and
it
was
a
lot
of
work
and
over
time
all
of
these
cms
has
evolved.
A
And
I
I
think
that
evolution
will
continue
in
that
there'll
be
more
and
more
tools
available
and
you'll
be
able
to
do
more
and
more
things
in
in
a
way
that
kind
of
works
fairly
well
for
search
engines.
And
it's
not
that
the
seo
work
will
go
away,
but
rather
it'll
it'll
evolve,
so
maybe,
instead
of
like
hand
tweaking
h2
tags
and
h1
tags,
you'll,
just
kind
of
like
delegate
that
to
a
cms.
A
That
makes
sure
that
the
most
important
content
is
already
included
as
a
heading
on
the
page,
since
pagerank
is
a
finite
resource.
Is
it
a
good
idea
to
preserve
page
ranking
using
prg
pattern
for
not
important
pages
like
privacy
policy
pages,
and
so
the
prg
pattern
is
kind
of
a
weird
trick
to
make
something
kind
of
work
like
a
link,
but
actually
not
be
a
link,
and
it
uses
things
like
posting
to
the
server
and
then
the
server
does
redirect
and
all
kinds
of
weird
things.
A
From
my
point
of
view,
you
absolutely
don't
need
to
use
any
of
that.
If
you
want
to
make
sure
that
a
link
does
not
pass
any
page
rank
then
use
the
rel
no
follow.
If
you
want
to
make
sure
that
a
link
works.
Well,
then
don't
use
the
rel
nofollow.
That's
essentially
my
my
perspective
on
that.
So
I
would
not
go
off
and
create
these
kind
of
fancy
constructs
where
you're
doing
things
like
posting
and
redirecting
on
a
server,
because
it
just
adds
so
much
more
complexity
and
you
get
absolutely
no
value
out
of
that.
A
So
that's
kind
of
my
my
primary
take
on
technologies
like
these.
I
think
it's
really
cool
to
kind
of
come
up
with
this
kind
of
a
setup,
but
it's
not
something
that
I
would
implement
on
a
on
a
website
on
a
day-to-day
basis.
It's
like
terrible
to
maintain.
You
can't
use
any
of
the
existing
tools
on
it,
because
you
can't
really
crawl
the
website.
A
I
I
would
absolutely
kind
of
discourage
from
from
that
setup
with
regards
to
pagerank,
and
essentially
what
is
going
here
is
kind
of
like
pagerank
sculpting.
Where
he's
saying
I
don't
want
any
pagerank
going
to
my
privacy
policy
pages.
A
We
just
have
a
few
minutes
left
before
I
pause
the
recording,
so
maybe
I'll
just
go
back
to
some
live
questions
in
the
meantime,
looks
like
there's
still
a
bunch
of
hands
up.
I
also
have
a
bit
more
time
afterwards.
If
any
of
you
want
to
stick
around
a
little
bit
longer,
let's
see
akash,
I
think
you're
up.
First.
C
Hi
john
hi,
so
my
question
is:
is
regarding
international
targeting.
C
Actually,
I
was
working
for
a
client
and
what
we
saw
is
like
we
have
created
for
like
different
different
pages
for
different
different
countries
like
targeting
those
pages,
and
of
course
all
the
pages
are
in
in
english
language,
and
we
have
created
like
a
subdirectory
kind
of
thing
like
of
usa
or
france
or
australia,
and
we
have
set
of
one
canonical
or
url
in
all
the
pages
and
the
html
tags
are
also
properly
implemented,
but
for
different
different
locations
we
see
like
sometimes
for
for
country.
C
Like
france,
we
see
our
uk
pages
getting
ranked
or
australia
some
something
else
is
getting
getting
ranked.
So
I
just
wanted
to
know
like,
like
all
our
interlinking
or
just
we
have
like
created
different
site
set
map.
Like
you
know,
if
the
pages
are
for
uk,
we
have
created
like
a
sub
or
a
nested
sitemap
for
them
if
the
breadcrumbs
are
even
properly
organized
intelligent,
also
so
being
great.
C
So
what
could
be
the
possible
reason
like
we're
doing
all
those
parameters
is
still
like
google
pixel
something
else
for
different
different
locations.
A
Yeah,
so
I
I
think
they're,
two
two
or
three
things
to
mention:
one
is:
if,
if
we
don't
have
the
the
pages
all
crawled
and
indexed
already,
then
with
the
atrial
flange
annotation,
we
will
kind
of
miss
that
connection
and
we
might
show
the
wrong
version.
A
A
If
these
pages
are
significantly
similar,
then
we
might
pick
one
as
a
canonical
and
we
might
use
that
one
to
show
in
the
search
results.
So
if,
for
example,
these
are
both
english
language
pages
and
the
content
itself
is
essentially
the
same
across
them,
then
that's
something
where
maybe
we'll
say
well.
A
These
are
duplicates
we'll
just
pick
one
and
show
those-
and
I
think
last
is
also
that
with
any
setups
that
you
have,
you
need
to
assume
that
google
won't
always
get
it
right
with
regards
to
internationalization
and
you
need
to
provide
some
kind
of
a
backup
mechanism
on
your
site
as
well.
A
Here's
a
link
to
the
indonesian
version
of
our
website,
so
those
are
kind
of
the
the
approaches.
I
would
take
there
on
the
one
hand
like
if,
if
these
pages
are
really
duplicates
of
each
each
other,
then
like
sometimes
you
just
have
to
take
that
into
account.
If
we
just
haven't
crawled
and
indexed
those
pages,
and
sometimes
it's
a
matter
of
just
waiting
until
that
happens.
A
K
Yes,
hi
john,
thank
you
so
much
for
doing
this,
so
I'm
working
on
a
site
that
has
public
profiles
as
a
component
for
users.
However,
this
isn't
a
social
network
where
users
can
follow
one
another
or
there's
a
directory
like
page
where
all
the
public
profiles
are
listed
is
that
each
user
has
a
profile
and
a
unique
url
to
their
profile.
K
And
so
I
wanted
to
understand
what
is
the
right
way
to
expose
to
search
crawlers
so
that
when
a
user
types
in
their
name
on
google,
their
public
profile
link
from
our
site
can
also
show
up.
I
was
looking
at
the
robot
txt
files
for
sites
like
linkedin
and
twitter
and
they
don't
expose
the
individual
users
profile
links
and
so,
and
it
doesn't
appear
to
be
in
the
site
map
either.
A
Essentially,
you
don't
need
to
do
anything
special
in
in
a
case
like
that,
so
from
from
our
point
of
view,
these
would
essentially
just
be
normal
pages
within
your
website
and
ideally
you
would
cross-link
them
within
the
website
normally.
So,
if
there's,
I
don't
know
a
comment
or
some
reference
to
a
specific
user,
you
would
link
to
their
profile
page
and
then
we
could
crawl.
The
website
normally
and
essentially
kind
of
like
find
our
way
to
those
profile
pages
and
then
just
index
them
like
that.
K
Okay
and
if
we
don't
have
like
the
commenting
functionality
or
any
way
for
pages
that
are
exposed
to
be
linking
to
the
user's
individual
profile,
is
there
anything
that
we
can
upload
into
search
console,
for
example,
like
the
links
of
public
profiles
that
can
allow
search
to
identify
them.
A
I
mean
you,
you
could
put
them
in
a
cycle
file
and
we
we
could
pick
them
up
like
that.
However,
if
they're
only
linked
in
a
scipam
file,
then
it's
it's
a
bit
hit
and
miss
if
we
would
actually
go
off
and
index
them,
because
we
don't
have
any
context
for
those
individual
links.
It's
basically
like
here
here,
a
bunch
of
pages
on
my
website,
and
we
only
give
them
to
you
in
the
cyclone
file.
Users
wouldn't
be
able
to
find
them.
A
Otherwise,
then
our
systems
would
be
a
little
bit
reluctant,
probably
in
to
to
actually
go
off
and
index
all
of
those
pages.
So
it's
really
something
where
we
need
to
be
able
to
find
links
to
those
profile
pages
somewhere
within
the
normal
content
of
the
site
and
kind
of
like
crawl.
Our
way
through
that
which
could
be,
I
mean
I
I
don't
know
how
these
would
be
embedded,
but
on
the
profile
pages
themselves,
you
could
have
things
like
related
links
where
you
say,
like
other
users
from
this
location
and
then.
G
A
Of
like
cross
list
link
them
like
that,
but
yeah
I
essentially,
we
need
to
find
normal
links
to
those
pages.
The
other
thing,
maybe
to
keep
in
mind
as
well,
is
that
user
profile
pages
are
super
popular
top
target
for
spam.
L
Let's
see
zach,
I
think
hi,
so
my
question
is
kind
of
similar
to
the
previous
one,
but
also
kind
of
the
opposite
at
the
same
time.
So
our
site
started
kind
of
ranking
recently
like
on
page
two
of
a
kind
of
fairly
competitive
keyword
or
like
search
query,
but
yesterday
or
the
day
before
it
kind
of
plummeted
to
page
six,
and
I
noticed
it's
because
google
re-crawled
our
site
and
indexed
like
four
or
five
hundred
user
profiles,
but
they
were
all
private.
L
So
I
I
it's
my
assumption
that
they're
marking
it
as
spam-
and
I
guess
like
my
first
instinct-
was
to
you
know:
451.
All
the
pages
are
401
all
the
pages
unauthorized,
but
the
project
was
bootstrapped
with
create
react,
app
and
that's
all
that's
a
single
page
application
and
like
there's,
no
really
there's
not
really
a
way
to
return
any
status
code
other
than
200
on,
like
a
client
served
application
like
that.
So
yeah,
I
guess,
like
my
question
is
like:
where
do
I
go
from
there?
L
A
So
I
I
think,
first
of
all,
if
if
these
pages
just
got
indexed
now,
then
I
would
not
expect
our
system
to
say.
Oh,
the
quality
of
the
website
is
bad.
We
will
not
rank
it
that
well
anymore,
so
my
assumption
is
that,
like
if
these
pages
just
got
indexed
now,
it's
probably
unrelated
to
the
change
in
ranking
that
you're,
seeing
one
of
the
things
that
recently
happened
is
we
launched
another
core
update?
I
think
yesterday.
A
So
if
that
kind
of
aligns
with
yesterday,
then
maybe
that's
related
to
the
core
update
and
the
the
core
updates
tend
to
be
more
on
a
broader
quality
basis
across
a
website,
so
that
would
also
take
into
account
the
indexable
pages
for
the
website.
But
if
these
pages
were
just
indexed
like
last
week
or
so,
then
that
probably
wouldn't
be
playing
a
role
there,
but
rather
kind
of
the
content
that
was
indexed
over
the
last
couple
of
months.
A
That
might
be
something
that
we
take
into
account
for
a
quality
update,
that's
kind
of
the
the
one
thing,
the
the
other
thing
with
regards
to
single
page
apps
and
pages
that
you
no
longer
want
to
have
indexed.
One
thing
you
can
do
is
add
a
no
index
to
these
pages.
You
can
do
that
with
javascript
as
well,
where,
when
you
load
the
page
and
you
kind
of
recognize,
oh,
this
is
actually
a
page
that
shouldn't
be
indexable.
A
Then
you
can
use
your
javascript
to
edit
the
dom
and
add
the
no
index
robots
meta
tag
to
that
page.
That's
one
thing
you
can
do.
You
can
also,
of
course,
redirect
to
an
unauthorized
page.
If
you
want
to
do
that
on
the
unauthorized
page,
I
would
also
use
the
no
index
instead
of
the
robot's
meta
tag
instead
of
the
robots.txt
disallow,
just
because,
with
the
no
index,
we
really
know
that
we
shouldn't
be
indexing
this
page
and
if
it's
an
error
page,
then
you
don't
want
to
have
that
index.
A
Whereas
if
you
block
the
page
with
robot's
text,
then
we
wouldn't
know
what's
on
that
page
and
then
we
would
go
off
and
probably
index
the
url
without
knowing
its
content,
and
we
might
end
up
showing
that
in
the
search
results
so
especially
for
a
single
page
app,
I
would
use
robots
meta
tag,
no
index,
not
the
robot's
text.
A
So
those
are,
I
think,
they're
the
two
things
main
things
I
I'd
mention
in
that
regard,
and
I
I
think
if,
if
you
suspect
this
is
based
on
the
core
update
from
from
a
timing
point
of
view,
I
would
go
and
look
for
the
blog
posts
that
we
did.
L
Okay,
that's
perfect,
and
I
guess
this
is
kind
of
a
smaller
question,
but
if
I
like
want
to
move
my
url
to
a
more
concise
like
a
smaller
and
a
higher
quality
tld,
but
the
new
url
doesn't
have
like
any
keywords
that
I'm
targeting
or
like
any
backlinks
to
the
site.
If
I
were
to
migrate
it
in
search
console,
would
the
rank
be
maintained
or
is
it
still
kind
of
starting
from
scratch,
because
I'm
losing
a
lot
of
those
like
things
I
had
before.
I
guess.
A
A
If
you
were
to
do
that
for
a
single
page
app,
I
don't
know
if
that's
like
combined
the
the
tricky
part
is
in
search
console.
We
check
to
see
if
there
are
301
redirects
in
place.
So
if
you
wanted
to
migrate
to
a
different
domain
with
a
single
page
app,
you
would
need
to
make
sure
that
your
server
generates
the
301
redirects
on
a
per
page
basis
and
not
do
that
with
javascript.