►
From YouTube: English Google SEO office-hours from May 7, 2021
Description
This is a recording of the Google SEO office-hours hangout from May 7, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome webmasters of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
here
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
search,
and
it
looks
like
just
need
to
mute.
A
Someone
here
and
a
bunch
of
questions
have
already
been
submitted
on
youtube,
so
we
can
go
through
some
of
those,
but
it
looks
like
lots
of
people
are
rushing
to
ask
questions
here,
live
as
well,
so
maybe
we
can
get
started
from
the
top
here.
I'm
on
my
list.
I
have
sun
kun.
B
I
have
a
complicated
situation.
Actually
we
use
abc
kr
as
a
waypoint
url
to
redirect
for
mobile
or
desktop
and
for
790
sites.
Right
about
this
tip,
we
provide
news
to
number
one
porter
sites
of
korea
and
japan
and
we
use
our
waypoint
urls
on
the
twitter
site,
for
you
know
for
the
destination
url
and
I
can
find
the
folders.
I
can
find
the
links
of
the
portal
site
in
the
time
linking
site
in
search
console
for
the
destination
domains.
B
A
I
don't
know
I
I
feel
I
need
to
take
a
look
at
how
you
have
it
set
up
before
I
can
really
say,
if
that's
kind
of
a
good
way
to
do
that
or
not
so,
if
you
have
some
examples,
maybe
that
you
can
drop
into
the
comments.
B
Do
it
to
you
now,
if
you
may,
okay,
I
have
the
yahweh
japan
side
and
in
here
there
are
kind
of
icy
list
of
kind
of
way,
point
urs
that
has
three
302
redirection
url.
Instead
of
our
final
destination,
urls
and.
B
Their
destination
urls
are
like
jp.abs.com
and
the
waypoint
new
order
is
kind
of
abc.kr
thing
like
that.
A
Now
I
I
don't
know
I
I
feel
I'd
almost
need
to
double
check
the
details
there.
So
if
you
could
maybe
drop
me
some
sample
urls
here
in
the
chat,
then
I
can
take
a
look
at
that
afterwards.
B
A
B
A
Want
to
kind
of
make
make
a
quick
judgment
call
based
on
just
hearing
it
like
that,
because
it
sounds
like
a
little
bit
more
complicated.
B
A
C
All
right
barry,
I
think,
you're
next
hey.
How
are
you
so?
I
had
a
question
about
steve
the
search
engine
so
in
your
google
podcast
search
off
the
record.
You
said
you
pretty
much
use
that
as
a
way
of
talking
about
some
ranking
signals,
because
you
don't
like
talking
about
google
writing
signals,
and
one
of
the
things
I
found
was
interesting
was
that
you
didn't
mention
anything
about
machine
learning,
changing
the
weights
of
those
ranking
signals.
C
I
found
that
very
interesting
because,
and
obviously
bing
is
very
into
like
saying
yeah.
We
have
lots
of
ranking
signals,
but
we
don't
know
what
the
weights
are
at
any
moment,
because
machine
learning
and
ai
changes
that
on
the
fly
based
on
tons
of
factors,
does
google
do
a
lot
of
that
or
it
depends
on
the
specific
ranking
factor
or
signal.
A
I
I
think
it
depends,
is
probably
the
right
answer.
It's
it's
something
where
for
some
some
signals,
I
I
know
that
we
do
a
lot
of
machine
learning
to
try
to
figure
out
how
we
should
integrate
them
and
for
others
we
we
don't
use
that
much
and
it
kind
of
also
depends
a
little
bit
on
on
the
specifics
of
what
we're
trying
to
figure
out
in
the
sense
of
do.
We
have
a
clear
metric
that
we
could
kind
of
base
this
machine
learning
system
on
or
are
we
doing
something
like?
A
I
don't
know,
training
the
machine
learning
system
on
clicks
and
then
it
just
finds
the
most
click-baity
titles
that
we
can
show
and
uses
those
in
search
where
you
kind
of
have
to
watch
out
for
it
for
things
like
that.
So
that's
something
where
for
some
elements
we
we
definitely
use
machine
learning
for
other
elements.
We
don't
use
it
as
much.
C
A
Or
no,
I
I
don't
know
I
mean
the
the
different
weightings
of
different
parts
of
the
algorithms
is
is
something
that
that
is
pretty
tricky
to
do,
because
you
can't
just
kind
of
like
manually
say:
oh,
this
is
weighted
ten
percent
and
then
suddenly
everything
else
is
10
less.
Overall,
it's
like
you,
you
kind
of
need
to
watch
out
for
the
whole
system.
So
I
I
could
imagine
some
parts
of
that
are
things
that
we
evaluate
with
machine
learning.
Maybe
we
use
the
values
directly
for
machine
learning.
Maybe
we
adjust
them
manually.
D
Hello,
john
hi,
I
own
a
site.
It
was
ranking
good.
Before
23rd
of
march
I
upgraded
from
yoast
seo
premium
from
free
to
premium.
D
After
that
the
site
got
de-indexed
from
google
and
we
lost
all
our
keywords.
We
tried
everything
to
sort
the
issue.
We
checked
the
robot
txt
file
sitemaps.
We
even
used
the
url
removal
tool
server
and
there
was
a
form
specific
for
the
indexing.
We
even
tried
that
we
also
checked.
Was
there
any
manual
penalty
or
action
from
google?
Still,
we
could
not
find
out
what
was
the
errors
with
our
site.
For
the
past
few
days,
the
site
has
been
indexing
in
the
induction.
D
A
I
don't
know
it's,
it
sounds
kind
of
kind
of
tricky.
I
I
I
would
say
offhand
it
probably
doesn't
have
to
do
with
with
the
updating
of
your
your
plugins,
but
it
it
could
very
well
be
a
technical
issue
somewhere
because
usually
like
when,
when
we
reduce
the
indexing
of
a
site,
when
we
say
it's
like,
we
don't
need
to
have
as
many
urls
indexed
from
a
website.
A
We
we
tend
to
keep
the
ones
the
urls
that
that
are
more
relevant
for
that
site,
and
that
tends
to
be
something
that
happens
over.
I
don't
know
this
longer
period
of
time
where
it
like
slowly
changes
the
indexing.
A
So
if
you're
seeing
something
wired
like
the
whole
site
disappears
from
indexing
it,
it
almost
sounds
like
something
that
might
be
related
to
a
technical
issue,
something
along
those
lines.
What
I
would
recommend
doing
there
is
going
to
the
webmaster
help
forum
and
posting
the
details
of
what
what
you're,
seeing
the
exact
urls
that
you're,
seeing
the
queries
that
you're
doing
so
that
others
in
the
forum
can
take
a
look
and
give
you
some
tips.
Maybe
there's
some
things
that
you
can
do
or
double
check
to
figure
out.
Is
it
a
technical
issue?
A
Is
it
a
quality
issue?
Is
it
a
spam
issue?
Maybe
maybe
the
website
is
hacked
it's
it's
really
hard
to
say,
but
the
folks
in
the
forum
have
a
little
bit
of
experience
with
these
kind
of
topics
and
they
can
help
you
to
narrow
that
down.
D
How
do
you
mean?
Could
there
be
a
conflict
of
plugin
within
the
site
that
could
result.
A
It's
theoretically
possible,
so
I've
I've
seen
situations
where
a
plug-in
shows
a
no
index
and
another
one
shows
normal
that
the
the
page
should
be
indexed
like
that
can
happen
if
you
have
too
many
conflicting
plugins
installed,
but
it's
something
again,
the
the
folks
in
the
help
forum.
They
they've
seen
these
kind
of
issues
and
they
can
tell
you
pretty
quickly.
Oh
it's
like
this
plugin
or
that
plugin.
A
Thank
you
sure,
andre.
E
Hi,
john
hi,
can
you
hear
me?
Well
yes
great,
so
it's
my
first
webmaster
hours,
I've
been
trying
to
get
in,
but
every
time
something
something
happened
so.
But
my
question
now
is
really
serious,
because
we,
just
today
this
morning
we
received
a
security
warning
from
google
search
console
that
is
on
all
of
the
subdomains.
E
We
have
like
400
sub
domains
and
all
of
them
are
showing
security
warning
like
harm
for
harmful
downloads
and
when
we
actually
checked
all
of
the
subdomains
on
on
google
on
like
just
nobody
like
none
of
this,
none
of
them
are
showing
harmful
content.
The
only
place
where
it's
shown
is
the
root
root
domain.
So
we
ran
out
of
all
of
the
possible
scenarios.
A
Okay
and
and
what
kind
of
warning
are
you
seeing.
E
Harmful
downloads
and
no
urls
are
shown
no,
not
no
details
like
zero
details.
It's
like
looking
for
needle
in
high
stack,
you
know
like
yeah
and
it's
in
in
search
console,
or
do
you
see
that
google
search
console
only
in
google
search
console
and
it's
just
showing
only
for
root
domain
and
no
sub
domains
when
we
check
every
single
individual
domain
on
on
the
on
the
like
other
tools,
it's
just
nowhere.
No
warnings
like
security
issues,
support
or
on
some
other
place,
like
nowhere.
E
Okay
thanks,
so
we
should
probably
go
ahead
and
submit
the
request
review
after
we've
done
like
run
out
of
ideas,
and
then
we'll
wait
for
for
this
to
be
done.
Okay,
thank
you
very
much.
I
hope
it
will
work.
A
Yeah,
good
luck!
If
it
doesn't
work,
if
it
tends
to
take
too
long,
let
me
know
on
twitter,
then
I
can
double
check
with
the
team
here
manually.
E
A
It
shouldn't
so
okay,
thank
you.
Usually,
that
would
only
affect
like
if
people
are
trying
to
download
something
from
your
site
that
they
might
see
that
that
warning
in
the
download
section
in
the
browser,
but
it
wouldn't
be
affecting
the
kind
of
the
rankings
or
the
visibility
in
search.
F
We
know
that
it
is
only
a
hint,
but
when
users
are
kind
of
watching
or
reading
the
article
and
they
hover
over
this
link,
it
looks
like
an
internal
link
but
yeah
when
they
click
they
go
to
the
affiliate
site.
And
my
question
is:
would
google
consider
this
up
as
kind
of
kind
of
a
form
of
cloaking,
or
is
it
totally
okay
to
have
have
it?
This
way.
A
So
if
you
have
a
nofollow
or
if
that
kind
of
that
bounce
url
that
you
have
on
your
site
is
blocked
by
robot's
text,
then
essentially
you
you
have
everything
right,
and
it
doesn't
matter
for
us
so
much
that
when
a
user
clicks
on
it,
they
get
redirected
to
the
other
site
and,
like
the
link
initially
looks
like
a
link
to
your
site.
That's
that's
totally
irrelevant
for
us.
It's
just.
We
don't
want
to
be
passing
page
rank
to
the
other
side.
A
Sure
all
right,
akash.
G
Hi
john
hi
hi
my
questions
regarding
the
kova
battles
and
the
amp
urls.
Okay,
so
am
urls
are
basically
like
something
you
have
abc.com
and
then
you
have
your
own
url,
but
when
a
user
searches
on
on
google
serve,
it
shows
google
cache.amp
and
then
the
then
the
rest
of
the
urls
so
would
like.
I
should
consider
my
urls
as
a
psp
inside
tool.
You
know
to
check
the
what
the
score
is
all
about,
or
should
the
amp
cache,
because
what
google
is
going
to
consider
is
what
users
are
actually
looking.
G
A
So
we
we
would
take
into
account
what
users
actually
see,
and
that
means
if
your
amp
pages
are
valid
amp
pages
and
we're
showing
them
through
the
amp
cache.
Then
that's
what
we
will
take
into
account.
Of
course,
if
users
navigate
within
your
website-
and
they
go
to-
I
don't
know
your
traditional
web
version
or
your
locally
hosted
amp
pages,
then
those
would
be
taken
into
account.
There.
G
Okay
and
in
case
of
like
sign
exchanges
like,
if
suppose
we
implement
that
that
technology,
you
know
where
the
urls
have
been
published
by
the
publishers
original
domain,
so
again
that
amp
cache
will
be
considered
or
then
my
initial
publisher
will
be
considered.
As
for
the
page
underscore.
A
What
whatever
users
see?
So
that's
so
my
understanding
is
with
signed
exchanges.
You
would
essentially
be
on
the
amp
cache,
but
with
your
url
shown
in
the
browser,
so
from
from
that
point
of
view,
it's
it's
kind
of
the
same
thing
as
as
before
sure.
Thank
you
so
much.
Thank
you
sure.
All
right,
rahul
hi
john.
H
Hi,
so
basically,
my
google
search
console
is
not
showing
page
experience
report.
So
what
kind
of
reasons.
A
Probably
they
the
reason
there
is
that
we
don't
have
much
data
for
your
website,
and
that
is
basically
based
on
on
how
we
collect
the
data
for
the
core
web
vitals
report,
which
goes
through
the
chrome
user
experience
data.
And
if,
if
we
don't
have
a
lot
of
data
for
your
website,
then
we
wouldn't
be
able
to
show
anything
there.
H
Okay
couple
of
more
questions:
I
have
a
scenario
like
we
have
site
a
and
site
b.
Social
traffic
is
coming
on
site
a
and
the
site
that
paid
particular
page
on
site
a
is
linking
to
site
b.
So
if
some
of
the
traffic
is
going
to
site
b
from
that
particular
link,
so
how?
How
will
google
take
it
into
consideration
like?
Is
it
organic
traffic
or
will
it
consider
as
a
social
traffic
referral
traffic.
A
I
I
don't
think
we
would
differentiate
there
I
mean
for
for
search.
We
we
don't
use
traffic
anyway,
as
a
as
a
ranking
factor
for
the
core
web
vitals.
We
we
don't
really
care
where
the
traffic
is
coming
from.
If
we
have
data
to
those
urls,
then
we
we
can
show
that
in
the
core
web
vitals
report,
it
doesn't
matter
if
that's
from
social,
from
paid
campaigns
or
from
organic
campaigns
or
bookmarks
or
whatever.
H
Okay,
one
more
question:
can
you
talk
a
bit
about
topic
clusters,
brand
mentions
and
social
signals
as
a
ranking
factor.
A
Oh
wow,
I
don't
really
have
much
to
say
about
all
of
that.
That
feels
like
something
super
vague,
and
I
mean
from
from
my
point
of
view.
The
the
main
thing
I
would
watch
out
for
is
that
you
really
have.
I
don't
know,
essentially
a
good
website
that
focuses
on
topics
that
you
know
a
lot
about
and
where
you're
able
to
show
your
knowledge
and
your
experience
and
make
it
clear
like
what
what
you're
working
on
and
all
of
that
kind
of
maps.
A
Two
things
like
topic
clusters
and
brand
mentions,
and
all
of
that,
but
from
from
a
practical
point
of
view,
especially
if
you're
going
from
a
situation
where
your
website
doesn't
have
a
lot
of
data.
Yet
because
they're
not
a
lot
of
users.
Yet
I
would
work
on
just
growing
your
website
organically
and
making
it
better
and
better
and
stronger
over
time
and
then
at
some
point
you
can
figure
out
like
which
topic
clusters
you
want
to
build
out
or
which
topic
areas
or
who,
what
kind
of
experts
you
want
to
have
on
your
site.
I
Good
morning
I
was
going
to
ask
you
about
international
and
duplicate
content,
but
you
already
answered
that
on
twitter
yesterday
and
barry
posted
about
it
this
morning.
So
thank
you
both
my
question.
The
other
major
question
I
have
on
our
on
this
new
site
that
I'm
working
on
is
it's
a
very
it's
a
fairly
large
site.
It's
a
a
fairly
well-known
company.
That's
performing
well,
but
the
traffic
year
over
year
over
year
is
flat
from
organic
search.
I
I
think
the
biggest
culprit
is
the
fact
that
there's
pervasive
use
of
javascript
the
mega
menu
main
navigation
is
all
in
javascript
major
sections
of
the
site,
like
the
blog
and
press
release
and
other
sections
are
all
built
with
javascript
and
not
crawlable.
I
I
just
wanted
your
opinion
on
how
much
that
is
hurting
us
versus
say,
like
webcore
vitals,
which
is
we
fail
a
lot
of
those
tests
right
now,
but
we're
working
hard
to
fix
that.
But
should
we
instead
be
focusing
on
you
know,
making
the
site
crawlable
yeah.
A
I
I
think,
that's
that's
always
kind
of
hard
to
judge
without
looking
at
the
details,
and
it's
definitely
not
the
case
that
just
using
javascript
and
having
content
kind
of
pulled
in
with
javascript
is
automatically
bad.
So
I
I
wouldn't
take
it
as
like.
Oh
we
use
javascript.
Therefore
we
need
to
fix
that
for
seo
reasons,
but
rather
you
kind
of
need
to
look
at
the
the
details
there
of
what
exactly
is
happening
and
kind
of
need
to
figure
out
is.
A
A
And
then
it's
more
a
question
of
like
what
can
we
do
to
kind
of
optimize
that
and
at
that
point
it's
less
a
matter
of?
Is
it
using
javascript
or
not?
But
rather
maybe
we
have
a
complex
navigation
internally
and
we
could
simplify
that
and
make
it
easier
for
google
to
find
the
parts
of
the
site
that
we
think
are
important
and
that's
something
you
could
do
with
javascript.
That's
something
you
could
do
with
with
a
different
kind
of
navigation.
A
That's
kind
of
the
I'd
say
like
the
the
normal
seo
work
then,
but
that's
that's
kind
of
I
think
the
the
first
thing
I
would
try
to
figure
out
is
like.
If
is
there
a
technical
issue
or
not,
and
if
there
is
a
technical
issue,
then
obviously
I
try
to
prioritize
that
over
kind
of
optimizations,
like
core
web
bibles.
I
A
D
A
D
Actually,
before
asking
a
question,
I
just
want
to
put
a
situation
here.
Let's
say
I
have
a
parent
company
called
example.com
which
caters
e-commerce,
internet
marketing
services
and
we
have
subsidy
of
that-
let's
say
test.example.com
which
delivers
mobile,
app
related
services,
so,
but
we
have
hosted
blocks
for
both
on
the
main
domain.
That
is
example.com.
D
So
I
here
just
want
to
know
in
this
case,
what
could
be
the
best
possible
practices
to
enhance
the
visibility
and
boost
the
rankings
for
the
product
pages,
not
for
the
blogs
which
are
hosted
on
the
sub
domain,
or
we
can
say
subsidiary
company.
A
How
can
you
make
it
so
that
google
can
recognize
that
the
parts
that
you
care
about
of
the
website
are
also
the
parts
that
google
ends
up
caring
about,
and
sometimes
that
that's
a
matter
of
like
if
you
have
an
e-commerce
site
taking
the
products
that
you
really
care
about
and
linking
to
them
from
the
home
page,
so
that
when
google
looks
at
the
home
page
of
your
website
and
says,
oh,
this
homepage
is
probably
the
most
important
part
of
the
site
it
links
out
to
these
individual
products
directly.
A
A
We
should
distribute
that
to
and
if
you
say
it's
like
everything
is
just
as
important
as
everything
else,
then
we
can't
recognize
like
which
parts
are
actually
critical
for
your
website
or
which
parts
you
care
about,
which
could
be
products
where
you
make
the
most
money
off
of
or
the
products
where
you're
best
known
for
or
where
you're
kind
of
like
proud
to
offer
them
or
where
the
competition
is
really
strong.
Those
kind
of
things.
A
Totally
up
to
you
from
our
point
of
view,
it
it
doesn't
matter.
I
I
would
consider
what
is
easier
for
you
to
do
what
is
easier
for
you
to
track
and
to
measure.
J
Hi,
if
I
add
301
redirection
for
more
than
1k
url
at
the
same
time,
if
it
is
affect
seo
rankings,
can
can
you
repeat
that
once
more
sorry,
sir
sure,
if
I
add
3.1
redirection
for
more
than
1k
url
at
the
same
time,
if
it
affect
seo
rankings.
A
Okay,
so
if
you
add
a
lot
of
redirects
to
your
picture,
your
site-
yes
yeah,
that's
that's
totally.
Fine,
that's
completely
normal
in
in
the
case
where
you
do
a
rebrand
or
kind
of
a
redesign
of
your
website.
You
have
to
move
a
lot
of
urls
from
one
part
to
another,
and
the
number
of
redirects
that
you
do
is
totally
irrelevant.
J
A
I
don't
know
it's:
it's
super
tricky
like
some.
Sometimes
traffic
just
or
the
visibility
in
search
just
goes
away.
Sometimes
it's
specific
to
individual
countries
where
maybe
the
situation
has
changed
in
that
country.
The
competitive
environment
has
changed
or
where
maybe
our
algorithms
think
that
your
website
is
not
as
relevant
in
that
particular
country.
C
Thank
you.
I
have
something
to
add
on
the
redirect
side,
so
I
just
got
a
random
phone
call
like
another
a
couple
days
ago
from
a
large
ecommerce
site
and
from
their
seo
team
and
they're
like
we
need
your
help.
Our
development
team
wants
to
do
make
five
redirects
over
the
course
of
a
year,
so
every
single
url
on
their
site
would
redirect
like
on
day,
let's
say
tomorrow,
be
one
redirect
and
then
a
month
from
now
another
redirect
because
they
want
to
change
how
the
structure
works
slowly.
A
Yeah,
I
I
yeah.
That's
that's.
Definitely
something
worth
mentioning
in
the
sense
that
if
you
do
restructure
your
site,
try
to
restructure
it
in
the
way
that
you
move
to
the
final
destination
as
quickly
as
possible
and
not
kind
of
like
the
situation
that
barry
mentioned
was
like
you
kind
of
move
here
and
then
here
and
then
here,
because
every
redirect
step
that
you
take
it.
It
takes
a
while
to
be
processed
and
you
will
see
fluctuations
in
search
when
that
happens.
A
So
if
you
take
everything
and
you
do
it
once,
then
it's
like
you
have
it
out
done
with,
whereas
if
you
do
everything
once
a
month,
then
like
every
month,
you
have
that
time
of
things
trying
to
be
settled
down
and
figured
out
again.
K
Hello,
my
question
is
something
that
you
replied
to
on
twitter
a
few
days
ago,
so
you
said
that
a
page
may
be
experiencing
some
issues
because
of
it's
not
considered
to
be
in
a
stable
state
of
indexing.
What
are
some
of
the
factors
that
may
affect
that.
A
I'm
trying
to
remember
back
what
what
the
situation
there
was.
I
I
think,
as
far
as
I
recall
what
was
happening,
there
was
that
things
were
just
essentially
settling
down
in
the
sense
that,
if,
if,
for
example,
you
you
put
a
page
up
for
the
first
time,
then
it
takes
a
bit
of
time
for
google
to
find
that
page
to
get
it
indexed
properly
and
depending
on
kind
of
the
the
current
situation
within
google's
data
centers.
A
It
can
happen
that
we
index
it
in
one
data
center
and
it's
not
quite
available
in
the
other
data
centers.
Yet
and
that's
more
kind
of
the
I
don't
know
organic
fluctuations
of
how
things
are
processed
and
it's
less
something
where,
like
there's
a
technical
issue
on
your
side,
that
makes
it
hard
for
us
to
pick
up.
A
A
Cool,
oh
gosh,
nizana.
K
L
You
hi
so
my
question.
I
have
actually
two
questions
regarding
one
website
that
is
a
switzerland's
classified
website
and
we
were
actually
wanting
to
do
some
changes
on
our
homepage
due
to
page
performances,
because
we
are
actually
rendering
everything
server
side
and
we
have
some
javascript
script
around
it.
That
takes
some
a
little
bit
of
time
to
to
render,
and
we
think
that
it
can
be
helpful
for
the
page
performance
if
we
do
it
otherwise,
so
we
do
have
like
a
menu.
L
Then
we
have
a
banner
and
then
we
have
a
few
listings
on
the
home
page
that
are
just
like
the
most
recent
listings,
or
something
like
that.
So
we
were
actually
thinking
of
maybe
removing
that
from
the
server
side
on
the
mobile
view
and
actually
waiting
for
the
user
to
do
some
action
like
scroll
down
for
it
to
to
render
or
to
appear
on
on
the
website,
and
the
other
solution
is
maybe
to
just
like
render
it
anyway
after
some
time
few
seconds
of
inactivity.
L
So
I
was
just
wondering
like:
is
it
okay?
It's
not
the
essential
part
of
the
home
page
though,
but
is
it
like
user
bot
friend,
not
user,
obviously
bot
friendly?
How
would
you
approach
it
and
okay,
so
this
and
the
second
question
will
follow.
A
Yeah,
I
I
think
the
the
first
thing
you
have
to
decide
is:
if
you
need
to
have
that
content
indexed
or
not,
and
it
sounds
like
you,
don't
necessarily
need
it,
it
would
be
nice,
but
not
critical.
Is
that
correct.
A
Yeah,
so
if,
if
it's
a
matter
of
it
would
be
nice
to
have
index
but
not
critical,
then
I
wouldn't
worry
too
much
about
it,
because
it's
possible
that
we'll
be
able
to
pick
it
up
and
it's
possible
that
we'll
miss
it.
That's
it's
it's
not
something
where
I
would
say
we
would
penalize
a
website
for
doing
something
like
this.
It's
just
for
technical
reasons.
A
We
might
have
it
or
we
might
not
have
it
indexed
the
the
way
that
we
usually
recommend
doing
something
like
this
like
is
it's
essentially
the
the
lazy
loading
setup
would
be
to
use
intersection
observer,
which
is
a
chrome
api,
or
I
don't
know,
I
guess,
like
a
general
browser
api
that
essentially
lets
you
know,
which
parts
of
a
page
are
visible
at
the
moment
and
googlebot
uses
that
when
we
render
the
page-
and
we
render
the
page
in
a
way
that
we
take
a
very
long
viewport
like
like
a
really
long
monitor
and
we
try
to
load
the
page,
and
then
we
tell
the
the
page
like
this
is
your
viewport.
A
So
essentially
with
that
setup.
If
you
recognize
that
this
part
of
the
content
is
visible
and
you
show
it
then
we'd
be
able
to
index
that
that's
kind
of
the
more
guaranteed
way
of
doing
that.
The
the
other
ways
where
we
have
to
watch
out
for
a
scroll
event
or
for
someone
to
click
a
load
more
button,
something
like
that.
That's
usually
something
that
googlebot
tends
not
to
do
so.
A
A
Yeah,
no,
it's
it's
something
like
these.
These
extra,
I
don't
know.
I
see
that
them
as
extra
functionality
are
things
which
cost
time
when
we
do
them
in
rendering
at
scale
like
if
we
have
billions
of
pages
to
do
and
our
systems
try
to
optimize
the
time
that
they
need
and
through
that.
It's
something
like
like
clicking
on
elements
and
scrolling
are
things
that
are
sometimes
done
when
we
have
enough
resources
on
the
side
to
do
that,
but
they're
not
something
we
would
consider
as
kind
of
the
baseline
indexing.
L
Okay,
thank
you,
and
the
second
question
also
regarding
this
website
is
that
we
have
a
home
page
on
dot
ch
and
then
we
have
of
course,
the
three.
The
three
lingual
version
of
slash
d,
e,
slash,
f,
r
and
slash-
I
t
so
for
german,
italian
and
french
and
actually
the
home
page,
as
you
know,
german
being
like
the
default
language
for
switzerland,
let's
say
is
pretty
much
the
similar
as
the
slash
d
version
and
I
have
put
the
canonical
to
it,
but
google
doesn't
seem
to
pick
a
pick
it
up.
L
It
doesn't
seem
to
agree
with
me
and
things
that
it's
actually
like
should
be
self
canonical
on
the
on
the
slash
d.
So
I
was
just
wondering
like
for
now
the
home
page
is
as
the
slash
d
version,
so
I
don't
want
to
have
any
kind
of
trouble
with
that.
I
was
thinking
whether
it's
okay
for
now
or
how
can
I
is
there
any
way
that
I
can
correct
this
in
terms
of
the
canonical
and
or
any
other.
A
Now
I
I
think
that
would
probably
be
fine
to
to
keep
like.
This
is
the
the
thing
that
tends
to
happen
here
is.
We
would
see
that
the
content
is
the
same
or
or
mostly
the
same,
and
then
we
would
try
to
revert
based
on
the
the
other
signals
that
we
have
from
the
site,
which
includes
things
like
internal
links
within
the
website
external
links.
A
We
also
try
to
figure
out
which
urls
look
nicer,
usually
the
shorter
ones.
We
use
your
sitemap
file.
Ahreflang
links
all
of
those
places
where
we
pick
up
links
and
try
to
figure
out
which
of
these
urls
is
the
better
one
to
show,
and
I
I
think
in
this
situation,
probably
both
of
them
would
be
okay
for
indexing.
A
L
Or
just
change
like
the
whole
structure
of
home
page
and
remove
the
canonical
to
the
e
like.
L
Both
indexed
now,
so
that's
that
that's
the
issue.
So,
even
though
we
have
a
canonical
and
slash
d
e,
it's
it
gets
indexed
and
picked
up
and
you
know
doesn't
getting
recognized
by
google
as
a
canonical
to
the
home
page,
so
yeah,
my
main
point
was
like:
is
it
harmful?
I
haven't
seen
anything
going
on
with
that.
I
just
wanted
to
make
sure.
A
L
A
Sure,
let
me
skip
to
some
of
the
the
youtube
questions
in
between
just
to
make
sure
we
don't
lose
track
of
them
completely
and
I'll
I'll
have
more
time
for
for
all
of
your
questions
as
well.
It
looks
like
endless
number
of
questions
today:
fantastic
cool,
okay.
I
have
a
site
where
1.2
000
pages
are
indexed.
When
I
look
at
core
web
vitals,
especially
for
desktop,
it
shows
zero
urls.
What
what
might
be
the
reason
for
that?
I
think
we
touched
on
that
earlier
as
well
in
that
for
core
web
vitals.
A
We
use
the
chrome
user
experience
report
data
which
is
based
on
kind
of
real
user
metrics,
and
essentially,
what
that
means
is
we
just
don't
have
enough
data
for
your
site
or
for
the
pages
on
your
site,
so
that's
something
where,
as
your
site
grows
in
popularity
and
has
more
users
that
end
up
going,
there
we'd
be
able
to
pick
up
some
data
from
that
and
be
able
to
use
that
for
the
core
web
vitals
report.
A
In
the
meantime,
I
would
recommend
using
kind
of
the
manual
testing
tools
for
core
web
vitals,
which
is,
for
example,
the
the
lighthouse
report
in
chrome.
You
can
look
at
the
core
web
vitals
also
in
page
speed.
Insights
is
a
really
nice
way
to
look
at
the
core
web
files
and
there's
also
webpagetest.org,
which
is
a
cool
way
to
look
at
things
there.
A
When,
when
testing
a
site,
I'd
recommend
checking
a
few
pages
from
each
kind
of
template
that
you
have.
So
you
don't
need
to
check
all
1.2
000
pages,
but
maybe
your
home
page,
a
category
page
and
one
of
the
detail
pages
and
then
based
on
that,
you
have
a
little
bit
of
an
idea
of
how
things
are
standing.
A
Can
a
small
amount
of
english
language
text
on
a
non-english
language
page
negatively
affect
its
ranking?
Usually
not
so
usually
what
would
happen
here?
Is
we
like,
depending
on
the
amount
of
text,
we
would
recognize
that
there
are
multiple
languages
on
a
page,
and
what
might
happen
is
that
we
show
that
link
that
like
there's
something
on
here
in
another
language,
and
would
you
like
to
have
it
translated?
A
But
essentially,
that's
not
a
ranking
issue,
it's
more
kind
of
a
bit
of
a
usability,
maybe
a
bit
of
confusing
for
users
in
some
cases
and
from
what
I've
seen
recently.
It's
not
the
case
that
this
would
trigger
just
for
a
handful
of
english
or
kind
of
different
language
words
on
a
page,
but
really,
if
there's
a
significant
amount
of
different
language
content
on
a
page,
we're
updating
our
international
strategy
and
consolidating
markets
to
folder
structure.
There
will
be
cases
when
the
content
will
change
based
on
the
ip
currency.
A
Content
will
be
slightly
modified,
particularly
if
a
feature
is
not
available
in
that
market.
When
crawlers
will
be
fetching
the
page,
they
may
be
getting
slightly
different
versions
of
the
page,
depending
where
they
originate.
The
crawl
request
from
will
this
cause
instability
if
the
page
is
slightly
different
with
every
crawl?
A
A
A
So
that's
kind
of
the
the
primary
thing
to
watch
out
for
if,
on
the
other
hand,
for
example,
users
in
switzerland
just
aren't
able
to
see
your
content,
then
I
might
be
kind
of
upset.
But
google
would
still
index
everything,
because
google
is
crawling
from
the
u.s
and
it
doesn't
even
know
that
someone
in
another
country
might
be
seeing
different
content,
and
the
same
applies
to
your
currencies
as
well,
where,
if
you're
using
the
same
urls
and
showing
different
currency,
then
we
would
only
see
the
currency
based
on
where
we
would
crawl
from.
A
A
Essentially,
it
would
need
to
be
different
urls
so
that
we
can
index
them
individually
and
they
would
need
to
be
accessible
to
at
least
to
to
googlebot
when
we
crawl.
So
if
we
crawl
from
the
us
and
we
access
your
swiss
page,
for
example,
you
need
to
let
that
user
see
kind
of
the
swiss
version
of
your
page
so
that
it
can
be
indexed
as
well.
A
So
all
of
that
is
kind
of
to
say
it's
not
quite
as
simple
as
just
adding
kind
of
ip
detection
and
changing
the
prices
on
the
fly.
You
really
kind
of
need
to
think
about
what
your
strategy
is
going
to
be
when
it
comes
to
an
international
site
in
search
since
april
13th.
Some
of
my
blog
pages
have
been
de-indexed
gradually.
The
site
is
about
nine
months
old.
How,
and
why
does
this
happen?
A
I
don't
know
what
what
is
exactly
happening
with
with
your
website
here.
In
some
cases
it
might
be
that
they're
technical
issues
if
this
is
happening
gradually,
then
that
sounds
a
bit
like
technical
issues.
That
might
be
something
that
you'd
be
able
to
see
in
search
console
depending
on
the
type
of
issue
there.
The
other
thing
is
that
our
index
is
limited.
A
A
So
from
from
that
point
of
view,
it's
it's
not
something
where
I'd
be
able
to
point
at
something
specific.
For
the
most
part,
my
recommendation
there
is,
if
you
really
are
kind
of
lost
with
regards
to.
Why
are
things
being
indexed?
The
way
they
are
is
go
to
the
help
forum
and
post
a
specific,
so
your
site,
some
of
the
queries
that
you
expect
some
of
the
pages
that
you've
seen
go
away
so
that
others
can
take
a
look
there
and
give
you
some
tips
on
what
you
might
be
missing.
A
A
So
I
took
a
quick
look
at
at
the
screenshot
that
that
you
included
in
the
question,
and
it
looks
it
looks
offhand
like
you
essentially
have
two
versions
of
your
site
running.
One
is
kind
of
the
client
side
rendered
version
that
users
see,
and
one
is
a
server
side
rendered
version
that
google
bought
another
bot
c
and
in
general,
that's
that's,
okay,
that's
kind
of
like
how
you
mention
it
there,
it's
dynamic,
rendering
we
we
don't
recommend
doing
that
anymore.
It's
something
you
can
do,
but
it's
not
recommended
anymore.
A
The
main
reason
we
don't
recommend
it
anymore
is
that
it's
very
easy
to
mess
up
and
to
have
googlebot
index,
something
that
is
not
what
your
users
actually
see
and
usually
it
happens
in
a
way
that
googlebot
sees
errors
and
your
users
see
normal
content,
and
you
don't
even
recognize
that
their
error
is
being
shown
to
googlebot
and
that's
kind
of
the
thing
to
to
watch
out
for
so
overall,
you
could
be
doing
something
like
this.
I
would
just
be
very
careful
when
you're
doing
this,
especially
with
the
server-side,
rendering
of
something
slightly
different.
A
A
We
also
don't
use
lighthouse
and
page
feed
inside
for
search,
it's
really
purely
for
you,
so
I
would
at
least
route
those
testing
tools,
at
least
partially
through
client-side
rendered
versions,
so
that
you
really
have
a
view
of
what
users
would
see.
A
Then
there's
a
question
regarding
u.s,
homepage
and
indian
site
links.
I
need
to
double
check
what
is
happening
there.
So
I
don't
know
of
hand.
A
I
see
some
sites
that
are
creating
posts,
but
they're
only
accessible
from
the
sitemap,
so
not
accessible
from
the
website
itself.
From
a
ux
point
of
view,
I
believe
it's
a
good
practice,
because
most
of
the
information
can
still
be
found
on
the
other
pages
of
the
website,
and
this
way
they
can
keep
the
website
well
organized
and
only
the
relevant
posts
on
the
website,
so
the
posts
are
only
for
seo
purposes.
Do
you
think?
What
do
you
think
about
this?
Is
this
okay?
A
So,
first
of
all,
I
I
think
that's
a
bad
practice,
because
google
might
be
able
to
find
and
index
those
pages
based
on
the
sitemap
file,
but
we're
not
going
to
be
able
to
give
them
any
weight
like
if
they're
not
linked
within
your
website,
then
we
have
no
idea
of
the
context
of
those
pages
and
we
have
no
idea
of
the
weight
that
we
should
give
them,
so
it
essentially
means
you're,
creating
these
pages,
putting
them
on
your
server
and
google
is
probably
ignoring
them.
A
A
A
On
the
one
hand,
it's
probably
signaling
to
google
that
it's
important
page,
so
google
will
value
this
page,
but
we
won't
have
google
traffic
for
it.
Can
it
harm
the
other
links,
because
it's
on
the
same
hierarchy
as
other
menu
links?
A
A
It
doesn't
cause
any
problems,
so
I
I
don't
know
looking
back.
Maybe
I
don't
know
five
or
ten
years
at
some
point,
people
would
use
nofollow
links
to
try
to
discourage
google
from
looking
at
the
about
us
page
but
like
we
know
that
an
about
us
page
is
is
a
part
of
a
normal
website.
So
we're
not
going
to
say
oh
it's
linked
from
every
page.
Therefore,
it
must
be
the
most
important
thing
on
your
website
and
it
does
have
information
about
you
on
that
page,
so
we
can
kind
of
crawl.
A
So
from
that
point
of
view
like
putting
these
links
across
your
pages,
where
users
can
find
them,
where
googlebot
can
find
them
perfectly
fine,
okay,
wow
more
more
questions-
I
don't
know,
maybe
I'll
just
take
one
or
two
more
here
and
then
I'll
pause,
the
recording
and
then
we
can
hang
around
a
little
bit
longer
and
go
through
more
of
the
questions.
A
I
let
me
see
I'll
just
I'll.
Just
take
the
first
one
here
rishba,
I
don't
know
if
you're
still
here,
probably
hi.
M
So
my
question
is
mostly
about
the
pop-ups,
so
any
type
of
pop-up
which
cover
complete
screen
can
be
called
an
intrusive
pop-up
or
can
be
you
know
harmful
or
bad
for
seo,
or
is
there
some
pop-ups,
like
you
know,
browser
notification
or
some
pop-up
which
helps
user
to
get
correct,
page
or
correct
information?
On
the
page
saying
that
we
have
for
an
example,
we
have
a
category
page
where
a
pop-up
comes
up
for
the
user,
which
asks
user,
whether
they
are
on
the
right
page
or
not.
M
A
We
we
would
probably
see
those
as
intrusive
interstitials,
so
the
the
thing
that
we
would
try
to
exclude
from
the
intrusive
interstitial
guideline
part
is:
if
something
is
a
legal
interstitial,
which
means
maybe
a
confirmation
for
cookies
or
confirmation
of
your
age.
That
kind
of
thing
that
that
would
that's
something
we
would
consider
a
normal
kind
of
part
of
a
website
and
that
wouldn't
be
problematic.
A
But
if
you
have
a
full
page
pop-up
that
essentially
is
providing
extra
functionality
like
a
sign
up
to
a
newsletter
or
saying
like
you're
on
the
wrong
version
of
the
website.
That
kind
of
thing
that
would
be
something
that
we
would
consider,
probably
as
an
intrusive
interstitial
and
at
the
very
least
we
would
try
to
index
that
content,
because
we
think
oh,
this
is
the
primary
content
of
the
page.
Therefore,
we
should
index
that
and
show
it
visibly
and
that's
probably
not
what
you
want.
M
You
know
pop-ups,
which
has
not
covered
the
complete
screen
but
part
of
it,
and
there
is
no
close
button,
so
these
pop-ups,
like
either
you
take
an
out.
You
know
you
have
to
take
an
action
on
the
pop-up.
If
you
have
not
taken
any
action,
you
will
not
be
able
to
see
any
content.
K
M
A
A
A
All
right,
sakah.
D
A
D
Sir
recently,
one
of
our
competitors
did
some
changes
on
his
website.
What
he
did
he
he
had
a
content
and
internal
links
below
the
fold.
What
he
did
he
took
it
above
the
fold
and
what
we
observed
that
the
ranking
and
the
click
share
increased
massively.
A
I
I
don't
think
we
we
have
strong
preferences
in
that
regard,
so
the
the
main
thing
is
that
we
want
to
see
some
content
above
the
fold
which,
which
means
like
a
part
of
your
page,
should
be
visible
when
a
user
goes
there.
A
So,
for
example,
if
a
user
goes
to
your
website
and
they
just
see
a
big
holiday
photo
and
they
have
to
scroll
down
a
little
bit
to
actually
get
content
about
a
hotel,
then
that
would
be
problematic
for
us,
but
if
they
go
to
your
home
page
and
they
see
a
hall
of
fame
photo
on
top
and
also
a
little
bit
of
information
about
the
hotel,
for
example,
for
a
hotel
site.
That
would
be
fine.
D
D
A
Both
both
can
be
correct,
so
we
take
into
account
for
the
core
web
vitals
what
users
see
in
their
browser
and
if,
if
the
first
point
of
kind
of
where
they
visit
your
site
is
on
the
amp
cache,
then
that
is
taken
into
account
if
the
next
click
on
your
site
for
a
user
who's
kind
of
browsing.
Your
website
is
to
the
amp
version
of
your
website
locally.
A
A
Sure,
okay,
let
me
pause
the
recording
here.
I'll
still
be
here
for
more
questions.
If
there's
anything
on
your
side
and
it
looks
like
lots
of
hands
are
still
raised,
but
always
good
to
have
recordings
for
reasonable
length.
Thank
you
all
for
joining
thanks
for
watching
this.
If
you're
watching
it
on
youtube,
I
hope
you
found
it
useful
and
insightful
and
hopefully
I'll
see
some
of
you
again
in
one
of
the
next
hangouts
all
right.
Let's
pause
now.