►
From YouTube: English Google SEO office-hours from May 28, 2021
Description
This is a recording of the Google SEO office-hours hangout from May 28, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome webmasters of all levels!
A
All
right
welcome
everyone
to
today's
google
seo
search
central
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
here
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
website
and
around
web
search.
A
As
always,
lots
of
stuff
has
been
submitted
already
on
on
youtube
on
the
community
page,
so
we
can
go
through
some
of
that,
but
maybe
to
start
off
we'll
just
go
through
some
live
questions
from
you
all
and
I
see
people
are
already
raising
their
hands.
So
fantastic,
we'll
start
off
with
you,
promotash
hi
john.
How
are
you
hi.
B
So,
john,
I
have
few
questions
so
again.
Some
basic
questions
that
I
have
doubt
because
over
various
forums
they
talk
differently.
Okay,
so
I
have
like
three
four
questions:
can
I
go
sure
go
for
it?
Okay,
okay,
the
first
basic
questions
that
I
want
to
know
it's
regarding
web
hosting
and
domain
registration.
A
No,
it's
it's
not
required
so
for
in
in
general.
If
you
want
to
use
geo
targeting
there,
there
are
two
ways
to
do
that:
one
is
to
use
the
country
level
top
level
domains
which
would
be
de
for
germany.
In
that
case
the
other
is
to
use
a
generic
top-level
domain
and
to
use
a
geo-targeting
setting
in
search
console,
so
that
could
be,
for
example,
a
dot-com
website
or
dot-net
or
dot
info
or
dot
eu
or
whatever
any
of
those
would
also
work.
A
And
then
you
just
set
geotargeting
for
germany,
the
the
hosting
location
is
also
not
required.
That's
something
that
way.
In
the
early
days
before
we
had
the
setting
in
search
console,
we
used
the
hosting
location
as
a
way
to
guess
which
country
the
website
might
be
targeting,
but
nowadays
that's
I,
I
don't
think
that's
used
at
all
and
with
a
content
delivery
network.
If
you
have
kind
of
a
fancy
international
website,
then
the
hosting
location
doesn't
matter
anyway,
because
you
always
have
some
local
presence
automatically.
B
A
I
I
think
that's
a
different
problem.
Then,
though,
so
just
kind
of
in
general,
translated
content
is
is
unique,
content,
it's
it's
different
words
different
letters
on
on
the
page,
so
it's
it's
different
content
they,
depending
on
like
how
you
translate
it.
That
would
be
more
of
a
quality
issue.
So
if
you
use
an
automatic
translating
tool-
and
you
just
translate
your
whole
website
automatically
into
a
different
language,
then
probably
we
would
see
that
as
a
lower
quality
website,
because
often
the
translations
are
not
that
great.
A
But
if
you
take
a
translation
tool
and
then
you
rework
it
with
with
maybe
translators
who
know
the
language
and
you
create
a
better
version
of
that
content,
then
that's.
That's
perfectly
fine
and
I
imagine
over
time
the
translation
tools
will
get
better
so
that
it
works
a
little
bit
better.
But
at
least
for
the
moment,
if
you
just
automatically
translate
it
from
from
a
quality
point
of
view,
that
would
be
problematic
and
even
a
step
further.
A
B
So
if
I
do
it
bit
like
with
human
interface,
like
with
an
intelligence
like
this
question,
was
asking
way
back
in
2011,
it
is
already
in
google
search
central
2011
only
it
was
august.
So
in
10
years
I
I
suppose
google
has
come
across
like
in
in
a
month
only
google
changes.
So
over
10
years,
google
translate
is
like
maybe
million
times
better
yeah
yeah,
so
it
is
million
times
better.
Then,
if
I
do
some
like,
I
will
not
generate
the
site,
but
I
I
will
take
piece
of
the
content.
B
I
will
generate
it
because
if
I
go
to
a
translator,
they
will
charge
me
in
euros
and
it
will
be
very
costly
so
maximum
like
what,
if
I
can,
the
expression,
what
is
your
name
suppose
it
can
be
easily
translated.
There
are
complex
situations
where
grammar
and
other
things
are
involved.
There
is
a
long
sentence
of
15
words
20
words.
Then
I
can
translate
some.
Then
I
can
hire
somebody
freelancer
who
could
look
after
look
into
it
and
then
make
changes
yeah.
I
think
that
would
be
good.
A
C
B
Content,
it
is
like
the
the
grammar
should
be
good
and
the
sentence
should
make
a
meaning
meaningful.
A
A
B
Okay,
okay,
can
I
ask
one
few
more
sure
if
nobody
is
yeah?
Okay,
sorry
guys.
I
have
some
more
questions.
Okay,
so
then,
again
on
the
site
map
I
have
one
doubt:
okay
in
the
sitemap
does
not
use
priority.
Okay,
so
does.
Does
google
elgo
use
the
last
mod
attribute
like
last
mod?
Yes,
the
time
suppose
I
have.
I
have
not
updated
the
sitemap,
but
I'm
keep
on
updating
my
content.
Okay,
my
content
is
updated,
but
it
is
not
linked
to
any
yoast
or
other
plugin.
B
A
So
we
use
a
site
map
to
to
help
us
to
crawl
better,
but
it
doesn't
replace
crawling.
So
if
your
sitemap
is
old
and
your
content
is
updated,
we
will
we
will
crawl
your
website
normally
in
addition
to
the
sitemap.
But
if
you
tell
us
in
a
sitemap
file
that
these
pages
are
new
or
these
pages
have
changed,
then
we
can
crawl
more
efficiently.
A
A
A
B
A
Get
some
of
the
other
guys
and
then
I
I
can
come
back
to
you
as
well,
just
to
make
sure
we're
not
missing
anything
out
sure
rajiv.
Thank
you.
Thank
you.
Hey
thanks.
E
Sean,
how
are
you
doing
pretty
good?
How
are
you
all
good?
So
I
have
a
question,
so
it
is
so
if
using
superscripts
to
give
outlinks
a
good
thing
to
do.
For
example,
I
have
seen
blogs
in
which
what
people
you
should
do,
they
write
the
sentences
and
corresponding
to
the
world.
For
example,
there
is
a
sentence
called,
I
am
an
seo.
E
A
I
I
would
try
to
avoid
that.
If
you
can,
there
might
be
usability
reasons
why
that
makes
sense
in
certain
cases,
but
in
general
we
we
do
like
to
try
to
find
the
link
within
the
context
of
your
pages.
C
Oh
yeah,
hello,
good
morning.
Thank
you.
How
are
you
john
pretty
good?
How
are
you
yeah?
Fine?
Fine?
I
have
a
question.
I
I
have
a
dutch
university
that
I'm
working
for
and
I
want
to
implement
the
href
length
tags
on
for
that
university.
We
have
a
lot
of
dutch
pages
and
english
is
our
second
language.
C
That's
what
we
want
to
target
all
the
other
countries
with
the
only
problem
that
we
have,
or
that
I
face,
is
that
not
all
the
pages
have
a
translated
version,
because
most
some
pages
are
only
in
english.
Some
pages
are
only
in
dutch,
you
understand
it.
I
think.
C
A
So
the
the
href
lang
annotations
are
on
a
per
page
basis,
so
you
don't
have
to
do
that
across
the
whole
website.
So
there
are
two.
I
guess
approaches
that
you
can
take
there.
One
is
to
skip
it
on
those
pages.
The
other
is
just
to
leave
them
there
and
just
have
one
hreflang
tag
like
you.
Just
have
the
dutch
version,
or
you
just
have
the
english
version
of
a
page,
and
we
will
look
at
that
and
say:
oh
there's,
no
href
laying
pair,
so
we
will
skip
that
hreflang.
C
A
It's
kind
of
like,
depending
on
what
your
process
is,
I
I
would
say
both
of
those
are
essentially
equivalent
for
us.
A
Sure
cool
saidul.
B
A
B
I
have
actually
two
questions
so
one
of
the
questions
we
have
received
from
our
client.
So
when
someone
searched
by
their
brand
name,
they
can
see
their
homepage
and
other
pages
link
and
the
organic
search
show
their
twitter
account.
B
Their
facebook
account
and
one
of
their
product
review
site
account,
and
then
it
shows
a
website
which
is
from
usa,
but
they
have
the
similar
brand
name
of
our
client
and
now
my
client
question
is
they
have
instagram
page
which
they
are
updating
regularly,
but
that
instagram
page
link
is
appearing
on
the
second
page
now.
Their
question
is
why
the
uss
site
is
appearing
on
the
first
page,
not
the
instagram
page
and
their
future
page,
which
they
are
not
using
for
last
four
years.
It
is
also
appearing
on
the
first
page
of
the
instagram.
A
A
Naturally,
I
I
think
there
there
might
be
two
things
which
which
they
could
do,
which
could
help
potentially
one
is
to
use
the
structured
data
on
the
pages
to
say
that
certain
other
pages
are
the
same
as
the
the
home
page
kind
of
to
link
to
the
different
profiles
that
they
have.
A
I'm
not
sure
exactly
what
the
name
is
of
this
structured
data
type,
but
you
can
link
to
different
kind
of
social
media
profiles
that
you
have
for
for
a
company
from
your
primary
website
and
the
the
effect
there
is
that
it
helps
us
to
better
understand
that
these
belong
together
and
then
maybe
we
will
rank
them
a
little
bit
closer
together.
I
I
think
it's
always
tricky,
because
maybe
that
other
person
who
has
a
similar
name
is
like
why
don't
I
show
up
for
these
queries
more
in
comparison.
A
So
so
that's,
for
example,
something
that
that
I
see
a
lot
with
my
name.
So
if,
if
you
search
for
my
name
like
you
can
find
me,
but
you
can
also
find
lots
of
other
people
who
are
called
john
mueller,
and
some
of
them
are
pretty
famous-
and
I
imagine
they're
also
like
who
is
this
guy
from
google
and
what
is
he
doing
in
my
search
results?
And
how
can
I
get
rid
of
it
and
that's
something
where
it's
like?
A
Sometimes
there
just
are
different
pages
that
have
a
similar
name
or
that
have
the
same
name
and
they
show
up
in
the
same
search
results.
The
other
thing
that
you
could
do
is,
if
there's
a
knowledge
graph
entry
or
a
knowledge
panel
on
the
side
for
for
the
company
that
we
show.
You
can
verify
that,
and
you
can
add
the
other
social
media
profiles
there
as
well.
B
Okay,
the
second
question,
I'm
not
quite
sure,
actually
whom
to
ask
this
question,
because
this
is
a
little
bit
tricky
for
me.
So
what
happens
for
our
company
when
we
search
it
on
google
maps?
It
shows
google
business
listing
automatically.
B
There
is
no
problem
with
that,
but
when
we
search
it
on
google
organic
search
search
engine,
it
shows
all
the
results
related
to
our
company,
but
it
does
not
show
google
business
listing
on
the
right
side
of
the
result,
but
I'm
not
assured
to
whom
the
asking
this
question,
whether
it
is
you
or
whether
it
is
the
google
business
listing
team.
So
I
just
need
to
know
yeah.
A
I
I
don't
know
I
I
feel
that's
also
more
of
a
ranking
question
in
the
in
terms
of
like
we,
we
can
find
your
business
and
show
it
in
the
normal
search
results,
but
maybe
we're
not
showing
the
the
google
my
business
entry
for
for
a
business
like
that
and
that
I
I
don't
know
the
details
of
how
the
the
google
my
business
profile,
ranking
side
of
things
work.
A
I
I
could
imagine
it
also
takes
into
account
things
like
the
location
where,
maybe,
if
your
business
is
not
actually
locally
or
nearby,
then
maybe
we
wouldn't
show
the
the
google
my
business
entry
and
decide.
But
I
I
don't
know
my
feeling
is:
there's
nothing
technical
that
you're
doing
wrong
if
if
the
listing
is
available-
and
it's
just
not
ranking
in
the
search
results,
but
you
can
still
ask
in
the
I
think,
the
the
google,
my
business
team
has
a
separate
help
forum
specifically
for
kind
of
companies
that
are
listed
in
google,
my
business.
A
So
I
I
would
double
check
with
them
to
see
if
there's
something
that
you
can
do,
maybe
there's
a
simple
trick
to
make
it
easier
to
kind
of
connect
those
two
worlds.
But
my
feeling
is
that
probably
this
is
just
the
ranking
question.
Okay,.
A
Cool
all
right
mark.
D
Hi
john,
my
question
is
regarding.
D
Title
tags:
traditionally,
I
was
used
to
just
having
one
title
tag
at
the
top
of
the
page,
which
would
describe
the
subject
matter
of
the
page,
but
I've
recently
been
using
a
wordpress
page
plugin.
It's
been
adding
title
tags
to
svg
images.
D
So
when
I
look
at
the
source
code,
I
end
up
with
approximately
19
title
tags,
and
now
the
designers
are
telling
me
that's
perfectly.
Okay.
Google
will
recognize
them,
but
I
feel
really
uncomfortable
about
that.
And
what
I'd
like
is
some
clarification.
D
A
A
So
that's
something
where
I
I
would
do
things
like:
go
into
search,
console
and
look
at
the
top
queries
that
people
are
using
to
find
your
website
and
then
just
try
those
searches
out
and
then
look
at
the
titles
that
are
shown
there.
And
if
that
looks
okay,
then
I
I
wouldn't
worry
about.
D
It
well
the
title
tags
are
showing
correctly
and
but
the
reason
why
I've
looked
in
the
code
is
that
there
are
pages
which
I
do
feel
which,
where
they're
being
indexed
by
google,
but
I'm
not
ranking
and
where
I
think
they
should
be
for
the
quality
of
content
that
say,
they're
not
ranking
in
the
top
100
really
well
written,
unique
content,
and
I
think
they
should
be
appearing
somewhere.
And
I'm
looking
for
the
reason.
And
do
you
think
it's
possible
that
they
may
contribute
to
confusion
with
google.
A
I
don't
think
so:
no,
no,
no,
it's
it's
something
where
at
least
at
the
time
when,
when
I
I
first
saw
this
and
I
we
we
had
problems
with
that.
It
was
more
a
matter
of
we
were
accidentally
taking
these
titles
and
showing
them
as
titles
in
the
search
results
right.
D
A
That's
that's
kind
of
annoying,
but
it's
it
wouldn't
cause
any
issues
with
ranking
of
the
pages.
So
just
because
you
have
multiple
titles
on
a
page,
especially
if
they're
in
svg
files,
that's
something
where
we
would
say
like
whatever.
That's
that's
perfectly
fine.
The
the
other
place
I
might
watch
out
for
this
is
in
image
search
results.
If
those
svg
files
are
images
that
you
want
to
have
appear
in
image,
results.
D
A
Yeah,
I
I
wouldn't
worry
about
that.
Yeah!
That's,
that's
perfectly
fine!
Okay!
Thank
you,
john.
Thank
you
all
right.
Let
me
go
through
some
of
the
submitted
questions
and
if
you
have
any
questions
along
the
way
feel
free
to
jump
in
feel
free
to
raise
your
hands
as
well
I'll
try
to
go
through
more
kind
of
live
questions
later
on,
as
well,
depending
on
how
things
go
with
with
the
submitted
questions,
all
right,
so
the
first
one
is
also
about
sitemaps.
A
We've
recently
made
large
changes
to
our
sitemap
file,
both
adding
and
removing
pages
in
an
effort
to
improve
our
seo.
To
our
surprise,
each
change
significantly
reduced
our
impressions
and
our
clicks
does
google
penalize
directly
or
indirectly
for
large
sitemap
changes
or,
on
the
other
hand,
does
google
boost
more
long-standing
content,
and
no,
so
I
I
think
for
both
of
those
questions.
A
There
is
it's
a
clear
no
in
particular,
sitemap
file
changes
are
perfectly
fine,
and
some
websites
have
a
lot
of
sitemap
changes
that
they
do
because
they
make
changes
to
their
pages
a
lot
and
that's
perfectly
fine.
The
sitemap
file
helps
us
to
crawl
more
efficiently.
A
A
Also
with
regards
to
long-standing
content,
we
we
don't
kind
of
boost
long-standing
content
or
evergreen
content
or
content
that
doesn't
that
hasn't
changed
it's
it's
something
where
I
think
the
effect
that
most
people
see
is
more
around
what
people
actually
are
expecting
to
find
in
the
search
results,
and
sometimes
you
want
something
more
like
a
stable
reference
to
find
more
information
on
a
topic.
A
Sometimes
people
want
to
find
kind
of
the
newest
updates
on
a
specific
topic
and
that
kind
of
intent
can
change
over
time
or
usually
it
does
change
over
time
like,
for
example,
if,
if
you're
looking
at,
I
don't
know
a
vacation
place
for
example,
then
maybe
if,
if
like
nothing,
is
happening,
if
everything
is
fine,
then
you
would
expect
to
find
kind
of
more
stable
content
about
that
location,
whereas
if
some
event
took
place
in
that
location
just
recently,
then
you
expect
to
find
more
news
kind
of
new,
updated
content
about
that
location
and
then,
over
time
again,
it'll
shift
more
towards
kind
of
the
evergreen
content
again,
and
these
are
things
that
our
systems
try
to
recognize
and
they
they
do
change
the
rankings
of
these
things
over
time,
depending
on
what
we
expect
that
users
are
trying
to
find.
A
Let's
see
similar
question,
we
see
a
spike
in
traffic
shortly
after
introducing
new
types
of
pages
followed
by
tapering
off,
though
we
don't
expect
our
users
to
behave
any
differently
based
on
how
long
the
content
has
been
live.
Our
content
isn't
very
time
based
nor
at
all
newsy.
Do
you
have
any
thoughts
on
why
we
might
see
this
sort
of
release
spike?
A
So
I
guess
this
is
kind
of
almost
like
the
the
opposite
of
the
previous
question,
where
the
previous
person
was
like.
Oh,
when
I
make
some
new
changes,
then
everything
goes
down
and
the
this
person
is
like
when
I
make
changes
everything
goes
up.
I
I
think
I
think,
probably
what
what
is
happening
in
this
in
this
particular
case
is
that
we're
seeing
new
content
for
a
website
and
especially
when
it
comes
to
new
content
on
a
website
or
new
websites.
A
Overall,
it's
there's
kind
of
this
period
where
we
recognize
a
new
content.
We
can
crawl
and
index
the
new
content,
but
we
don't
have
a
lot
of
signals
for
that
new
content.
Yet
and
then
we
have
to
make
assumptions
and
our
systems
try
to
make
assumptions
where
they
think
this
is
probably
in
line
with
the
rest
of
the
website,
but
sometimes
those
assumptions
are
kind
of
on
on
the
high
side,
where
we
say.
Oh,
this
is
fantastic
content.
A
Probably,
and
sometimes
the
assumptions
are
more
on
the
lower
side
were
a
little
bit
more
conservative
and
like
we
have
to
be
careful
with
showing
this
new
content
and
that's
something
where
you'll
see
that
sometimes
new
content
performs
particularly
well
for
a
period
of
time,
and
then
it
settles
down
again.
Sometimes
it
performs
kind
of
badly
initially
and
then
settles
down
in
a
higher
state.
A
This
is
something
which,
which
is
essentially
just
our
systems,
kind
of
trying
to
figure
out
where
this
new
content
should
fit
in
before
we
have
a
lot
of
signals
about
the
content
and
in
in
the
seo
world.
This
is
sometimes
called
kind
of
like
a
sandbox,
where
google
is
like
keeping
things
back
to
prevent
new
pages
from
showing
up,
which
is
not
the
case
or
some.
A
Some
people
call
it
like
the
honeymoon
period
where
new
content
comes
out
and
google
really
loves
it
and
tries
to
promote
it,
and
it's
again
not
the
case
that
we're
explicitly
trying
to
promote
new
content
or
demote
new
content.
It's
just
we
don't
know,
and
we
we
have
to
make
assumptions
and
then
sometimes
those
assumptions
are
right
and
nothing
really
changes
over
time.
Sometimes
things
settle
down
a
little
bit
lower,
sometimes
a
little
bit
higher.
A
A
So
we
we
don't
have
a
list
of
the
ip
addresses
for
googlebot
because
they
can
change
over
time,
but
we
do
have
a
method
of
you
verifying
whether
or
not
a
google
bot
request
is
legitimate
or
not,
and
we
have
that
documented
in
our
help
center.
So
in
in
a
case
like
this,
where
you
really
need
to
kind
of
like
block
everyone
else,
and
only
let
google
look
at
the
content,
that's
that's
something
that
you
can
do
specifically
for
sitemap
files.
We've
said
that
that's
okay.
A
I
I
would
of
course,
kind
of
like
watch
out
for
the
other
search
engines
as
well,
so
that
you
don't
block
them
from
from
accessing
your
content,
because
otherwise
they
wouldn't
be
able
to
access
it,
but
essentially
for
sitemap
files.
You
can
verify
the
request,
usually
the
the
way
that
this
is
done.
Is
you
look
at
the
user
agent
and
if
it's
a
google
bank
user
agent,
then
you
take
the
ip
address
that
the
request
has
and
you
do
a
reverse
lookup
to
find
the
hostname
for
that
ip
address.
A
And
then
you
do
another
lookup
of
the
hostname
to
double
check
that
the
ip
address
is
correct
and
that's
something
that
you
can
do
at
scale.
You
can
cache
these
for
a
while.
You
can
kind
of
keep
them
like
that
for
for
a
bit
and
collect
those
ip
addresses.
If
you
want,
I
would
just
be
careful
with
the
assumption
that
one
ip
address
will
always
be
the
same
one,
because
we
do
have
different
data
centers
in
different
locations,
and
sometimes
you
end
up
with
different
ip
addresses
that
are
used
for
crawling.
A
If
we
have
a
blog
post
about
the
finances
of
a
company
which
is
only
relevant
for
a
few
months
per
year
until
the
next
report
comes
out,
is
it
better
to
update
that
blog
post
or
to
create
a
new
one
in
terms
of
seo
for
google?
Does
it
hurt
the
seo
to
have
a
page
change
every
month
or
every
three
months?
A
So
that's
I.
I
think
something
that
comes
up
regularly
well
regularly.
Of
course,
if
you're
creating
content
regularly
and
our
recommendations,
there
are
essentially
so
I
mean
this
is
assuming
that
you
want
the
these
financial
pages
to
show
up
in
search,
if
you
like,
maybe
taking
a
step
back
if
you're
just
like.
Well,
we
put
these
pages
out
there,
but
we
don't
really
care
if
people
find
these
pages,
we
care
about
the
rest
of
our
website.
A
Then
in
a
case
like
that,
like
you
can
do
whatever
you
want
anyway.
But
if
you
do
want
these
pages
to
be
found,
then
usually
the
approach
is
to
have
one
stable
url
for
this
content,
and
usually
you
would
want
to
keep
the
older
versions
of
the
content
and
move
those
to
an
archive
section.
So
essentially,
you
would
have
the
current
report
on
on
kind
of
a
clear
url
which
is
like
I
don't
know,
financialreport.html,
and
you
would
take
the
last
report
that
you
have
and
move
it
to
something
like
I
don't
know.
A
2020
financial
report
kind
of
an
archive
section
of
your
site
and
the
the
goal
here
is
essentially
that
that
primary
page
that
you
have
the
kind
of
the
normal
financial
report
page
that
that
page
is
the
page
that
collects
the
signals
over
time
where
over
time
we
will
see
like
if
someone
is
searching
for
the
financial
report
for
your
company.
This
is
the
place
to
go,
and
if
someone
ends
up
searching
for
the
financial
report
for
2019
for
your
company,
then
we
can
still
dig
up
those
archived
pages
as
well.
A
So
if
you
have
an
event
that
takes
place
once
a
year
or
every
I
don't
know
a
couple
of
months,
then
you
have
one
stable
event
landing
page
and
then
from
there
you
link
to
an
archive
section
with
the
older
versions
of
that
event
and
or
if
you
have
products
where
you
say
like
I
don't
know,
I
have
an
android
phone
27
and
it
replaces
the
android
phone
25
or
whatever.
A
Then
people
will
be
searching
for
android
phone
and
they
want
to
find
that
that
android
phone
general
kind
of
stable
landing
page.
So
that's
that's
kind
of
the
the
approach
I
would
take
here.
You
just
have
one
stable
landing
page
for
the
the
overall
topic
that
you
keep,
updating
and
move
the
old
content
into
an
archive
section
so
that
if
people
are
searching
for
the
old
content,
they
can
still
find
it,
but
most
people
when
they're
just
searching
for
the
topic.
A
Overall,
they
can
find
that
stable,
page
and
overall
over
time
that
stable
page
will
just
kind
of
like
gain
an
importance
across
your
website
anyway.
So
that's
that's
kind
of
the
approach
I
would
take,
then
a
question
with
regards
to
iframes.
My
site
has
a
page
with
multiple
iframes
which
are
updated
monthly.
This
source
is
graphs
that
I
make
with
plotly
javascript,
which
I
then
host
on
the
web
pages
and
they're
not
available
throughout
the
website
anywhere
else.
A
So
the
first
question
is:
should
I
put
a
nofollow
on
those
pages
so
that
they
don't
get
ranked?
I
don't
think
you
can
do
a
nofollow
on
an
iframe.
So
probably
that
doesn't
matter
so
much.
A
What
you
could
do
is
use
a
rel
canonical
on
the
iframed
content,
to
point
back
to
your
general
page.
That
makes
it
a
little
bit
easier,
but
you
can't
put
like
a
nofollow
on
the
iframe.
The
second
question
refers
to
page
speed,
insights
test.
They
appear
to
be
very
slow
due
to
having
to
load
the
plotly
library
for
each
iframe
separately
instead
of
just
caching
it
do
you
think
I
would
be
better
off
for
going.
A
A
On
the
one
hand,
I
would
try
to
figure
out
what
you
can
do
to
make
the
javascript
faster
in
terms
of
caching.
So
there's
a
lot
of
a
lot
of
things
that
you
can
do
with
regards
to
how
you
serve
content
from
your
server
and
with
caching,
you
can
probably
do
some
things
to
to
help
improve
the
speed
of
those
javascript
libraries,
so
I
wouldn't
say
it's
absolutely
necessary
to
get
rid
of
all
of
that
javascript,
but
probably
you
can
do
quite
a
bit
to
improve
the
speed
as
well.
A
The
other
thing
is
with
regards
to
the
content
of
the
iframes.
One
of
the
things
that
happens
here
is,
if
you're
generating
these
graphs
with
javascript.
We
would
not
be
able
to
index
them
as
images.
So
if,
for
example,
you
have
fancy
images
that
you're
creating
and
people
are
actually
looking
for
those
images
in
image
search,
then
we
wouldn't
be
able
to
index
those
as
images,
because
we
we
would
essentially
process
the
javascript
and
see
oh,
it
does
something
fancy
with
the
canvas
or
whatever.
A
But
we
wouldn't
see
that
this
is
actually
an
image
that
we
would
be
able
to
index.
So
if,
if
you're,
seeing
that
people
are
looking
for
these
graphs
using
google
images,
then
I
would
at
least
make
it
possible
that
people
can
view
them
as
images
and
the
the
final
thing
here.
With
regards
to
speed.
In
particular,
it
is
possible
that
you
can
achieve
a
significant
speed
boost
by
swapping
out
the
kind
of
javascript
libraries
against
static
images.
A
It
depends,
I
think,
a
little
bit
on
your
pages
themselves
and
whether
or
not
that
actually
makes
sense
like
if
everyone
is
always
interacting
with
those
those
javascript
pages,
then
probably
you
want
to
kind
of
keep
it
interactive
by
default.
However,
if
most
people
are
just
looking
at
those
images,
then
maybe
you
don't
need
to
make
them
interactive
by
default,
and
you
can
just
cache
the
image
make
it
so
that
when
people
click
on
it
that
they
can
go
to
the
javascript
version
and
then
do
kind
of
the
interactive
bit
there.
A
So
that's
those
are
kind
of
the
the
aspects
that
that
I
would
look
at
there
but
kind
of
again,
just
with
regards
to
javascript
library
there's
a
lot
of
things
you
can
do
with
javascript
and
with
the
serving
content
from
your
server
that
can
make
it
fast
too
and
yeah.
A
I
I
don't
know
I
I
think
there
are
probably
also
some
other
aspects
with
regards
to
speed
and
ranking
that
you
could
look
at
here
in,
let's
see
in
particular,
if
this
is
like
a
small
part
of
your
website,
then
probably
the
overall
aggregate
data
for
your
website.
With
regards
to
speed
will
be
independent
of
these
particular
pages.
A
Let's
see
question
about
responsive
web
design
and
display
none
under
mobile
first
indexing.
Can
I
assume
that
the
contents
that
is
in
html
it's
seen
on
desktop
but
can't
be
seen
or
accessed
from
smartphone,
because
it
uses
display
none
in
responsive
design,
is
indexed
and
avail
evaluated
equally
as
other
visible
and
accessible
content
under
mobile
first
indexing.
A
So
it
has
a
little
bit
more
details
on
kind
of
what's
what's
happening
here
in
in
general,
we
would
be
able
to
index
this
content
and
we
would
be
able
to
show
it
in
the
search
results.
A
I
I
would,
however,
go
with
the
caveat
that
you
you
can
test
this
yourself
and
you
can
double
check
to
see
if
this
is
actually
ranking
well
or
the
way
that
you
want
it
to
rank
in
the
search
results
just
by
by
searching
for
your
content
yourself
in
in
that
regard.
It's
something
where
it's
like.
You
don't
have
to
take
my
word,
but
rather
you
can
try
it
out
and
see
what
actually
happens
there.
A
A
So
it's
not
the
case
that
the
mobile
version
has
to
be
one-to-one
exactly
the
same
as
what
is
visible
on
desktop.
It
can
be
the
case
that
that
there
are
actual
content
differences
there,
if,
like
most
sites
at
the
moment,
are
on
mobile
first
indexing
already.
So
perhaps
your
site
is
on
there
as
well,
and
then
you
kind
of
see
exactly
what's
happening
already
we're
still
kind
of
struggling
with
some
of
the
the
last
sites
with
mobile
first
indexing,
so
hopefully
we'll
be
able
to
shift
more
of
those
over
over.
B
Which
is
appearing
for
mobile
device?
That
is
one
step
for
indexing,
and
next
step
is
for
ranking
yeah
and
you
have.
You
have
also
told
that
we
take
consider
mobile
content
for
ranking
and
indexing
purpose
yeah,
but
but
we
are
just
trying
to
serve
that
content
in
desktop,
but
not
in
mobile,
so
how
google
will
take
this
kind
of
infrastructure
or
this
kind
of
elements
for
for
ranking
and
indexing
purpose?.
A
If,
if
the
the
way
that
you
can
test
is
by
using
the
inspect
url
tool
in
search
console
to
do
a
live
fetch
with
the
mobile
version
and
to
look
at
the
html
to
see
if
it's
actually
in
there,
if
it's
not
in
the
html
on
the
mobile
version,
if
it's
only
visible
for
desktop
users,
for
example,
then
we
would
not
be
able
to
index
it
and
we
would
not
be
able
to
use
that
for
ranking.
G
A
I
don't
know
it's
it's
something
where
if,
if
you're
using
javascript
on
your
pages,
which
is
slowing
things
down
for
your
users,
then
we
would
see
that
as
your
pages
being
slow.
So
it's
not
it's
not
the
case
that
we
would
say.
Oh
this
is
recaptcha.
We
will
ignore
it
or
we
don't
count
it
against
your
website.
Or
this
is,
I
don't,
know,
google
ads,
so
we
won't
count
it
against
your
website.
It's
it's.
Really.
G
Very
really
facing
issue
with
recaptcha
javascript.
There
is
a
third
party
as
a
showing
as
a
third
party,
so
there
is
anything
to
defer
the
particular
javascript.
A
I
don't
know
how
how
the
recapture
script
is
embedded.
So
that's
something
where
sometimes
the
the
way
that
you
embed
a
third-party
script
can
play
a
big
difference
in
how
the
speed
is
done
or
is
seen
by
users.
Sometimes
scripts
are
just
slow
and
I
I
don't
know
in
particular
the
recaptcha
script.
I
could
imagine
that
it
has
to
do
some
things
to
recognize.
Is
this
a
user,
or
is
this
not
an
actual
user,
and
maybe
some
of
that
is,
is
more
processing
intensive
and
just
takes
time.
A
So
if,
if
that's
the
case,
then
I
I
would
just
think
about
like
what
you
can
do
to
improve
the
speed
of
your
site
overall,
and
maybe
that
includes
removing
some
of
these
scripts
or
removing
some
of
these
scripts
from
pages
that
don't
necessarily
need
that
script.
A
So
I
mean
it's
it's
I
I
don't
like
to
kind
of
like
say:
oh,
you
should
remove
google
script
from
your
pages,
but
if
these
things
are
slowing
your
pages
down,
then
like
that's,
that's
something
that
sometimes
you
have
to
do,
and
sometimes
it's
also
something
where
you
can
contact
the
team
that
that
is
implementing
this
script
and
says
like
hey.
Your
script
is
so
slow,
I'm
going
to
remove
it
unless
you
make
it
faster
and
maybe
if
they
get
this
feedback,
they'll
be
able
to
improve
things.
I
I
don't
know
fine.
A
Let
me
just
run
through
some
short
questions
that
are
still
submitted,
and
then
we
can
get
back
to
more
live
questions,
and
I
also
have
a
bit
more
time
afterwards.
If
any
of
you
want
to
hang
around
longer
and
just
getting
more
google
reviews
or
responding
to
them,
increase
your
seo
ranking
in
any
way.
A
As
far
as
I
know,
so
it
might
be
that
it
affects
something
in
google,
my
business
in
the
the
local
listings
and
the
maps
listings,
but
at
least
from
an
seo
point
of
view,
we
don't
look
at
the
number
of
google
reviews
that
you
have
or
kind
of
the
back
and
forth
that
you
have
with
regards
to
google
reviews,
how
do
you
deal
with
href
lang
when
there's
not
a
translation
for
every
page?
I
think
we
touched
on
this.
A
A
What
would
be
the
best
approach
to
avoid
this
tracking
subdomain
to
be
crawled,
so
I
guess
first
of
all,
if
it's
a
tracking
script
and
it's
not
necessary
for
your
content,
you
can
block
it
by
robot's
text
and
if
you
block
it
by
robot's
text,
we
would
not
crawl
that
part
of
your
site,
so
we
wouldn't
take
that
into
account,
so
that's
kind
of
the
practical
effect
there.
My
guess
is
that
this
wouldn't
be
negatively
affecting
your
website
anyway.
A
I
I
think
it's
like
something
you
could
optimize
with
robots
text
to
block
the
tracking
subdomain,
but
if
you're
not
absolutely
sure
what
the
tracking
subdomain
all
does.
Maybe
there's
also
an
api
there
that
pulls
in
content
or
I
don't
know,
then
I
wouldn't
necessarily
just
block
it
by
by
default
and
say:
oh,
I
don't
want
any
of
this
crawled
because
it
could
have
other
effects
as
well,
but
if,
if
you're
sure
that
this
is
just
a
tracking
script,
it
doesn't
affect
the
rest
of
your
pages.
A
Everything
loads
normally
without
it
like
go
for
it
search
console
mobile
usability
report,
says
text
too
small
to
read
what
is
the
minimum
font
size?
We
should
maintain.
I
I
don't
know
so
I
looked
briefly
for
our
documentation
on
this.
A
I
don't
think
we
we
have
the
exact
font
size
documented,
however,
in
the
lighthouse
tool,
where
there's
also
kind
of
the
mobile
friendliness
test
that
you
can
do,
it
looks
out
for
12
point
font,
so
that
might
be
something
to
aim
at
or
at
least
try
out
the
lighthouse
tool
and
see
how
that
works
out
there.
A
A
So
that
might
be
something
that
throws
things
off
a
little
bit
in
particular,
if
you're
seeing
this
issue
being
flagged
for
individual
urls
across
your
website
and
not
for
the
whole
website
overall,
then
it
might
just
be
that
oh,
we
weren't
able
to
get
the
css
file
once
and
next
time
we
can
get
the
css
file
and
it'll
be
fine.
Again,
I'm
running
a
website
using
wordpress.
I
have
a
question
about
structured
data
for
non-amp
pages
that
contain
structured
data.
Is
there
anything
else
that
I
should
be
setting
besides
the
such
central
guidelines?
A
If
I'm
concerned
about
structured
data,
is
it
better
for
seo
to
make
my
site
amp
so
amp
or
not?
Amp
is
essentially
independent
of
all
structured
data
questions
for
structured
data.
I
would
primarily
focus
on
the
aspects
that
you
can
see
in
the
search
results.
So
that
means,
if
you
have
content
that
matches
kind
of
a
special
format
that
we
would
show
in
the
search
result.
A
I
would
try
to
kind
of
use
the
right
structure
data
to
be
visible
like
that,
for
example,
if
you
have
recipes
on
your
site,
then
using
the
recipe
markup
makes
it
possible
for
us
to
to
pick
up
nutritional
information
to
show
a
nice
thumbnail
image.
All
of
that,
if
you
don't
have
recipes
on
your
website,
then
obviously
there's
no
need
to
add
recipe
markup
to
your
pages.
That
wouldn't
make
much
sense.
A
So
my
my
recommendation
here
would
be
to
see
the
amp
or
not
amp
question
as
something
separate
and
a
lot
of
times
using
amp
is
a
great
way
to
make
a
site
very
fast.
A
Cool
okay,
I
think
maybe
we'll
just
go
to
two
more
live
questions
to
kind
of
close
things
out
for
for
the
recording.
Here,
let's
see
wushi
kesh,
I
think
I
hope
I
got
your
name
right.
F
Hi,
so
I
have
a
website
consisting
of
million
pages.
So
basically
we
sell
automative
part
so
putting
static
content
on
every
page
is
hard
for
us,
so
we
just
change
the
path,
description
and
part
number
on
every
page.
So
my
question
is:
does
google
consider
this
as
a
duplicate
content?
First
and
second
thing?
F
F
So
basically
we
are
saving
automatic
parts
pages
only
so
it's
like
a
quote
pages
only
so
we
take
an
information
from
the
user
and
then
we
take
a
lead
on
this.
A
Okay,
so
these
are
it's
basically,
a
big
catalog
of
of
parts
that
you
have
and
and
you're
creating
web
web
pages.
For
that
right,
correct,
okay,
yeah!
I
I
think
that's
that's
perfectly
fine.
It's
essentially
a
database
driven
website
which
tends
to
be
generated
dynamically,
like
that,
from
from
our
side,
we
wouldn't
necessarily
differentiate
between
you're,
generating
the
html
pages
with
a
database
or
you're,
generating
the
html
pages
with
static
files.
A
We
don't
see
that
difference,
so
we
essentially
just
look
at
the
content
that
you're
providing
and
if
the
content
is
useful,
if
it
matches
what
people
are
searching
for,
then
that's
that's
perfectly
fine
like
how
you
get
to
that
content
is,
is
up
to
you.
Sometimes
it
makes
sense
to
go
through
a
database.
Sometimes
there's
separate
text
files.
A
That's
that's
totally
up
to
you.
I
think
the
important
part
here
really
is
the
the
content
itself
has
to
be
useful
and
compelling.
So
if,
for
example,
you
you're
a
reseller
for
a
a
catalog
of
products-
like
I
don't
know,
you're,
maybe
for
a
big
car
manufacturer,
you
have
all
of
their
parts
and
you
get
their
database
of
parts
and
you
just
put
their
that
database
of
parts
online
as
well
and
then
essentially
you're
not
providing
a
lot
of
unique
value
by
just
putting
the
same
database
online.
A
F
A
F
Currently,
we
are
facing
the
indexing
of
these
pages,
so
we
are
not
taking
this
data
from
other
side.
So
it's
a
in-house
content
only
we
are
just
taking
the
part
numbers
and
all
those
so
we
are
facing
the
indexing
issues
now
so
is
this
could
be
a
reason
for
that.
A
It's
it's
hard
to
say
without
knowing
more
about
your
website,
but
I,
I
think
you're,
probably
on
the
right
track.
The
the
thing
with
indexing
a
large
database
like
that
is,
is
that
our
systems
are
also
very
dynamic
in
that
they
will
kind
of
re-evaluate
how
much
content
we
need
to
have
index
from
individual
websites
over
time,
and
especially
with
larger
websites.
It's
completely
normal
that
sometimes
a
lot
of
content
is
indexed.
Sometimes
less
content
is
indexed
and
the
these
kind
of
fluctuations
they
they
happen
all
the
time.
A
What
I
would
watch
out
for
here
is
that,
within
your
website
structure,
you
structure
it
in
a
way
that
the
critical
pages
are
recognizable
for
google,
as
well
as
being
critical
pages.
So
that
could
be
things
like
your
your
category
pages
or
your
sub
category
pages
or
kind
of
the
the
blog
posts
or
content
that
you
have
written
about
your
products.
A
And
as
long
as
those
important
parts
are
indexed,
then
the
rest
can
fluctuate
a
bit.
And
if
you
have
a
million
products
and
let's
say
a
thousand
category
pages,
if
those
1000
category
pages
are
all
indexed,
then
the
fluctuations
with
those
million
products,
like
that's
a
big
number
of
fluctuations.
But
it
doesn't
matter
so
much
for
your
for
your
overall
business,
because
people
can
still
find
the
content
that
they're
looking
for.
G
G
G
A
Yeah,
I
think
it's
a
big
big
range
of
things
that
you
can.
You
can
look
at
here.
So
it's
really
hard
to
say
what
what
the
best
practices
are.
I
I
would
recommend,
as
a
start,
to
at
least
go
through
all
of
our
documentation
on
javascript
and
seo,
because
we
have
a
lot
of
things
documented.
A
We
have
a
lot
of
videos
about
this
as
well
and
to
use
the
tools
that
we
have
to
double
check
your
pages,
and
it
might
be
that
your
pages
are
all
okay
in
that
regard,
and
you
don't
have
to
do
anything
special.
It
might
be
that
you
recognize
some
issues
that
you
have
to
to
focus
on
and
the
the
other
thing
that
I
would
recommend
doing
here
is,
if
you
run
into
specific
technical
issues
with
regards
to
javascript
and
rendering
and
seo,
is
to
make
sure
to
join
the
office
hours
with
martin.
A
He
does
them.
I
think,
every
other
week
or
so,
and
he's
he's
our
expert
on
everything
around
javascript
and
seo.
And
if
you
come
with
a
technical
challenge
to
him,
then
he
will
be
super
happy
to
to
help
you
to
try
to
figure
that
out.
So
that's
kind
of
the
the
approach
that
I
would
take
there.
A
It
it
can
certainly
be
okay.
I
I
mean
these
are
techniques
that
you
can
do
to
improve
the
speed
of
your
pages
and,
depending
on
what
what
actually
happens
on
your
pages
in
in
that
regard,
it
can
have
an
effect
on
improving
the
speed
for
users,
without
necessarily
causing
issues
for
seo
deferred.
Loading,
for
example,
is
something
that
essentially
takes
the
javascript
and
processes
it
a
little
bit
later.
A
If
that
javascript
is
not
critical
for
your
pages,
if
there
it
doesn't
generate
any
special
content,
then
like
it
doesn't
matter
for
seo
purposes
if
it's
loaded
later
or
earlier.
However,
if
that
javascript
generates
the
primary
content
of
your
pages
and
you're,
deferring
it
from
being
loading
until
after
google
processes
your
pages,
then
we
wouldn't
have
that
content
for
indexing.
A
So
that's
something
where
you
kind
of
have
to
make
a
judgment,
call
on
your
side
and
say
this
makes
sense,
or
this
doesn't
make
sense,
or
this
improves
speed,
but
causes
technical
seo
issues,
or
maybe
things
improve
speed
and
they
work
well
for
seo
as
well.
All
of
these
are
things
that
you
can
test
out.
A
Yeah
yeah,
I
mean
a
lot
of
sites
have
tons
of
javascript
on
them.
That
is
never
used.
So
that's
something
where
I,
I
suspect,
by
focusing
on
the
javascript
that
you
have
on
your
site
cleaning.
All
of
that
up.
You
can
save
quite
a
bit
of
time
or
rendering
time
as
well
for
users.
I
I
think
there
are
multiple
approaches
that
you
could
take
there
and
you
could,
for
example,
go
into
the
javascript
files
and
tweak
things
line
by
line.
A
Sometimes
you
can
just
take
a
tool
and
run
it
over
your
javascript
files,
and
it
does
everything
for
you.
Sometimes
it's
a
matter
of
switching
to
a
more
lightweight
library,
rather
than
kind
of
like
including
everything
just
include
the
parts
that
you
actually
need.
It's,
I
don't
think,
there's
one
answer
that
would
fix
it
for
all
websites.
A
Yeah
thanks.
Thanks
so
much
sure.
Okay-
and
let
me
pause
the
recording
here
and
we
can
go
through
the
the
rest
of
the
questions
that
are
still
lined
up.
You're
welcome
to
to
stick
around
a
little
bit
longer,
but
just
for
sake
of
having
a
reasonable
length
for
the
recording.
I
always
like
to
pause
things.
Thank
you
all
for
watching
if
you've
been
watching
this
on
youtube
and
feel
free
to
drop
in
in
one
of
the
next
office
hours
as
well.
A
If
there's
something
on
your
mind
that
you'd
like
to
talk
about
all
right.
So
let's
pause
here.