►
From YouTube: English Google SEO office-hours from January 8, 2021
Description
This is a recording of the Google SEO office-hours hangout from January 8, 2021. These sessions are open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://developers.google.com/search/events/join-office-hours
Feel free to join us - we welcome webmasters of all levels!
A
A
Hopefully,
this
year
will
be
a
little
bit
easier,
but
it's
starting
off
with
a
bang
so
we'll
see,
hopefully
at
least
with
these
office
hour
hangouts.
We
have
something
kind
of
regular,
that's
not
too
too
crazy
anyway,
and
so
we
can
jump
through
some
of
the
questions
that
were
submitted,
but
maybe,
first
of
all,
if
any
of
you
want
to
ask
a
question
and
something
on
your
mind,
feel
free
to
jump
on
in.
A
B
Hi,
john,
if
you
can
hear
me
okay,
perfect,
so
I'm
trying
to
index
a
site
that
has
about
a
million
products,
so
I'm
trying
to
index
it.
You
know
slowly
by
just
submitting
the
products
first
and
then
categories
pages,
rather
indexing
tags
and
everything
or
you
know
once
so.
This
is
being
done
from
last
few
weeks
and
I've
managed
to
index
about
three
lakh
products.
B
B
I
assume
this
is
a
new
website
yeah.
It
was
actually
published
three
four
months
ago,
yeah.
A
Yeah,
okay,
so
usually
what
what
happens
on
our
side?
Is
we
try
to
index
a
bunch
of
content
from
especially
new
websites,
but
we're
a
little
bit
trying
to
be
on
almost
on
the
safe
side
when
it
comes
to
a
ton
of
content,
so
in
particular
it
it
takes
a
lot
of
storage.
It
takes
a
lot
of
time
to
to
index
really
large
websites,
so
we
need
to
make
sure
that
it's
really
worthwhile.
A
So
one
of
the
things
that
we
do
is
we
might.
In
the
beginning,
with
a
new
website,
we
might
be
a
little
bit
more
conservative,
a
little
bit
hold
back
a
little
bit
on
the
indexing
speed
on
the
crawling
speed
so
that
we
don't
cause
problems
on
the
server
and
over
time,
as
we
see
that
this
is
a
really
good
website
that
it
kind
of
embedded
well
within
the
rest
of
the
web
and
we'll
pick
up
again
and
crawl
a
little
bit
more.
A
So
what
you're
seeing
there
is
essentially
normal
and
expected,
and
it's
not
that
there's
a
simple
way
to
just
say:
okay,
google
index
a
million
pages,
but
rather
you
have
to
kind
of
show
to
google
that
actually
the
millions
of
pages
that
you
have
on
your
site,
they're,
worthwhile
and
they're
important
to
the
index,
because
they
contain
something
that
people
are
looking
for,
where
people
are
actively
going
to
your
site,
recommending
it
to
other
people.
A
B
So
that
was
really
just
it
actually.
Yeah
know
this.
You
know
the
quality
and
the
you
know
from
your
guidelines,
etc.
So
this
is
actually
a
parts
website.
You
know
mechanical
parts,
it
has,
you
know
specific
niche
or
industry.
That
is,
you
know,
searching
for
those
parts,
so
I
I
believe
it
might
take.
I
mean
much
longer
than
the
usual
website.
A
It's
I
don't
know
it
depends
it's
possible,
so
one
one
of
the
the
strategies
that
some
sites
use,
especially
in
the
beginning
as
they're
building
things
up,
is
to
concentrate
on
fewer
pages
that
are
really
important
for
you
and
build
up
on
those
pages
and
expand
from
there.
A
So
with
a
parts
website,
one
of
the
things
you
could
do
is
focus,
for
example,
on
on
the
categories
of
parts
that
you
have
and
instead
of
focusing
on
all
of
the
millions
of
individual
parts
on
the
categories,
because
then
people
will
still
be
able
to
find
your
categories
if
they
search
specifically
for
the
part
number
or
the
the
name
of
the
part.
But
you'll
have
a
lot
fewer
pages
that
you
want
to
have
indexed
in
the
beginning,
and
we
can
focus
on
that
first
and
then
over
time
you
can
say,
oh
well.
A
A
It's
always
a
bit
challenging
with
a
new
website
and
if
you
have
a
lot
of
content,
and
especially
if
it's
something
where
you'd
say
well,
it's
it's
normal
content.
It's
not
that
you're
like
automatically
generating
content
millions
of
pages,
but
it
it
takes
time
and
kind
of
learning
to
focus
first
and
then
expand
from
there.
I
think
that
really
helps.
B
C
Can
I
ask
the
next
one
sure
go
for
it?
Okay,
hello,
john,
so
my
question
is
regarding
links,
reporting,
search
console.
So
as
of
now,
we
cannot
see
the
all
the
links,
but
regarding
some
links,
which
we
are
visible
on
the
report,
so
our
google
filtering
based
on
some
ref
reference
tag.
I'm
sorry
I
forgot
the
actual
name,
the
following
yeah,
the
no
follow
sponsored
or
ugc
and
all
so
is.
Google
shows
all
the
links,
all
type
of
links
or
google
filters,
some
type
of
links
or
attribution.
A
We
we
don't
filter
by
by
the
type
of
links
which
is
sometimes
a
bit
confusing,
but
what
we
do
there
is.
We
show
a
sample
of
the
total
links.
It's
not
that
we
would
show
all
of
them
there.
It's
not
that
we
would
filter
by
nofollow
or
by
disavowed
links
or
things
like
that.
It's
really
just
the
sample
and
we
we
try
to
make
it
a
representative
sample,
but
sometimes
you
have
a
little
bit
more
and
sometimes
you
have
a
little
bit
less.
D
Thanks
john,
I
have
follow
up
on
this
sure,
so
john
in
case
there
is
any
site
migration
on
that
case.
Can
we
expect
immediately
moving
links
from
one
property
to
next
property,
or
it
will
take
time
for
search
console
to
start
showing
links
on
new
property,
because
of
we
are
noticing
that
in
some
websites
links
report,
it
is
not
fetching
for
new
property.
A
I
I
think
it
would
be
normal
that
that
it
takes
a
bit
of
time
with
with
a
site
migration
to
update
all
of
the
data
in
search
console.
I
don't
know
if
there's
anything
different
with
regards
to
the
links
reports
there,
but
it's
it's
pretty
normal
when
you
do
a
site
migration
that
the
the
data
takes
a
bit
of
time.
A
Specifically
with
regards
to
links
what
what
happens
there
is
we
we
track
the
link
based
on
the
canonical
urls
on
both
ends.
So
on
the
one
hand,
the
page
that
it's
linking
from,
we
take
the
canonical
url
from
there
and
the
canonical
url
of
the
page
that
it's
linking
to
and
if
you
do
a
site
migration,
then
individually.
The
pages
on
your
website
will
ideally
shift
the
canonical
to
your
new
site
and
then,
when
that
canonical
shifts,
then
we
count
that
link
as
being
from
the
original
source
to
the
new
canonical
url.
A
A
But
I
would
expect
that
this
takes
several
months
maybe
to
to
settle
down
to
to
the
new
domain
when
you're
doing
a
migration
like
that-
and
it's
probably
similar
for
for
the
other
reports,
but
especially
with
the
lynx
report.
It's
really
something
where
we
really
have
to
focus
on
the
canonicals
on
both
ends
and
that
especially
takes
a
little
bit
longer.
D
E
A
A
It's
it's
based
on
the
destination,
so
it's
is
a
little
bit.
I
don't
know
complicated
in
the
sense
that
for
for
the
links
we
we
have
those
both
ends
like
the
the
page
it's
linking
from,
and
the
page
that
it's
linking
to
and
if
either
of
those
changes
the
canonical,
then
we
will
update
that
in
our
internal
links,
but
especially
with
with
the
lynx
report
there,
like
so
many
steps
that
lead
to
the
data
being
shown
in
the
link
support,
I
could
imagine
it
just
takes
a
little
bit
longer.
C
One
another
question:
I
just
wanted
to
ask:
okay,
so
suppose
a
website
receive
a
manual
election
regarding
unnatural
links,
so
how
our
master
or
a
website
property
owner
will
you
know
analyze
those
links?
Okay,
because
of
those
links
I'm
receiving
the
manual
action,
because
in
my
election
there
is
no
example,
links
are
given.
A
So
the
I
I
think
the
the
general
theme
is
kind
of
like
well.
If
you
don't
show
all
of
the
links,
how
can
I
fix
all
of
the
links
and
our
in
general
when
it
comes
to
manual
actions?
It's
not
something
that
is
based
on
individual
links.
It's
really
based
on
a
broader
pattern
and
that's
essentially
the
the
broader
pattern
that
you
should
look
for
and
try
to
resolve
and
not
like
those
individual
five
links
there
and
five
links
here
kind
of
thing
so
for
manual
actions
in,
I
think
in
pretty
much
all
cases.
A
It
would
be
the
situation
that
in
the
lynx
report,
you
see
enough
information
to
recognize
that
broader
trend
of
a
problem
and
based
on
that
you
can
fix
that
broader
problem
and
the
the
manual
actions
team
is,
for
the
most
part
not
going
to
be
picky
in
that.
Oh,
you
fixed
700
links,
but
you
didn't
fix
these
two
links,
but
rather
they
want
to
see
that
you
understood
what
the
problem
was
and
that
you
were
able
to
resolve
that.
A
So
that's
kind
of
the
the
general
idea
there,
and
for
that
you
don't
need
to
have
all
of
the
links.
You
need
to
kind
of
know
what
the
problem
is
and
most
of
the
time
you
will
kind
of
know
what
what
you
or
what
your
site
did
to
to
get
that
manual
action
and
then
fixing
that
it's
something
that
needs
to
be
done.
A
So
sometimes
they
just
take
a
little
bit
longer
to
be
reviewed,
which
means
from
from
a
practical
point
of
view,
it
makes
sense
to
really
go
through
and
try
to
clean
up
as
much
of
as
possible
with
regards
to
those
links.
So
that's
something
where,
if
you
do
have
a
manual
action
based
on
links,
then
finding
the
broader
pattern
is
really
important,
but
also
really
being
clear
that
you
recognize
it
and
cleaned
it
up
is
really
important.
C
Again,
as
you
said,
the
broader
pattern
so
recently
in
couple
of
months
we
saw
links
are
coming
from
google
websites
itself,
for
example,
hotel
websites,
google
hotels,
so
from
google
hotel
website.
We
are
getting
around
hundreds
of
thousands
of
links
for
a
specific
single
page
without
no
follow
and
the
anchor
text
is
remain
the
same.
A
No,
no,
that
no,
that
should
not
be
the
case
if,
if
it's
a
no
follow
link,
definitely
not
if
it's
something
which
is
is
a
natural
link
that
that
is.
There
then
also,
definitely
not
it's
really
more
a
matter
of
maybe
you
reached
out
to
a
thousand
bloggers
and
tried
to
get
a
link
from
their
site,
or
maybe
there
there's
some
other
pattern
involved
where
you're
you're
paying
people
for
links
or
you're
doing
some
kind
of
a
link
exchange
kind
of
scheme.
A
Then
that's
that's
the
kind
of
thing
where
the
the
website
team
would
would
step
in
and
say:
okay,
we
we
need
to
take
manual
action
here
for
nofollow
links.
We
we
can
ignore
those
automatically.
That's
perfectly
fine,
just
because
it's
a
large
number
of
links
that
doesn't
matter
if
it's
a
similar
anchor
text
that
doesn't
matter,
it's
really
the
the
thought
behind
the
links.
That's
that's
important
for
us!
D
F
C
D
Referral
is
probably,
I
think,
google
chrome's
attribute,
which
is
ignored
by
google,
google
crawler.
D
F
F
C
A
C
A
A
Yeah,
okay,
I
mean
the
one.
The
one
case
I
I
might
do
that
outside
of
a
manual
penalty
is,
if
you
really
know
that
someone
working
on
your
website
did
something
weird
in
that
direction,
and
you
just
want
to
make
sure
that
it
doesn't
end
up
in
a
manual
action
but
for
like
normal
websites
on
a
day-to-day
basis.
You
don't
need
to
do
anything
with
the
disavow
report.
D
John
john,
regarding
this,
I
I
had
one
idea
so
when
google
already
discounts
those
links
when
manual
action
happens,
and
we
have
seen
a
lot
of
times
of
webmasters
disavow
other
types
of
links
which
are
not
worth
to
be
dismissed
and
you.
You
also
suggested
that
it
is
not
the
web
spam
team
hundred
percent
expects
things
to
be
dissolved.
D
Then
why
google
wants
webmasters
to
to
disavow
those
links
or
or
or
do
some
exercise,
because
google
is
already
just
discounting.
Those
links
and
the
chances
are
sometimes
webmaster
make
more
damage
on
the
website.
A
Yeah,
I
I
think
it's
at
some
point
we'll
get
into
the
position
that
we
can
really
completely
handle
this,
but
it's
on.
On
the
one
hand,
it's
it's
a
way
for
for
people
to
clean
things
up
if
they
they
are
aware
of
issues
and
the
the
other
part
that
I
I
think
always
comes
up
a
little
bit
when
it
comes
to
links,
is
the
the
general
fear
that
a
competitor
might
be
harming
your
website
and
that's
something
where
I
I
think
we
do
a
really
good
job
of
recognizing
those
situations
and
ignoring
them.
A
But
I
know
it's
easy
to
lose
sleep
over
something
like
that,
because
it
feels
like
something
that
you
have
no
control
over
and
by
using
the
disavow
links
report.
That's
something
where,
if
you
do
recognize
something
where
a
competitor
is
doing
something
really
weird
with
regards
to
links
to
your
website,
then
you
can
go
in
there
and
say:
okay.
I
just
want
to
be
100
sure
that
this
is
not
something
that
google
will
ever
consider
as
something
that
I'm
doing
so.
A
That
kind
of
that
peace
of
mind
aspect,
I
think,
is
also
quite
quite
useful
there.
But
I
I
think
it
would
be
nice
at
some
point
to
be
able
to
move
away
from
just
like
focusing
on
links
so
much,
but
probably
not
not
zoo.
A
G
G
You
know
attacks
on
other
sites,
but
they're
few
and
far
between
the
genuine
ones
that
the
or
I
should
be
more
specific,
the
genuine
ones
that
work
I
mean
it
goes
on
all
the
time.
Our
site's
got
a
ton
of
junk
links
coming
into
it,
but
we
didn't
build
them,
so
we
just
ignore
them,
but
genuinely
malicious,
effective,
negative
seo
attacks.
Are
I
mean
you
just
can't
find
them.
A
I
have
five
brands
that
each
cover
different
non-overlapping
territories,
but
all
brands
offer
the
same
service.
I
have
a
website
for
each
brand.
The
content
for
each
website
is
the
same
other
than
that
the
territory
that's
covered.
What's
the
best
approach
for
seo
for
website
content
so
that
one
brand
doesn't
penalize
the
other
brand
with
duplicate
content.
A
So
it
sounds
like
the
content
is
in
the
same
language,
which
is
is
something
that
I
I
think
would
make
it
very
different,
because
if
the
content
were
just
translated
into
different
languages,
then
it
would
be
unique
content.
We
wouldn't
have
to
worry
about
duplicate
content,
so
that's
that's
one.
One
part
that
sometimes
comes
up,
but
it
sounds
like
maybe
in
this
case
everything
is
in
english
and
for
different
countries
that
that
speak,
english,
for
example.
A
One
thing
you
can
do
here
is
use
the
hreflang
attribute
to
tell
us
about
these
different
versions
and
say
this
is
the
english
version
for
the
uk.
This
is
the
english
version
for
the
us.
That
kind
of
thing
that
helps
us.
What
happens
there
with
the
ahref
lang
is
that
we
will
index
these
different
versions
in
most
cases
and
when
a
user
searches
we'll
try
to
show
the
version
that
is
most
suited
to
their
location.
A
So
that's
that's
one
approach
that
you
could
do
there.
The
other
approach
is
like
you
mentioned,
making
sure
that
the
content
is
slightly
different.
A
third
approach
might
also
be
to
say:
well
if
this
content
is
really
the
same
everywhere,
then
maybe
it
makes
sense
to
concentrate
it
into
one
side.
A
Usually
I
recommend
folk
trying
to
reduce
the
amount
of
content
and
focusing
on
fewer
versions
rather
than
more
versions
just
because
it
makes
that
kind
of
that
single
version.
If
you
have
a
single
version,
much
stronger
in
that,
we
can
understand
that
all
of
the
value
of
your
site
is
concentrated
there
and
when
someone
searches
for
something
general,
we
can
say
well.
This
is
a
really
strong
and
good
website
for
this
topic.
A
We
can
show
this
one,
sometimes
that's
not
feasible
when
you're
targeting
different
countries
or
different
territories,
sometimes
for
policy
reasons,
or
I
don't
know
some
other
reasons.
So
if
you
need
to
have
different
versions,
then
you
kind
of
have
those
options
of
making
different
content
or
using
hreflang
attributes.
A
The
other
thing
maybe
worth
mentioning
here
is
that
there's
no
penalty
for
duplicate
content.
It's
not
that
we
will
say
your
website
is
lower
quality
or
spammy
or
problematic.
If
you
have
the
same
content
across
different
websites
for
for
different
audiences,
what
happens
on
a
practical
level
is
if
it's
exactly
the
same
page,
then
we
might
just
index
one
version
of
that
page
and
we'll
try
to
concentrate
the
value
there,
because
we
think
it's
like
exactly
the
same
thing,
so
we
put
it
all
into
one
page.
A
What
can
also
happen
if
a
part
of
the
page
is
the
same?
Is
we'll
index
the
individual
versions
and
if
someone
is
searching
for
something
that
is
within
this
chunk
of
text
that
is
shared
across
these
pages?
We'll
just
pick
one
of
those
pages
to
show
in
the
search
results
and
we'll
kind
of
filter
out
the
other
ones,
and
usually
we'll
try
to
pick
the
one
that
best
matches
what
the
user
is
looking
for,
but
sometimes
that's
kind
of
tricky.
A
If
we
don't
have
enough
information
about
like
what
exactly
the
user
is
looking
for,
so
those
are
kind
of
the
the
different
approaches
there
and
I
I
think
there
are
pros
and
cons
to
to
all
of
those
approaches.
So
it's
not
that
there's
one
solution
that
will
always
work
if
you
have
five
brands
and
you
want
to
keep
them
kind
of
separate,
but
different
different
approaches
might
make
sense.
In
your
particular
case,.
A
A
So
I
think
we
talked
about
this
in
the
beginning
slightly
as
well,
with
regards
to
the
site
that
had
millions
of
pages
and
is
just
getting
started
in
particular,
when
it
comes
to
indexing,
we
don't
guarantee
that
we
will
index
all
pages
of
a
website
and
especially
for
larger
websites.
It's
really
normal
that
we
don't
index
everything,
and
that
can
be
the
case
that
maybe
we
just
index
one
tenth
of
a
website,
because
it's
a
really
large
website
and
we
don't
really
know
if
it's
worthwhile
to
index
the
rest.
A
Sometimes
we
index
ninety
percent
and
you
kind
of
struggle
with
those
remaining
ten
percent,
but
that
kind
of
that
balance
between
trying
to
index
as
much
as
possible
and
trying
to
focus
on
the
content
that
we
think
is
actually
going
to
be
shown
in
search
and
that's
that's
something
we
we
always
have
to
struggle
with.
A
A
Usually
it's
a
little
bit
easier
to
focus
on
fewer
pages
and
just
making
those
stronger
than
to
significantly
improve
the
quality
of
a
whole
website
overall,
especially
when
you're
talking
about
a
really
large
e-commerce
website,
but
some
sometimes
it's
worthwhile
to
take
that
time
to
invest
and
try
to
significantly
step
things
up,
share
some
brief
information
on
ssr
and
csr
and
how
it
impacts,
indexing
and
ranking.
So
ssr
is
server
side
rendering
and
csr
is
client-side
rendering.
So,
usually,
those
are
the
aspects
of
kind
of
like
how.
A
How
are
you
using
javascript
on
a
website
with
regards
to
showing
the
content
on
your
website
and
the
general
approach?
There
is,
if
you
have
a
javascript-based
website,
if
you
don't
do
anything
special,
then
that
would
be
called
client-side
rendering,
because
in
your
browser
it
has
to
process
the
javascript
and
then
render
the
pages.
A
Whereas
if
you
do
something
special
on
your
website
with
that,
javascript
based
website,
so
that
the
server
actually
processes
the
javascript
and
then
sends
the
the
html
code
to
the
the
browser,
then
that
would
be
called
server
side
rendering
and
the
pros
and
cons
to
both
of
those
approaches.
Sometimes
there
there's
a
speed
aspect
involved
there
as
well
in
that
server
side,
rendering
takes
a
bit
of
time,
but
it
makes
things
a
little
bit
faster
for
users,
so
you
kind
of
have
to
balance
things
out
there.
A
I
would
recommend.
Checking
in
with
martin
martin
also
does
office
hours
for
seo
for
javascript
seo
in
particular,
and
trying
to
to
ask
him,
I
guess
questions
more
specific
to
what
exactly
you're
looking
for,
because
this
is
a
really
broad
topic,
and
I
don't
think
there's
like
this
one
thing
where
it's
like:
oh,
you
should
do
exactly
this
and
it
will
work
well,
but
rather
usually
you're.
You
have
some
existing
framework
on
your
website,
that's
based
on
javascript,
and
you
need
to
figure
out
what
you
need
to
do
there.
A
So
I
check
in
with
martin
on
that.
How
do
amp
pages
help
in
ranking
also
how?
How
does
it
make
sense
when
we
have
a
pwa
client-side
rendered
light
quick
loading
pages,
and
we
also
make
amp
for
the
same
pages.
A
So
amp
pages
do
not
affect
ranking,
that's
kind
of
the
first
thing
for
first
off,
so
it's
not
that
you
need
to
use
amp.
If
you
want
to
rank
well
in
search
there.
I
I
believe,
there's
still
some
features
that
rely
on
the
amp
framework
with
regards
to
how
we
show
the
content
there's
also
web
stories
which
is
based
on
amp.
A
So
if
you
want
to
use
anything
around
that,
then
maybe
it
makes
sense
to
use
amp,
but
if
you're
just
worried
about
speed,
amp
is
a
great
way
to
make
really
fast
websites,
but
it's
not
the
only
way
to
make
really
fast
websites.
A
So
that's
something
where,
if
you
do
have
a
really
fast
website
already,
then
maybe
that's
that's
already
fine,
so
yeah,
I
I
don't
know
I
I
don't
want
to
talk
you
out
of
using
amp,
because
I
think
it's
it's
fantastic
to
have
a
framework
that
makes
it
easy
to
make
really
fast
websites,
but
at
the
same
time
you
don't
need
to
kind
of
like
turn
everything
around
and
if
you
have
a
really
fast
website
already
then
change
your
framework
and
change.
A
All
of
that,
just
so
that
you
can
use
regarding
the
upcoming
mobile
first
indexing
update.
If
content
is
similar
and
crawlable
for
googlebot
from
mobile
and
desktop,
but
the
mobile
version
is
not
responsive,
then
how
can
this
update
affect
those
websites?
I
see
several
blogs
claim
that
responsive
web
design
is
the
first
step
for
this,
but
I
don't
see
any
information
from
the
webmaster
central
blog
for
this
claim
yeah.
A
So
I'm
not
not
100
sure
what
exactly
you're
asking
here,
but
if,
if
you
have
a
separate
mobile
and
adapt
separate
desktop
version-
and
you
have
those
versions
linked
properly
so
that
we
can
understand
the
connection
between
those
two,
then
that's
essentially
what
what
we
need,
so
that
we
can
understand
that
those
versions
are
there.
A
When
it
comes
to
mobile
first
indexing,
we
will
only
index
the
content
from
the
mobile
version,
though
so
that's
something
where,
if
like,
like
you
said
there,
the
content
is
similar
and
crawlable
for
mobile
and
desktop.
Then
that's
kind
of
what
we
would
focus
on
and
we
would
use
the
desktop
version
as
a
way
of
understanding.
A
Oh
there's
a
desktop
url
for
this
page
as
well,
and
if
someone
on
desktop
is
searching,
we
can
show
them
the
the
desktop
url,
but
otherwise
we'll
focus
indexing
on
the
the
mobile
version,
and
it's
not
the
case
that
the
mobile
version
has
to
be
responsive
design.
If
you
have
separate
mobile
urls,
that's
that's
perfectly
fine!
I
I
don't
think
it's
the
the
optimal
approach
to
having
separate
mobile
urls.
But
if
that's
the
way
your
website
is
set
up,
then,
if
the
mobile
url
has
all
of
the
content,
then
that's
that's.
F
May
I
ask
you
questions
sure,
so
my
question
is
about
targeting
different
region
in
a
country,
so
one
of
my
clients
they
have
they
provide
same
service,
but
for
each
state
they
have
different
branding
and
they
have
different
separate
websites
for
each
brand.
Now
they
want
to
use
same
content
on
each
website.
Do
you
think
this
will?
This
will
be
a
problem
from
duplicate
content
issue,
and
if
we
have
this
issue,
then
how
we
can
rectify
this
problem.
A
I
I
think
we
just
went
through
that
question
briefly
right
before
you
joined,
so
I
I'd
check
out
the
recording
afterwards
and
see
if
that
answers
your
question,
because
that's
is
pretty
much
exactly
the
same
same
question
there,
like
the
the
same
content
for
for
different
brands,
kind
of
thing.
D
A
D
A
Okay,
so
the
the
viewport
meta
tag,
I
think,
is
specific
more
to
the
responsive
design
setup.
So
that's
something
where,
if
you
have
a
separate
mobile
url
and
a
desktop
url,
that's
perfectly
fine
for
indexing.
We
focus
on
the
content
of
the
page.
We
don't
see
if
it's
mobile,
friendly,
we
essentially
just
use
a
mobile
version
for
indexing
mobile
friendly,
is
something
that
we
use
for
the
the
page
experience
ranking
factor
which
is
coming
in
may
so
it
makes
sense
to
to
have
a
mobile
friendly
page.
A
A
Usually
we
try
to
understand
when
it
makes
sense
to
show
a
date
on
a
page
and
we'll
try
to
show
that
in
in
the
snippet
of
the
search
results
itself.
If
there's
a
specific
date
that
you
want
to
have
shown,
then
you
can
use
the
I
think,
the
the
date
structured
data
in
the
article
structure,
data
type.
I
think,
where
you
can
tell
us
which,
which
date
we
should
use
there.
So
if
you
do
have
a
date,
we
can
tell
us
there.
A
A
If
you
don't
want
a
date
to
be
shown,
then
it's
not
possible
to
suppress
that
date,
snippet
from
being
shown,
but
what
you
could
do
is
of
course
make
sure
that
there's
no
date
shown
on
your
page.
So
if
you
don't
have
any
dates
in
the
html
page,
then
that
we
don't
really
have
much
to
pick
up
on
to
to
show
there.
So,
especially
when
you're
talking
about
things
like
a
home
home,
page
or
contact
page,
then
for
the
most
part,
you
probably
don't
have
dates
on
there
anyway.
A
A
I
would
love
to
have
examples
of
that,
so
in
particular,
a
query
where
you're
seeing
that
happen
and
the
urls
from
your
site,
where
that's
happening
from
so
that
that's
something
that
we
can
take
up
with
with
the
team
here
that
works
on
showing
dates
and
recognizing
dates
on
a
page
or
we
can
say
well,
maybe
we're
recognizing
a
phone
number
as
a
date
accidentally,
and
we
need
to
be
able
to
fix
that
after
the
disavow
tool
is
used.
Does
the
domain
carry
any
mark
that
it
may
hold
it
back?
A
No,
no.
The
disavow
tool
is
purely
a
a
technical
thing.
Essentially
that
tells
our
systems
to
ignore
these
links.
It's
not
an
admission
of
guilt
or
any
kind
of
bad
thing
to
use
a
disavow
tool.
It's
not
the
case
that
we
would
say
well
they're
using
the
disavow
tool.
Therefore,
they
must
be
buying
links,
it's
really
just
a
way
to
say.
Well,
I
don't
want
these
links
to
be
taken
into
account
and
sometimes
that's
for
things
that
you've
done
or
someone
working
on
your
website
has
done
in
the
past.
A
Sometimes
that's
for
things
that
you
just
don't
want
google
to
take
into
account
for
whatever
reason,
and
both
of
those
things
are
good
situations
right.
It's
like
you,
recognize,
there's
a
problem,
and
this
is
a
tool
that
you
can
use
to
resolve
that
and
that's
that's
not
a
bad
thing.
So
it's
not
the
case
that
there's
any
kind
of
a
red
mark
or
any
kind
of
flag.
That's
passed
on
just
because
a
website
has
used
the
disavow
tool.
A
A
It's
like
I'm,
I'm
ranking
below
a
website
and
they're
doing
one
thing
bad,
and
why
doesn't
google
notice
that
and
just
remove
that
website
completely
from
search
because
their
keyword
stuffing
or
they
they've
bought
links
or
they've
done
some
some
other
crazy
things,
and
I
I
don't
think
that
this
is
something
that
is
for,
for
the
most
part
useful
to
focus
on
as
a
site
owner
because
essentially
you're,
focusing
on
someone
else's
website
and
trying
to
find
problems
in
that
other
website,
rather
than
taking
the
time
that
you
have
and
focusing
on
your
website
instead.
A
A
Overall,
in
that
a
website
will
do
some
things
really
well
and
something's
kind
of
okay,
and
maybe
we'll
do
some
things
really
badly,
and
because
of
that,
it's
not
the
case
that
we
would
say
well,
we'll
just
remove
all
of
the
websites
that
aren't
perfect
because-
and
I
think
the
search
results
will
be
pretty
empty
and
instead
our
algorithms
are
built
in
a
way
to
try
to
find
the
the
overall
view
of
that
website.
So
that
could
be
simplified
into
well.
We
take
the
average
of
how
good
the
website
is.
A
So
that's
kind
of
a
good
thing
for
your
situation,
but
obviously,
if
your
competitor
is
doing
something
wrong
or
just
ignoring
that
bad
part,
then
that
can
be
a
bit
frustrating.
So,
instead
of
focusing
on
the
things
that
your
competitor
is
doing
wrong,
try
to
find
ways
that
you
can
improve
your
website.
A
That
kind
of
are
more,
I
I'd
say,
sustainable
for
the
long
run
for
your
website
in
particular,
and
that
could
be
to
say
well
the
website.
The
competitor
is
doing
these
things
badly,
and
maybe
google
is
ignoring
those,
but
I
can
do
those
things
really
well,
so
that
when
google
re-evaluates
my
website
over
time,
it'll
see
well.
Actually
I
have
a
lot
more
things
that
are
lined
up
and
done.
Well,
so
that's
that's.
Definitely
one
approach.
A
So
that's
something
where
it's
it's
very
easy
as
an
seo
who
focuses
on
tools
and
numbers
to
say
well,
technically,
my
website
is
better,
but
practically,
maybe
that
other
website
is
just
a
lot
better
in
that
it's,
it
provides
a
lot
more
value
to
users.
It
works
really
well
for
those
users,
so
they
might
be
doing
things
like.
A
Specifically,
I
think,
for
the
core
updates
and
for
the
the
panda
update
at
a
time
with
like
lots
of
questions,
you
can
ask
yourself
or
you
can
ask
users
about
your
website
and
that's
the
kind
of
thing
that
I
would
take
and
do
a
user
study
and
maybe
find
10
20
users
of
your
website
and
go
through
these
questions
in
an
objective
way.
To
really
get
input
on
where
you
can
improve
your
website,
not
purely
from
a
technical
level,
but
also
kind
of
from
from
a
user
level
as
well.
A
Let
me
see:
maybe
we
can
run
through
some
of
the
the
other
questions
here,
the
the
large
website
with
millions
of
products.
I
think
we
talked
about
that
briefly,
the
most
important
ranking
factors.
A
I
don't
think
we
list
the
most
important
ranking
factors
so
I'll
have
to
disappoint
you
there.
If
I
submit
a
blog
post
today
and
add
more
content
in
10
days
from
now,
do
I
need
to
submit
an
updated
statement
for
the
blog
post
to
get
updated
or
will
it
just
get
recrawled,
so
the
sitemap
file
definitely
helps
us
understand
when
there's
new
and
updated
content.
A
If
you're,
using
a
setup
where
you
don't
have
a
kind
of
an
automatic
sitemap
file,
then
you
can
use
the
submit
to
indexing
feature
in
inspect
url,
and
let
us
know
that
way
from
a
practical
point
of
view,
if
you're
doing
anything
more
than
a
website
with,
let's
say
10
pages,
I
would
strongly
recommend
using
a
sitemap
file
and
automating
this,
rather
than
doing
it
manually.
A
Should
a
404
page
contain
a
self
canonical.
Is
it
okay?
If
I
added
one
and
then
removed
it,
we
don't
look
at
any
content
on
a
404
page.
So
if
a
page
is
returning
a
404
status
code,
then
essentially
we
say:
okay,
that's
that's
fine!
That's
enough
for
us!
A
Cool
okay,
I
think
we
ran
through
those
questions.
What
else
is
on
your
mind?
What
else
can
I
help
with.
H
John
john,
actually
that
question
about
a
date
popping
up
actually
remind
me
of
something
a
friend
of
mine
asked
me
a
question
about
her
site
and
I
thought
it
was
a
pretty
good
question
because
it
had
well
wide
appeal
for
others
and
it's
it's
about
different
types
of
content
on
one
site.
H
So
she
has
a
website
that
has
resources
for
women
who
are
interested
in
stand-up
comedy
where
you
can
take
classes
how
to
write
a
routine
that
kind
of
stuff.
It's
very,
very
evergreen.
But
she
also
has
a
blog,
that's
written
by
established
comedians
tied
to
news
events.
So
she
asked
me
if
the
newsy
articles
written
by
these
well-known
comedians
or
experts,
if
you
will,
should
they
have
bylines
and
timestamps
while
the
evergreen
content
does
not
so.
A
It's
it's
sometimes
I
imagine
a
bit
tricky
if
you
mix
those
two
on
the
same
site
and
if
it's
hard
for
us
to
recognize,
which
part
is
the
news,
your
content,
which
part
is
the
evergreen
content,
so
something
like
separating
that
out
in
a
separate
url
structure,
I
think,
would
make
sense.
A
But
in
general,
if
you
have
something
that
is
timely,
then
I
would
certainly
go
for
making
sure
that
you
have
that
byline
there,
that
you
have
the
date
on
the
page
that
you
have
the
date
and
the
structured
data
as
well.
Just
so
that
we
can
clearly
understand
this
is
something
new
that
just
came
out.
Maybe
we
can
show
it
and
discover.
Maybe
we
can
show
it
otherwise
in
the
search
results.
A
E
Hi
hi,
like
I'm,
having
a
question
on
like
x,
robot
tag,
so
we
can't
supposed
to
use
like
a
private
folder,
all
the
page
level,
no
index
meta
tag,
so
I
just
go
through
the
x
robot
tag,
so
is
the.
Is
that
there's
a
permanent
solution
for
preventing
the
google
from
the
index
or
it's
like
something
similar
to
robot.txt
blocking?
So
we
need
to
block
the
page
entirely
permanently
from
the
google
or
any
other
search
engines.
So
can
we
use
x,
robot
tag,
sure
yeah.
A
E
So,
like
the
server
level
configuration
or
the
header.php
like,
which,
which
method
did
you
prefer
for
the
the
entire
site
block
totally
up
to
you.
A
B
Okay,
our
question
actually
there's
a
website.
You
know
which
was
put
to
no
index
through
merit
x
and
was
you
know
de-indexed
10
days
ago,
when
we
realized-
and
you
know
we
removed
that
low
index
property
or
tag
and
we're
just
waiting
for
about
10
to
12
days
now
to
get
you
know
crawled
by
the
google.
B
A
Yeah
and
and
how
long
ago,
did
you
remove
the
dino
index?
It
was
about
26th.
A
So
even
if
the
the
website
was
on
no
index
for
a
long
time,
then
we
should
try
to
recrawl
those
every
couple
of
days.
So
it's
like
end
of
december,
and
now
that
feels
like,
I
don't
know,
maybe
two
weeks
almost.
That
seems
like
a
time
where
we
should
have
recrawled
at
least
some
of
those
pages
and
shown
those
again.
A
So
that's
something
where
I
I
would
say
if
we're
not
indexing
anything
after
two
weeks
that
feels
like,
maybe
there
might
be
something
else
that
is
wrong,
so
maybe
something
like
a
removal
request
is
still
pending
somewhere
within
the
website.
You
can
check
that
in
search
console.
Maybe
there
is
a
manual
action,
that's
in
place.
You
can
also
check
that
in
search
console.
A
A
So
that's
that's
kind
of
one
thing
there,
but
that's
really,
I
would
say
the
case
if
we
don't
index
anything
at
all
from
the
website.
After,
like
those
two
weeks,
if
we
index
things
like
the
home
page,
but
not
all
of
the
detail
pages,
then
I
think
that's
normal
after
after
two
weeks,
I
think
that's
something
that
tends
to
take
a
little
bit
longer,
especially
if
the
whole
website
was
on
no
index
for
a
longer
time.
A
I
Hey
again
hi,
so
oh
happy
new
year
by
the
way,
john
and
everyone,
so
I've
got
one
around
the
search
console.
I've
noticed
in
some
cases
that
certain
queries
that
are
very
low
in
average
position
you
know,
60
70
80
have
a
very,
very
high
a
number
of
impressions.
I
I
know
that,
for
example,
with
other
products
like
google
analytics,
google
tries
to
filter
out
bots
and
scrapers,
and
things
like
that.
I
wonder
if
for
search
console
these
high
number
of
impressions
for
such
a
low
position,
where
users
don't
wouldn't
usually
get
to
like
page
seven
of
the
search
results
are
are,
you
know,
are
either
like?
Is
it
just
scrapers
or
bots,
or
anything
like
that
or
might
be
a
a
bug
or
something?
I
How
reliable
are
these
impressions?
Basically.
A
We
do
filtering
for
kind
of
scrapers
and
things
like
that
on
several
levels,
but
it's
also
something
that
it's.
It
can
happen
that
it
shows
up
in
search
console.
A
So
that's
not
completely
excluded
that
we
would
never
show
that
there,
but
for
the
most
part,
I
think
we
most
of
these
things
are
filtered
before
they
reach
search
consoles.
So
my
my
guess
is,
if
you're
seeing
something
like
that,
which
is
obviously
kind
of
a
pattern
of
kind
of
scrapers
and
bots
and
the
the
usual
stuff,
then
that
might
be
kind
of
a
side
effect
of
that
that
we're
just
not
filtering
in
search
console
but
yeah.
I
The
weird
thing
is
that
so
I'm
comparing
two
queries
or
for
one
of
them
the
average
position
like
10
11
12,
something
like
that,
the
other
one
average
position
is
60
or
70.
I
both
seem
to
have
about
the
same
search
volume
give
or
take,
and
they
both
have,
like
you
know,
10
15,
000
impressions
like
get.
Why?
Why
the
one
on
page,
two
or
page
one,
might
have
that
but
page,
seven,
the
same
number
of
impressions
that
that
has
to
be
something
out
there.
A
I
don't
know
it's
it's
really
hard
to
say.
Sometimes
the
the
ones
I've
seen
that
that
have
gotten
through
in
in
search
console
are
things
that
are
more
periodic
in
that.
If
you
look
at
the
the
individual
query,
then
you
see
oh,
it's
like
every
sunday
they
come,
and
it's
like
thousands
of
impressions.
A
I
Right,
the
only
problem
with
that
is
that,
if
you're
trying
to
check
out
like
click-through
rate,
for
example,
to
try
to
figure
out
oh
look,
these
are
some
queries
that
have
a
lot
of
impressions.
We
have
a
very
low
click-through
rate.
Let's
try
to
optimize
for
those
or
something
like
that,
where
the
data
is
not
actually
there,
it's
not
actual
users
that
are
doing
those
are
creating
those
impressions.
A
Yeah,
I
I
think
that's
always
a
struggle,
but
I
mean
I
mean
on
the
on
the
one
hand,
I'm
not
not
really
a
fan
of
people
scraping
search
anyways,
but
these
bots
and
scrapers
are
around
all
the
time
anyway.
So
it's
always
the
regardless
of
how
you're
tracking
those
metrics.
You
always
have
this
this
level
of
noise.
That's
that's
involved
there,
which
makes
it
a
little
bit
tricky
to
understand.
Is
these
actual
users
or
these
bots
acting
like
users
right,
okay,
cool?
Okay!
Let
me
pause
the
recording
here.
A
It's
been
great
having
you
all
here,
and
it's
been
good
doing
these
hangouts
again
kind
of
something
a
little
bit
regular
and
normal
along
the
way
I'll
be
around
a
little
bit
longer
as
well.
So
if
any
of
you
want
to
stick
around
after
the
recording
stops,
you're
welcome
to
stick
around.
Otherwise.
A
I
set
up
the
the
next
english
office
hours
for
next
friday
in
the
the
evening,
our
time
here
so
maybe
a
little
bit
better
for
the
u.s
time
zones.
So
you
don't
have
to
get
up
in
the
middle
of
the
night.
Michael
always
good.
To
have
you
here,
though,
and
I
think
a
german
one
is
lined
up
as
well
for
next
week.
A
So
if
you
have
anything
that
we
still
need
to
cover
feel
free
to
jump
into
one
of
those
office
hours
or,
of
course,
drop
by
the
search
central
help
forums
where,
where
experts
like
mihai,
hang
out
and
can
help
answer
your
questions
as
well,
all
right
thanks
everyone
and
let
me
pause
the
recording
now.