►
From YouTube: English Google SEO office-hours from June 18, 2021
Description
This is a recording of the Google SEO office-hours hangout from DATEDATE. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
seo
search
central
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions.
Around
search
looks
like
we
have
a
bunch
of
people
here
today.
So
that's
pretty
cool.
A
A
bunch
of
questions
were
submitted
on
youtube
already,
so
we
can
take
a
look
at
some
of
those,
but
as
always,
maybe
we
can
go
through
a
handful
of
live
questions.
First
to
get
us
started,
a
bunch
of
people
are
already
raising
hands.
So
it's
like
super
super,
quick
everyone.
So
good.
Let's
see
nothing.
B
B
However,
since
we
added
the
review
snippet,
we
have
seen
a
decline
in
terms
of
click
through
rate
and
rank.
Could
the
addition
of
review
snippet
have
been
a
factor?
Moreover,
we
added
the
rating
number
and
rating
value
present
in
a
third
party
site
manually
into
the
website.
Could
this
manual
addition
have
been
an
issue?
We
have
not
received
any
warning
from
google
till
now,
but
the
source
performance
is
decreasing
every
day.
So
what
could
be
the
issue
zone.
A
So
that
wouldn't
be
related
in
to
to
a
drop
in
rankings,
but
if
you're
adding
structured
data,
it
should
apply
to
the
primary
element
of
the
page,
and
it
shouldn't
be
something
that
you
just
copy
and
paste
across
the
whole
site.
So
that's
kind
of
the
one
thing
there.
So,
if
you're
using
the
same
structured
data
on
all
of
the
pages
of
your
site,
that
would
be
incorrect.
A
The
other
thing
specifically
with
regards
to
reviews
they
should
be
based
on
reviews
that
are
actually
published
on
your
site.
They
shouldn't
be
kind
of
like
hand-picked
reviews
from
third-party
sites,
so
that
would
be
essentially
against
our
our
guidelines.
As
far
as
I
understand
it,
so
both
of
those
things
are
not
not
really
in
line
with
our
guidelines
and
it's
something
where,
if
someone
were
to
manually
review
that
they
might
flag
that
and
apply
a
manual
action
for
that,
but
it
wouldn't
affect
your
site's
ranking.
A
C
C
We
finish
in
the
stock,
so
we
came
to
this
point,
so
we
have
temporary
four
or
four
pages
and
permanently
404
pages
for
permanently
404
pages.
We
know
that
we
can
use
410
codes,
so
we
tell
google
to
yeah
just
remove
from
search
result,
but
for
the
temporary
one
we
are
not
sure
what
is
the
best
approach,
and
I
would
like
to
get
your
recommendation
on
that.
C
A
I
think
it
doesn't
matter
so
there
are
different
strategies
for
dealing
with
temporary
products
that
are
out
of
stock
and
essentially
all
of
those
can
work
for
search.
So
you
you
can
just
mark
it
as
out
of
stock.
You
can
use
a
404.
A
You
can
put
a
no
index
on
the
page.
It's
essentially
up
to
you
the
way
that
you
want
to
handle
it
and
some
sites
handle
it
in
in
different
ways,
and
that's
that's
perfectly
fine.
You
can
also
say,
if
I
don't
know
if
it's
permanent
or
not
I'll,
just
mark
it
out
of
stock
and
keep
it
as
a
200
page
and
after
a
month
when
I
realize
I
can't
get
that
product
anymore,
maybe
I'll
make
it
a
404..
C
So
the
last
question
I
have
is
it's
not
going
to
somehow
affect
or
impact
bad
on
our
rankings
or
indexing
stuff
right.
A
No,
that
shouldn't
I
mean,
especially
if,
if
you're
talking
about
individual
products
that
come
in
and
out
of
kind
of
inventory,
then
that's
those
are
the
kind
of
fluctuations
that
always
happen
on
websites.
If
you
say
like
your
whole
website,
is
suddenly
not
available
anymore,
then
that's
a
bit
trickier,
but
individual
products
coming
and
going
that's
completely
normal.
E
D
E
Question
is
like
the
question
is
about
where
keyword
placing
within
the
page
so
basically
like,
as
we
know
like
google,
is
following
mobile
first
indexing.
E
E
So
now
I
have
a
page
where
there
is
a
lot
of
lot
of
content,
say:
20
000
words
of
content
and
my
focus
keyword
is
at
the
bottom
of
the
page,
so
how
it
will
impact
the
ranking
because,
as
I
know
like
when
the
user
has
opened
the
page
on
the
smartphone,
the
real
user,
not
the
google
bot,
so
that
keyword
is
not
in
the
active
viewport.
E
So
unless
he
scroll
down
the
focus.
Keyword
will
not
be
visible
to
him,
because
it
will.
It
will
be
not
in
active
viewport.
So
how
a
google
con
see
that
page
or
like
how
google
crawls
as
a
smartphone,
then
what
is
the
maximum
height
of
the
viewport?
Google
will
fetch
the
that
page
as
a
smartphone.
A
So
don't
just
put
that
as
like
a
one
word
mention
on
the
bottom
of
your
article,
but
rather
like
you
use
it
in
your
titles.
Use
it
in
your
headings,
use
it
in
in
your
subheadings,
use
it
in
captions
from
images
all
of
these
things,
to
make
it
as
clear
as
possible
for
users
and
for
google
when
they
go
to
your
page
that
this
page
is
about
this
topic.
A
So
that's
that's
kind
of
the
direction
I
would
had
there.
I
would
not
worry
about.
Like.
Can
google
get
to
the
word
number
20,
000
or
not,
because
if,
if
you're
talking
about
the
word,
20
000
and
you're
saying
this
is
the
most
important
keyword
for
my
page,
then
you're
already
doing
things
wrong
you,
you
really
need
to
make
sure
that
the
kind
of
the
information
that
tells
us
what
this
page
is
about
is
as
obvious
as
possible
so
that
when
users
go
there
they're
like
yes,
I
made
it
to
the
right
page.
A
Thank
you,
john
mohammed,.
F
Hi,
john,
actually,
a
week
or
a
month
ago
I
asked
you
a
question
related
to
copied
content
and
at
that
time
you
suggested
about
dmca,
because
some
websites
are
copying
our
content
consistently.
But
at
that
time
we
was.
We
were
not
aware
that
some
of
the
keywords
they
are
ranking
for
they
are
ranking
higher
than
us.
So
at
that
time
you
suggested
about
some
factors.
We
have
to
look
into
our
websites
or
at
that
time,
so
I
do
not
remember
it
properly.
F
F
A
I
don't
know
it's
hard
to
say
exactly
what
we
talked
about
back
then,
but
in
in
general.
I
I
think
the
dmca
process
is
probably
the
one
that
that
you
should
be
looking
into.
If
someone
is
is
really
copying.
Your
content
and
the
dmca
process
is
a
legal
process,
so
I
can't
give
you
advice
about
that.
That's
kind
of
one
one
of
the
tricky
aspects
there,
but
that
feels
like
the
direction
that
I
would
head
there
and
with
regards
to
why
this
can
happen.
A
For
example,
we
we
often
see
that,
with
our
blog
posts,
where
someone
like-
I
don't
know,
barry
schwartz,
we'll
we'll
write
about
our
blog
post,
maybe
quote
some
of
the
blog
posts
and
essentially
he
will
rank
his
content
almost
like
better
than
our
content
and
from
from
our
point
of
view,
it's
like
they're
copying
the
content
but
they're
also
providing
additional
insight,
and
maybe
questioning
some
of
the
things
that
we
wrote
about
there.
A
And
maybe
there
are
people
that
are
commenting
on
the
other
side
as
well,
and
that
provides
additional
value
where
we
have
to
say
well,
it
makes
sense
to
show
this
in
the
search
results.
Additionally,
maybe
it
even
makes
sense
to
show
it
above
the
original
content,
and
these
are
the
kind
of
things
where
sometimes
a
copy
makes
a
lot
of
sense,
and
it
does
make
sense
to
show
it
separately
in
search
results
and
sometimes
even
more
visibly
in
the
search
results.
F
A
So
that's
also
one
of
the
situations
that
you
might
be
running
into,
where
maybe
it
take
it's
worthwhile
to
also
invest
in
improving
the
quality
of
your
website
overall.
So
not
just
that
one
article
that
apparently
people
like,
but
also
the
rest
of
your
website
overall.
A
Okay,
thank
you
sure.
Praveen.
G
Hi
john,
how
are
you
you
are
doing
great
hi
hi,
so
this
question
came
to
me
from
a
friend
of
mine,
so
they
have
this
website
on
cctld
domain,
which
is
dot
info
for
india.
Now
they
want
to
make
this
website
a
global
website.
G
So
but,
as
you
know,
cctv
is
like
we
are
not.
We
cannot
jio
target
with
on
a
cctld
other
country,
so
the
question
was:
can
we
do?
Can
we
with
the
help
of
hrf
link
tags?
Can
we
geo
target
other
countries
on
a
cctld
domain
or
they
need
to
take
a
different
route
like
either
gopher.com
or
you
know
something
else.
A
So
there
there's
a
difference
between
geo,
targeting
and
kind
of
this
language,
regional,
targeting
with
atria
flame
with
geo.
Targeting.
If
we
can
tell
that
a
user
is
looking
for
something
local,
then
we
will
use
geo
targeting
to
try
to
promote
local
sites
in
the
search
results
with
href
lang.
We
essentially
purely
focus
on
the
language
and
the
region,
and
we
swap
out
the
urls,
so
it
doesn't
change
ranking.
So
hr
flang
tries
to
bring
the
correct
url
in
the
search
results,
but
it
doesn't
change,
ranking
and
geo.
G
G
For
example,
there
is
a
global
brand,
which
is
using
a
dot
com
for
us
and
and
cctld
is
for
other
countries
like
dot,
uk
uk
for
uk
and
dot
a
u
for
australia,
and,
as
you
know,
when
google
analyzes
the
website
it,
it
looks
for
like
overall
quality
of
the
domain,
not
just
a
few
pages.
So
in
a
setup
like
this,
where
a
website
is
using
different
cctlds
to
geotarget,
where
co
uk
is,
is
a
website
on
its
own.
G
So
in
that
case,
for
example,
dot
com
is
a
from
quality
perspective.
It's
it's
a
great
website
from
google's
point
of
view,
but
dot
co.
Uk
is
not
like
it's
blow
standard.
So
can
that
uk
website
impact
impactperformanceof.com
in
a
way
because
they
are
interconnected
somewhere
through
your
targeting,
and
you
know,
google
can
figure
out
that
they
are
both.
A
Yeah,
I
mean
you,
usually
they
they
would
be
essentially
the
same
website
just
kind
of
written
for
for
the
local
audience.
So
from
a
quality
point
of
view,
I
think
it
would
be
unexpected
if,
if
a
cctld
website
were
significantly
different
than
the
g,
a
a
generic
top-level
domain,
for
example.
But
essentially
what
happens
is
we
would
rank
those
pages
appropriately
for
the
user
and
then
swap
out
the
urls.
So
it's
not
that
we
would
say
this
is
one
giant
website
across
all
of
these
different
domains.
A
But
rather
we
would
say
we
we
will
rank
the
the
pages
that
we
know
about
from
all
of
these
websites
and
then,
when
we
have
a
page
that
we
want
to
show
in
the
search
results,
we'll
double
check
the
hr-flang
annotations
and
swap
out
the
url
for
whatever
local
one.
There
is
so,
for
example,
if
we
would
never
show
the
the
uk
version
in
the
search
results,
because
it's
a
really
terrible
website
and
someone
in
the
uk
were
to
search
then
probably
from
geotargeting.
A
We
would
focus
on
the
uk
version
of
the
site
and
it
would
rank
lower
and
then
with
ahreflang.
We
would
swap
out
the
urls
there,
so
it's
something
where
there's
kind
of
this
subtle
mix
across
across
the
the
way
that
that
would
be
handled.
But
again
it
feels
like
something,
that's
very
theoretical,
which
is
unlikely
to
really
happen
like
that.
G
Yeah
I've
seen
such
scenarios
where
different
teams
are
working
on.
You
know
different
domains,
so,
like
uk
team
is
working
on
uk
site
and
us
team
is
working
on
there's
a
u.s
website
but
different
with
different
fundamentals.
So
yeah
they're
asking.
H
Yeah
hi
john
hi,
so
my
question
is
related
to
core
web
vitals.
Basically,
I'm
accessing
my
web
pages
on
different
core
web
vital
tools
like
web
dot,
dav,
slash
measure,
page
speed
inside
lighthouse
and
all
of
them
are
giving
different
values
for
web
titles.
A
One
works
best
for
you,
so
there's
there's
one
thing
I
I
think
which
is
really
important.
Is
we
differentiate
between
what
users
actually
see
and
what
the
testing
tools
say
and
for
search?
We
use
what
users
actually
see.
We
call
that
the
field
data
or
the
real
user
metrics
and
the
testing
tools
that
are
out
there.
They
use
kind
of
like
their
their
own
personal
tests,
and
that
is
called
lab
data.
We
would
call
that
and
all
of
these
lab
data
testing
tools.
A
So
that's,
I
think,
really
critical
when
it
comes
to
core
web
vitals
to
to
understand
that
difference
between
what
what
the
testing
tools
show
and
what
google
would
use,
what
google
would
use
is
in
search
console
in
the
page
experience
metrics
section
there
and
that's
something
you
would
also
see
in
the
chrome
user
experience
report
which,
which
is
a
separate
site
as
well.
A
The
different
testing
tools
can
make
different
assumptions
with
regards
to
what
they
think
users
might
see.
So
you
would
see
differences
there.
My
recommendation
is
to
use
the
real
user
metrics
as
a
way
of
recognizing.
Is
there
a
problem
or
is
there
not
a
problem?
And
if
there
is
a
problem,
then
to
use
the
different
testing
tools
to
try
to
figure
out
where
this
problem
might
be
coming
from
and
what
you
might
need
to
do
to
improve
that?
A
So
it's
more
like
the
the
data
and
search
console
is
used
as
as
a
directional
sign,
saying
everything
is
good
or
everything
is
bad,
and
then
you
use
the
different
testing
tools
to
figure
out
what
exactly
you
can
do
to
improve
things,
and
some
tools
are
easier
to
use.
Some
tools
have
more
detailed
information.
A
Some
tools
are
faster
to
run
or
easier
to
use.
So
it's
something
where
you
need
to
find
the
tools
that
work
well
for
you
in
general.
They
all
go
in
the
same
direction,
but
you
would
not
expect
to
see
exactly
the
same
number
in
the
different
measurements
for
all
of
these
different
tools:
okay,
okay,.
I
A
Round
now,
oh
my
gosh,
I'm
probably
saying
your
name
wrong.
Sorry.
J
Okay
yeah,
so
I
had
a
question
about
this
about
list
so
back
in
2014.
Actually,
we
disavowed
more
than
5000
links
because
of
a
manual
action
and
recently
in
march,
so
three
months
ago
we
submitted
around
300
links
again
on
the
source
console
and
somewhere
around
that
time
we
started
noticing
a
significant
reduction
in
seo
traffic
and.
J
So
like
it
accounted
to
about
more
than
30
loss
in
traffic
in
90
days,
so
do
you
think
our
action
in
march
could
have
caused
an
impact
on
our
google
rankings?
If
so,
how
would
you
recommend
fixing
this
issue.
A
It's
I
I
think,
it's
theoretically
possible
that
that
it
had
an
effect
like
that,
but
I
think
it
would
be
rare
that
that
would
happen.
So
in
particular
it
could
happen.
If
you
take
all
of
the
most
important
links
to
your
website
and
you
submit
those
in
the
disavow
file,
then
I
could
imagine
that
our
systems
say
well.
There
are
no
really
good
links
for
this
website
anymore.
A
I
B
A
No,
it's
the
the
disavow
file
is
purely
technical
for
us.
It's
purely
a
sign,
saying,
don't
take
these
links
into
account
and
we
reprocess
that
when
we
reprocess
the
the
external
pages,
so
it's
it's
basically
like
just
saying:
well
use
these
links
or
don't
use
these
links.
It's
not
not
a
sign
saying
I
am
a
spammer,
and
here
is
proof
that
I
was
a
spammer.
It's
really
just
purely
saying.
I
don't
want
you
to
use
these
links
for
whatever
reason
and
the
reason
is
totally
up
to
you.
It's
not
something
that
our
systems
would
judge.
A
I
Hi
john,
thank
you
hi.
I
represent
the
ukrainian
news
agency
and
our
site
has
a
monthly
audience
of
11.5
million
users.
It
is
licensed
as
a
news
agency
ukrainian
top
officials
give
us
interviews
regularly.
The
content
is
70
exclusive.
We
have
about
1995
percent
page
speed
insights.
Our
media
outlet
always
tries
to
adhere
to
recommendations
regarding
search
engine
optimization.
I
Since
january
2021
we
have
fallen
down
in
search
results
for
the
keys
related
to
politics,
issues
from
the
top
one
to
the
third
page
of
search
results
or
even
deeper,
and
our
stories
on
politics
and
economics
have
ceased
to
be
included
in
the
google
discover.
The
rest
of
the
content
from
our
own
domain
is
still
in
the
talk
and
gets
into
the
google
discover.
We
have
received
no
letters
about
function
from
google,
so
we
have
also
noticed
that
during
the
same
period
of
time,
many
low
quality
sites
were
stealing
content
from
us.
A
Now
I
I
don't
know
offhand,
so
I
I
think.
First
of
all,
I
I
don't
have
much
insight
into
the
google
news
side
of
things.
So,
if
you're
seeing
effects
that
are
kind
of
tied
to
the
way
that
google
news
processes
a
website,
I
I
don't
really
have
much
insight
there.
That
would
be
something
where
you
need
to
go
through.
The
google
news
contact
in,
I
believe,
the
the
help
center
for
google
news
for
google
news
publishers.
A
Otherwise
I'm
I'm
happy
to
kind
of
forward
your
issue
along.
If
I
can
get
more
information
from
you
with
regards
to
which
site
it
is
and
what
specifically
you're
seeing
so
my
recommendation
would
be,
I
I
think
you
post
it
also
in
the
youtube
comments.
My
recommendation
would
be
if
you
could
add
a
little
bit
more
detail
there.
Then
I
can
pass
that
on
to
the
team
and
they
can
take
a
look
to
see
if
there's
something
specific
on
our
side
that
we
need
to
adjust.
I
Okay,
so
we
can
just
post
it
here.
Yes
in
comments.
Yes,
please,
okay,
thank
you.
So
much
john
sure.
A
Let
me
run
through
some
of
the
submitted
questions
as
well.
I
I
realize
there's
still
lots
of
people
with
their
hands
raised,
but
just
to
make
sure
that
the
submitter
things
are
also
taken
care
of
a
little
bit.
I
I
recognize
some
of
the
questions
here,
so
it's
there's
a
little
bit
of
overlap
here.
I
also
have
a
bit
more
time
afterwards.
A
If
any
of
you
want
to
stick
around
and
still
have
questions
that
we
need
to
get
through,
let's
see
how
much
influence
does
direct
traffic
have
in
terms
of
google,
taking
it
as
a
signal
and
ranking
content
from
the
website.
We
don't
use
direct
traffic.
We
we
don't
really
have
a
measure
for
that
anyway.
A
A
Yes,
you
can
remove
the
http
version
in
search
console.
I
don't
think
it
would
change
anything
positive
or
negatively
for
the
website,
and
if
you
have
the
domain
verified,
which
which
is
kind
of
my
recommended
approach,
then
you
can
always
add
the
http
version.
Should
you
ever
need
to
do
something
specific
with
the
http
version?
So
for
the
most
part,
it
doesn't
harm
to
keep
the
http
version
in
search
console.
But
if
you
want
to
make
room,
then
of
course
feel
free
to
remove
it,
especially
if
you
don't
need
it
anymore.
A
A
So
featured
snippets
are
essentially
a
way
that
we
highlight
things
in
the
search
results
with
us
like
a
little
bit
of
a
bigger,
snippet
and
they're
a
part
of
our
normal
ranking
systems,
and
we
rank
slightly
differently
across
the
different
countries
we
use
geo
targeting
we
try
to
figure
out
which
results
make
sense
to
show
to
users,
and
all
of
that
can
lead
to
slightly
different
results
being
shown
in
the
search
results.
A
So
that's
something
where
it's
not
that
our
systems
would
need
to
go
in
there
and
say:
oh
this
website
doesn't
have
good
english.
Therefore
we
shouldn't
show
it
in
an
english-speaking
country,
but
it's
more
that
well,
we
have
better
results
that
we
would
show
here
in
other
countries.
Therefore,
we'll
show
those
instead
can
the
url
structure
be
different
than
the
breadcrumbs
on
a
page,
for
example,
if
the
url
is
just
website.com
blog
dash
post
last
name,
but
the
breadcrumbs
show
home
main
topic,
subtopic
blog
post
name.
Would
that
be
an
issue?
A
That's
perfectly
fine,
so
breadcrumbs
don't
have
to
match
the
url
structure.
In
some
cases
you
have
a
very
flat
url
structure,
everything
in
one
file
name
and
no
subdirectories.
In
some
cases
you
have
a
very
kind
of
like
organized
folder-based
structure
and
it's
a
in
a
url
structure.
That's
perfectly
fine
as
well
it
none
of
these
don't
have
to
map
to
breadcrumbs
on
a
page.
A
I
think
where
some
of
the
confusion
comes
from
is
that
if
you
don't
use
breadcrumb
markup
on
a
page,
then
our
systems
will
try
to
guess
breadcrumbs
sometimes,
and
they
try
to
guess
that,
based
on
the
url
at
some
point,
and
then
it
can
happen
that
our
kind
of
breadcrumb
guessing
systems
will
say
blog
dash,
post-name,
that's
kind
of
like
a
set
of
breadcrumbs.
We
should
show
those,
but
if
you
specify
breadcrumbs
with
structured
data,
then
we
will
use
those
instead
of
our
kind
of
guessing
metrics.
A
Do
google
my
business
clicks
and
then
press
and
show
up
in
search
console
as
clicks
and
impressions?
Yes,
when
the
google,
my
business
listing,
is
shown
in
the
normal
search
results.
So
if
you
do
a
normal
web
search-
and
you
have
a
map
with
your
google,
my
business
listing
shown,
then
we
will
show
that
as
clicks
and
impressions
in
search
console.
A
However,
if
you're
looking
at
google
maps-
and
you
see
the
google
my
business
listing-
then
that's
something
that
would
be
outside
of
search
which
we
wouldn't
show
is
there
an
impact
to
linking
twice
to
the
same
page
from
the
menu?
No,
so
there's
a
little
bit
more
detail
in
the
question,
but
in
short,
the
answer
is
no:
if
you
link
to
the
exact
same
page
twice
in
in
the
menu
with
different
anchor
text
or
something
like
that,
that's
perfectly
fine!
That's
not
going
to
cause
any
problems.
A
I've
also
seen
quotes
from
you
saying
that
google
doesn't
look
at
the
urls
in
in
the
or
the
slashes
in
the
url.
So
it's
just
preference.
Do
shorter
urls
actually
make
an
impact
compared
to
long
urls,
or
is
that
just
an
seo
method?
A
My
guess
is.
This
is
based
on
things
like
correlation
studies
that
are
done
where,
where
people
say
well,
when
I
look
at
these
search
results,
they
top
so
and
so
results
tend
to
have
shorter
urls
than
the
ones
way
in
the
bottom,
and
from
that
point
of
view,
it's
easy
to
say.
Well,
if
you
want
to
rank
high,
therefore
you
should
have
short
urls,
but
I
think
that's
that's
misleading.
A
It
just
happens
to
be
that
some
pages
when,
when
they
tend
to
be
very,
I
don't
know,
concentrated
on
a
website
very
important
for
a
specific
website.
They
tend
to
have
shorter
urls
because
they're
closer
to
the
root
of
the
website,
and
because
of
that,
like
it
looks
like
shorter
urls,
lead
to
higher
ranking,
but
it's
actually
just
well.
A
So
from
from
that
point
of
view,
shorter
urls
are
not
going
to
make
your
website
rank
better
the
one
place
in
our
systems
where
we
do
use
the
url
length
is
when
it
comes
to
canonicalization,
so
in
particular,
when
we
know
that
there
are
multiple
urls
on
your
website
that
show
exactly
the
same
content.
Then
one
of
the
kind
of
factors
that
we
take
into
account
there
is
the
the
length
of
the
url
where
we
say
well,
if
it's
a
shorter
url,
then
usually
that
looks
nicer
for
users.
A
So
we
tend
to
use
that
one
and
usually
that
comes
into
play
when
you
have
things
like
url
parameters
attached
to
a
url,
and
you
tell
us
this
version
with
all
of
these
parameters
is
the
same
as
the
same
version
without
parameters.
Then
our
systems
will
say
well.
The
one
without
parameters
looks
nicer,
we'll
just
use
that
one
instead
and
canonicalization
is
not
something
that
just
uses
one
factor.
A
The
news
agency,
I
think
we
looked
at
aggregate
rate,
ranking
rating.
We
looked
at
outdated
news
pages.
What's
the
best
practice
to
apply
to
old
news
pages.
By
that
I
mean
news
of
last
year.
Would
deleting
these
be
the
best
option
or
no
indexing
or
what
should
be
the
best
way
from
seo
point
of
view
generally,
I
recommend
keeping
the
old
content
online,
maybe
putting
it
into
an
archive
section.
If
you
want,
if
you
really
don't
want
to
keep
it
anymore,
then
deleting
it
no
indexing
it.
A
A
A
A
So
it's
completely
normal
that
sometimes
we
have
more
index
and
sometimes
we
have
less
index
when,
when
I
look
at
some
of
my
older
test
sites,
where,
where
I
have
a
ton
of
pages
kind
of
just
for
for
test
purposes
in
search
console
to
see
how
things
kind
of
perform
with
the
the
technical
reports
there,
it's
completely
normal
for
me
to
see
like
things
fluctuating
quite
a
bit
and
sometimes
they
fluctuate
very
strongly-
and
I
I
haven't
changed
anything
on
these
sites
for
years,
but
still
the
the
number
of
index
pages
goes
up
like
crazy
amounts.
A
So
that's
something
where,
from
from
my
point
of
view,
the
amount
that
we
index
from
a
website
that
varies
over
time
and
that
isn't
something
where
we
would
say
we
will
try
to
index
everything
from
all
websites,
because
it's
it's
just
purely
from
a
technical
point
of
view.
It's
impossible
to
get
everything
from
all
websites.
A
A
K
So
so
we
have
product
schema
right
now,
but
we
don't
have
it
filled
out
so
we're
getting
errored
for
that.
Would
that
be
the
reason
why
we
are
getting
de-indexed.
A
No,
no,
the
the
structured
data
is
purely
for
understanding
what
what
else
is
on
the
page
and
for
showing
that
in
the
rich
results.
So
it's
not.
It
wouldn't
be
a
sign
that
we
would
say
we
will
drop
these
pages
from
the
index.
A
Okay,
thank
you
sure.
Usually,
the
the
best
approach
that
I've
seen
to
getting
more
content
indexed
in
a
consistent
way
is
to
make
sure
that
the
quality
of
the
website
overall
is
as
high
as
possible.
So
that's
kind
of
a
trick
from
I
don't
know.
I
don't
know
if
you
call
it
a
trick,
but
essentially
you're
telling
google
that
this
is
really
important
content
and
then
google
will
try
to
actually
go
off
and
make
sure
that
all
of
that
is
indexed.
A
So
that's
something
that
kind
of
comes
into
play
with
the
crawl
budget
discussions
and
we
have
a
blog
post.
I
think
a
couple
years
back
now
with
regards
to
crawl
budget,
and
that
also
includes
the
topic
of
crawl
demand,
which
is
basically
how
much
google
wants
to
crawl
from
a
website
and
crawling
is
kind
of
the
first
step
to
indexing.
K
A
A
What
could
be
the
best
average
period
to
measure
the
impact
of
our
website
changes?
For
example,
if
I
disintegrate
the
amp
feature
from
a
news
website,
note
a
traffic
drop
in
the
next
hour.
Could
that
be
due
to
removing
amp?
A
I
I
think
this
is
kind
of
tricky
because
on
on
the
one
hand,
it
it
sounds
like
like
what,
if
I
make
normal
changes
on
my
website,
how
long
does
that
take
where
that's
something
where
I'd
say?
Well,
it
depends
and
it's
something
where
you
tend
to
see
changes
within
a
couple
of
days
and
then
the
majority
of
the
changes,
maybe
within
the
next
month,
and
then
some
remaining
changes
over,
I
don't
know
half
a
year
or
longer,
even
but
specifically
about
kind
of
amp
and
non-amp.
A
A
So
it's
not
something
where
it's
it's
a
normal
technical
change,
where,
if
you
change
all
of
the
titles
on
your
page,
to
something
different
where
it
takes
a
while
to
be
reprocessed
with
amp,
it's
something
that
we
try
to
pick
up
as
quickly
as
possible
and
if,
for
example,
the
urls
are
no
longer
in
the
amp
cache,
then
we
would
have
to
kind
of
react
fairly
quickly
and
swap
all
of
that
out
in
practice.
I
would
expect
just
removing
amp
or
disabling
amp
is
something
where
you
would
tend
to
see
within.
A
I
don't
know
a
couple
of
days.
You
tend
to
see
those
changes
fairly
quickly
and
you
would
specifically
see
that
the
amp
pages
are
no
longer
being
shown
in
search,
but
rather
we
show
the
normal
pages
again,
and
it's
not
so
much
that
you
would
see
a
ranking
drop
during
that
time,
but
rather
a
swap
in
the
urls
that
are
shown
and
depending
on
how
you're
tracking
things
that
could
look
like
a
ranking
drop.
So
that's
kind
kind
of
the
the
effect
there.
A
I
don't
think
you
would
see
an
effect
within
hour
when
removing
app.
This
feels
like
something
where
it's
more
like
it's
a
fast
change,
but
it's
something
that
would
come
within
a
couple
of
days.
The
the
one
effect
with
regards
to
amp
that
might
be
visible
with
regards
to
kind
of
ranking.
A
Is
that
the
top
stories
carousel,
where,
by
default,
the
amp
pages
would
be
eligible
to
be
shown
there,
but
non-amp
pages
kind
of
previously
were
not
eligible
to
be
shown
there,
and
I
think,
since
this
week
since
we
started
rolling
out
the
page
experience,
update,
non-amp
pages
are
also
eligible
to
be
shown
there
provided
they're.
I
think
fast
enough
or
good
enough
with
regards
to
the
page
experience
ranking
factors.
A
I
would
not
expect
an
effect
within
one
hour,
but
you
could
probably
see
some
effects
when
you
remove
amps,
specifically
for
a
news
website
over
the
course
of
I
don't
know
a
couple
of
days
until
things
settle
down
a
little
bit
again,
also,
usually
with
a
news
website,
the
website
tends
to
rank
more
for
the
new
content.
A
I
have
one
website
with
your
brand
d
e
in
german
with
language
d,
e
d
e
and
onewhat.com
in
english,
with
engb
or
us,
one
is
basically
a
clone
of
the
other,
but
properly
translated.
Are
there
any
disadvantages,
not
using
real
alternate
hreflang?
A
Are
there
any
advantages
to
applying
it?
So
essentially,
the
hr
flying
links
between
those
two
pages
will
not
change
the
rankings
of
those
pages.
So
that's
kind
of
the
first
step.
However,
they
can
swap
out
the
urls
that
are
being
shown,
so
it's
hard
to
say
whether
there's
an
advantage
or
disadvantage
from
using
it.
A
I
think
there's
definitely
no
disadvantage
to
using
it,
but
essentially
the
advantage
is
kind
of
if
we're
showing
the
wrong
language
url
in
the
search
results
then
with
hreflang,
we
can
swap
that
out.
So,
usually,
you
would
see
this
if
someone
has
a
brand
that
is
the
same
across
multiple
languages.
If
users
are
just
searching
for
that
brand
name,
then,
from
the
query
alone,
we
don't
know
which
language
content
to
show
which
language
content
to
rank
for
that
page,
and
it
can
happen
that
we
would
rank
the
wrong
version
for
that.
A
So
I
don't
know.
For
example,
if
you
search
for
google
in
germany,
we
would
not
necessarily
know
that
someone
is
looking
for
the
german
version
of
google,
so
it
could
happen
that
the
dot
com
version
is
shown
to
users
and
then
with
hreflang.
We
would
swap
out
the
url
against
the
german
version,
because
we
know
oh,
this
user
is
in
germany
and
has
their
settings
set
to
german.
So
we
should
use
the
german
version
for
their
page.
A
On
the
other
hand,
if
users
are
always
searching
in
a
way
that
is
clearly
localized
where,
if
you
have
say
different
different
brands
across
different
countries
and
someone
in
one
country,
searches
for
your
brand
and
there's
only
that
one
page
which
is
really
relevant
for
that
pay.
For
that
result,
then
probably
ahreflang
will
not
change
anything.
A
So
that's
something
where
it's
like.
I,
I
would
double
check.
First,
our
users
seeing
the
wrong
versions
and
if
so,
then
use
hreflang.
If
not,
then
probably
you
don't
need
to
use
that.
L
Okay,
so
this
is
always
very
confusing
for
us,
or
especially
for
branded
queries,
so
john,
so
for,
for
instance,
if
if
I
have
english
page
and
I
have
folder
for
german
or
da
folder
for
germany
in
germany,
if
a
person
is
searching,
obviously
brand
name
would
remain
in
english
or
person
can
just
type
the
brand
name,
but
his
browser
language
is
in
german.
From
that
case,
using
hreflang
means.
Google
will
show
german
version
on
branded
query
right
and
if,
if
his
browser
language
is
setting
is
in
english,
then
english
page
will
appear
because.
L
A
The
other
thing
is
that,
especially
when
we're
talking
about
english
a
lot
of
times,
the
default
language
on
a
computer
is
english
and
these
computers
are
distributed
worldwide,
and
if
we
can
recognize
that
this
is
essentially
a
default
setting,
then
we
might
say
well.
We
can't
trust
the
setting
to
be
accurate,
so
we
will
use
whatever
is
locally
relevant.
A
So
this
is
particularly,
I
think,
an
issue
in
english,
where,
if
we
see
that
a
browser
is
set
to
us,
english
and
everything
looks
default,
then
we
might
say:
well
probably
the
users
in
germany.
So
probably
they
want
german
content,
even
though
their
browser
is
set
to
english,
because
we
assume
that
maybe
they
didn't
set
the
browser
language
at
all,
and
it's
just
the
default
setting.
They
don't
know
how
to
change
that,
so
purely
based
on
the
browser
language
is,
is
usually
not.
L
A
I
think
in
general,
with
regards
to
internationalization,
you
should
not
rely
on
google
always
getting
this
right
and
instead
have
some
kind
of
a
fallback
mechanism.
On
your
pages,
where
you're
saying
you
landed
on
the
german
version
of
our
homepage,
but
it
looks
like
you:
have
your
browser
set
to
french.
So
here's
also
the
french
version
and
make
it
easy
for
people
to
get
to
the
appropriate
version.
I
A
Site
so
always
have
that
that
backup
set
up
there
and
it
particularly
makes
sense
when
you're
across
different
countries
where
people
might
also
just
enter
the
url
and
say
well.
I
entered
the
dot
com
because
that's
the
main
site,
but
I'm
located
in
india.
So
I
should
be
getting
the
indian
version
of
the
site
and
having
a
banner
on
top
saying,
like
hey,
do
you
want
to
go
to
the
indian
indian
version
of
our
site
click
here?
A
Cool
yeah.
Thank
you
sure.
Let
me
just
grab
one
more
question
from
the
list
really
short
one.
What's
the
best
practice
for
rel
canonical
on
a
url
that
is
now
returning
404,
you
can
do
whatever
you
want
on
a
404
page.
If
the
result
code
is
404
or
410
or
any
error,
then
we
will
ignore
all
content
on
the
page.
A
What
you
put
in
the
robots,
meta
tags
and
the
rel
canonical
is
totally
irrelevant
for
us,
because
we
already
recognize
that
there's
nothing
there
that
we
need
to
index,
so
that
was
kind
of
an
easy
one,
and
so
let
me
get
back
to
all
of
these
millions
of
people
who
have
raised
hands
and
we'll
get.
I
don't
know
one
or
two
more
in
and
then
pause
the
recording
and
then
kind
of
continue
on
afterwards
as
well.
Julian,
hey
john
how's.
It
going
hi.
M
So
my
question
is
about
best
practices
for
a
website.
That's
a
database
of
products
that
monetizes
via
affiliate
links,
so
the
site
is
ugc
and
community
driven
and
you
readers
can
contribute
unique
content,
reviews
and
other
cool
stuff.
So
you
know
just
up
front.
I
know
how
skeptical
google
is
of
thin
affiliate
sites
that
offer
no
unique
value,
and
I
know
you've
talked
about
it
a
lot
on
here
and
I
totally
get
that
so
on
this
particular
site.
M
So
my
goal
is
not
to
have
google
judge
the
overall
picture
of
the
website
as
hey.
This
is
a
thin
affiliate,
so
I
feel
like
tactically.
I
have
a
chicken
and
egg
problem
because
you
need
starter
pages
created
in
the
first
place.
That
may
be
viewed
as
thin.
I
think
from
a
google
lens,
but
you
kind
of
need
them,
so
they
can
accrue
unique
content
over
time.
Some
products
just
aren't
going
to
be
as
popular,
etc.
M
Right,
so
my
inclination
is
just
make
use
of
no
index,
which
sounds
you
know
kind
of
like
the
simplest
answer,
four
pages
that
haven't
hit
a
certain
quality
threshold
and
I
know
you've
gone
on
record
to
say
that
google
tries
to
look
at
pages
granularly.
So
if
there's
a
thousand
pages,
one
is
great.
999
are
crap,
like
it
behooves
google,
to
show
that
one
great
page,
but
you
judge
trust
based
on
larger
sections
of
the
site.
So
you
know
you
could
look
at
all
product
pages
how
trustworthy
is
the
site.
M
So
I
guess
my
two
kind
of
pointed
questions
are:
is
it
enough
to
know
index
the
ones
that
haven't
hit
the
quality
threshold,
or
should
I
go
as
far
as
to
remove
affiliate
links
from
the
not
so
great
pages?
So
broadly?
How
just?
How
do
I
get
in
google's
good
graces
for
this
type
of
a
website
which
I
know
is
kind
of
a
hot
button
issue
a
little
bit.
A
Yeah,
so
no
index
would
be
perfectly
fine.
I
I
don't
think
you
would
see
any
effect
if
you
just
removed
the
affiliate
link
there
and
left
those
pages,
because
we
we
would
just
judge
the
pages
on
their
own
and
say
he's
like
well,
there's
not
much
content
here
and
the
kind
of
the
matter
of
the
affiliate
link
is
almost
like
a
side
side.
Note
there.
Okay,
so
I
I
think
the
approach
with
no
index
makes
a
lot
of
sense
figuring
out
how
to
recognize
what
pages
are
quality
or
not.
A
That's
sometimes
a
bit
tricky,
but
it
sounds
like
you
have
some
ideas,
so
that's
kind
of
the
direction
I
would
head
there
to
really
say.
A
Okay,
but
generally,
especially
for
all
of
the
the
content
that
you
collect
over
the
years,
if
you're
saying
well,
someone
submitted
this
five
years
ago,
nobody
has
commented
on
it
since
then.
Maybe
we
should
just
no
index
this,
and
you
could
also
go
so
far
as
to
say
at
some
point
well
like
I
have
no
insects
that
for
a
year
and
nobody
has
manually
gone
there
either.
N
N
A
If
we
don't
have
data
for
a
site
at
the
moment,
we
wouldn't
be
able
to
use
that
for
for
for
the
ranking
factor,
but
we
we
have
a
lot
of
practice
with
dealing
with
sites
where
we
don't
have
the
full
data
set.
So
it's
something
where
I
wouldn't
necessarily
worry
about
it
and
I
wouldn't
see
it
as
something
like.
Oh,
I
have
to
build
more
traffic
to
my
website
in
order
for
it
to
be
able
to
rank,
but
rather,
if,
if
your
website
just
doesn't
have
a
lot
of
traffic,
then
that's
perfectly
fine.
A
Okay,
thanks
all
right,
let
me
pause
the
recording
here
and,
like
I
mentioned,
you
all,
are
welcome
to
to
hang
around
a
little
bit
longer
afterwards
as
well.
It
looks
like
there's
still
tons
of
questions
left,
so
we'll
see
how
far
we
can
make
it
before
I
fall
over.
Thank
you
all
for
watching
this
recording,
if
you're
watching
it
on
youtube
and
feel
free
to
jump
on
in
in
one
of
the
future.
Episodes
if
you'd
like
to
join
us
live
all
right.