►
From YouTube: English Google SEO office-hours from December 31, 2021
Description
This is a recording of the Google SEO office-hours hangout from December 31, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
This
will
be
the
last
one
of
this
year.
A
Hopefully
I
who
knows,
but
most
likely
it
looks
like
a
bunch
of
people
still
managed
to
to
join
in,
even
though
it's
on
the
on
the
last
day
of
the
year,
even
people
in
in
weird
time
zones
hi,
michael,
so
cool
a
handful
of
questions
were
submitted
on
youtube
and
we
have
a
bunch
of
people
raising
their
hands
here
already,
so
maybe
we'll
just
go
and
run
through
the
people
that
have
their
hands
raised.
At
the
moment.
A
Let's
see
ashwani,
I
think
you're.
First.
B
Yeah
hi
john,
very
good
morning
hi.
I
I
have
a
question
related
to
like
indexing
like
as
we
as
what
I
found.
Our
blog
or
pages
are
not
indexing
on
regular
basis,
but
their
tags
are
in
indexed
by
google.
So
I
I
don't
understand
why
tag
index
first
rather
than
the
url,
why
url
is
not
indexing
first,
how.
B
B
Look
look,
for
example,
your
website
name
is
abc.com
okay
and
my
page
is
abc.com
seo
updates.
Okay
and
inside
that
page,
I
have
added
a
tags
called
seo.
Updates
seo
news.
Okay.
So
when
I
try
to
search
whether
this
space
is
indexed
by
google
or
not
so
what
google
is
basically
saying,
like
your
tags
are
indexed,
it
means
it
shows
in
archive,
like
your
tags
are
in
indexed,
but
your
url,
the
complete
url,
is
not
indexed.
A
B
No
no,
like,
as
mr
province
said,
that
there
must
be
a
url
for
free
tax,
but
no,
there
is
no
any
url
for
any
tags.
A
B
This
one
on
the
google
on
the
google
like
when
I
search
site
double
column,
the
url
name
when
I
put
put
that
on
the
google
it
shows
your
url
is
not
indexed,
but
you
tag.
The
tag
is
indexed
so
here
I
am
a
little
bit
confused
like
why
google
indexing
tags,
rather
than
they
are
they're
not
interested
to
indexing.
My
web
page,
my
url.
A
So
what
what
I
would
recommend
doing
here
is
maybe
posting
in
in
the
webmaster
help
forum,
with
with
the
exact
urls
and
the
the
searches
that
you're
doing
just
so
that
other
people
can
can
take
a
look
and
double
check
to
see
exactly
what
what
is
happening
and
maybe
give
you
some
some
input
on
what
you
could
be
doing
differently.
B
A
From
from
a
practical
point
of
view,
that
can
sometimes
happen
that
these
are
indexed
before
the
actual
content
that
makes
sense,
for
example,
with
the
home
page,
because
we
usually
recrawl
the
homepage
quite
often
so
we
we
might
see
the
home
page
is
updated
and
a
link
to
your
article
is
there,
but
the
article
itself
sometimes
just
takes
a
little
bit
longer.
B
Okay,
okay,
I
will
definitely
share
the
link
to
the
in
the
community
and
the
second
question
is
related
to
to
same
indexing
and
caching,
so
sometimes
what
I
found
some
of
our
website.
We
sorry
url
is
indexed
properly,
but
we
are
unable
to
find
the
classify.
B
A
The
cast
file
that
can
happen-
that's
that's
normal
in
the
sense
that
the
the
cache
pages
in
the
search
results
are
handled
separately
from
the
indexing.
A
B
A
E
Yeah
good
morning,
everybody
thank
you,
john,
for
working
on
the
31st
of
december
and
taking
our
questions.
I
already
asked
the
question
on
youtube,
so
I
might
simply
read
it
as
it
is
in
youtube.
E
E
Let's
assume
that
I
have
an
established
website
in
french
that
exists
for
a
number
of
years
and
has
reasonable
seo
success,
and
then
I
want
to
add
german
language
version
on
the
same
domain,
so
not
the
distinct
domain,
but
the
same
domain
and
the
website
owner
uses
automated
translation,
unfortunately
for
the
german
user
interface
and
the
german
content.
So
I
know
that
automated
translation
is
considered
as
automated
automated
generated
content
and
google
doesn't
like
it.
E
So
it
would
seem
normal
that
google
probably
doesn't
appreciate
the
new
german
version
so
much,
but
my
question
mainly
targets
the
established
french
version
which
has
done
reasonably
well
so
far
in
the
I
wonder
if
this
poor
german
language
version
can
influence
negatively
the
exa
the
success
of
the
more
established
french
version
so
in,
in
other
words,
do
you
consider
the
language
quality
of
each
language
version
on
the
same
domain
independently?
A
The
the
main
issue
here
is
less
about
these
being
translated
versions
of
the
content,
but
more
that
for
for
some
things,
we
look
at
the
quality
of
the
site
overall
and
when
we
look
at
the
quality
of
the
site.
Overall,
if
you
have
significant
portions
that
are
lower
quality
it,
it
doesn't
matter
so
much
for
us
like
why
they
would
be
lower
quality
if
they're,
just
bad
translations
or,
if
they're,
terrible
content
or
whatever.
A
But
if
we
see
that
they're
significant
parts
that
are
lower
quality,
then
we
might
think
overall.
This
website
is
not
so
fantastic
as
we
thought
and
that
can
have
effects
in
in
different
places
across
the
website,
so
in.
In
short,
I
guess
if
you
have
a
very
low
quality
translation,
that's
also
indexed
and
that's
also
very
visible
in
search,
then
that
can
definitely
pull
down
the
good
quality
translation
as
well
or
the
the
good
quality
original
content
that
you
also
have
so.
E
Yeah
very
good:
can
I
ask
one
follow-up
question
on
this?
Maybe
because,
obviously,
what
I'm
working
in
an
seo
agency
and
what
frequently
happens
is
that
clients
will
give
us.
You
know
what
they
consider
the
seo
tests
as
to
translate
with
seo
optimization
at
the
same
time,
and
then
they
have
somebody
else
doing
the
translation
of
texts
which
don't
have
seo
importance
of
the
user
interface
and,
unfortunately,
very
frequently
they
choose
low
cost
translation
services
to
do
that
part
of
the
job.
E
A
I
I
don't
know
if
we
have
anything
that
specifically
looks
for
low
quality
translations,
so
it's
it's
probably
at
least
the
way
that
I
understand
it.
It's
more
a
matter
of
us
trying
to
understand
the
quality
of
the
website
overall
and
that's
usually
not
something
where
they're
individual
things
that
we
could
just
point
at
to
say
like.
Oh,
if
you
have
five
five
misspellings
on
a
page,
that's
a
sign
of
low
quality.
A
It's
like
the
these
things
happen
individually
and
all
of
these
factors,
I
think
individually,
are
hard
to
say
that
they're
a
sign
of
something
being
low
quality,
but
rather
we
have
to
take
everything
together
and
then
figure
out
what
what
the
mix
is
together
and
that's
also
a
reason
why
sometimes,
when
you
significantly
improve
the
quality
of
a
website
overall
or
when
things
get
significantly
worse,
it
just
takes
a
lot
of
time
for
our
systems
to
figure
out
like
oh
overall.
The
view
of
our
of
this
website
is
now
better
or
worse.
A
So
from
from
that
point
of
view,
it's
not
that
we
have
anything
specific
that
we
could
point
at
the
the
best
that
I
could
do.
If
you
wanted
individual
items
is
to
look
at
the
blog
post,
we
did
on
core
updates,
I
think,
last
year,
sometime
or
the
year
before
that
which
which
has
a
bunch
of
different
questions
in
there,
which
you
could
ask
yourself
about
the
website,
which
you
could
also
go
and
look
at
together
with.
A
Maybe
I
don't
know
some
some
testers
or
some
users
to
get
external
feedback
as
well,
and
it's
not
so
much
that
we
have
algorithms
that
try
to
like
copy
that
directly.
But
it's
it's
kind
of
like
almost
almost
like
a
a
guiding
direction
where
we
say
well.
This
is
the
direction
we
want
to
go
and
then
we
will
work
to
make
our
algorithms
try
to
figure
that
out.
E
A
Yeah,
I
think
that's
a
good
way
to
look
at
it.
The
the
other
thing
that
I
think
is
really
important
is
that
you
have
objective
feedback
in
this
regard,
in
that,
if
you're
working
on
the
website-
and
it's
like
your
baby,
then
it's
going
to
be
hard
to
accept
that
people,
don't
like
the
color
scheme
or
that
the
design
looks
outdated
or
not
professional
enough,
like
the
these
small
subjective
things.
A
So
it's
really,
I
think,
important,
to
have
someone
external
someone
who
is
objective
to
give
you
this
kind
of
feedback,
and
maybe
even
the
connection
between
seo
agency
and
client
is
almost
like
too
personal
in
that
you're
you're
very
devoted
to
that.
So
in
maybe
in
these
kind
of
situations
it
would
make
sense
to
say
well,
we
will
put
together
a
small
panel
of
people
who
are
not
involved
with
your
website
and
they
will
give
us
objective
feedback,
and
it's
not
that
we
want
you
to
make
these
changes.
F
F
John
sure
or
frank,
if,
if
you
say
those
pages,
aren't
the
pages
they've
been
badly
translated,
if
they're,
not
that
important,
then
do
you
need
them
indexed.
Why
show
google
a
bad
version?
Why
not
just
let
users
read
it
if
they
need
to
and
then
just
no
index
it
and
then
the
overall
quality
is
still
unless
I'm
wrong
john?
Is
that
not
a
sure
an
idea
if
you,
if
you
don't,
if
you
say,
they're,
not
important,
if
it's
like
the
directions
page
or
if
it
has
the
terms
and
conditions
that
no
one
but
google
does.
E
Yeah
well,
unfortunately,
it's
not
only
such
unimportant
pages
like
the
directions.
I
don't
think
that
you
would
suggest
to
not
index
the
home
page,
for
example,
or.
E
Oh,
the
idea
would
be
to
tell
the
client
okay
to
invest
money
only
in
the
important
pages.
Well,
then,
then,
it
gets
really
subjective.
Also,
I
think,
because
they
have
a
bunch
of
product
pages
which
are
of
poor
quality,
and
so
where
do
you
draw
the
line?
What
is
an
important
product
page?
What
is
not
an
important
product
page,
so
I
appreciate
the
input
drop.
E
I
will
keep
that
in
in
mind,
but
many
times
I
think
it's
rather
that
you
know
the
the
rest,
which
is
not
seo
text
so
to
speak,
is
taken
or
is
assigned
to
a
low-cost
translation
service
or
even
automated
translation,
and
then
that's
the
case
for
all
the
rest
of
the
website.
Unfortunately,
but.
F
A
Yeah,
I
think
I
think,
the
aspect
of
not
just
looking
at
the
quantity
of
pages,
but
rather
the
the
kind
of
relative
importance.
I
I
think
that
does
help
a
little
bit
though,
and
like
especially,
if
you
have
a
limited
translation
budget,
then
really
focusing
to
make
sure
that
the
important
pages,
the
pages
that
get
a
lot
of
traffic,
that
they're
actually
well
translated.
A
I
think
that
goes
a
long
way,
because
when
we
look
at
the
quality
of
a
website
overall,
it's
not
that
we
count
the
pages
and
we
determine
the
quality
of
each
page
and
make
kind
of
like
an
average
like
that,
it's
more
that
we
try
to
figure
out
well.
How
does
this
website
represent
itself
externally
and
that's
focused
quite
a
bit
on
the
pages
that
are
more
important:
the
pages
that
get
a
lot
of
traffic
so
kind
of
like
how
rob
mentioned
like
if
you
can
make
sure
that
the
important
pages
are
well
translated.
A
Then,
even
if
the
less
important
pages
are
low
quality
translations
you
you
can
get
almost
like.
I
don't
know
90
of
the
way
there
so
that
I
mean
it
really
depends
on
the
website
itself
and
the
budgets
and
how
it's
organized.
If
you
just
have
one
person
translating
or
one
agency
translating
everything,
then
obviously
you
can't
tell
them
like.
Oh
just
do
a
bad
job
on
these
pages.
A
E
F
F
F
F
A
A
Yeah,
if
it
uses
the
the
browser
permissions
dialog
thing,
then
I
I
don't
think
we
we
ever
accept
any
permissions.
So
I
mean
it's
something
you
can.
You
can
probably
easily
test,
but
from
from
my
understanding
when
it
comes
to
rendering
all
of
these
permission,
dialogues
are
essentially
ignored
and
not
processed.
A
Cool
praveen.
C
So
the
question
is
about
the
pages
that
are
not
indexed
basically,
so
the
thing
is
we.
We
have
a
site
where
there
are
some
100
pages
that
are
not
indexed.
They
are
tagged
with
meta
robots,
not
index
tag,
but
they
are
accessible
to
the
users.
They
can
access
those
pages
now
the
thing
is,
there
are
a
lot
of
good
authority
sites
in
the
industry
that
are
actually
linking
back
to
you
know
these
pages
that
are
actually
not
indexed.
C
So,
though
we
are
getting
referral
traffic
and
all
that
stuff,
but
from
seo
benefit,
you
know,
link
benefit
point
of
view.
We
are
not
getting
anything
because
obviously
they
are
not
indexed.
So
the
question
was:
if
we,
what
if
we,
if
we
you
know,
set
a
301
redirect
for
googlebot
on
these
urls
to
some
relevant
pages?
Will
that
be?
You
know
against
google
guidelines
because
users
can
still
access
those
pages,
but
for
googlebot
there
will
be
a
301
redirect.
A
It
also
feels
like
the
kind
of
thing
where
you
might
just
use
a
real
canonical
and
leave
it
leave
it
at
that
to
kind
of
point
at
the
page
that
you
do
want
to
have
index,
because
if,
if
you're
doing
this,
this
kind
of
redirect
specifically
for
googlebot,
then
I
I
think,
on
the
one
hand,
from
a
technical
point
of
view,
it's
very
easy
to
get
that
wrong
and
to
have
something
mess
up,
and
I
mean
from
from
a
user
point
of
view.
A
I
don't
think
it
would
be
a
big
issue,
because
we
would
probably
only
index
the
the
the
target
page
anyway.
So
it's
not
that
a
user
would
click
on
a
link
in
the
search
results
and
end
up
on
one
page
that
looks
very
different
from
what
they
clicked
on
so
on.
On
the
one
hand
there
I
I
think
it
feels
like
there
are
easier
solutions
to
to
this,
with
with
a
real
canonical
to
do
essentially
the
same
thing,
but
I
don't
think
it
would
be
like
super
problematic.
C
Yeah
we
thought
about
canonical
too,
but
then
we
thought
it
might
like.
Google
might
ignore
it,
because
the
two
pages
might
not
be
exactly
same
or
even
so.
There
could
be
an
issue
because
one
page
is
more
dependent
on
javascript
thing
and
other
is
more
html.
A
Yeah,
so
I
don't
know
from
from
my
point
of
view,
I
I
would
prefer
to
try
to
use
the
rel
canonical
as
much
as
possible
just
to
make
sure
that
you
don't
have
to
set
up
any
separate
infrastructure
to
cloak
to
googlebot
and
kind
of
all
of
the
problems
associated
with
that,
because
it
feels
like
you.
You
have
to
put
a
lot
of
work
in
to
make
that
work.
The
the
way
that
you
want
it
and
the
the
other
approach
is
just
so
much
simpler
and
so
much
easier
and
less
error-prone.
C
Yeah
so
from
options
right
over,
we
got
both,
but
just
wanted
to
confirm.
If
we
go
with
three
zero
one
it
will
it
be
against
google
guidelines
or.
A
No,
it
won't
be
against
google
guys
because
the
like,
what
would
be
problematic
is
if
this
page
were
indexable
and
you
redirected
kind
of
depending
on
the
user
agent.
Then
that
could
be
seen
as
a
kind
of
a
sneaky
redirect.
But
since
we're
indexing
the
target
page,
then
it
doesn't
really
matter
what
you
do
on
the
pages
that
lead
up
to
that
page,
provided
that
they're
not
indexed.
A
And
very
happy
new
year
thanks,
you
too,
let's
see
sanjay
hi.
D
A
Maybe
the
I
mean
if
it's
in
in
normal
html
on
the
page
and
it's
just
hidden,
then
it's
it's
possible
that
we
pick
that
up
and
use
it
for
indexing.
A
So,
for
example,
one
thing
that
that
I
saw
recently
is
someone
had
an
error
message
in
in
the
part
of
the
page
that
was
actually
hidden
and
it
was
only
shown
if
there
was
actually
an
error
on
the
page,
but
it
was
always
on
the
page
and
our
systems
picked
that
up
and
thought.
Well,
this
page
is
an
error
page.
We
can
ignore
it.
D
Actually,
the
situation
is
that
on
our
product
category
pages,
we
we
have
fixed
load
for
every
product
and
in
in
that
fixed
load.
We
can
so
like
only
one
or
two
lines
of
description
product
description,
but
in
our
source
code
the
developer
have
mentioned
like
complete
descript
description
like
around
50
60
words
of
description
in
source
code,
but
this
is
not
visible
on
desktop
on
desktop.
It
is
visible
only
for
like
one
or
two
lines
and
on
mobile
devices.
D
It
only
five
or
six
word
appears
for
users,
and
there
is
no
read
more
button
as
well.
Is
there
any
situation
that
google
might
paralyze?
Also
us
like
for
cloaking
or
something
like
that.
A
I
don't
think
so
I
mean
it's.
It
sounds
like
it's
also
text,
that's
related
to
your
product
anyway.
So
from
that
point
of
view,
it's
it's
not
very
misleading.
It's
I
think,
just
sub-optimal
from
your
side
like
adding
a
read
more
button
or
some
kind
of
a
link
to
expand.
The
description
sounds
like
a
simple
solution
to
that.
I
wouldn't
worry
that
google
will
penalize
a
website
for
something
like
that.
D
A
Cool,
let's
see
ellen.
G
Hi
john
happy
new
year
to
you
in
advance,
thanks
for
taking
my
call,
so
we
have
a
situation
where
we
have
a
news
website
that
that's
one
of
the
leading
financial
publications
in
the
country.
They've
done
well
always
and
there's
another
sister
website
completely
unrelated.
But
again
you
know
it's
a
smaller
site,
smaller
traffic,
a
smaller
number
of
stories,
daily
smaller
number
of
stories.
Overall,
the
bigger
website,
the
financial
publication.
Over
time
we
suspected
that
previous
stakeholders,
they
did
some
nefarious
activities.
G
We
think
it's
penalized,
but
there's
no
way
for
us
to
know
that
has
been
penalized
so,
but
lately,
what
we
noticed
was
that
the
other
website
sometimes
takes
stories
from
the
financial
publication,
with
attribution
and
on
the
lesser
website
on
google
news,
it
always
seems
to
end
up
ranking
higher
and
we
realize
if
the
smaller
website
with
a
lower
page
rank
potentially
is
ranking
higher.
We
we
figured
that
something
is
a
foot
or
something
is
a
miss
and
we
don't
know
how
to
go
about
it.
G
I've
disavowed
a
lot
of
links
over
3000
links
a
lot
of
domains
as
well.
We
realized
a
lot
of
traffic
is
coming
from
blogspot.
You
know
some
of
them
were
shady.
G
We
realized
there
was
about
two
300
urls
that
were
marketing
related.
We
blocked
those
about
six
to
eight
months
ago.
We
still
are
not
able
to
rank
higher.
In
fact,
what
happens
is
on
google
news.
We'll
have
a
story
we'll
be
the
exclusive
will
be
the
first
one.
You
know
the
eat.
G
The
whole
e
thing
will
be
trustworthy,
we're
exclusive,
we'll
rank
very
high
and
as
soon
as
even
this
lesser
website
or
other
websites
publish
the
same
story,
even
if
they
do
a
spin
on
our
take
or
if
there's
a
genuine
story
or
a
copy
of-
or
you
know
extrapolating
from
our
story
when
they
do
that,
they
suddenly
start
ranking
higher
and
we'll
immediately
start
disappearing
into
oblivion,
and,
and
now
I'm
tasked
with
you
know
improving
the
seo.
G
The
editors
are
frustrated,
we
put
a
story,
it's
original
and
I'm
kind
of
feeling
out
of
options
on
what
to
do,
and
the
second
part
of
the
question
is
you
know
what
the
disavow
is.
There's
two
ways
to
get
the
list.
One
is
on
these
seo
tools
the
paid
tools
they
come
up
with
a
lot
of
urls
that
don't
show
up
in
the
search
console,
so
we're
looking
at
those
lists,
adding
them
to
our
disavow
list
and
adding
them.
G
D
A
Right
so
so
I
took
a
quick
look
at
at
the
domain
beforehand
and
I
don't
see
any
kind
of
manual
action
or
penalty
associated
with
that
domain.
I
also
don't
see
any
issue
with
regards
to
the
links
for
that
domain,
so
my
suspicion
is,
you
probably
don't
need
to
put
a
lot
of
time
into
the
survival
work
and
probably
even
just
focusing
on
search
console
links.
If
you
actually
find
something
really
problematic,
there
would
be
enough.
A
So
that's
kind
of
I
I
think,
the
from
the
overall
point
of
view
there.
I
I
don't
see
anything
really
holding
back
this
this
website
in
in
search
the
the
main
thing
I
did
notice,
though,
is
that
you
have
geo
targeting
setup
in
search
console
for
pakistan.
A
A
I
I
think
that
that
I
just
realized
in
listening
to
your
question:
is
that
you're
focusing
very
much
on
google
news
and
while
google
news
does
a
lot
of
things
similar
to
google
search,
I
don't
know
if
it
does
everything
in
the
same
way
and
in
particular
with
regards
to
the
ranking
there.
I
don't
know
if
there
might
be
like
separate
things
that
google
news
would
do
that
doesn't
apply
to
google
search.
A
So
what
I
would
also
do
there
is
try
to
contact
someone
on
the
google
news
side
in
in
the
help
center
for
google
news
publishers.
I
believe,
there's
a
form
that
you
can
use
to
contact
the
folks
on
the
google
news
side
and
to
try
to
get
some
input
from
them
as
well.
So
that's
kind
of
the
direction
I
I
would
take
there.
G
All
right
I'll
do
that
thanks
very
much
just
a
quick
question:
how
how
soon
would
these
settings
take
effect?
Let's
say
I
make
it
international
now.
How
soon
can
I
expect
to
see
the
results
or
or
the
impact.
A
So
that's
kind
of
the
the
time
frame
that
that
I
suspect
it
takes
for
something
like
the
geotargeting
setting
to
change
with
regards
to
a
news
website,
I
suspect
it'll
be
a
little
bit
faster
than
usual.
Just
because
we
with
a
news
website,
we
would
focus
on
the
newer
content
and
with
the
newer
content,
we
would
see
the
setting
very
quickly
so
probably
more
on
the
shorter
side
for
news
website,
but
I
don't
know
all
of
the
systems
that
are
involved
with
geo
targeting
settings
changes.
A
G
And
one
follow-up
as
well:
we
are
considering
doing
the
the
instant
api
where
we
update
you.
The
documentation
says
that
that's
relevant
for
jobs,
but
someone
also
on
on
the
same
community.
I
think
there
was
another
question
and
they
alluded
if
it
can
be
used
for
news.
Is
it
okay
to
use
for
news?
We
have
been
using
it
and
the
second
part
is
sometimes
when
we
do
use
it,
we
publish
a
story,
then
we
unpublish
it
and
then
we
republish
it
during
the
editing
phase
and
then
we
finally
publish
it.
G
So
is
that
a
concern,
and
is
that
api
relevant
for
news.
A
G
We
did
notice,
oddly
enough,
that
when
we
did
it,
we
indexed
a
bit
faster
than
if
we
just
wait
for
the
crawler
to
come
around
based
on
the
site
map,
and-
and
I
mean
this
is
anecdotal,
but
we
did.
I
mean
we
didn't
notice
that
minute
or
two
which
is
kind
of
strange,
but
again
if
it
hits
the
limit
on
the
number
of
queries
per
day
or
the
api
limits.
I
guess
it
wouldn't
really
work
for
large
numbers.
Yeah.
A
A
All
right
stein,
I
think.
H
Hi
john
hi
hi.
I
did
a
website
migration
three
months
ago
to
a
new
domain.
I
just
cloned
the
whole
website.
I
had
all
internal
and
external
links
updated
and
redirected
to
the
new
one
I
had
ad
enabled
on
the
old
one,
and
my
old
amp
articles
are
always
ranked
in
google
top
stories.
H
So
now
my
new
one
is
not.
I
have
amp
disabled
right
now
on
my
new
domain
because
I
didn't
like
it
and
it
gives
too
much
trouble
just
I
don't
want
to
use
it
anymore
so,
but
amp
is
not
needed
to
include
in
top
stories
right
now.
So
my
question
is
why
I'm
not
why
my
new
domain
is
not
ranked
in
google
top
stories.
H
I
want
to
add
to
that
that
I
rank
on
the
google
page
on
my
main
keyword
on
top
like
number
two,
and
also
my
articles
are
shown
in
google
news
app
and
on
google
discovery.
So
I
have
a
quite
a
strong
authority
on
my
main
keywords,
so
I
don't
understand
why
I'm
not
displayed
in
google
top
stories.
Can
you
explain
that.
A
I
don't
know
it's
it's
hard
to
say
I,
I
think,
if
you're
doing
a
domain,
migration
and
switching
off
amp
at
the
same
time,
then
especially
with
something
like
top
stories
that
might
be
a
little
bit
confusing,
but
it
sounds
like
otherwise,
things
are
being
picked
up.
Well,
so
probably
you're
you're
on
the
right
track
there.
The
thing
with
with
top
stories
in
particular
is
it's
an
organic
search
feature
and
it's
not
something
that
a
site
just
gets
because
they
deserve
it.
A
A
What
I
I
would
consider
doing
here
is
on,
on
the
one
hand,
sorry
on
the
one
hand,
giving
it
a
little
bit
more
time.
The
the
other
thing
is
to
double
check
things
around
the
page
experience
setting
because,
like
like
we
mentioned
in
the
blog
post,
when
we
turned
that
off.
A
We
we
essentially
said
well
with
with
the
pages
with
a
very
good
page
experience
score.
They
can
essentially
appear
in
top
stories
as
well.
So
it's
not
the
case
that
we
would
take
any
kind
of
page
and
show
it
in
top
stories,
but
rather
we
would
use
kind
of,
like
the
page
experience
score,
almost
like,
as
as
a
kind
of
a
ranking
factor
to
determine
what
we
would
show
within
the
top
story
section.
So.
H
In
my
my
site,
vitality,
so
all
my
urls
are
in
the
green.
A
H
So
all
that's
good!
So
there's
not
like
99.4
percent
of
my.
I
have
good
99.4
of
good
urls,
so
there's
all
green
there
with
no
no
specific
marks
or
whatever.
So
that's
all
in
place.
H
A
Yeah
my
my
suspicion
is:
maybe
it
just
takes
longer
well.
A
Yeah,
maybe
with
like
the
domain
migration,
it
just
takes
a
little
bit
longer
to
to
kind
of
get
settled
in
there.
It's
it
sounds
like
everything
else
is,
is
kind
of
okay,
and
especially,
if
you're
being
shown
in
google
news
and
discover
as
well
yeah,
then
it's
like
things
are
essentially
set
up.
Well,
so
my
guess
is
it's
more
a
matter
of
time.
The
the
thing
also
with
with
top
stories
in
particular,
is
I'm
not
aware
of
anything
manual
on
our
side
where
we
would
control
that
or
we
would
say.
Oh.
H
H
H
A
I
I
would
say
if
you
don't
see
any
change
within
a
month,
then
let
me
know
and
send
me
the
url
and
I
can
double
check
with
with
the
team
here,
but
especially
if
you're
saying
that
migration
took
place.
Like
I
don't
know
three
months.
H
A
H
No
problem,
okay,
thank
you,
one
other
short
question
on
about
so
I'll
write
about
streaming
services.
I
always
wrote
about
one
specific
streaming
service,
but
now
I
added
a
second
one,
but
now
all
of
a
sudden
google
is
adding
a
keyword
in
in
my
urls
when
it's
like.
I
think
it's
like
too
short.
Maybe
google
thinks
it's
too
short
and
it
adds
my
main
keyword
in
articles
from
the
second
streaming
services,
so
that
don't
make
sense
at
all.
H
H
I'm
talking
about
titles
yeah,
so
it
adds
keywords.
It
adds
keywords
from
streaming
services
b
to
the
article
title
from
streaming:
services,
a
okay
randomly,
so
google's
probably
used
to
you
only
write
about
streaming
services
a
so
you
know.
So
it's
very
confusing.
So
I
I
can
send
it
to
you,
but
it's
yeah.
It
don't
looks
good
in
the
service.
It's
quite
confusing
yeah.
If
you
can
drop
a.
A
You
too,
bye-bye
bye.
Let
me
run
through
some
of
the
submitted
questions
very
briefly,
and
then
we
can
get
more
to
live
questions
as
well,
and
anything
else
we'll
see
it
looks
like
a
bunch
of
the
questions
were
already
submitted,
so
probably
similar.
The
first
one
I
have
here
is:
I
am
currently
improving
the
website
quality
after
I
have
multiple
pages
getting
discovered,
but
not
indexed.
I
wonder
in
what
time
frame
I
can
expect
google
to
pick
up
quality
changes.
A
We
we
talked
about
this
briefly
before
and
essentially
making
bigger
quality
changes
on
a
website
takes
quite
a
bit
of
time
for
google
systems
to
pick
that
up.
So
I
would
assume
that
this
is
something
more
along
the
lines
of
several
months
and
not
several
days.
A
So
that's
kind
of
the
time
frame
that
I
would
aim
for
here
and
because
it
takes
such
a
while
to
get
quality
changes
picked
up.
My
recommendation
would
be
not
to
make
like
small
changes
and
wait
and
see
if
it's
good
enough,
but
rather
really
make
sure
that,
if
you're
making
significant
quality
changes
make
sure
that
it's
like
really
good
quality
changes
and
not
that
you're
going
to
have
to
go
back
and
improve
that
again,
because
if
it
takes
a
few
months,
you
don't
want
to
wait
a
few
months
and
then
decide
oh
yeah.
A
E
If
we
interrupt-
because
this
ties
back
to
the
quality
issue
that
you
were
discussing
with
a
translation
topic,
so
that
would
rather
hint
towards
well,
if
you
think
now
that
the
poor
quality
of
the
translation
is
the
cause
of
maybe
the
entire
site
going
down
in
the
search
results,
then
I
understand
that
you
would
rather
advocate
well.
If
you
now
go
back
and
re-translate
text,
do
it
the
right
way
for
the
entire
site
and
do
not
do
penny
penny-pinching
efforts?
E
Where
you
say,
okay,
we
will
concentrate
on
the
12
main
pages
and
then
we
might
see
if
that
works
or
not,
rather
really
do
good
work
for
the
entire
site
that
correct.
I,
I
think.
A
If
you're,
in
a
situation
where
your
site's
dropped
invisibility
and
you
think
it's
because
the
overall
quality,
then
I
would
definitely
go
and
try
to
do
as
much
as
possible
if
you're,
in
a
situation
where
you're
expanding
a
website
and
adding
translations,
then
it
feels
like
something
where
you
could
say.
We
will
initially
focus
on
the
more
important
parts.
A
A
Then,
back
to
the
the
e-commerce
question,
I
sell
handmade
shoots:
they're
all
produced
for
a
specific
age
range
within
the
same
material
technique,
etc,
but
only
the
design
is
different.
Would
it
be
counted
as
duplicate
content
by
google?
If
I
write
one
and
only
high
quality
product
description
for
all,
or
is
it
better
to
have
unique
descriptions
for
each
one
which
reduces
the
quality
of
the
content?
A
I
don't
know
if,
like
the
unique
descriptions,
would
reduce
the
quality
of
the
content.
So
from
that
point
of
view,
I
would
kind
of
argue
that
you
can
have
both
unique
descriptions
and
high
quality
descriptions,
so
that
kind
of
that
last
part
I
I
would
ignore,
but
the
the
general
question
with
regards
to
duplicate
content
is:
we
would
probably
see
this
as
duplicate
content,
but
we
would
not
demote
a
website
because
of
duplicate
content.
A
So,
from
a
practical
point
of
view,
what
would
happen
is
if
someone
is
searching
for
a
piece
of
text
that
is
within
this
duplicated
description
on
your
pages,
then
we
would
recognize
that
this.
This
piece
of
text
is
found
on
a
bunch
of
pages
on
your
website
and
we
will
try
to
just
pick,
maybe
one
or
two
pages
from
your
website
to
show.
So
it's
not
that
we
would
demote
your
website
or
kind
of
penalize
the
website
in
any
way,
because
it
has
duplicate
content.
It's
more
from
a
practical
point
of
view.
A
We
recognize
you
have
this
content
on
a
lot
of
your
pages.
So
if
someone
is
searching
specifically
for
that
content,
it
doesn't
make
sense
for
us
to
show
all
of
those
pages
and
that's
kind
of
reasonable.
When,
when
people
are
searching
for
a
piece
of
content,
they
don't
need
to
find
all
of
the
pages
within
your
website
that
have
that
piece
of
content.
A
The
the
thing
I
I
think
to
to
kind
of
watch
out
for
here
is,
if
you
don't
have
anything
in
the
textual
content
at
all
that
covers
kind
of
the
the
visual
elements
of
your
products,
then
it
makes
it
very
hard
for
us
to
actually
show
these
properly
in
the
search
results.
So,
for
example,
if
you
you
mentioned,
you
have
handmade
shoes.
If
you
have
blue
shoes
and
red
shoes-
and
you
never
mentioned
which
color
these
shoes
are
in,
then,
if
someone
is
searching
for
blue
shoes,
we
might
think
well.
A
Let's
see,
I
have
a
question
about
a
notification
from
search
console.
It's
about
my
author
archive
pages
that
are
missing
a
field
url.
I
would
like
to
no
index
my
author
archive
pages.
Will
it
solve
a
problem
or
will
it
have
an
impact
on
my
site
appearing
in
search
the
author
archive
pages?
Don't
have
any
keywords
anyway?
Are
they
important
for
eat?
Will
my
eat
score
go
down
if
I
no
index
the
author
archive
pages,
so
we
don't
have
an
eat
score.
A
The
notification
you
received
in
search
console
is
probably
about
structured
data
that
you're
using
on
these
pages,
and
if
you
don't
want
those
pages
indexed,
then
by
no
indexing
those
pages,
you
will
remove
that
notification
as
well,
if
you're,
using
some
kind
of
a
plugin
on
your
on
your
site
that
generates
the
structured
data
there,
then
maybe
you
can
disable
it
for
those
author
pages
and
it'll
be
fixed
as
well
or
maybe
you
can
fix
the
the
fields
that
the
structured
data
provides
and
that
will
also
solve
the
problem.
A
My
guess
is
that
the
structured
data
that
you're
using
on
these
pages
is
not
critical
for
your
site.
It's
not
something
that
we
would
show
in
the
search
results
directly
anyway.
So
from
that
point
of
view,
probably
you're
fine
with
either
removing
the
structured
data
from
those
pages
no
indexing
those
pages,
if
they're
not
critical
for
your
site,
all
of
that
would
be
fine.
A
I
I
think
I
I
would
see
this
slightly
differently
if
I
knew
that
this
was
a
site
that
really
focus
a
lot
on
the
authority
and
kind
of
the
knowledge
and
the
name
of
the
authors
where,
if
people
are
actively
searching
for
the
name
of
the
author,
then
your
collection
of
content
by
that
author
might
be
actually
useful
to
have
in
the
search
results.
So
for
for
those
kind
of
sites.
A
I
think
it
would
be
useful
to
keep
that
index,
but
then
you
would
probably
already
want
to
keep
that
index
because
they're
getting
traffic
from
search.
So
if
you're
not
seeing
any
traffic
at
all
to
these
author
pages
and
they're,
just
random
people
who
are
writing
for
your
blog
or
something
like
that,
then
probably
no
indexing
them
would
be
fine.
A
I
I
don't
think
any
of
these
approaches
are
really
better
or
worse,
in
the
sense
that
for
the
longest
time
we
didn't
have
any
guidance
on
pagination
and
it
seemed
to
just
work
well
for
search.
So
it
felt
like
something
where
we
don't
actually
have
to
tell
people
what
exactly
to
do.
A
If
you
link
sequentially,
what
will
generally
happen
is
the
first
page
of
your
set
will
have
a
lot
stronger
signals
in
the
sense
that
your
main
content
links
to
the
first
page
and
then
it
kind
of
like
incrementally
drops
and
drops
and
drops
as
it
read,
goes
through
the
pagination
set,
so
you're
kind
of
focusing
more
on
the
first
pages
of
the
set.
Whereas
if
you
link
to
a
bunch
of
different
pages
immediately,
then
you're
kind
of
like
spreading
it
out
a
lot
faster.
A
G
A
quick
follow-up
on
this,
so
we
have
about
thousands
of
authors
and
each
author.
Some
of
them
might
have
hundreds
of
pages.
Now
when
a
new
story
comes
in
page
two
changes,
because
everything
gets
sort
of
cascading
down.
In
that
case,
does
google
recrawl
everything
or-
and
we
know
that
the
links
on
the
older
on
the
deeper
pages
have
already
been
indexed?
Is
there
any
guidelines
around
that
or
should
we
just
expect
google
to
recrawl
because
everybody
gets
reshuffled
every
day
across
hundreds.
G
So
we'll
be
all
right
to
put
a
no
no
crawl
on,
let's
say
after
page
number
100,
you
know
because
it's
short.
D
A
Cool,
I
think,
we're
kind
of
at
time.
I
have
one
more
hand
raised
and
then
I
can
pause
the
recording
for
for
youtube
and
we
can
continue
chatting
a
bit
if
any
of
you
want
to
cheyenne.
I
think.
I
A
I
I
know
the
page
experience
in
google
search
console
on
the
core
web
vitals
information
is
from
the
chrome
ux
data
set,
but
when
I
look
at
mobile
browsers
specifically
and
the
mobile
browser
shares
like
if
like
for
us,
the
safari
and
other
browsers
have
a
share
of
60
percent.
Does
that
mean
still
it's
only
the
other
40
percent
field
data
which
is
used
for
the
core
web
titles
and
page
experience
or
does
google
do
something
else
to
account
for
the
other
browsers
page
experience
and
corporate
vitals
too.
A
A
So
there
are
also
specific
requirements
with
regards
to
kind
of
the
the
settings
that
are
enabled
for
for
the
user
if
they're
sending
performance
data
back
or
not,
which
are
also
required
there,
and
from
that
point
of
view,
it's
not
like
all
chrome
users,
but
also
just
the
subset
of
the
chrome
users.
A
For
for
the
most
part,
I
I
think
it's
still
representative,
because
browsers
tend
to
be
kind
of
very
similar
with
regards
to
their
qualities
and
if
anything,
the
more
high-end
devices
which
which
sometimes
the
the
ios
devices
are
tend
to
be
a
little
bit
faster.
So
it's
something
where
it's
not
that.
I
think
we're
skewing
our
data
in
a
bad
way.
A
So
that's
at
least
as
far
as
I
understand
it.
I
I
don't
know
all
of
the
details
with
regards
to
chrome
user
experience
report
data,
but
it
would
surprise
me
if
there
were
like
approximations
that
were
kind
of
applied
there
to
say,
oh
well.
We
will
just
make
it
so
many
percent
more
because
we
think
that
other
users
might
be
like
that.
I
think
it's
really
just
based
on
the
data
that
we've
collected.
A
Okay,
so
with
that,
let
me
pause
the
recording
here.
It's
been
great,
having
you
all
here
and
even
on
the
last
day
of
the
year,
I
I
don't
think
I
will
make
more
of
these
office
hours
this
year,
but
next
year
you
can
count
on
it.
We'll
definitely
have
more
than
so.
Let
me
pause
the
recording
here
and
wish
you
all
a
fantastic
new
year
and
if
you're
watching
this
on
youtube
feel
free
to
join
in
in
one
of
the
future
episodes
where
we
can
talk
about
your
questions
as
well.