►
From YouTube: English Google SEO office-hours from January 7, 2022
Description
This is a recording of the Google SEO office-hours hangout from January 7, 2022. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
back
everyone
to
today's
episode
of
the
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hours
where
people
can
join
in.
I
think,
first
of
all,
happy
new
year
to
everyone.
I
hope
you
had
a
good
break
and
had
a
chance
to
recover.
A
I
don't
know
whatever
you
needed
to
recover
and
that
things
aren't
looking
too
bad
where
you
are
at
the
moment.
It
looks
like
lots
of
people
have
questions.
Lots
of
people
are
here
and
we
have
a
bunch
of
questions
submitted
as
well.
A
So
maybe
we'll
just
go
through
some
of
the
people
that
have
raised
their
hands
already
and
then
we'll
kind
of
see
where
we
end
up.
Let's
see
rogan,
I
think
you're.
First.
B
Hi
john
yeah,
just
a
hopefully
a
relatively
quick
one
we
we've
been
trying
to.
We
have
a
book
website.
It's
a
subscription
product
with
about
800
000
books
on
it.
When
you
search
for
a
book
on
google
on
the
right
hand,
side
there's
like
a
get
book
function,
you
get
book
option
on
the
book
component
on
the
book
overview
component.
B
There's
when,
when
we,
when
we
search
for
some
books,
it
is
up
to
seven
different
providers
and
we've
been
trying
to
get
in
as
one
of
those
providers
and
we've
added
the
book,
schema
markup
and
then
submitted
a
request
to
be
part
of
you
know
part
of
that
initiative
multiple
times,
but
I
haven't
heard
anything
back.
A
I
I
think,
being
submitted
in
with
the
form
is
essentially
the
right
approach
at
the
moment.
My
my
understanding
is:
the
team
is
still
going
through
those
form
submissions
and
the
plan
is
kind
of
to
move
to
a
more
open
model
where,
whenever
we
discover
the
markup,
we
just
start
using
it.
If
we
feel
we
can
trust
it,
but
I
don't
know
what
what
the
current
status
is
there
at
the
moment.
If,
if
you.
C
A
B
Yeah,
I've
added
a
comment
to
the
before
the
call
with
all
the
information
there
we've
submitted
it,
maybe
like
eight
ten
times
over
the
last
nine
months
and
haven't
heard
anything
back
so
yeah.
We,
if
you
can
give
them
a
nudge.
That
would
be
awesome.
A
Yeah,
I
can't
promise
that
they'll
like
turn
it
on
automatically,
but
you
usually
the
these
kind
of
nudges
help
us
to
to
kind
of
like
make
make
that
step
to
to
making
it
a
little
bit
more
open
in
the
sense
that
we
just
use
it
whenever
we
can
trust
it.
A
Okay,
thanks
all
right,
the
next
name,
I
don't
think
I
can
pronounce
if
you
can
pronounce
your
name
and
feel
free
to
jump
on
in.
D
Okay,
so
a
very
happy
new
year
and
a
very
good
evening,
according
to
our
local
time,
so
my
name
spells
like
showing
and
I'm
from
india.
D
So
my
question
is
that
we
all
have
heard
a
term
called
crawl
budget
and
suppose
I
have
a
blog
in
which
I
post
almost
one
article
every
day
and
another
people,
another
person
having
a
blog
in
which
he
post
about
one
article
in
a
week.
So,
according
to
consistency,
I
am
a
bit
more
consistent
than
him.
So
will
it
affect
how
google
will
frequently
will
it
affect
the
frequency
on
how
google
will
crawl
my
website
or
how
google
will
handle
my
website?
D
Suppose
we
both
publish
the
same
content
on
same
day,
but
I
I
but
he
publishes
less
content
in
his
blog.
So
will
my
content
will
rank
in
google
on
upper
on
top
and
his
content
will
rank
on
google
on
down
or
on
low?
Is
this?
Does
consistency
have
any
connection
with
ranking.
A
I
I
don't
think
so,
so
there
are
a
lot
of
actors
that
go
into
ranking
and
being
able
to
crawl
and
index.
A
website
is
definitely
one
of
those
things,
but
if
we're
talking
about
one
one
page
per
day
or
one
page
a
week,
kind
of
the
ability
for
us
to
crawl,
that
is
trivial.
If
we're
talking
about
millions
of
pages
every
day,
then
like
some,
sometimes
the
technical
capabilities,
kind
of
come
into
play
and
crawl
budget
is
a
topic.
A
D
Okay,
I
have
understood
another
question
that
when
I
used
to
post,
as
I
said
one
article
every
day
in
my
website,
I
used
to
say
that
google
almost
crawls
my
website
every
day
and
when
I
go,
I
went
to
the
search
console
and
when
I
see
this
side
map
yeah,
I
see
that
google
recalls
it
every
day.
But
when
I
became
inconsistent
then
I
I
saw
that
google
crossed
the
site
once
in
every
two
days
or
maybe
a
lesser.
So
is
this
a
fact.
A
That
can
happen
yes,
so
it's
I
mean
it's
not
so
much
that
we
crawl
a
website,
but
we
crawl
individual
pages
of
a
website
and
when,
when
it
comes
to
crawling,
we
we
have
two
types
of
crawling.
Roughly
one
is
a
discovery
crawl,
where
we
try
to
discover
new
pages
on
your
website
and
the
other
is
the
refresh
crawl,
where
we
update
existing
pages
that
we
know
about
so
for
the
most
part,
for
example,
we
would
refresh
crawl
the
home
page.
A
I
don't
know
once
a
day
or
every
couple
of
hours
or
something
like
that,
and
if
we
find
new
links
on
their
homepage,
then
we'll
go
off
and
crawl.
Those
with
kind
of
the
discovery
crawl
as
well,
and
because
of
that
you'll
always
see
kind
of
a
mix
of
discover
and
refresh
happening
with.
Regards
to
crawling
and
you'll
see
some
baseline
of
crawling
happening
every
day.
A
Whereas
if
it's
a
news
website
that
updates
once
a
month,
then
we
should
learn
that
we
don't
need
to
crawl
every
hour
and
that's
not
a
sign
of
quality
or
a
sign
of
ranking
or
anything
like
that.
It's
really
just
purely
from
a
technical
point
of
view.
We've
learned
we
we
can
crawl
this
once
a
day
or
once
a
week
and
that's
okay,.
D
Okay,
my
last
and
final
question
is
that
suppose
I
have
posted
our
topic
on
my
blog,
a
post
on
my
blog
and
due
to
some
need.
I
need
to
regain
post
that
contents
on
my
blog
again
with,
possibly
maybe
that
I
have
posted
already
posted
to
a
post,
and
I
can
I
I
I
need
to
combine
them
and
add
something
new
and
then
republish
it.
D
A
No,
I
I
think
that's
usually
fine.
I
mean
our
systems
would
recognize
when
there's
significant
duplication
on
a
website,
but
for
a
large
part,
those
are
normal
things
that
happen
on
a
website
like.
If
you
have
a
category
page
and
a
detail
page
for
a
blog
post,
then
obviously
on
the
category
page
you'll
have
some
part
of
the
blog
post
itself
as
well,
which
is
duplication,
but
it's
also
fine.
D
Okay,
so
pardon
me,
I
have
got
another
question
so
every
time
we
often
hear
that
a
google
likes
fresh
content.
So
if
I
have
posted
a
content
in
a
post,
somehow
in
late
2020
and
should
I
again
repost
it
or
update
it
or
should
I
repost
it.
A
D
So
if
I
need
to
add
something
or
if
I
need
to
update
it,
will
I
update,
should
I
update
it,
or
should
I
repost
it
I
I
will
just
update
it.
I
think
that's
perfectly
fine,
yeah!
Okay!
Thank
you!
So
much!
That's
all
that
was
all
my
query
for
today,
thanks
and
nice,
nice
to
talk
to
you
cool
all
right,
ahmed.
C
Hi
john
hi
hi
great.
My
question
is
about
hf
link
text.
Actually,
let's
say
I
have
one
website
that
I
typed
a
specific
language
with
that
domain
and
performs
very
well
in
that
particular
language,
and
then
I
decided
to
create
an
english
version
of
that
website
to
target
people
in
global
new
domain.
So
should
I
add,
hdf
length
text
connect
these
two
separate
domains
or
just
leave
it
alone,
google,
to
figure
out
itself,
and
can
these
have
blank
text
impact
my
website's
performance
in
a
positive
or
negative
way?
A
A
A
It's
it's
not
necessary,
I
mean
it
doesn't
change
the
rankings,
but
it
helps
to
make
sure
that
your
preferred
version
is
shown
to
the
user.
It
doesn't
guarantee
it,
but
it
makes
it
a
little
bit
easier
for
us
to
show
like
the
the
preferred
language
version.
So
if
someone
is
searching
in
french-
and
we
have
your
french
and
your
english
pages,
we
wouldn't
accidentally
show
the
english
page
to
them.
A
Cool,
let's
see,
I
think,
vahid.
F
Hello,
john
hi
today,
I'm
gonna
ask
about
page
experience.
We
know
that
google
search
console
report,
three
factors
for
page
experience:
mobile
friendly
https
and
core
corvette
guitars.
So
I'm
gonna
ask
you
if
there
is
more
factors
for
paid
experience
or
these
three
are.
What
could
you
care
about
in
this
case.
A
I
I
think
the
the
only
one
that
is
missing
is
intrusive,
interstitials,
which
we
also
have
documented,
and
I
think
at
the
moment
we
don't
have
a
report
specific
to
that.
So
that's
not
listed
separately
in
search
console
and
I.
F
Have
another
question:
what
data
does
google
chrome
collect
from
users
for
ranking?
Can
you
please
tell
me
by
at
least.
A
I
don't
think
we
use
anything
from
google
chrome
for
ranking,
so
the
only
thing
that
happens
with
with
chrome
is
the
for
the
page
experience
report.
We
use
the
chrome
user
experience
report
data
which
is
kind
of
that
aggregated
data
of
what
users
saw
when
when
they
went
to
the
website.
With
regards
to
the
page
experience
specifically-
and
I
think
that's
the
only
thing
that
we
use
from
chrome
within
ranking.
F
A
Well,
I
mean
we,
we
don't
use
that
data,
so
I
I
think
I
think
these
metrics
are
sometimes
useful
for
site
owners
to
look
at,
but
that
doesn't
mean
that
they're
useful
for
search
to
to
actually
use
okay.
Thank
you,
john,
for,
for
your
time
sure
all
right
and
the
next
name.
I
don't
think
I
can
pronounce
either.
Maybe
if
you
can
pronounce
your
name
and
feel
free
to
get
started.
G
Hi,
like
my
question,
is
on
international
targeting
so
like
we
have
set
up
a
business
from
last
10
to
15
years
and
we
have
dot
com
in
for
the
united
states.
Recently
in
the
past
two
years,
we
have
started
for
the
india,
australia
and
mexico
and
we
have
purchased
some
cctlds
and
we
have
set
up
a
different,
different
website
for
the
different
websites.
But
initially
what
happened?
G
A
A
So
especially,
if
you
have
different
country
versions
in
english
in
the
same
language,
then
the
hraflang
makes
it
a
little
bit
easier
for
us
to
show
the
best
matching
version
for
that
user.
So,
if
you're
seeing
the
the
wrong
version
shown
in
the
search
results
like
if
someone
is
searching
in
australia
and
they
see
the
indian
version,
then
that's
a
pretty
strong
sign
that
the
href
lang
would
be
useful
if
they're,
mostly
seeing
the
right
version
already
in
search,
then
probably
the
hr
flying
is
not
critical.
G
Yes,
this
is
the
scenario
like
our
dot.
Com
is
ranking
for
the
indian
version,
also
in
the
indian
webs
region,
but
we
want
to
rank
like
dotting
website
that
cctld.
So
if
we
install
that
as
reflect
tag
right
now
for
the
dotting
website,
so
do
we
get
the
same
ranking
for
the
dot
in
website.
I
John
hi
yeah,
I
have
just
posted
my
question
on
the
comment
section
as
well.
So
just
let
me
tell
you
again,
like
I
have
two
questions.
One
is
that,
like
we
represent
the
company
expertise.com
and
we
are
the
job
portal
for
international
region.
I
So,
as
of
now
like
yesterday,
we
received
a
reminder
for
the
new
job
posting
content
policy
guidelines
and
that
it
is
said,
like
the
direct
apply
markup,
which
has
been
very
recently
added
and
a
few
policy
guidelines,
so
in
that
it's
recommended
for
us
to
please
review
the
updates
and
implement
the
needed
changes
on
our
site
so
going
through
that
you
know
we
haven't
added
the
direct
apply
because
most
of
the
jobs
comes
from
our
ats
partners
and
recruitment
website.
I
But,
however,
following
that
we
have
added
the
director
prime
hacker
for
those
which
are
not
direct
jobs.
We
said
it's
false
and
for
direct
jobs.
We
said
it
true,
okay,
so
in
this
regard
just
we
want
to
make
sure
that
is
it
a
matter
of
concern
for
us
that
email,
because
we
want
to
make
sure
that
we
maintain
and
comply
with
google
and
maintain
their
eco-friendly
environment.
So
I
or
that
that's
one,
my
question
and.
E
A
Informational
email,
yeah,
then
from
you
from
what
I
know.
Usually
those
are
situations
where
it's
more
that
we
want
to
inform
you
of.
Maybe
this
change
in
in
the
the
possibilities
that
you
have
with
regards
to
the
markup.
It
would
be
different
if
there
were
explicitly
a
message
from
referring
to
a
manual
action
or
something.
A
Because
that
would
mean
that
the
the
spam
team
or
the
policy
team
has
taken
a
look
and
seen.
Oh
there's
something
really
wrong
here
and.
I
And-
and
one
more
question
I
have
is
that
like
actually
we
used
to
maintain
for
each
country
a
domain
as
a
subfolder,
like
example,
for
you
as
expert
in
ejobs
usa.
But
then
we
thought
the
better
idea
is
to
go
on
with
subdomain
for
each
country.
I
Okay,
so
we
move
from
folders
to
subdomains,
but
when
we
did
we,
we
have
used
the
three
zero
one
permanent
redirect,
but
we
see
there's
a
lot
of
fall
down
in
the
traffic
from
actually
we
did
it
from
past
six
months
back,
but
since
we
unable
to
gain
the
traffic
what
we
used
to
have
on
the
folder
website.
So
what
we
did
is
as
of
now
we
have
you
know
just
enable
both
pages,
like
the
folder
page
for
the
usa
and
subdomain
page
usa,
dot
expertini.
I
So
should
we
use
now
the
conical
link,
rather
than
be
what
we
used
for
the
three
zero
one,
because
that
was
not
correct
for
us.
So
we
thought,
like
the
better
idea,
is
to
use
the
chronicle
and
keep
maintaining
both
the
pages
till
google
fully.
You
know
understand
that
we
want
to
prioritize
subdomains
rather
than
the
folder
website.
A
You
can
do
that,
okay,
I
I
think
yeah,
I
I
don't
see
any
downside
to
doing
that.
I
okay,
I
think
the
the
main
issue
is
that
if
you
have
those
two
versions,
we
will
try
to
crawl
both
of
them
and
if
it's
a
very
large
job
site,
where
you're
really
posting
millions
of
jobs,
then
it
might
be
that
the
crawling
is
kind
of
the
limiting
factor.
So
that's
what
what
I
might
watch
out
for
that
is
like,
with
the
crawling
side,
we're
actually
being
able
to
keep
up.
J
All
right
so
happy
new
year,
first
and
yeah.
My
question
is
about
how
to
get
another
language
version
of
a
website
into
top
stories.
J
So
we
we
have
a
normal
site
which
is
german
and
another
site
which
is
in
the
section
now,
which
is
french
and
we
publish,
I
think
enough
and
high
quality
articles
for
almost
half
a
year
now,
but
we
never
made
it
into
top
stories
and
we
wonder
what's
the
problem
yeah,
so
I
checked
multiple
things
so,
for
example,
technically
it's
exactly
the
same
we
implemented
like
hreflang.
J
We
also
added
it
to
a
publisher
center
with
a
french
version,
but
we
are
kind
of
stuck,
and
we
know
that
many
that
also
competitors
and
other
sites
have
the
same
problem
with
new
publishing
web
websites
to
get
into
top
stories.
And
I
wonder
if
you
could,
if
you
know
something
here
about
how
long
do
we
have
to
wait?
How
can
we
push
it
because
we
really
think
that
yeah,
the
content
should
be
good
enough
to
get
at
least
one
ranking
in
half
a
year.
A
No,
I
I
don't
know
so,
there's
nothing
technical
from
from
our
side
that
really
kind
of
plays
into
that.
When
it
comes
to
top
stories,
we
we
don't
require
amp
pages
nowadays.
So
I
don't
know.
If
that's
a
factor,
however,
we
do
take
into
account
things
like
the
the
page
experience
metrics,
so
so
things
around
kind
of
the
core
web
vital
speed.
A
Those
things
I
suspect,
if
you're,
using
the
same
platform
that
you'll
have
similar
metrics
to
your
other
website,
but
I
don't
know
that
might
be
one
thing
to
to
double
check.
The
other
thing
is
usually
just
because
it's
in
a
different
language
wouldn't
mean
that
we
would
treat
it
differently.
So
if
you
have
it
in
something
like
a
subdirectory,
then
from
our
point
of
view,
that
would
probably
be
something
that
we
just
treat
the
same
at
your
other
as
the
rest
of
your
website.
A
A
J
Yeah,
so
so
all
that
all
the
metrics
are
basically
the
same,
so
we
have
the
same
setup
with
amp.
We
have
the
same
page
experience
and
we
see
really
good
performance
in
discover
and
also
in
normal
search
and
also
with
amps,
sometimes
in
normal
search.
But
it's
just
about
these
top
stories
and
for
me
I
read
some
rumors
that
it's
harder
if
you
started
a
new
platform
after
december
2019
to
get
into
because
they
might
have
something
changed
in
publisher
center.
Okay,
so
you
don't
know,
I
don't
know.
I.
J
E
A
Yeah
thanks
cool,
let's
see
alex.
E
And
hello,
joan
and
happy
new
year,
alex
here
coming
from
greece
so
about
a
year
ago,
danny
sullivan
posted
a
blog
post
on
the
google
search
blog
about
how
autocomplete
suggestions
are
generated,
which
was
great.
So
it
offered
a
lot
of
information
and
insights
about
about
the
the
predictions
and
how
they
work.
E
And
then
I
was
actually
curious
about
the
other
sort
of
related
feature
on
the
serp,
which
are
the
related
searches
just
on
the
very
bottom
of
the
syrup
and
actually
try
to
find
some.
I
did
some
research
try
to
find
some
in
for
some
sort
of
similar
info
information,
just
like
the
one
that
donnie
sullivan
shared
on
the
on
the
blog
post.
E
I
couldn't
find
anything
to
be
honest
and
the
only
thing
that
I
was
able
to
find
actually
came
from
google
trends
where
they
have
a
similar
concept
of
related
searches
and
the
way
they
they
define
it.
There
is
that
these
are
terms
frequently
searched
with
the
term
you
entered
in
the
same
session
in
the
same
session.
So
the
question
is:
does
this
apply
also
to
the
classic
google
search?
And
if
not
so,
how
are
these
related
searches
generated
and
where
do
they
come
from.
A
I
don't
think
we
have
any
documentation
on
that
specifically,
so
I
I
don't
have
any
kind
of
like
hot
hot
lead
for
you
in
in
that
regard,
and
usually
things
in
in
other
products
are
not
necessarily
the
same
across
that
product
and
search.
So
it's
it's
possible
that
google
trends
uses
something
similar
to
what
google
search
uses,
but
I
would
not
assume
that
it's
the
same
kind
of
algorithm
or
same
setup
for
that,
but
it
I
mean
intuitively.
It
sounds
like
it's.
A
E
A
Okay,
let
me
run
through
some
of
the
submitted
questions
and
I'll
get
to
more
of
the
people.
Who've
raised
their
hands
as
well
later
on.
Let's
see
the
first
one
I
have
on
my
list,
since
the
number
of
features
are
increasing
in
search
results.
I
guess
that's
similar
to
before
I'm
wondering
if
and
how
google
search
console
is
including
rankings.
For
example,
the
google
map,
packs
or
people
also
ask
in
metrics,
like
average
position,
clicks,
etc.
A
If
not,
what's
the
best
way
to
see
if
my
website
is
ranking
in
these
kind
of
different
features,
so
for
the
most
part,
yes,
we
do
include
all
of
that
in
in
the
performance
report,
data
in
search
console
so
anytime,
or
at
least
we
try
to
do
that
anytime.
We
show
a
url
from
your
website
in
the
search
results.
We'll
show
that
as
an
impression
for
that
website.
A
For
that
query,
the
the
position
average
position
also
goes
into
play
there
and
the
average
position
is
not
like
the
average
position
on
a
page,
but
the
average
top
position.
So
if
your
page,
if
your
website
is
visible
in
position,
three
four
and
five
for
example,
then
we'll
track
three
as
kind
of
the
position
for
that
individual
query.
A
So
all
of
that
kind
of
comes
into
there
already.
What
you
don't
see
is
for
a
lot
of
these
features,
a
breakdown
by
the
feature
type.
So
you
can't
go
in
and
say
like:
where
is
my
website
always
being
shown
within
google
business
profiles
or
within
the
map?
Searches?
We
don't
show
that,
but
we
do
count
that
as
an
impression
for
those
individual
queries.
A
But
all
of
that
should
go
come
into
play
when
we
launch
new
features
where
we
also
list
the
website.
We
do
try
to
watch
out
to
make
sure
that
we
also
include
that
in
search
console.
A
A
Let's
see
we
have,
we
see
that
every
javascript
string,
starting
with
a
slash,
is
interpreted
as
a
url
and
is
followed
by
googlebots.
Sometimes
the
url
is
not
valid
and
we
see
different
crawl
errors
in
search
console.
Is
there
an
official
recommendation
on
how
to
no
follow
such
urls?
We
used
to
split
the
strings
into
two
or
more
parts
having
millions
of
pages
with
such
strings.
A
May
negatively
impact
the
crawl
budget,
so
I
think
just
kind
of
like
the
the
last
question
or
the
last
part
of
the
the
question
there
with
regards
to
crawl
budget.
That's
one
thing
you
definitely
don't
have
to
worry
about,
because
when
it
comes
to
crawling,
we
prioritize
things
in
different
ways
and
all
of
these
kind
of,
like
random
url
discoveries
that
we
come
across
where
your
url
is
mentioned
in
a
text
or
in
a
javascript
file
somewhere,
those
tend
to
be
fairly
low
on
the
list.
A
So
if
we
have
anything
important
that
we
recognize
on
your
website,
any
new
pages
that
you
link
to
any
new
content
that
you've
created
we'll
prioritize
that
first
and
then,
if
we
have
time
we'll,
also
go
through
all
of
these
random
other
url
mentions
that
we've
discovered
so
from
a
crawl
budget
point
of
view.
This
is
usually
a
non-issue
if
you're
seeing
that
overall
we're
crawling
too
much
of
your
website,
then
you
can
adjust
the
amount
of
crawling
in
search
console
with
the
crawl
rate.
Setting
and
again
here
we
still
prioritize
things.
A
So
if
you
set
the
setting
to
be
fairly
low,
then
we'll
still
try
to
focus
on
the
important
things
first,
and
if
we
can
cover
the
important
things,
then
we'll
try
to
kind
of
go
through
the
rest.
So
from
that
point
of
view
like,
if
you're
really
seeing
that
we're
hitting
your
server
too
hard,
you
can
just
adjust
that
after
a
day
or
two,
it
should
kind
of
settle
down
at
that
new
rate,
and
we
should
be
able
to
kind
of
like
keep
on
crawling
with
regards
to
no
following
these
urls.
A
What
you
can
do,
however,
is
put
these
urls
into
a
javascript
file
that
is
blocked
by
robot's
text,
and
if
the
url
is
blocked
by
robots.txt,
then
we
won't
be
able
to
see
the
javascript
file
and
we
won't
see
those
urls.
So
if
it's
really
a
critical
thing
that
you're
thinking
oh
googlebot
is
getting
totally
lost
on
my
website,
then
you
could
use
robots
text
to
block
that
javascript
file.
The
important
part
there
is
to
keep
in
mind
that
your
site
should
still
render
normally
with
that
file
kind
of
blocked.
A
A
A
A
From
my
point
of
view,
you
can
use
either
one,
so
I
wouldn't
necessarily
just
blindly
shift
from
one
to
the
other,
but
rather
think
about
what
makes
sense
for
you
for
the
long
run.
Sometimes
tracking
things
is
easier
if
everything
is
on
the
same
domain.
So
if
you
use
subdirectories
as
sometimes
for
legal
reasons,
you
have
to
use
different
subdomains
or
you
have
to
use
different
tlds.
A
A
For
example,
when
I
publish
an
article
on
my
website
and
on
every
page
where
this
article
is
mentioned,
I'll
just
use
well
no
follow
in
the
url
with
that
article,
so
no
it's
it's
essentially
nofollow
tells
us
not
to
pass
any
pagerank
to
those
pages,
but
it
doesn't
mean
that
we
will
never
index
that
page.
So
if
you
really
want
a
page
to
be
blocked
from
indexing,
make
sure
it
has
a
no
index
on
it,
don't
rely
on
us
not
accidentally
running
across
a
random
link
to
to
that
page.
A
So
I
would
not
assume
that
those
two
are
the
same
and
in
particular
with
regards
to
new
content
on
the
web.
I
I
think
gary
did
a
blog
post,
maybe
a
year
or
so
ago
about
rel
nofollow
and
the
different
types
of
other
real
attributes
where
he
mentioned
that
we
do
sometimes
use
this
for
discovery
of
urls
as
well.
A
A
We
published
a
landing
page
about
a
month
ago
and
it
hasn't
been
indexed
yet.
I
tested
with
the
live
url
and
requested
indexing
a
few
times.
I
understand
indexing
doesn't
always
happen
quickly,
but
this
is
the
first
time
a
landing
page
on
our
site
is
not
indexed
after
a
couple
of
days,
so
I'm
wondering
if
there
might
be
something
I've
missed.
I
also
did
all
the
checks
I
could
find
to
make
sure
nothing
was
preventing
indexing,
including
the
search
console
tests.
Everything
seemed
signed.
A
I
it's
really
hard
to
say
without
knowing
the
individual
urls
there.
We
we
don't
index
everything
on
the
web,
so
it's
completely
common
that
for
for
most
websites,
we
index
some
chunk
of
the
website,
but
not
absolutely
everything
on
the
website,
so
that
might
be
something
that
you're
seeing
there
with
regards
to
the
amount
of
contact
that
we
index
from
individual
websites,
sometimes
that
relies
a
little
bit
on
our
understanding
of
the
quality
of
the
website
itself.
A
So
if
we
think
this
is
a
really
high
quality
and
important
website,
then
maybe
we'll
go
off
and
try
to
crawl
and
index
that
content
as
quickly
as
possible,
but
there's
no
guarantee
there.
So
from
from
that
point
of
view,
it's
it's
kind
of
tricky
to
to
see
what
exactly
is
happening
here.
A
What
I
might
do
in
a
case
like
this
is
post
in
the
help
forum
just
to
make
sure
that
there
are
no
technical
issues
holding
that
url
back
and
then
otherwise
kind
of
like
give
it
a
little
bit
more
time
or
see
what
you
can
do
overall
to
improve
the
quality
of
the
website
in
general,
which
is
usually
something
that's
more
of
like
a
long-term
goal,
rather
than
something
that
you
can
just
quickly
tweak
and
hope
that
google
will
pick
it
up
and
tomorrow
everything
will
be
different.
A
A
For
example,
I
don't
know
if
you're
selling
christmas
trees,
then
you
probably
expect
those
pages
to
be
kind
of
visible
in
the
search
results
in
december.
So
if
you
look
in
january
or
march,
and
you
look
at
the
traffic
to
your
pages
and
you'll
say:
oh
well,
these
christmas
trees
are
terrible,
like
I
should
just
delete
all
my
christmas
tree
pages.
But
that's
not
that's,
not
really
the
the
right
thing
to
do
there.
I
mean
these.
A
These
pages
will
be
relevant
at
some
point
in
the
future
and
similarly
other
kinds
of
pages
on
your
website
might
get
very
little
traffic,
but
they
might
be
really
good
pages
and
they
might
be
really
important
kind
of
pieces
of
information
on
the
web
overall.
So
just
purely
going
in
and
saying
at
this
level
of
traffic,
I
will
delete
everything
from
my
website.
I
don't
think
that
makes
sense.
A
I
think,
as
a
way
of
kind
of
like
fine
fine-tuning
what
you
want
to
focus
on
and
then
manually
going
through
and
saying.
Well,
this
is
good.
This
is
bad.
That's
perfectly
fine,
but
just
blindly
saying,
like
everything
below
this
traffic
level
is,
is
bad
and
I'll
delete
it.
I
don't
think
that
makes
sense,
so
I
would
not
recommend
doing
that.
A
A
I
don't
know
exactly
what
what
you're
trying
to
do
with
with
the
breadcrumb
markup
there,
but
it
sounds
like
you're
kind
of
like
splitting
up
the
breadcrumb
pieces
and
marking
some
of
them
up
with
structured
data
and
some
of
them
with
a
direct
link
on
the
page
and
usually
that's
fine
when
it
comes
to
structured
data
or
algorithms,
try
to
recognize
what
is
visible
on
the
page
versus
what
you
just
have
marked
up
for
structured
data,
and
we
want
to
make
sure
that
the
structured
data
is
actually
visible
on
the
page.
A
If
some
of
these
elements
are
linked
as
well
on
the
page
or
not
linked,
that's
usually
fine.
When
it
comes
to
breadcrumbs
in
particular,
I
don't
think
the
the
website
team
would
watch
out
for
this
and
do
any
kind
of
manual
actions
anyway,
so
that
fear
of
being
penalized
for
marking
up
breadcrumbs
wrong.
You
can
totally
forget
about
that.
That's
not
something
I
would
worry
about.
A
I
would
try
to
see
how
breadcrumbs
work
on
your
pages,
how
you
can
mark
them
up
properly
test
things
out,
see
how
they
look
in
the
search
results
and,
if
you're
happy
with
how
they
look.
That's
that's
like
perfectly
fine.
I
would
not
worry
about
the
the
web
spam
team
kind
of
taking
action
on
a
website
for
marking
up
breadcrumbs
incorrectly,
because
there's
just
so
many
bigger
problems
to
to
worry
about
on
the
web.
A
With
the
recent
update,
giving
a
boost
to
extensive
reviews.
Won't
this
make
it
difficult
for
new
e-commerce
sites
to
become
competitive.
Also,
does
google
have
a
preference
for
native
platform
review
functionality
or
platforms
like
yochpo,
which
I
don't
know,
so
I
don't
know
won't
this,
make
it
difficult
for
new
e-commerce
sites.
A
I
think
things
are
always
changing
on
the
web,
so
it's
it's
like
really
hard
to
say
like
will
this
make
it
harder
or
easier
for
sites
to
be
competitive
and
it's
something
where
users
expect
different
things
over
time
and
there
are
different
kinds
of
sites
available
over
time,
and
this
is
always
very
dynamic
relationship.
I
think-
and
it's
not
so
much
that
we're
trying
to
make
it
harder
for
new
websites
to
be
visible
in
search,
but
rather
we
see
user
expectations
are
going
this
way.
A
So
we
need
to
make
sure
that
we
kind
of
align
with
what
users
want
and
that
can
sometimes
drive
what
we
show
in
the
search
results
with
regards
to
new
e-commerce
sites
and
being
competitive.
I
I
think,
as
a
new
site,
you
always
need
to
look
at
the
bigger
picture
and
see
where
you
can
fit
in
where
you
can
provide
something
that
is
both
valued
by
users
and
something
that
is
perhaps
missing
from
other
websites
anyway.
A
So
from
that
point
of
view,
it's
like
it's
always
going
to
be
hard,
especially
if
you're
starting
out
fresh.
Then
you
have
to
look
at
so
many
different
aspects
there
and
how
things
are
shown
in
search
is,
is
one
aspect
there,
but
there
are
lots
of
different
things
that
can
come
into
play
and
with
regards
to
native
platform
review
functionality,
I
honestly
don't
know,
I
don't
think
we
would
have
any
preference
with
regards
to
particular
providers
or
particular
setups.
A
So,
if,
if
something
is,
is
visible
on
your
website
in
a
way
that
users
can
see
it,
that
search
engines
can
see
it,
then
usually
that's
what
we're
looking
for
and
if
that's
handled
in
the
background
by
some
third
party
or
handled
by
yourself
or
even
maybe
hand
coded
or
something
like
that,
that
doesn't
really
bother
us
at
all.
It's
like
we
look
at
the
pages,
how
they,
how
they
end
up
on
the
web,
and
we
try
to
index
them
like
that.
A
Every
article
on
my
site
has
a
specific
phrase.
In
the
beginning,
through
a
google
search
for
this
exact
phrase,
I
found
3
000
doorway
sites
that
stole
my
content
and
within
six
months
I
had
them
all
removed
from
google's
index
through
the
dmca.
I
did
a
great
job,
but
it
had
no
effect
on
my
positions.
A
A
So,
as
far
as
I
know,
there
is
no
aspect
in
our
algorithms
that
says.
Oh,
this
is
something
that
is
very
unique
to
this
one
website
and
we
will
because
there's
something
very
unique
here:
we'll
rank
it
higher
for
all
kinds
of
other
queries.
So
if
you're
selling,
I
don't
know
a
unique
type
of
shoes
and
someone
is
searching
for
shoes,
then
it's
not
that
we
would
rank
your
site
because
it's
a
unique
type
of
shoes,
but
rather
you
have
shoes.
A
This
person
is
looking
for
shoes
and
maybe
other
sites
also
have
shoes
and
we'll
rank
them
based
on
kind
of
the
shoe
content
that
we
find
there.
So
it's
not
not
a
matter
of
us
kind
of
going
through
and
saying.
Well,
there's
only
there's
something
very
unique
here.
Therefore,
we
should
rank
it
higher
for
this
more
generic
term.
Obviously,
if
you
have
something
unique
and
someone
is
searching
for
that
unique
thing-
then
like
we,
we
will
try
to
show
your
site
there
and
that's
kind
of
the
the
reason.
A
But
for
kind
of
the
generic
case
where
someone
is
searching
for
something
generic
and
you
have
unique
things
that
also
map
into
that
generic
category,
I
don't
think
we
would
rank
your
pages
higher
just
because
they're
unique
things
and
kind
of
like
from
from
that
point
of
view.
So
I
I
think,
if,
if
you're,
seeing
that
other
sites
are
ranking
for
your
specific
thing,
for
that
unique
thing
that
you
have
on
your
website
and
you
have
a
copyright
on
your
content
and
whatever
else
aligns
that
you
can
use
a
dmca
process
for
that.
A
That's
that's
a
perfectly
fine
tool
to
try
to
help
clean
that
up.
But
it's
not
the
case
that
we
will
rank
your
website
higher.
Just
because,
like
we,
we've
seen
some
unique
things
on
your
website
in
the
case
of
a
software
brand
using
companies
like
medium
for
branded
landing
pages
such
as
brand.medium.com.
A
Should
we
be
worried
about
possible
duplicate
content
if
the
medium
domain
has
more
strength
for
a
particular
subject
or
topic,
and
the
page
lists
exact
duplicates
to
the
category
and
blog
type
content
on
the
brand
site?
Is
it
possible
that
google
would
canonicalize
to
the
second
domain
if
we
differentiate
the
content?
Is
that
enough?
Or
will
these
types
of
pages
always
be
boosted
over
bran
for
brand
results
thanks
to
the
apparent
domain
strength?
A
So
when
it
comes
to
canonicalization,
we
use
various
factors
to
try
to
figure
out
which
of
these
is
the
canonical
we
should
show
and
whether
or
not
we
should
even
kind
of
like
follow
a
rel
canonical
between
these
pages.
My
understanding
is
that,
on
this
platform
in
particular,
you
can
set
a
rel
canonical
to
your
pages
there.
A
So,
on
the
one
hand,
you're
giving
us
some
signals
that
this
is
the
right
page
to
show
like
your
original
page,
but
on
the
other
hand,
we
might
be
finding
other
signals
as
well
pointing
at
the
other
page
kind
of
the
the
medium
domain,
landing
page
that
you
have
there.
So
from
that
point
of
view,
it's
never
guaranteed
that
we
will
only
show
your
pages
or
only
show
the
other
pages,
and
it's
definitely
not
the
case
that
just
because
you're
also
hosting
it
on
this
other
provider
site
that
their
pages
will
always
be
shown.
A
So
it's
something
where
all
that
kind
of
comes
together
and
it's
possible
that
we'll
show
one
or
the
other
from
a
practical
point
of
view.
Usually
this
doesn't
matter
so
much
because
it's
your
content,
that's
being
shown,
and
it's
not,
that
your
content
would
be
ranked
lower
because
of
these
different
versions
or
the
canonical.
A
So
from
that
point
of
view,
it's
not
that
you
will
be
penalized
for
having
duplicate
content
or
that's
a
bad
thing
to
host
things
like
this.
It's
essentially
just
it's
not
guaranteed
that
it
will
always
fall
one
way
or
the
other
okay
wow
running
low
on
time
already,
let's
see,
maybe
we'll
go
through
some
more
of
the
the
hands
that
were
raised.
I
think
christian
you're
next
on
my
list.
K
Hi
john
happy
new
year
hi,
so
I
I
also
posted
a
question
in.
K
K
Okay,
but
then
it's
like
yeah,
so
I
posted
it
five
hours
ago
and
the
example
is
so
we
generated
like
a
page
which
was
linked
from
famous
german
newspapers
and
everything
and
people
like
this
content
and
and
other
competitors.
They
are
just
like
with
anchor
text
of
phone
number
and
somewhere
in
the
text.
There's
this
number,
but
all
this
all
this
stuff
is
this
seems
to
be
more
relevant
for
google
for
this
keyword.
K
To
as
a
publisher,
how
to
handle
business
problem.
A
I
don't
know
it,
I
don't
quite
understand
the
situation,
so
I
I
can't
load
the
questions
for
some
or
some
reason.
The
youtube
page
isn't
loading.
So
I
I
don't
know,
can
you
explain
it
a
little
bit
more?
It's
like
okay,.
A
K
I
just
the
content
I
have.
The
question
is
in
the
forum
yeah,
so
I
can.
I
can
tell
you
a
little
bit
more
so
for
a
better
understanding.
Sorry
about
that,
so
it's
like
so
for
for
a
special
search
request
like
so.
We
are
ranking.
Our
website
is
making
phone
numbers
and
there's
like
for
a
special
phone
number.
K
They
are
like
competitors
and
they
are
using
spam
techniques
like
like,
if
you
just
post
on
that
pages
on
your
university
pages,
like
with
an
anchor
link
exactly
to
this
keyword-
and
it
looks
like
that
since
december
2020,
google,
like
prefers
that
of
pages,
and
we
have
like
we
are
like
we
have
like
the
same
keyword,
but
not
exactly
because
natural
links
never
have
like
exactly
this
keyword
linking
to
this
page.
So
so
we
have
like
articles
doing
a
lot
of
insights.
E
A
Now
I
I
don't
know
I
I'd
have
to
take
a
look
at
the
examples.
If
you
want
to
maybe
drop
some
some
example
urls
or
some
queries
in
in
the
chat,
then
I
can
pick
that
up
afterwards
and
take
a
look.
A
L
Hey,
hey
john,
so
I
have
a
so.
We
have
a
bunch
of
pages
that
are
dedicated
to
sports
teams
that
offer
you
know
their
upcoming
schedules.
So
they're,
like
list
of
pages
that
show
upcoming
games
like
team,
a
will
play
team
b
on
january
12th.
You
know
coming
up
next
week,
but
what
I'm
noticing
is
google
actually
putting
in
the
date
in
the
search
results,
but,
for
example,
it's
putting
it's
putting
like
january
12,
2011
or
2012.,
I'm
assuming,
because
that
date
is
kind
of
in
the
future.
L
Google
might
just
be
pulling
a
date
that
from
a
prior
year,
ultimately,
we
don't
want
date
snippets
to
show
up
in
the
search
results.
I've
noticed
this
probably
like
a
month
ago.
This
started
to
happen
and
clicks
have
starting
to
drop
on
those
pages,
assuming
that
users
might
think
that
the
content
is
severely
outdated.
They
see
a
date
from
like
2013
or
2014.,
so
just
interested
overall
to
hear
your
thoughts
on
how
to
remove
them.
L
I
know
the
data,
the
data
no
snippet
tag,
if
we
just
you
know,
wrap
the
the
dates
around
the
or
wrap
the
no
snippet
around
the
dates,
because
you
know.
Ultimately,
we
don't
want
to
remove
the
dates
from
the
pages
itself,
so
just
interested
to
see
or
to
hear
your
thoughts
if
there's
anything
else
other
than
the
no
snippet
tag,
it's
not
showing
up
for
every
sports
page
that
we
have
probably
like
half
they're
just
interested
to
see.
If
there's
any
other
way
that
we
can
help
avoid
this
problem.
A
A
A
I
think
it
is
to
let
us
know
of
the
either
the
the
date
when
you
publish
something
to
and
kind
of
put
a
publish
date
on
these
pages
so
that
we
can
pick
up
that
publish
date
and
use
that
if
we,
if
we
wanted
to
or
at
least
to
notify
us
of
the
actual
date
of
the
event,
so
that
if
we
do
choose
that
event,
that
we'll
know
oh,
this
is-
I
don't
know
next
next
february
or
whatever
next
month
kind
of
thing,
and
we
could
show
that
in
the
search
results
so
that
that
might
also
be
an
option.
A
My
guess
is:
if
we're
picking
up
something
like
a
wrong
year,
then
in
some
places
on
your
pages,
you
have
the
right
kind
of
like
day
and
month,
but
there's
a
number
kind
of
like
close
by
where
we
think.
Oh,
this
is
probably
the
year,
and
you
could
also
try
to
look
at
these
pages
and
see
like.
Is
there
a
way
to
make
sure
that
you
don't
have
other
numbers
close
by,
so
that
we
can
be
a
little
bit
easier
with
regards
to
recognizing
things?
A
So
I
I
guess
those
are
kind
of
three
aspects
there.
On
the
one
hand,
the
data
and
no
snippet,
like
you,
mentioned
to
kind
of
hide
these
dates
from
the
snippet,
making
sure
that
either
the
the
publishing
date
or
the
event
date
is,
is
very
kind
of
easily
recognizable
for
us
also
with
the
right
year
and
perhaps
trying
to
make
sure
that
it's
harder
to
recognize
like
a
random
number
on
the
page
as
being
a
part
of
that
date,.
L
A
Cool,
thank
you,
cool.
Okay.
Let
me
pause
the
recording
here
and
we
can
see
if
we
still
have
more
questions
coming
up,
but
thank
you
all
for
joining
in
thanks
for
submitting
all
the
questions.
Sorry,
I
can't
don't
see
any
of
the
remaining
questions
on
youtube.
So
if
there
are
questions
there
that
we
missed
out
feel
free
to
add
them
to
the
next
one.
I'll
probably
do
another
one
of
these
next
week-
and
usually
I
add
them
like
a
couple
days
ahead
of
time,
so
feel
free
to
copy
your
questions
over.