►
From YouTube: English Google SEO office-hours from September 17, 2021
Description
This is a recording of the Google SEO office-hours hangout from September 17, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
A
We
have
a
handful
of
questions
submitted
already
on
youtube.
We
can
go
through
some
of
those,
but
we
can
also
do
some
live
ones
to
get
started
and
it
looks
like
christian
is
the
first
one
with
his
hands
raised
so.
B
Far
good
morning
john
morning,
everybody
yeah
great
to
have
this
opportunity
to
ask
a
question
here
shouldn't
I
already
send
it
to
you,
and
I
thought
I
ask
it
again
here
in
the
webmaster
hangout,
because
I
think
it
can
maybe
interesting
for
other
guys.
Well,
so
it's
about
an
e-commerce
website.
B
We
have
a
category
page
with
products
and
a
huge
advisory
text
and
the
ranking
for
yeah
for
one
of
the
most
important
keywords
dropped
after
stable
rankings
for
about
one
a
year,
and
the
interesting
thing
is
that
the
ranking
dropped
only
for
the
keyword
in
in
a
plural
form,
but
not
in
in
singular
form,
and
the
overall
visibility
of
the
websites
has
even
slightly
increased.
So
it's
just
this
keyword
and
yeah
you
already
mentioned
this
could
result
from
google
not
being
able
to
differentiate
between
transactional
informational
content.
B
But
what
would
interest
me
is.
Why
is
this
only
with
the
keyword
in
singular,
not
in
plural
and
yeah?
Do
you
have
any
advisory
for
this.
A
A
It
could
happen
that
we
switch
it
back
to
something
where
we
say.
Oh
actually,
this
is
more
informational
or
maybe
it's
also
seasonal,
where
we
would
say
well
at
this
time
it's
an
informational
query,
but
maybe,
if
in
december,
you're
searching
for
christmas
trees
and
you
want
to
buy
one-
you
have
more
of
a
transactional
intent
and
if,
if
you
have
one
page
that
covers
both
of
these
intents,
on
the
one
hand,
that's
kind
of
good
because
you're
trying
to
cover
both
sides.
A
B
So
could
it
be
a
solution
to
yeah
have
some
kind
of
compressed
information
on
the
on
the
category
page
so
like
a
list
of
bullet
points
with
the
most
important
advisories
and
yeah
to
to
outsource
the
rest
to
another
page
which
links
to.
A
A
Page
sure,
I
think
that's
good,
I
mean
some.
Some
people
have
completely
separate
pages.
Other
sites
have
something
like
a
blog
set
up
where
they
put
this
informational
content
in
the
blog
and
then
they
link
to
category
pages
or
individual
products.
However,
you
want
to
combine
that
I
think
is
generally
fine.
B
Okay
and
this
this
can
be
different
for
for
different
keywords,
so
we
have
other
category
category
pages
where
we
have
the
same
setup
and
and
there
we
don't
have
this
problem
at
all.
So
can
it
be
that
this
problem
arises
only
for
for
special
kinds
of
keywords,
or
I.
B
C
Hi,
can
you
hear
me
a
similar
question
to
the
christians?
We,
I
read
a
case
study
about
the
fluffy
descript
description
on
the
category
pages
of
e-commerce
websites,
as
you
suggested,
we
shouldn't
manipulate
the
system
about
the
intent
of
the
page,
but
one
of
the
competitors
in
the
market
use
this.
This
is
the
same
strategy
with
the
keyword
stuff
and
the
indexed
pages
are
the
same,
include
the
same
products
but
different
texts,
but
all
of
them
are
on
the
search
results,
and
in
this
case,
how
should
we
compete
with
them?
C
How
can
we
compete
with
them?
For
example,
we
wrote
500
words
for
that
pages,
but,
as
I
read
these
case
studies,
I
decided
that
we
should
use
them
on
blog
posts
and
create
a
link
to
the
category
pages,
but
does
it
guarantee
the
result?
In
that
case,.
A
Nothing
is
guaranteed,
so
I
I
think
that
that's
kind
of
like
the
the
basis
it's
it
is
something
where
like
like
I
mentioned
with
christian
it's,
it's
sometimes
tricky
for
our
systems
to
figure
out
what
the
intent
is
of
the
page
and
in
some
cases
it's
a
lot
easier.
In
some
cases,
it's
a
little
bit
harder.
A
So
having
some
amount
of
information
on
a
category
page
is
good
but
having
it
so
that
you
have
clear
separate
pages
for
the
different
intents,
I
think
makes
it
a
lot
easier
for
our
systems
and
makes
it
a
lot
easier
for
users
as
well.
If
they
want
information,
then
they
land
on
an
informational
page
and
they
don't
have
to
think.
Oh,
is
this
a
store
or
is
this?
I
don't
know
some
something
else,
and
if,
after
going
through
the
informational
content,
they
feel
they
they
would
like
to
buy
something.
A
C
How?
How
do
you
mean,
for
example,
I
see
that
the
page
is
informational,
but
I
still
engage
with
products
and
buy
something
and
google
have
the
data
that
people
engage
with
these
pages.
In
that
case,
I
think
that
the
page
is
worthy
to
rank
on
the
top
right.
A
C
A
I
think
there
is
no
limit,
so
I
I'm
just
not
not
going
to
say
like
yes
or
no.
I
I
think
it's
something
where,
especially
with
category
pages,
you
need
to
have
some
information
on
the
page
so
that
we
understand
what
the
topic
is,
but
that's
that's
generally
very
little
information
and
in
many
cases
we
understand
that
from
the
the
products
that
you
have
listed
anyway,
if
the
the
names
of
the
products
are
clear
enough
to
us
to
understand.
A
Oh
this
is,
I
don't
know,
running
shoes
or
something
like
nike
running
shoes,
adidas
running
shoes,
different
brands.
Whatever,
then
it's
clear
that
this
is
a
list
of
running
shoes.
You
don't
need
to
put
an
extra
text
there,
but
sometimes
product
names
are
a
little
bit
hard
to
understand
and
then
it
might
make
sense
to
to
add
some
additional
text
to
give
us
some
context
there.
But
it's
usually
additional
context
in
in
the
size
of
I
don't
know,
maybe
one
or
two
or
three
sentences.
A
C
A
Cool
okay,
gaston.
D
Yeah
hi,
so
I
have
two
questions,
but
I'm
only
going
to
ask
one
and
a
little
opportunity,
so
I'm
working
on
a
live
website.
With
a
couple
of
million
pages,
we
did
a
redirection
between
two
languages
and
into
spanish
languages,
one
into
the
other,
brilliant
everything
from
one
from
one
spanish
to
the
other.
So
the
migration
is
going
well.
The
traffic
is
it's
it's
moving
from
from
the
second
path,
just
to
clarify
it's
a
projection
between
two
subfolders
or
everything
within
the
same
domain.
D
They
they
think
that
that
that
I
want
to
raise
here
is
we
are
not
seeing
any
change
in
the
coverage
metrics
after
almost
three
weeks
of
having
done
that
redirection,
we
are
not
seeing
any
any
change
in
the
in
the
course
magic.
No,
no
dropping
the
valid
page
is
not
increased,
not
not
an
increase
in
excluded
pages
because
of
redirections
or
anything
like
that.
A
A
So
if,
if
the
content
in
these
folders
that
you're
merging
is
content,
that
is
very
rarely
crawled,
then
it's
going
to
take
a
long
time
for
that
to
be
reflected.
Whereas
if
it's
content
that
is
very
actively
being
crawled,
then
you
should
see
changes
there
within.
Usually
I
don't
know
a
week
or
so
so
those
are
kind
of
the
the
aspects
that
that
I
might
take
into
account.
There.
D
Oh
yeah,
I'm
aware
of
that,
but
my
thing
is
so:
the
the
traffic
is
going
to
one
out
one
part
of
the
side
to
the
other
everything's
going
well,
but
we
are
also
seeing
that
in
in
the
log
files
that
google,
what
is
calling
the
the
direction
well,
but
we
are
not
seeing
in
the
coverage
metrics
we
are.
We
are
concerned
that
that
might
be
either
a
reporting
bug
or
that
we
should
be
waiting
all
day.
I
don't
know
coverage
metrics
are
updated
once
a
month,
possibly
no.
They.
A
I
I
think
the
index
coverage
report
is
updated,
maybe
twice
a
week,
I'm
not
100
sure,
but
a
lot.
A
lot
of
the
reports
in
search
console
are
like
once
or
twice
a
week,
so
that
might
be
something
where
it's
just
a
delay,
but
if,
if
you're,
seeing
the
traffic
going
to
the
right
pages,
if
you're
looking
at
the
performance
report
and
you're
seeing
that
kind
of
a
shift,
then
I
I
think
that's
perfectly
fine.
A
I
don't
think
you
need
to
watch
out
for
the
index
coverage
report
for
something
like
this,
because
I
usually
what
what
happens
when
you,
when
you
merge
pages
like
this,
is
our
systems
first
have
to
figure
out
that
we
have
to
find
a
new
canonical
for
these
pages.
So
you
take
two
pages.
You
fold
them
into
one.
Then
we
see
this
is
a
set
of
pages.
We
have
to
pick
a
canonical
and
that
process
takes
a
little
bit
of
time
and
then
it
takes
a
bit
of
time
for
that
to
be
reflected
in
the
reporting.
A
Whereas,
if
you
do
something
like
a
clean
sight
move,
then
we
we
can
just
transfer
everything.
We
don't
have.
This
canonicalization
kind
of
process
that
we
have
to
figure
out,
so
I
could
imagine
that
it
just
takes
longer
to
be
visible
there.
E
Hey
john
hi,
just
a
couple
of
questions.
You
said
quite
a
few
times.
You
can't
just
work
on
a
page
or
a
set
of
pages
that
won't
necessarily
be
enough
to
get
them
ranking,
getting
them
ranking.
Well,
if
the
rest
of
the
website
isn't
doing
well.
E
So
my
question
is
around,
I
suppose,
specifically
the
algorithms
with
new
pages,
you
know
much
younger
pages
on
the
website.
When
google
comes
along
and
calls
it
and
and
looks
to
understand
it,
does
it
then
compare
those
pages
to
older
legacy
pages
on
the
site
and
say:
okay,
well,
yeah!
These
pages
are
great,
but
these
much
older
pages
actually
rubbish,
so
that
will
then
affect
the
quality
of
the
you
know.
The
newer
pages
and
the
category
pages
is
that
something
that
the
algorithms
do
to?
A
A
So
if
you
add,
I
don't
know
five
pages
to
a
website
that
has
10
000
pages
already,
then
we're
going
to
probably
focus
on
on
most
of
the
site
first
and
then
over
time,
we'll
see
how
that
kind
of
settles
down
with
the
new
content
there
as
well.
So
like.
I
guess
that
kind
of
goes
in
in
the
direction
that
you're
asking.
E
Okay
cool,
so
it
it
yeah,
so
it
definitely
yeah
okay
yeah,
it
makes
sense
yeah.
My
other
question
is
around:
I
guess
backlinks,
so
I
guess
google,
isn't
you
know
if
you
get
a
backlink,
it's
not
just
gonna
assume
that
is
a
good
quality
backlink
and
then
therefore
just
pass
link
equity
to
the
site.
You
know
blindly,
so
does
google
when
it's
crawling,
you
know
these
these
backlinks.
Does
it
either
look
at
the
referral
traffic
and
play
that
into
the
algorithm
or
does
it
if
it
doesn't
see
that
information
does
it?
E
Try
to?
I
guess,
assess
whether
there's
a
you
know
a
high
propensity
to
click
on
that
link,
and
so,
if
there
is
a
high
propensity
to
click
on
that
link,
then
they
will
pass
link
equity.
Whereas
if
there
isn't
then
say
you
know,
you
could
just
click,
you
know
great
you
could
you
could
literally
create
a
blog
and
link
now
you
know-
and
you
know
in
that
situation.
In
that
case
google
say
well.
Actually
there's
no
traffic,
there's
not
really
a
lot
going
on
here.
E
So
why
should
we
pass
any
form
of
link
equity?
So
in
terms
of
yeah,
so
in
terms
of
that
and
does
that
kind
of
feed
into
whether
or
not
link
equity
does
get
passed
on
to
a
site.
A
But
if,
if
someone
is
referring
to
your
site
and
saying
like
I'm
doing
this
because
this
expert
here
said
to
do
that,
then
people
are
not
going
to
click
on
that
link
and
always
like
look
at
your
site
and
kind
of
confirm,
whatever
whatever
is
written
there.
But
they'll
see
it
as
almost
like
a
reference
it's
like.
If,
if
they
needed
to
find
out
more
information,
they
could
go
there,
but
they
don't
need
to.
E
Yeah,
so
is
it
more
on
page
factors
than
when
they're
when,
when
there
are
guns,
are
reviewing
the
that
link
and
whether
or
not
you
should
pass
on?
Are
they
assessing
that
website
as
a
whole
and
then
saying
yes
or
no?
We
should
pass
it
on
or
actually
we
don't
trust
the
site.
So
we
won't
a
little.
A
Bit
I
mean
some
of
that.
You
will
see
if
you
look
into
how
how
pagerank
works.
It's
it's
a
bit
different.
I
think
nowadays
how
we
handle
that,
but
essentially
the
the
idea
behind
that
is
that
we
we
kind
of
or
in
general,
with
page
rank
you.
You
set
up
some
value
for
the
individual
pages,
and
then
you
pass
a
fraction
of
that
value
on
through
the
links
there
yeah.
A
So
essentially,
if,
if
page
is
seen
as
being
very
high
value,
then
the
links
from
there
will
be
treated
with
a
little
bit
more
weight
compared
to
some
random
page
on
the
internet,
and
we,
as
far
as
I
know,
we
we
don't,
do
it
exactly
the
same
way
as
things
were
in
the
beginning
with
regards
to
pagerank.
A
Sure
cool,
okay,
let
me
go
through
some
of
the
submitted
questions
and
then
I'll
get
back
to
the
folks
that
have
their
hands
raised
here
as
well.
Don't
worry
well,
we'll
find
time.
A
A
We
added
the
new
property
all
of
these
things,
but
we
can't
get
the
change
of
address
to
work.
It
has
a
validation
error,
saying
that
there's
a
301
redirect
where
it
can't
fetch
the
old
domain.
How
do
we
pass
the
validation
for
a
change
of
address
tool?
A
So
I
think
first
of
all
the
most
important
thing
to
keep
in
mind
is
that
the
change
of
address
tool
is
just
one
extra
signal
that
we
use
with
regards
to
migrations:
it's
not
a
requirement.
So
if,
for
whatever
reason,
you
can't
get
the
change
of
address
tool
to
work
for
your
website,
if
you
have
the
redirect
set
up
properly
all
of
those
things,
then
you
should
be
set.
A
It's
not
something
that
you
absolutely
need
to
do.
I
imagine
most
sites
when
they
do
kind
of
moves.
They
don't
actually
use
this
tool,
it's
something
more.
Those
who
know
about
search,
console
and
those
who
know
about
all
of
these
kind
of
fancy
things
they
might
be
using
it
with
regards
to
why
it
might
be
failing,
it's
really
hard
to
say
without
knowing
your
site's
name
or
the
the
urls
that
you're
kind
of
testing
there.
A
One
thing
that
that
I
have
seen
is:
if
you
do
something
like
if
you
have
a
www
version
and
a
num
dub
dub
dub
version
of
your
website,
and
you
redirect
step
by
step
through
that.
So,
for
example,
you
redirect
to
the
non-www
version,
and
then
you
redirect
to
the
new
domain,
and
then
that
can
happen
that
that
throws
us
off.
A
If
you
submit
the
change
of
address
in
the
in
the
version
of
the
site,
that
is
not
kind
of
the
primary
version,
so
that
might
be
one
thing
to
to
double
check.
With
regards
to
the
change
of
address
tool,
are
you
submitting
the
version
that
is,
or
that
was
currently
indexed
or
are
you
submitting
maybe
the
alternate
version
in
search
console,
but
otherwise
I'm
not
aware
of
any
general
issues
with
that
tool.
A
A
Can
I
add
multiple
schema
types
to
one
page.
If,
yes,
what
would
be
the
best
way
to
combine
both
faq
schema
and
recipe
schema
on
one
page,
you
can
put
as
much
structured
data
on
your
page
as
you
want,
but
for
most
cases
when
it
comes
to
the
rich
results
that
we
show
in
search
results,
we
tend
to
pick
just
one
kind
of
structured
data
or
one
would
rich
result
type
and
just
focus
on
that.
A
So
if
you
have
multiple
types
of
structured
data
on
your
page,
then
there's
a
there's,
a
very
high
chance
that
we
just
pick
one
of
these
types,
and
we
show
that.
So
if
you
want
any
particular
type
to
be
shown
in
the
search
results-
and
you
see
that
there
are
no
kind
of
combined
uses
when
you
look
at
the
search
results
otherwise,
then
I
would
try
to
focus
on
the
type
that
you
want
and
not
just
combine
it
with
other
things.
A
So,
ideally
pick
one
that
you
really
want
and
focus
on
that
our
search
console
crawl
stats
report
is
showing
a
steady
increase
of
404
pages
that
are
not
a
part
of
our
site.
They
don't
exist
in
our
sitemap,
nor
are
they
generated
by
internal
search.
A
A
So
I
I
think,
first
of
all,
we
we
don't
make
up
urls.
So
it's
not
that
we
would
take
google
searches
and
then
make
up
urls
on
your
website.
A
A
This
is
not
something
that
you
have
to
take
care
of
if
they're
404,
if
they
don't
exist,
then
that's
that's
fine,
that's
the
way
that
they,
it
should
be,
and
usually
what
what
happens
with
these
kind
of
links
is
we
try
to
figure
out
overall
for
your
website
which
urls
we
need
to
be
crawling
and
which
urls
we
need
to
be
crawling
at
which
frequency
and
then
we
take
into
account
after
we've,
worked
out
what
we
absolutely
need
to
do,
what
we
can
do
kind
of
additionally
and
in
that
additional
bucket,
which
is
also
like
a
very
I
think,
like
a
graded
set
of
urls.
A
So
from
that
point
of
view,
it's
not
that
these
404s
would
be
causing
issues
with
the
crawling
of
your
website.
It's
almost
more
a
sign
that
well,
we
have
enough
capacity
for
your
website
and
if
you
happen
to
have
more
content
that
you
actually
linked
within
your
website,
we
would
probably
crawl
and
index
that
too.
A
So,
essentially,
it's
almost
like
a
good
sign
and
you
definitely
don't
need
to
block
these
by
robots
text.
It's
not
something
that
you
need
to
suppress
it's
something!
It's
it's
just
super
common
on
the
web
that
sites
link
in
a
random
way,
or
maybe
they
have
broken
html,
and
then
we
end
up
discovering
a
bunch
of
urls
like
this.
A
We
have
a
service
website
operating
in
france
only
and
we
have
a
lot
of
traffic
coming
from
other
countries
which
have
really
bad
bandwidth,
which
causes
our
core
web
vital
scores
to
go
down
since
we're
not
operating
outside
of
france.
We
don't
want,
we
don't
have
any
use
for
traffic
outside
of
it.
Is
it
recommended
to
block
traffic
from
other
countries?
I
would
try
to
avoid
blocking
traffic
from
other
countries.
It's
I.
I
think
it's
something
where
ultimately
it's
it's
up
to
you,
it's
your
website.
A
F
A
F
I
add
something
in
this
sure
I
have
similar
questions.
Actually,
we
have
live
tv
page
for
our
new
channel,
which
we
are
not
providing
in
u.s
but,
as
you
told
the
site
crawl
in
the
u.s
only
so
we
provide
information
information.
This
live
tv
is
not
available
in
u.s,
so
a
u.s
crawler
crawled
and
the
google
results.
So
if,
if
I,
if
I
type
in
google,
then
this
hey,
this
live
tv
is
not
available
in
your
location.
Even
I
search
from
india
as
well.
That's
why?
F
Because
so
in
that
case,
can
we
do
clocking
techniques
for
our
live
page?
So
we
show
the
india
version
of
live
tv
for
google
crawler,
but
the
user
will
show
the
live
tv
not
available
live
tv,
not
available
text
page
so
that
it's
it's
a
issue
fixed
automatically.
So
to
avoid
this
issue.
A
A
So
that's
that's
kind
of
I
think
the
baseline
situation
there.
There
is.
I
I
think
now
that
you
mention
it,
especially
with
regards
to
news
sites
what
one
way
that
might
work,
I
I
don't
know,
is
the
the
use
of
the
paywall
structured
data.
F
A
Yeah,
if
you
can't
provide
it
even
after
the
paywall,
then
you
wouldn't
be
able
to
provide
it
to
google
yeah.
It's,
I
think
it's
it's
kind
of
an
awkward
situation
and
it's,
I
think,
a
little
bit
unfair
towards
people
who
are
not
in
the
us
or
not
in
the
location
where
sites
are
being
crawled,
because
the
other
way
around
is
is
of
course,
possible
like
if
you
have
a
website
that
is
based
in
the
u.s
and
you
block
india,
then
that
wouldn't
be
affecting
the
search
results
because
we
crawl
from
the
u.s.
A
Yes,
so
I
I
think
it's
it's
a
little
bit
unfair,
but
from
from
a
technical
point
of
view,
it's
just
how
how
we
have
to
deal
with
crawling.
We
can't
crawl
from
all
countries,
and
the
policies,
at
least
at
the
moment,
are
such
that
you
should
treat
googlebot
the
same
as
other
users
from
that
country.
It
might
be
that
at
some
point
we
we
change
those
policies,
but
they've
been
like
that
for
a
very
long
time.
F
A
So
another
approach
that
that
I've
seen
for
for
other
kinds
of
content
is
to
provide
some
level
of
information.
That
is
something
that
you
can
provide
in
the
u.s,
for
example.
I
I
think
I've
seen
this
with
maybe
casino
websites,
I'm
not
sure
where,
whereas
sometimes
the
content
is
illegal
in
the
u.s,
where
they
have
kind
of
a
simplified
version
of
the
content
which
they
can
provide
to
the
u.s,
which
is
more
like
descriptive
information
about
the
content.
A
F
So
earlier
I
think
I
was
I
was
maybe
gary
or
someone
else
from
google.
In
some
condition
you
you
can
use
the
clocking,
so
you
can
in
some
condition.
I
might.
I
don't
know
which
condition
is
that,
but
I
think
my
condition
is
same.
We
can
use
that
okay.
A
Our
policies
are
pretty
clear
in
that
regard
like
it
should
really
be
what
users
see
in
that
country,
and
I
mean
there
there's
an
aspect
of
you.
You
might
be
willing
to
take
a
risk
and
say
well.
I
assume
the
web
spam
team
will
not
take
action
on
my
website
because
I'm
a
legitimate
business
and
if
they
check
they,
will
see
what
I'm
doing,
but
that's
something
where
you're
essentially
breaking
the
guidelines
and
hoping
that
it
comes
out
well
and
for
for
normal
businesses.
A
A
So
it's
now
it's
kind
of
tricky.
I
I
think
yeah.
I
think
this
question
also
goes
into
the
the
core
web
vitals
aspect
there,
where
maybe
you
want
to
block
countries
where
connectivity
is
slow
and
from
from
that
point
of
view,
I
think
I
I
mean
in
this
case
blocking
other
countries.
Would
be
a
problem
with
regards
to
search,
first
of
all
and
with
regards
to
core
web
vitals,
it's
more
something!
Well,
if
you,
if
you
want
to
block
other
countries,
you
can
do
that.
A
A
This
has
brought
the
average
lcp
for
these
pages
up
to
3.4
seconds,
despite
the
fact
that
our
product
pages
were
averaging
2.5
seconds
before
the
regrouping
we're
working
to
get
these
pages
at
2.5
seconds
below
the
threshold.
But
our
tactics
now
seem
too
insignificant
to
get
us
to
the
score
we
need
to
hit.
A
A
A
So
that
might
be
something
where
you're
seeing
these
kind
of
differences.
If,
if
a
lot
of
people
are
going
to
your
home
page
and
not
so
many
to
individual
products,
then
it
might
be
that
that
home
page
is
weighted
a
little
bit
higher
just
because
we
have
more
data
there.
A
I
would
tend
to
look
at
things
like
your
google
analytics
or
other
analytics
that
you
have
to
figure
out
which
pages
or
which
page
types
are
getting
a
lot
of
traffic
and
then
by
optimizing
those
pages
you're
essentially
trying
to
improve
the
user
experience
for
the
the
users
on
your
website
and
that's
something
that
we
would
try
to
pick
up
for
the
core
web
vital
scoring
there
so
essentially
less
of
averaging
across
the
number
of
pages
and
more
averaging
across
the
traffic
of
what
what
people
actually
see
when
they
come
to
your
website,
and
I
think
that
also
goes
into
the
second
question
like
what?
A
What
logic
does
google
use
to
group
these
pages?
Essentially,
it's
first
of
all,
we
we
have
to
have
data,
and
if
we
don't
have
data
field
data
that
we
can
use
there,
then
we
we
can't
kind
of
group
pages
in
a
more
fine-grained
way,
we're
having
multiple
issues
on
our
site
where
search
console
says
we're
ranking
for
keywords.
But
when
we
check
the
rankings,
we
don't
rank
at
all.
What
might
be
the
cause
of
this
and
which
is
right
so
search
console,
does
not
invent
rankings.
A
A
A
If
it's
consistently
at
that
level,
then
it
feels
like
there's
a
chance
that
if
I
search
with
those
settings,
I
could
see
that
as
well.
It's
not
always
guaranteed,
but
it's
there's
at
least
a
small
chance
there.
A
In
particular,
you
also
when
you're
looking
at
the
normal
web
search
results
in
the
performance
report.
We
also
include
things
like
the
images
one
box
or
the
knowledge
panel
on
the
side
and
the
the
maps
one
box,
for
example,
if
that's
visible
somewhere.
All
of
these
things
can
count
as
impressions
as
well.
A
So
you
really
need
to
take
a
look
at
the
the
whole
page
and
especially
with
the
images
one
box.
It's
sometimes
a
little
bit
surprising
when,
when
you
finally
find
kind
of
a
link
to
your
site
in
there,
but
from
our
point
of
view,
that's
also
an
impression
on
the
on
the
search
results
page.
A
The
other
thing
that
can
happen
is
that
sometimes
the
search
results
are
very
personalized
and
it
can
happen
that
for
a
query
that
has
usually
a
fairly
high
search
volume
that
maybe
a
really
small
portion
of
those
people
see
something
that
that
would
show
your
website
and
that's
usually
a
little
bit
harder
to
try
to
figure
out.
But
you
can
kind
of
guess
at
that.
If
you
look
at
the
queries-
and
you
think
oh,
this
is
a
very
popular
query.
Probably,
but
my
website
is
getting
very
little
and
very
few
impressions
for
this
query.
A
Let's
see
with
the
event
of
the
mum
algorithm,
which
is
kind
of
a
machine
learning
algorithm
that
we
use
in
search
now
will
search
results,
be
a
response
to
multiple
sources.
A
A
For
for
that
topic,
I
don't
think
that
applies
to
most
queries,
but
sometimes
that's
something
that
we
do
try
to
take
into
account
with
all
of
these
machine
learning,
algorithms.
A
I
I
think
it's,
it's
always
a
little
bit
tricky,
because
they
we
we
tend
to,
I
don't
know,
show
them
more
for
the
user
user
side
in
in
our
presentations
and
then
turning
that
around
and
saying.
Well,
how
can
I
optimize
for
this
machine
learning
algorithm?
That's
always
a
little
bit
trickier
end
of
september.
We
have
another
search
on
event
lined
up,
and
they
also
mentioned
something
in
in
the
announcement
around
machine
learning,
artificial
intelligence.
A
I
could
imagine
that
there
might
be
something
similar
to
mom
being
mentioned
there
as
well.
I
I
don't
have
any
insight,
but
it
always
feels
like
that's
that's
a
time
where
we
kind
of
show
off
the
new
research
and
new
technologies
that
are
being
done,
but
it's
it's
very
often
or
very
easy
to
take
those
announcements
that
are
made
for
the
user
side
and
try
to
turn
them
around
and
say,
oh
as
an
seo.
That
means
I
have
to
do
things
like
this,
and
often
that
kind
of
inverse
relationship
is
not
so
straightforward.
A
How
does
google
treat
subdomains
versus
folders
is
authority
passed
through
the
subdomains?
Oh,
my
gosh,
I
don't
know
if
someone
is
trying
to
troll
us
again,
we
from
from
our
point
of
view
when
we
talk
with
the
search
quality
team,
they
say,
subdomains
and
subdirectories
are
essentially
equivalent.
A
You
can
put
your
content,
however,
you
want
some
people
in
the
seo.
World
have
very
strong
opinions,
and
if
you,
if
we
say
it
doesn't
matter
and
someone
else
says
you
should
do
it
like
this-
and
you
want
to
follow
their
advice,
then
it's
like
go
ahead
and
do
that
because,
like
from
our
point
of
view
like
we,
we
say
it
doesn't
matter.
A
I
I
think
that
there
are
a
few
aspects
that
play
a
role
there,
which
are
less
around
seo,
but
more
around
things
like
reporting
is:
how
easy
is
it
to
to
set
these
sites
up?
Do
you
want
to
track
their
performance
in
separately
on
separate
host
names
or
together
in
one
host
name,
those
kind
of
things
with
some
kinds
of
websites?
A
So
those
things
can
all
play
a
role
there
in
practice.
What
what
I
would
say
is
try
to
focus
on
your
infrastructure
first
and
see
what
makes
sense
for
you
and
then
work
on
that
and,
if
you're
working
together
with
an
seo,
who
has
a
very
strong
opinion
and
everyone
else
is
like
whatever
you
want,
then,
like
I
don't
mind
like
if
you
follow
their
opinion
either,
because
that
should
work
too.
A
Let's
see
what
are
when
doing
a
site
migration
on
the
day,
we
pull
the
trigger.
There's
a
robot's
text
lock
on
both
domains.
A
A
So
from
our
point
of
view,
these
are
all
separate
situations,
and
it's
not
the
case
that
we
would
say
well.
This
is
a
site
move
with
this
variation,
but
rather,
if
you're
blocking
things,
if
things
are
broken,
then
we
would
first
of
all
see
that
as
something
that
is
broken
and
if
at
a
later
stage
we
see
they're,
actually
redirects,
then
we
would
say
well
now
the
site
is
redirecting
and
we
would
treat
those
as
separate
states.
A
A
So
from
from
that
point
of
view,
I
would
treat
these
as
separate
states
and,
of
course,
try
to
fix
any
broken
state
as
quickly
as
you
can
and
essentially
move
to
the
migration,
then
as
as
quickly
as
you
can,
after
that,
oh
wow
still
so
many
more
questions.
Okay,
maybe
I'll
switch
to
to
a
few
more
live
questions
here,
and
I
saw
some
in
the
chat
as
well.
Maybe
I'll
just
grab
those
quickly
and
I
have
a
bit
more
time
afterwards
as
well
and
we
can
go
through
more
things,
live
as
well.
A
Gambling
website
promotion
policy
for
publishers-
I
don't
know
what
the
policies
are
from
the
publisher
side,
if
you're
talking
about
the
the
ad
side
of
things
from
the
search
side
of
things
seo
wise,
like
publish
whatever
you
want,
would
you
mind
taking
a
look
at
my
website
at
the
end?
Okay,
that
sounds
like
something
for
the
end.
Okay
and
let's
see
how
do
I
find
the
raised
hands
again?
Oh
my
gosh
all
right
this.
G
G
Is
it
does
it
something
like
do
some
to
the
quality
of
the
page
overall
or
crawl
budget?
It's
it's
a
page
with
like
three
million
pages
on
it.
So
does
it
do
something
to
that.
A
I
I
don't
think
you
would
get
a
lot
of
value
out
of
removing
just
old
news,
so
it
it's
also
not
something
I
would
recommend
to
to
news
websites,
because
sometimes
all
the
information
is
still
useful.
So
from
from
that
point
of
view,
I
wouldn't
do
this
for
seo
reasons.
G
Okay,
clear
thanks
and
another
question
is:
I
was
just
really
curious.
It's
about
googlebot!
I
was
working
on
my
personal
site
a
few
months
ago
and
the
very
first
thing
as
an
seo
I
did
is
verified.
Google
search
console
and
google
analytics
and
all
that
stuff.
So
I
was
developing
for
a
few
months
and
when
I
was
ready
to
go,
live
and
do
a
crawl
request
for
the
home
page
in
search
bazaar.
G
I
really
saw
that
googlebot
tried
to
crawl
pages
in
crosstalk
and
I
was
wondering
how
does
that
happen,
and
why
does
it
happen?
Because
there
are
no
it's
a
new
domain.
There
are
no
links
to
it
and
I
was
just
wondering
how
does
googlebot
still
find
this
domain
and
tries
to
crawl.
It.
A
I
I
think
it's
something
that
we
do
when
you
verify
a
website
in
search
console
where
we
kind
of
say
well.
It
looks
like
you're
trying
to
get
this
website
into
search.
We
don't
have
any
information
on
it.
We
will
take
a
look
and
I
I
don't
think
it's
something
that
we
do
for
for
policy
reasons
or
anything
it's
basically
just
oh,
it
looks
like
you're
trying
to
go
into
search.
We
will
help
you
so
that
you
don't
have
to
do
every
step
yourself.
A
All
right
thanks
gaston.
D
So
do
you,
I
don't
know
how
to
work
this
question.
So
the
thing
is
we
have
a
side
map
with
the
let's
say,
200
most
important
pages
for
our
site,
and
we
have
this
time.
That
is
is
static
and
it's
been
the
same
timer
for
three
years,
let's
say,
but
we
have
one
page
that
has
been
the
last
time
that
was
crawled
was
two
years
ago
and
google.
What
does
indeed
to
pre-crawl
that
page?
We
added
links
internal
links.
We
added
that
page
to
another
sitemap
we
under
external
links.
D
D
D
A
I
don't
know
it
sounds
kind
of
weird.
I
mean
it
if
you
can
drop
the
link
into
the
chat
here.
I
can
take
a
look
afterwards
to
see
if
there's
something
very
obvious.
H
Hi
john
hi
hello,
so
my
question
is
about
the
duplicate
content
across
a
manufacturer
and,
let's
say
refiller
aggregators
e-commerce
height
so
supposed
there
is
a
website
selling
thousands
of
products
from
a
variety
of
different
brands,
and
the
commercial
teams
have
been
working
hard
to
get
all
the
official
informations
from
the
brand
and
then
the
data
entries
people
are
constantly
uploading.
H
So
my
question
is
just
a
small
part
of
it
of
this.
Would
it
be
a
good
idea
to
put
an
external
link
to
the
product
sheets
to
the
official
product
sheet
from
e-commerce?
To
the
manufacturer?
Ideally,
will
be.
The
best
will
be
to
both
link
both
linking
each
other,
but
assuming
googlebot
already
is
aware
of
the
duplication
will
be
an
improvement
for
the
e-commerce
to
say.
Okay,
this
is
where
I
took.
The
content
from
those
pages
are
strictly
strictly
related,
or
will
this
result
in
a
downgrade
of
the
e-commerce
site.
A
I
I
don't
think
it
would
change
anything,
so
we
we
don't
need
to
have
a
link
to
the
original
source.
So
so
there
are
two
aspects
here
when
it
comes
to
duplicate
content.
First
of
all,
you
you
don't
get
a
penalty
for
duplicate
content,
so
that's
kind
of
I
think
like
just
just
even
before
you
look
into
it
too
much
the
the
only
time
we
we
would
have
something
like
a
penalty
or
an
algorithmic
action
or
manual
action
is
when
the
whole
website
is
purely
duplicated
content.
A
So
if,
if
it's
one
website
that
is
scraping
other
websites,
for
example,
if
these
are
ecommerce
sites-
and
you
have
the
description-
that's
the
same
and
the
rest
of
your
website
is
different-
that's
that's
perfectly
fine.
You
don't
need
to
worry
about
any
kind
of
demotion
or
dropping
and
ranking
or
anything
with
with
duplicate
content.
We
we
have
essentially
two
roughly
different
things
that
we
look
at.
A
But
that
also
means
that
if
someone
is
searching
for
kind
of
something
generic
that
is
in
the
description
and
we
can
tell
that
they
they
want
to
buy
it,
and
maybe
you
you're
the
the
best
source
or
the
local
source
of
that
product
or
you
have
it
in
stock
or
whatever.
Then
we
will
show
your
pages
and
not
the
other
one,
and
all
of
that
is
independent
of
you
marking
up
like
where
you
took
the
subscription
from
it's.
Essentially
we
we
have
this
description.
A
We
want
to
show
it
in
search
and
we'll
pick
the
best
page
that
we
can
show
for
that
this
description.
So
from
that
point
of
view,
it's
I
think
it's
always
a
good
practice
to
have
unique
descriptions
on
your
pages.
But
if
you
have
a
lot
of
products,
it's
not
always
possible,
and
it's
it's
also
the
case
that
we
would
not
penalize
a
website
for
having
duplicate
descriptions
in
their
products.
A
Cool
okay,
let
me
take
a
break
here
with
the
recording
and
I'll
be
around
a
little
bit
longer
as
well,
so
we
can
go
through
more
questions
too.
Thank
you
all
for
watching.
If
you
watch
this
on
youtube,
if
you
want
to
join
these
in
the
future,
I
always
have
the
links
in
the
community
part
of
our
channel
feel
free
to
join
in
there
or
add
more
questions
there
if
you'd
like
and
otherwise.
I
wish
you
all
a
great
weekend
and
maybe
see
you
next
time.