►
From YouTube: English Google SEO office-hours from October 1, 2021
Description
This is a recording of the Google SEO office-hours hangout from October 1, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hours,
where
people
can
jump
in
and
ask
their
questions
around
search
and
we
can
try
to
find
some
answers
and
it
looks
like
we
have
one
person
raising
their
hand.
So
we'll
start
off
with
some
live
questions
and
then
we'll
go
off
and
look
at
the
submitted
ones.
Daniel.
B
Do
you
want
to
go
ahead?
Yeah?
I
can't
believe
I
was
the
first
one
as
well
after
that
mine's
just
a
quick
one
actually-
and
I
think
I
know
the
answer-
I
just
wanted
to
double
check.
B
So
if
we've
got
internal
urls,
a
relative
urls
and
they're
pointing
to
the
canonical
and
the
canonical
as
the
absolute
url
is
that
fine
for
google
and
the
reason
why
I
ask
is
because
in
screaming
frog,
yes
in
scream,
we
struggle
to
crawl
relative
urls
because
it
has
to
have
the
domain.
It
has
to
have
the
protocol
in
there,
and
so
we
can
only
crawl
it.
If
we
then
change
the
configuration
to
add
xml
sitemap.
A
B
Maybe
looks
I've
noticed
yeah,
it's
mainly
products,
I've
seen
it
a
couple
of
times,
and
it
might
be
that
we
just
need
to
change
the
configuration,
but
that
was
something
that
I'd
noticed
and
I
know
I
think
it
is
maybe
a
I
suppose,
a
bigger
issue.
If
the
canonical
was
the
relative
url,
because
you
don't
know
what
the
canonical
is,
but
I
just
thought
I'd
just
double
check.
If
it
was
the
other
way
around,
if
there
was
an
issue
there
should
be
fine
should
be
fine
awesome.
That
was
it
for
me.
Thank
you
appreciate
it.
C
Hi
so
a
couple
of
years
ago,
google
noted
that,
when
it
came
to
ranking
results,
bert
would
better
understand
and
impact
about
ten
percent
of
searches
in
the
us.
So
my
question
is
twofold:
one
is:
has
that
percentage
change
for
bert,
and
you
probably
can
anticipate
this
one?
What
percentage
is
mom
expected
to
better
understand
and
impact
searches.
A
I
have
no
idea,
so
I
I'm
pretty
sure
that
the
percentage
changed
since
then,
because
everything
is
is
changing,
but
I
don't
know
if
we
have
like
a
fixed
number
that
goes
for
bert,
or
that
goes
for
mom
mom
is,
as
far
as
I
understand
it
is,
is
more
like
a
multi-purpose
machine
learning
library
anyway.
So
it's
something
that
can
be
applied
in
lots
of
different
parts
of
search.
A
So
it's
not
not
so
much
that
you
you
would
like
isolated
to
to
just
ranking,
but
rather
you
might
be
able
to
use
it
for
understanding
things
like
on
a
very
fine
grains
level
and
then
that's
kind
of
interwoven
in
a
lot
of
different
kinds
of
search
results.
But
I
don't
think
we
have
any
fixed
numbers.
C
A
I
don't
know
we'll
see,
I
think
it's
always
tricky
to
to
look
at
the
marketing
around
machine
learning
algorithms,
because
it's
very
easy
to
find-
I
don't
know
very
exceptional
examples,
but
that
doesn't
mean
that
everything
is
as
flashy
as
that,
but
I
I
like
in
talking
with
some
of
the
these
search
quality
folks,
they're
they're,
really
really
happy
with
the
way
that
these
kind
of
machine
learning
models
are
working
cool.
Thank
you,
cool
julian.
D
Hey
john,
thank
you
for
doing
the
office
hours.
I
have
a
question
about
a
message
in
the
search
console
indexed
though
blocked
by
robots.txt.
So
of
a
certain
page
type.
We
have
about
20
000
valid
index
pages,
which
looks
about
right
and,
however,
we're
alarmed
that
we
have
a
warning
for
about
40,
000
pages.
That
says
indexed
though
blocked
by
robots.
D
When
we
inspect
these
it
turns
out.
They
are
auxiliary
pages
that
are
linked
to
from
a
main
page
that
we
want
to
rank
for,
and
these
urls
are
either
not
accessible
by
users
who
are
not
logged
in
or
they're
just
kind
of
thin
informational
pages.
So
we
did
indeed
add
a
disallow
for
these
on
robots,
and
I
guess
google
must
have
hit
these
a
while
back
before
we
got
a
chance
to
disallow
them
and
because
it
knows
about
them
and
what's
alarming,
is
that
console
says
that
they're
indexed
right?
D
So
I
wanted
to
ask
you:
does
this
actually
mean
that
they
are,
and
more
importantly,
are
they
counted
toward
my
site's
quality?
Because
when
I
try
to
pull
them
up
using
a
site
colon
query,
I
only
get
about
one
and
five
to
show
up,
so
I'm
a
little
confused
if
you
could
shed
some
light
on
on
that
message.
Please.
A
Yeah,
so
I
I
would
see
that,
mostly
as
something
of
a
warning
for
a
situation
where
you're
not
aware
of
what's
happening.
So
if,
if
you're
certain
that
these
should
be
blocked
by
robots
text,
then,
like
that's,
that's
pretty
much
fine,
but
if
you
weren't
aware
that
they
were
blocked
by
robots
text
and
you
actually
wanted
to
have
them
indexed
and
ranking.
Well,
then
that's
something
where
you
could
take
action
on.
A
So
from
that
point
of
view,
I
wouldn't
see
this
as
something
that's
alarming.
It's
mostly
just
well.
Google
found
these
pages
and
they're
blocked
by
robots
text
and
by
the
way,
if
you
care
about
these
pages,
you
might
want
to
take
some
action.
If
you
don't
care
about
them,
but
just
leave
them
the
way
they
are,
and
it's
not
going
to
count
against
your
website
it.
It
can
happen
that
they
appear
in
the
search
results,
but
usually
that's
only
for
very
artificial
queries
like
like
a
site
query
and
the
the
main
reason.
It's
mostly.
A
Those
kind
of
queries
is
because,
for
all
of
your
normal
queries
that
are
going
to
your
pages,
you
almost
certainly
have
reasonable
content.
That
is
actually
indexable
and
crawlable
and
that
will
rank
instead.
So
if
we
have
the
choice
between
a
roboted
page
that
we
don't
know
about
kind
of
what's
on
there
and
a
reasonable
page
on
your
website
and
that
reasonable
page
has
those
keywords
on
it,
then
we'll
just
show
your
normal
pages.
A
D
Okay,
yeah,
our
main
concern
is
more
about
I'm,
reflecting
on
the
quality
of
our
website.
So
if
you're
saying
that
it
doesn't,
I
guess
I
guess
no
harm,
no
yeah.
That
does.
A
Google
is
perfectly
fine,
like,
I
think
what
you
could
do
if
you
wanted
to
make
sure,
look
in
the
performance
report
and
search
console
and
see
what
kind
of
queries
they
show
up
for
and
if
you
can't
find
any,
then
it's
like
well
we're
not
showing
them
for
any
queries
that
you
care
about.
So
it's
not
that
we're
going
to
take
them
into
account
for,
for
that
kind
of
query,.
D
Okay,
while
I'm
here
a
question
about
one
of
those
urls
and
robots
was,
is
not
really
accessible
to
the
bot,
because
it's
behind
authentication,
so
you
have
to
be
a
logged
in
user.
Is
there
a
preference
for
google
bot
on
those
types
of
pages
on
the
status
code,
because
I
think
you
know
the
convention
is
kind
of
like
I've
seen
a
lot
of
302s
to
a
login,
page,
google.
A
No,
no,
that's!
That's,
usually
fine.
If,
if
it's
blocked
by
robot's
text,
we
don't
see
the
status
code
anyway.
So,
okay
from
from
that
point
of
view,
we
wouldn't
see
it
the
the
one
situation
where
I
would
look
into
login
pages
a
little
bit
more
is
if
your
primary
content
is
behind
a
login
page,
and
you
want
to
have
at
least
that
login
placeholder
kind
of
appearing
in
the
search
results.
So
we
we
have
that,
for
example,
with
search
console.
A
I
just
I
don't
know
happen
to
see
that
more
more
often
there,
a
lot
of
the
tools
are
essentially
behind
a
login
and
if
you
search
for
one
of
those
tool
names,
we
know
it's
behind
this
url,
but
we
can't
find
any
content
there.
So
we
kind
of
need
to
at
least
be
able
to
index
some
kind
of
a
templated
login
page
for
that
query.
D
Okay,
perfect
and
sorry,
one
more
quick
one,
no
indexing
pages,
so
I've
heard
you
say
in
past
office
hours
that
for
a
large
site
processing,
a
lot
of
pages
that
are
now
no
index
could
take
up
to
a
month
or
so,
and
we've
had
to
do
that
in
order
to
address
some
site
quality
issues
and
as
we
monitor
console,
is
it
normal
to
see
the
index
pages
kind
of
gradually
decline?
I
guess
that
would
be
my
guess,
as
they're
called,
or
is
it
more
like
one
day
there's
a
sharp
drop
off?
A
E
A
Thanks
brandon.
F
Hey
john,
so
I'm
gonna
set
the
stage
with
some
context
so,
and
this
is
actually
relevant
to
the
no
index
discussion.
So
let
let's
say
you
consider
a
site
with
100
million
pages
and
there
may
be
10
million
pages
that
we
believe
are
low
quality.
F
Maybe
5
million
of
those
are
actually
indexed
and,
let's
suppose
that
we
want
to
add
a
no
index
tag
to
these
10
million,
as
well
as
reduce
the
amount
of
internal
links
that
they
receive.
So
over
a
hypothetical
four
months.
Let's
say
google
crawls
and
removes
two
million
from
the
index,
while
the
other
three
million
remain,
and
maybe
a
few
months
down
the
road.
We
determined
that
a
few
hundred
thousand
are
actually
of
decent
quality
and
we
want
to
reinstate
them.
F
Essentially,
so
we
remove
the
no
index
tag
and-
and
we
start
adding
meaningful
internal
links
back,
do
you
foresee
or
think
the
the
history
here
of
being
indexed
and
applying
a
no
index
tag
and
then
trying
to
get
it
indexed
again
at
a
later
time
would
would
be
detrimental.
F
You
know,
would
google
be
reluctant
to
crawl
these
urls
again,
knowing
that
they
previously
were?
Were
no
indexed
and
you
know,
do
you
think
that
there
would
be
any
issues
with
with
getting
them
indexed
again
after
they
were
crawled.
A
I
I
think
like
in
in
the
long
run,
there
definitely
wouldn't
be
any
kind
of
long-term
effect.
From
that
I,
I
think
what
you
might
see
initially
is
that
if
we've
seen
a
no
index
for
a
longer
period
of
time
that
we
might
not
crawl,
that
url
as
frequently
as
otherwise,
but
if,
if
you're
saying
this
is
kind
of
in
the
process
of
we
added
a
no
index,
it
dropped
out,
and
then
we
changed
our
mind
we
added
or
we
removed
it
again.
A
A
Crawl
rate
anyway,
and
when
you're
talking
about
millions
of
pages,
we
have
to
spread
that
out
anyway.
So
that's
something
where
you'll
probably
see.
I
don't
know,
depending
on
these
pages,
I
I
would
imagine
you
might
see
every
couple
of
weeks
or
every
month
or
so
you
might
see
us
try
to
crawl
those
pages
individually.
So
I
I
don't
see
that
that
kind
of
crawl
rate
dropping
off
to
completely
zero.
A
Maybe
it'll
go
from
one
month
to
two
months.
If
we
see
in
no
index
for
a
longer
period
of
time,
but
I
don't
see
it
going
completely
to
zero,
so
that
kind
of
like
we,
we
added
no
index
and
we
changed
our
mind
at
some
point
later
on.
A
Usually
that's
perfectly
fine-
and
this
is
especially
the
case
if
you're
working
on
internal,
linking
at
the
same
time
where
you
you
add
more
internal
links
to
those
pages,
then
of
course
we're
going
to
crawl
those
pages
a
little
bit
more
because
we
see
like
oh,
it's
like
freshly
linked
from
this
and
this
place
within
your
website.
Therefore,
maybe
there's
something
we
actually
do
need
to
index.
F
Yeah
jim,
do
you
think
it
might
actually
be
better
to
reinstate
these
urls
under
a
different
url?
So
that
way
there
is
none
of
that
historical
context
and
potentially
the
likelihood
that
it
gets
crawled
would
be
higher
because
I
it
seems
like
if,
over
the
long
run,
let's
say
after
six
months.
Google
knows
oh
this
url,
you
know
we,
we
know
indexed
it
before
that.
It
may
be
less
likely
to
crawl
that
url,
despite
increased
internal,
linking
as
opposed
to
a
brand
new
url
that
has
the
same
amount
of
internal
links.
A
No
okay!
I
mean
it's.
I
at
least
from
from
my
point
of
view.
I
it's
not
something
where
I
would
spend
a
lot
of
time,
trying
trying
to
figure
that
that
level
out,
because
if
you're
talking
about
a
really
small
website,
where
kind
of
one
page
index
is,
is
a
significant
portion
of
the
web
page,
then
that
kind
of
optimization,
I
think,
makes
sense.
But
if
you're
talking
about
a
website
with
millions
of
pages,
then
I
I
think
any
optimization
you
do
there
is.
Is
such
a
small
level
that
you
actually
can't
measure
it.
F
And
when
it
comes
to
the
the
no
indexing
of
contents
that
we
should,
we
anticipate,
you
know
that
impacting
the
overall
assessment
of
the
site,
where
you
know
the
majority
of
of
the
content
actually
is
low
quality
that
you
know
accelerating
its
de-indexing
might
fast
track
us
to
improve
our
you
know:
site
quality
in
the
eyes
of
google.
Does
that
sound
about
right?
A
Yeah,
so
sometimes
that
that
can
help,
I
I
think,
with
with
a
large
website.
It
sometimes
makes
sense
to
make
bigger
changes
like
this,
but
the
I
think
the
thing
to
keep
in
mind
is
that
our
quality
evaluations
they
tend
to
take
quite
a
bit
of
time.
A
So
if
you
go
off
and
just
de-index,
I
don't
know
10
of
your
website,
then
it's
not
going
to
be
the
case
that,
like
next
week,
we
will
pick
that
up
and
re-evaluate
the
quality
of
your
website
overall,
but
rather
we
we
need
to
do
that
over
time
and
that's
something
that'll
take
a
couple
of
months
anyway,
and
with
regards
to
just
removing
a
lot
of
content
or
not,
I
think
that's
something
that
depends
quite
a
bit
on
the
website
itself
and
oftentimes.
Just
removing
content
doesn't
make
the
website
quality
overall,
much
better.
A
G
Yeah,
hey
john
I've
been
working
with
a
client
for
for
like
two
months
now,
and
I
I
found
in
my
research
that
they
were
cloaking
internal
links.
D
G
Yeah
but
then
I
checked
in
within
with
with
the
wayback
machine
and
I
found
out,
they
have
been
doing
this,
for
they
made
a
certain
template
change
and
it's
about
footer
links
and
these
photo
links
were
there
back
in
january.
G
But
when
I
check
search
console,
I
don't
really
see
anything
happening
with
a
with
a
penalty
or
such.
So
I
wonder
how
long
does
it
take?
Because
I
want
to
advise
him.
Of
course,
hey
you
need
to
take
action,
be
before
you
get
a
penalty,
but
I
can
imagine
the
response.
Hey
we've
been
doing
this
for
apparently
nine
months
should
we
be
afraid.
G
Display
is
not
display
is
not
on
a
with
css,
okay
and
you.
There
is
no
way
to
make
these
links
here
with
yeah,
with
a
with
a
link
or
such.
A
Yeah,
I
I
think
it's
something
where,
theoretically
we
don't
like
that,
but
I
don't
see
the
webspam
team
taking
action
on
that
because,
especially
when
it
comes
to
to
internal
linking
like
that,
it's
it's
something
that
has
quite
quite
a
subtle
effect
within
the
website
and
you're,
essentially
just
shuffling
things
around
within
your
own
website.
A
I
think
it
would
be
trickier
if
they
were,
I
don't
know,
buying
links
somewhere
else
and
then
hiding
them
that
that
would
be
problematic.
That
might
be
something
that
our
algorithms
pick
up
on
or
that
even
the
webspam
team
at
some
point
might
manually
look
at.
But
if
it's
within
the
same
website,
if
it's
set
to
display
none,
then
I
like,
I,
I
don't
think
it's
a
great
practice.
If,
if
you
think
it's
an
important
link,
then
kind
of
like
make
it
visible
to
people.
A
G
Okay,
so
in
this
case
you
would
say
leave
it
as
it
is.
A
A
Of
like,
if
you
think
this
is
an
important
link
to
an
important
page,
then
it's
like
just
like
be
be
straightforward
about
it,
yeah,
because
users
are
going
to
use
it
too,
or
maybe,
if
users
don't
care
about
it,
maybe
it
is,
isn't
actually
an
important
link,
but
I
wouldn't
see
it
as
something
where
I
like
drop
everything.
We
need
to
fix
this
this
week,
kind
of
thing.
A
Cool
okay,
let
me
go
through
some
of
the
submitted
questions
and
then
I'll
get
back
to
all
of
you
with
your
raised
hands
as
well.
Some
of
your
questions
are
always
useful
along
the
way
as
well,
because
not
everyone
can
join
these
sessions
live.
A
Let's
see
the
first
one
I
have
here
is
a
longer
question
about
product
pages
and
I'll
just
try.
To
paraphrase
it
a
little
bit
there.
There
are
two
aspects
there.
A
On
the
one
hand,
they
have
kind
of
promotional
products
where
they
have
samples,
essentially
that
are
also
shown
so
individual
companies
who
are
using
these
promotional
products-
and
the
other
part,
is
that
some
of
these
are
shown
behind
the
view
more
button,
and
I
I
took
a
look
at
the
example
and
at
the
website
a
little
bit
to
to
try
to
figure
out
like
what
is
actually
the
happening
there
and
one
of
the
things
that
that
I
see
there
is
so
so
I
think
the
part
of
the
question
is
like
well.
A
How
do
we
get
all
of
these
sample
pages
indexed
as
well,
and
the
thing
there,
I
think,
is
it's
probably
not
necessary
to
get
all
of
these
sample
pages
indexed,
because
essentially,
these
samples
are
promotional
material
for
other
companies,
and
it's
not
that
you
need
to
have
those
pages
indexed
so
that
your
page
is
findable,
because
essentially
those
pages
would
be
findable
for
some
other
person's
company,
plus
folder
or
plus
cups
or
whatever,
and
from
from
that
point
of
view,
it's
not
that
these
are
pages
that
would
be
driving
traffic
to
your
website
because
nobody
would
be
looking
for.
A
A
Obviously,
there
might
be
individual
situations
where
it
does
make
sense
to
have
those
indexable
and
findable,
but
at
least
the
ones
that
I
looked
at
seemed
like
the
kind
of
pages
where
you
don't
really
need
to
worry
about
them
too
much.
The
other
part
is
the
javascript
view
more
button
that
you
have
on
some
of
these
pages,
with
kind
of
collections
that
you're
showing
there
too.
A
I
I
took
a
quick
look
at
how
you
have
this
implemented
and
essentially
you
load
a
small
subset
of
your
collection
and
then
the
javascript
view
more
button
would
load
more
of
these
items
onto
the
page
and
googlebot
doesn't
use
view
more
buttons,
so
we
would
not
be
able
to
see
kind
of
the
additional
content
that
you're
loading
there.
So,
instead
of
using
viewmore,
usually
we
recommend
using
some
form
of
pagination
in
in
our
e-commerce
stocks,
which
we
just
published
this
week.
A
Let's
see
official
movie
sites,
we
oversee
were
recently
redesigned.
Some
content
was
just
reorganized,
but
the
new
site
is
a
better
user
experience.
The
ranking
dropped
afterwards,
reverting
the
sites
brought
back
the
ranking.
We
concluded
that
no
javascript
technical
issues,
this
started
just
after
the
june
spam
updates
same
findings
for
multiple
movies
and
domains.
Could
the
new
sites
be
getting
hit
by
a
spam
penalty
and
not
the
old
site?
We
can
provide
some
examples.
A
But
I
doubt
that
this
is
the
case
usually
when,
when
you
do
a
bigger
reorg
of
a
website,
you
do
see
some
fluctuations,
so
especially
in
the
short
term,
you
will
see
things
kind
of
shuffle
around
a
little
bit
until
they
settle
down
again
so
those
kind
of
fluctuations-
I
I
think,
are
kind
of
expected
and
the
kind
of
longer
term
changes
it's
something
where
it
almost
depends
a
little
bit
more
on
what
you're
you're
actually
doing
on
the
site-
and
I
think
you're
here
andrew
yeah,.
H
So
there's
there's
something
going
on
with
the
new
temp
template
and
it's
really.
These
movie
sites
are
really
just
the
home
page.
We've
had
we
have
sub
pages,
but
you
know
the
synopsis
page
and
even
the
gallery
page.
Those
been
those
have
been
moved
to
the
home
page
and
that's
really
the
only
change.
A
Yeah,
I
I
wouldn't
expect
the
spam
update
to
have
any
effect
there,
because
the
usually
the
the
type
of
spam
that
we
pick
up
is
more
along
the
lines
of
like
really
bad
spammy
content
or
really
bad
spammy
linking
and
if
you
have
a
site,
you
do
redesign,
even
if
it's
seen
as
like.
Oh
it's
like
maybe
there's
keyword
stuff
in
here
or
something
like
that,
then
that's
not
something
that
these
kind
of
spammy
updates
would
would
be
looking
for.
H
No,
no
they're,
almost
the
same,
like
I
mean
I
can
look
at
the
google
search
console
render
and
it
sees
the
content
all
all
as
is
it
should
be.
You
know
google's,
you
know
the
html
it
outputs
it
from
google
console
is,
is
there
so
it's
and
the
render
looks
great.
So
it's
it's.
You
know
we
have
no
idea
why
this
is
happening
and
the
spam
is
the
only
reason
because
you
know
it
really
happened
at
the
those
fan
updates.
H
H
And
like
the
only
reason
why
we're
thinking
the
spam
is
just
because
you
know
this
site
hosts
very
similar
content
to
a
lot
of
other
sites,
you
know
the
synopsis.
Content
is
pretty
much.
You
know
boilerplate
across
you
know
other
entertainment
sites
across
the
industry,
but
you
know
we're
dealing
with
the
official
site.
You
know
so
we're
trying
to
figure
out.
Why
why
this
is
happening.
A
Yeah,
if
you
can
drop
some
examples
in
the
chat,
I
can
take
a
look
at
that
afterwards,
yeah
yeah,
I
it's
like,
I
think
like
they
do.
One
thing
that
I
could
imagine
that
maybe
our
algorithms
are
running
into
is
we.
We
see
a
lot
of
sites
that
essentially
offer,
I
don't
know
illegal
downloads
of
movie
content
or
things
like
that,
and
it
might
be
that
kind
of
that
combination
of
lots
of
movie
content,
and
maybe
a
different
template-
is
throwing
us
off
in
that
direction
when
we're
like.
H
A
A
I
don't
know
exactly
what
what
direction
you're
you're
headed
with
this
question.
So
it's
it's
hard
to
say
in
in
general,
I
would
recommend,
may
focusing
on
things
like,
like
keyword,
research,
to
try
to
understand
what
people
are
actually
searching
for,
and
if
you
want
to
match
that
one-to-one,
then
that's
kind
of
up
to
you.
I
always
find
it
a
little
bit
risky
to
try
to
match
these
queries
one-to-one
because
those
queries
can
change
fairly
quickly.
A
So
that's
something
where
I
I
wouldn't
focus
so
much
on.
Well,
google
is
understanding
this
content
right
away.
Therefore,
it's
good
but
more
thinking
about
like
what.
What
is
the
long-term
thing
that
you
want
to
achieve
with
your
pages,
and
how
do
you
want
them
to
show
up
in
search,
and
sometimes
that
means
having
some
question
answer
content,
but
not
not
always.
A
How
does
google
evaluate
interactive
content
like
a
quiz
or
questionnaire
that
helps
users
figure
out
which
product
or
thing
that
they
need?
Would
rankings
still
be
based
on
the
static
content
that
appears
on
the
page?
A
Yes,
the
the
ranking
essentially
is
based
on
the
static
content
on
these
pages.
So
if
you
have
something
like
a
full
page
of
questions-
and
we
will
essentially
rank
that
page
based
on
those
questions-
and
that
means,
if
you
have
products
that
are
kind
of
findable
after
going
through
those
those
questions,
you
should
also
make
sure
that
you
have
internal
linking
to
those
products
without
needing
to
go
through
that
questionnaire.
A
So
it's
something
along
the
lines
of
having
like
a
normal
category
set
up
on
the
website
as
well
as
something
like.
I
don't
know
a
product
wizard
or
whatever
you
have
to
to
help
the
users
make
those
decisions.
I
I
think
that's
really
important.
A
The
the
questionnaire
pages
can
still
be
useful
in
search
if
you
recognize
that
people
are
searching
in
a
way
that
they
they're
not
sure
which
particular
product
matches
their
needs,
and
you
can
kind
of
address
that
in
the
questionnaire
in
the
text
on
the
questionnaire,
then
those
questionnaires
can
also
appear
in
search
as
well,
but
it's
not
the
case
that
googlebot
goes
in
and
tries
to
fill
out
a
questionnaire
and
sees
what
what's
kind
of
happening
there
does
giving
a
do.
Follow.
Link
to
a
trusted,
authoritative
site
is,
is
that
good
for
seo?
A
I
I
think
this
is
something
that
people
used
to
do
way
in
the
beginning,
where
they
would
create
a
spammy
website
and
on
the
bottom,
they'd
have
a
link
to
wikipedia
and
cnn
and
then
hope
that
search
engines.
Look
at
that
and
say
like.
Oh,
this
must
be
a
legitimate
website,
but,
like
I
said,
people
did
this
way
in
the
beginning,
and
it
was
a
really
kind
of
traditional
spam
technique
almost
and
I
don't
know
if
this
ever
actually
worked.
A
So
from
that
point
of
view,
I
would
say
no,
this
doesn't
make
any
sense.
Obviously,
if
you
have
good
content
within
your
website
and
part
of
that
references,
existing
other
content,
then
kind
of
that
whole
structure.
That
makes
a
little
bit
more
sense
and
means
that
your
website
overall
is
a
good
thing,
but
just
having
a
link
to
some
authoritative
page
that
doesn't
change
anything
from
from
ipod.
A
I'd
like
to
do
basic
research
regarding
what
do
foreigners,
google,
the
most
with
taiwanese
culture.
What
are
the
most
relevant
keywords
with
taiwan
on
google
search?
It
would
be
great
if
you,
if
I
could
generate
a
ranking
list
of
it
to
acquire
that
information.
I
could
further
designate
a
campaign
for
certain
products.
A
I
I
don't
have
that
information,
so
I
can't
give
that
to
you,
but
essentially
what
you're
looking
for
is
probably
everything
around
the
lines
of
keyword,
research
and
there's
lots
of
content
written
up
on
how
to
do
keyword,
research,
there's
some
tools
from
google
that
you
can
use
to
help.
You
figure
that
out.
There
are
a
lot
of
third-party
tools
as
well,
and
I
I
have
no
insight
into
what
what
all
is
involved
there.
A
I
was
having
two
languages
like
hindi
and
english
on
the
same
page,
and
I
was
ranking
good
on
google,
but
after
the
december
core
update,
I
lost
ranking
for
hindi
keywords.
Mostly.
A
A
So
from
that
point
of
view,
I
think
that
configuration
of
having,
like
I
don't
know,
hindi
on
one
side,
english
on
the
other
side
on
a
single
page,
is
something
that
can
be
problematic
on
its
own.
So
I
would
try
to
avoid
that
that
setup
and
instead
make
pages
that
are
clearly
in
hindi
and
clearly
in
english
and
by
having
separate
pages
like
that,
it's
a
lot
easier
for
us
to
say.
Oh
someone
is
searching
in
hindi
for
this
keyword.
A
So
that's
kind
of
the
one
thing.
The
other
thing
is
with
regards
to
core
updates.
We,
we
have
a
lot
of
blog
posts
around
core
updates
and
I
would
go
through
those
as
well,
because,
if
you're
seeing
this
kind
of
a
change
happening
together
with
a
core
update,
it
might
be
due
to
kind
of
two
languages
on
page,
but
probably
it's
more
likely
due
to
just
general
core
update
changes
that
we've
made.
A
A
Yeah,
I
don't
know
about
the
this
specific
situation
and
usually
we
we
would
call
things
doorway
pages
if
they
essentially
lead
to
the
same
funnel
afterwards,
where
essentially
you're,
taking
a
lot
of
different
keywords
and
you're
you're
guiding
people
to
exactly
the
same
content
in
the
end.
In
the
case
of
something
like
an
encyclopedia,
it's
not
the
same
content.
It's
essentially,
I
don't
know
very,
very
unique
pieces
of
content
on
there
and
just
because
it
covers
a
lot
of
keywords,
doesn't
necessarily
mean
that
it's
a
doorway
page.
A
So
without
digging
into
this
specific
site
in
detail.
My
guess
is
that
that
would
not
be
considered
a
doorway
page,
but
a
doorway
page
might
be
something
where
I
don't
know.
A
If
you
have
a
cactus
page
on
your
website
and
you're,
saying
like
cactuses
in
all
cities
nearby,
you
make
individual
city
pages
where
all
of
the
traffic
is
essentially
funneled
to
the
same
same
direction
on
your
website,
then
that
would
be
considered
a
doorway
page
where
you're
kind
of
like
creating
all
of
these
small
doorways,
but
they
lead
all
to
the
same
house,
a
question
related
to
classified
websites.
I
have
ad
listings
on
search
results.
I
allow
to
crawl
in
index.
A
If
I
have
no
ad
listings
for
some
time,
should
I
disallow
to
index
or
should
I
let
google
decide
if
search
results,
don't
have
ad
listings
and
excluding
those
pages
from
the
sitemap
would
also
be
a
good
practice.
So
I
think
just
just
for
sake
of
clarity.
I
I
think
the
the
search
results
that
this
person
means
are
the
search
results
within
their
own
website.
So
if,
if
someone
is
searching
for
a
specific
kind
of
content,
then
the
website
pulls
together
all
the
ads
that
it
knows
and
it's
those
search
results,
not
not.
A
That's
kind
of
the
ideal
situation,
because
what
we
want
to
avoid
is
to
have
a
page
like
that
in
our
index,
where
it's
basically
like
saying.
Oh,
someone
is
searching
for
a
blue
car
of
this
model
and
make-
and
you
have
this
page
on
your
website,
but
it
says,
like
I
don't
know,
of
any
people
selling
this
kind
of
a
car,
then
sending
people
to
your
website
for
that
kind
of
a
query
would
be
a
really
bad
user
experience.
A
So
essentially,
that's
that's
kind
of
the
direction
to
go
there.
If
you
can
recognize
it
ahead
of
time,
I
we
would
generally
prefer
having
a
no
index
directly
from
your
side.
If
you
can't
recognize
it
ahead
of
time,
then
using
javascript
to
add
a
no
index
might
be
an
option
with
regards
to
sitemap
or
not.
A
The
sitemap
file
only
helps
us
with
additional
crawling
within
a
website.
It
doesn't
prevent
us
from
crawling
these
pages,
so
removing
these
pages
from
the
sitemap
file
would
not
result
in
us,
dropping
them
from
search
or
and
would
not
result
in
us
recognizing
that,
actually
they
don't
have
any
content
there.
A
So
removing
something
in
a
cycle
file
wouldn't
negatively
affect
the
natural
crawling
and
indexing
that
we
do
for
individual
pages.
So
I
think
those
are
kind
of
the
two
aspects.
If
you
can
recognize
it's
an
empty
search
results
page
put
a
no
index
on
it.
Removing
it
from
a
sitemap
file
is
not
going
to
remove
it
from
our
index.
A
We
made
sure
that
all
critical
resources
and
main
content
are
available
to
the
bot
and
get
properly
rendered.
It
seems
like
we
are
missing
something
and
we
can't
pin
down
exactly
why
this
is
happening.
Any
suggestions.
I
would
probably
need
to
take
a
look
at
some
examples,
so
I
don't
know
if
you're
here,
oh.
A
A
Could
I
share
with
you
on
url,
maybe
yeah?
If
you
can
drop
them
in
the
chat,
I
can
pick
them
up
afterwards.
I
enjoy.
A
Right
yeah,
it's
it's
always
tricky
to
debug
them
live,
but
it
sends
me
a
chat
log
afterwards
and
I
I
always
go
through
that
for
examples.
I
Yeah
I'll
send
you
a
couple
of
them.
It's
just
that
easy!
Okay,
since
I
mentioned
my
question
right,
the
21st
of
september
yeah
and
we
just
can't
wrap
our
head
around
what's
happening
because
I
use
the
inspection,
url
tool,
message,
control
and
all
the
main
resources
or
sources
are
there.
So
it's
difficult
to
understand:
what's
happening
really,
yeah!
Okay,
should
I
wait
for
for
an
answer
for
you.
A
Usually,
what
what
happens
with
these
kind
of
things
is,
I
I
double
check
them
briefly,
and
then
I
forward
them
on
to
the
team
so
that
they
can
take
whatever
action
that
they
need
to
there
if,
if
these
are
really
kind
of
just
normal
pages
and
we're
recognizing
them
properly,
it
sounds
like
something
we
have
to
fix,
not
something
you
have
to
pay.
Okay,
fantastic,
okay,
thanks
a
lot
really
appreciate
you
out
cool.
A
My
question
is
that
google
is
not
indexing
websites
even
fresh
sites
and
also
not
updating
data
and
search
console.
Is
there
any
hidden
update
going
on?
I
mean
there
are
always
updates
going
on.
So
that's
that's
kind
of
hard
to
say
I
don't
think,
there's
anything
explicitly
hidden
going
on
what
what
I
do
sometimes
see
is
that
because
it's
so
much
easier
to
create
websites.
A
Nowadays,
people
create
websites
with
a
large
number
of
pages,
and
then
they
focus
more
on
the
technical
aspect
of
getting,
I
don't
know
million
pages
up
and
they
disregard
a
lot
of
the
quality
aspects
and
then,
because
of
the
way
that
search
console
tries
to
provide
more
and
more
information
about
the
indexing
process.
It's
a
lot
easier
to
recognize
that
well,
google
is
not
actually
indexing
everything
from
this
website
and
then
the
assumption
is
is
often
there
that.
A
So
if
you
give
us
a
million
pages
and
the
pages
that
we
we
end
up,
showing
initially
don't
convince
us,
then
we're
not
going
to
spend
time
to
actually
get
all
of
those
millions
of
pages
indexed
we're
going
to
kind
of
hold
off
and
keep
a
small
subset.
And
if,
over
time
we
see
that
the
subset
is
doing
really
well
and
has
all
the
signs
that
we
look
at
with
regards
to
quality.
Then
we
will
go
off
and
try
to
crawl
more.
A
But
just
because
there
are
a
lot
of
pages
on
a
website
does
not
mean
that
we're
going
to
crawl
and
index
a
lot
of
pages.
From
from
that
website,
cool
okay,
we
still
have
a
bunch
of
hands
raised
ice
wow
that
is
getting
longer
and
longer.
I
I
have
a
bit
more
time
afterwards,
so
we
can
continue
on
as
well,
but
maybe
I'll
just
go
through
some
of
these
that
have
their
hands.
Where
he's
now,
let's
see,
I
think
I
trend
greece.
J
A
So
we
we
need
to
be
able
to
see
the
the
old
site
and
to
see
that
there's
a
redirect
from
the
old
one
to
the
new
one.
We
can't
just
process
a
site
move
without
looking
at
that.
J
Okay,
one
more
question
for
how
much
time
your
boats,
if
you
know
I
try
to
access,
let's
say
I
trend
dot
jdr,
even
if
it's
dead.
A
How
long
I
I
don't
think
we
have
a
fixed
time,
so
it
it
depends
a
little
bit
on
how
often
we
try
to
crawl
and
if,
if
like,
for,
for
example,
the
home
page
is
something
that
we
will
try
to
crawl
every
couple
of
days
and
probably
that
will
disappear
fairly
quickly,
because
we
we
try
a
lot
and
we
can't
get
to
it,
but
the
lower
level
pages
of
the
website.
J
J
A
J
A
Pages
in
search
console:
yes,
you
you
can
submit
them
for
for
recrawling
as
well,
but
usually
if,
if
the
website
goes
from
not
being
accessible
at
all
to
accessible
again,
then
we
will
crawl
that
anyway.
So
you,
you
can
help
a
little
bit,
but
it's
not
not
critical.
A
All
right,
merc,
hi.
A
E
For
doing
this,
so
basically
I
did
a
small
website
for
my
mom's
business
using
a
cms
tool
called
squarespace.
E
I
know
that
they
automatically
submit
a
sitemap
once
you
create
a
new
page
and
now
we've
decided
to
add,
like
the
ecom
functionality
like
about
two
weeks
ago
and
the
site
was
password
protected.
So
my
first
question
would
be:
if
google
penalizes
you
in
a
way
if
the
user
can't
access
the
page,
if
it's
basically
just
password
protected
and
the
second
would
be
yeah,
the
site
was
basically
indexed
and
shown
really
nicely
and
the
pages
before,
but
after
editing
all
those
products
and
different
pages.
E
I
looked
it
up
on
google
search
console
for
crawling,
but
basically
my
mom
was
giving
me
a
hard
time
now
when
these
pages
are
going
to
be
shown
again.
So
I'm
kind
of
looking
for
answers.
A
Yeah-
hey,
I
think
so
so,
there's
no
penalty
for
for
having
password
protection,
but
it
means
that
we
can't
access
the
content
behind
the
password.
So
probably
that
that
is
the
the
initial
step
that
happened
with
regards
to
kind
of
turning
on
almost
like
an
e-commerce
site
or
shop
section
on
your
website.
We
we
actually
have
a
whole
article
on
that
now
in
our
search
documentation
specifically
for
e-commerce
sites.
A
A
K
Thanks
for
watching
the
my
question
is
regarding
the
performance
measurement
of
google
discover.
So
basically
I
have
a
client
whose
site
is
up,
for,
I
think,
seven
or
eight
years,
and
they
have
a.
K
Over
over
this
website-
and
what
I
am
saying
is
for
a
particular
day
and
for
a
particular
page,
if
search
console
is
showing
you,
let's
say,
100
clicks
and
if
you
check
that
particular
page
and
particular
date
on
analytics,
so
it
it
shows
you
the
discrepancy
in
the
session
than
the
clips,
so
there
it
is
showing
around
20
sessions
I
mean
in
analytics.
So
there
is
a
lot
discrepancy
I
saw.
So
what
is
the
good
way
to
measure
the
google
discover
performance.
A
I
I
think
the
only
way
to
measure
it
is
in
search
console
because
in
in
particular
in
analytics
the
traffic
from
discover
is
almost
always
folded
into
google
search,
and
then
you
can't
separate
that
out.
So
it's
it's
only
in
search
console.
Do
you
see
kind
of
the
the
bigger
picture.
K
All
right,
okay!
So
there
is
another
question:
if
so,
okay,
if
we
should
look
at
search
console,
basically
as
you
say,
but
if
we
are
going
to
so,
a
lot
of
people
say
like
this
traffic
goes
to
the
direct
channel
medium.
So
I
mean
why
is
that?
Is
there
any
particular
reason
for
that.
A
I
don't
know
I
I
think
it's
it's
something
that
the
the
product
team
just
decided
to
do
is
like
that
in
the
beginning,
so
in
in
particular
discover
we
we
see
as
a
part
of
google
search,
so
we
we
don't
separate
it
out
and
in
analytics
it
might
be
that
I
don't
know
exactly
how
how
they
would
even
track
that
there,
in
which
category,
but
at
least
the
discover
team
they
they
basically
decided
well
discover,
is
a
part
of
search,
so
we're
not
going
to
separate
it
down.
K
L
Hey
john
good
morning,
yes,
I
had
a
question
in
regards
to
internal
search
pages,
so
we're
allowing
indexation
of
on-site
searches.
So
sometimes
someone
does
a
search
on
our
site.
We
create
a
page
for
that
and
now
that's
gone
out
a
bit
of
control,
so
we
have
hundreds
of
millions
of
these
pages.
A
I
I
think,
for
the
most
part,
it
does
make
sense
to
clean
that
up,
because
it
makes
crawling
a
lot
harder,
so
that's
kind
of
the
the
pers,
the
I
don't
know
the
direction
I
I
would
look
at
there
is
to
think
about,
like
which
pages
you
actually
do.
You
want
to
have
crawled
and
indexed
and
to
help
our
systems
to
focus
on
that.
A
Not
so
much
that,
like
you,
should
get
rid
of
all
internal
search
pages.
Some
of
these
might
be
perfectly
fine
to
show
in
search
but
really
try
to
avoid
the
situation
where
anyone
can
just
go
off
and
create
a
million
new
pages
on
your
website
by
linking
to
random
urls
or
words
that
you
might
have
on
your
pages,
so
to
kind
of
take
it
and
say:
well,
you
have
control,
of
which
pages
you
want
to
have
crawled
in
index
rather
than
like,
whatever
randomly
happens
on
the
internet,.
L
And
for
those
pages,
then
that
so
we
so
we
control
this,
and
the
pages
then
aren't
any
longer
links
they're
kind
of
these
orphan
urls,
but
they're
still
going
to
be
crawled
and
probably
indexed.
A
So
one
one
thing
you
could
do
is
create
a
list
of
search
queries
where
you
think
we
we
have
reasonable
content
and
just
double
check
that
list.
If
it's
on
the
list
and
like
let
it
be
indexed,
if
it's
not
on
the
list,
then
add
a
no
index
to
those
pages
and
something
along
those
lines
and
finding
out
which
of
these
pages
are
actually
reasonable,
is
sometimes
a
bit
tricky,
because
it's
easy
to
take
simple,
metrics
and
just
say:
oh
I'll,
just
take
the
top
10
pages
from
traffic.
A
L
Continually
changing
too
so
yeah,
you
might
get
pages
that
suddenly
have
demands
and
then
don't
have
demands.
So
it's
a
it's
a
tricky
thing,
sir
yeah
yeah
manage.
Is
there
any
kind
of
benefits?
Then
do
you
think
longer
term?
So
if
we
do
this
cleanup,
obviously
the
businesses
would
argue.
You
know
if
we're
going
to
do
this
work
as
investment
from
our
side.
What
is
the
return
on
this?
A
So,
on
the
one
hand,
you
have
the
crawling
side
which
will
be
affected.
A
A
But
it
really
depends
on
the
specific
situation,
and
it's
not
the
case
that
you
can
say
well
I'll
clean
up
my
internal
search
results
and
then
google
will
think
my
website
is
suddenly
significantly
higher
quality
it.
It
can
be
like
a
small
change
and
sometimes
a
small
change
is
relatively
large
for
a
website,
but
I
I
wouldn't
rely
on
that
aspect.
I
would
really
focus
more
on
the
crawling
aspect.
L
Great
thanks.
One
final
question
is
so
these
these
pages
are
linked
from
the
footer
of
of
the
site
across
most
of
the
pages
of
the
site,
and
is
there
in
google's
eyes?
Is
there
any
benefit
to
have
these
links
in
the
navigation,
so
they're
more
seen
from
a
kind
of
a
user's
benefit
to
navigate
through
the
sites
than
having
them
kind
of
in
the
footer.
A
E
A
Okay,
thanks
sure,
okay,
let
me
take
a
break
here
with
regards
to
the
recording
so
that
we
have
a
reasonably
long
session
and
thank
you
all
for
all
of
your
questions.
So
far,
it's
been
great
having
you
all
here
and
if
you're
watching
this
on
youtube.
You're,
of
course
welcome
to
join
these
sessions,
live
or
at
least
drop
your
questions
as
well
into
the
community
section
in
youtube,
where
we
always
have
the
next
ones
lined
up
and
with
that.
Let
me
pause
here.