►
From YouTube: English Google SEO office-hours from August 20, 2021
Description
This is a recording of the Google SEO office-hours hangout from August 20, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
seo
search
central
office
hours
hangout.
My
name
is
john
miller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
websites
and
web
search.
A
B
Hi,
so
we
have
a
site
that
has
a
hub
and
spoke
architecture,
so
a
hub
page
might
be
eric
clapton
and
the
spokes
are
what
guitars
he
uses
and
each
of
those
pages
are
relatively
small.
The
value
from
them
is
from
embedded
videos
or
pictures
with
relatively
little
unique
font
content
over
time.
Those
pages
have
become
the
majority
of
our
indexed
pages
with
well
over
a
hundred
thousand,
but
only
a
third
of
those
are
getting
traffic
through
search
in
the
past.
B
I've
heard
you
say
that
to
affect
your
website's
quality
score,
we
were
considering
de-indexing
those
pages,
but
we
the
pages
that
are
not
getting
traffic.
That
is,
however,
we
were
also
considering
canonicalizing
these,
so
I
was
curious
how
google
would
treat
that
from
a
quality
score
perspective.
A
So
we
don't
really
have
a
quality
score
in.
In
that
sense,
I
I
think
that's
that's
something
that
that
comes
from
at
the
ad
side,
so
that
that's
kind
of
one
thing
to
to
kind
of
keep
in
mind
there.
Let's
see,
I
I
think
there
are
multiple
things
to
to
think
about
here.
On
the
one
hand,
I
I
would
consider
do
taking
some
action
if
you
feel
that
these
pages
are
low,
quality
and
taking
action
could
be
something
like
like
removing
those
pages
and
improving
those
pages
combining
those
kinds
of
pages
together.
A
Anything
along
those
lines
could
be
something
that
you
could
do
if
these
are
low
quality
pages.
If
these
are
just
pages
that
tend
not
to
get
a
lot
of
traffic
but
they're
actually
useful
on
their
own,
then
I
wouldn't
necessarily
see
them
as
low
quality.
A
So
that's
that's
kind
of
one
thing
to
keep
in
mind
on
on
some
websites
pages
that
get
low
traffic
are
often
kind
of
almost
like
correlated
with
low
quality
as
well,
but
that
doesn't
have
to
be
the
case
on
on
other
websites.
It
might
just
be
that
well,
a
lot
of
traffic
goes
through
the
head
pages,
and
the
tail
pages
are
just
as
useful
but
they're
useful
for
a
much
smaller
audience,
so
they
get
barely
any
traffic
and
from
from
our
point
of
view,
those
websites
are
still
useful
and
it's
still
high
quality
content.
A
I
wouldn't
remove
it
just
because
it
doesn't
get
traffic
with
regards
to
the
different
kind
of
approaches
there.
A
A
A
So,
if
you
have
one
page,
for
example,
about
eric
clapton's,
guitars
and
another
page
about
eric
clapton's
shoes-
and
you
say
the
guitar
page
is
the
canonical
for
the
shoes
page.
Then
we
wouldn't
have
that
shoe
page
or
any
of
its
content
in
our
index
anymore.
We
would
essentially
just
focus
on
the
guitars,
so
if
someone
were
searching
for
eric
clapton
shoes,
they
wouldn't
be
able
to
find
those
pages
at
all,
so
that's
kind
of
w
with
the
different
approaches
something
to
keep
in
mind.
A
Okay,
thank
you
john
sure.
Let's
see
praveen.
C
A
C
So
we
have
situation
where
we
need.
You
know
kind
of
some
clarity
from
google's
perspective
on
our
website.
Please
don't
mind
if
I
feel
stupid
asking
this
question,
but
this
this
kind
of
situation
we
are
in
so
we
have
site
that
is,
you
know,
creating
a
lot
of
user
generated
content
and
on
that
side
we
have
a
lot
of
pages
that
we
feel
are
not.
C
You
know
relevant
to
the
search
users
like
you
can
consider
them
as
user
profile
pages,
where
all
those
pages
are
kind
of
blank
pages,
with
just
user
name
and
a
little
bit
of
details,
so
they
don't
make
sense
for
the
search
users,
so
we
want
to
keep
them
out
of
search.
So
what
we
did
is
we
edit
meta
robots
tag
with
no
index
on
all
such
pages,
but
the
number
of
such
urls
is,
you
know,
kind
of
very
high,
like
five
thousand
six
thousand
urls,
so
to
quickly
to
kick
the
process.
C
We
just
use
the
removals
tool
on
search
console
and
remove
all
those
urls
from
search
from
search.
Now
I
was
advised
that
why
don't
since
these
urls
are
out
of
search
results,
why
don't
we
just
put
you
know
block
their
crawling
from
robots.txt,
also
so
that
google
don't
troll
them?
But
I
think
here,
what's
the
confusion
is
what
happens
when
we
remove
those
urls
from
search?
C
I
mean
they
are
still
part
of
the
search
index.
If
I'm
not
wrong,
and
so
we
want
to
know
if,
if
those
euros
are
removed
from
search,
whether
google
still
tries
to
crawl
them
during
those
six
months
or
you
know
after
the
or
it's,
if
it's
not
scrolling
them,
because
if,
if
it's
trolling
them,
then
it
won't
be
able
to
see
in
no
index
tag.
A
Okay,
so
the
the
removal
tool
in
search
console
basically
just
hides
the
page
in
the
search
results.
It
doesn't
change
anything
with
the
indexing
or
with
the
crawling
of
the
page.
So
during
the
time
that
the
removal
is
active,
essentially
we
will
recrawl
and
re-index
that
page
normally.
A
If
you
block
it
by
robot's
text
for
for
many
cases,
we
would
drop
the
page
from
the
index
as
well
and
in
the
cases
where
we
don't
drop
it,
it's
usually
a
a
matter
of
we.
We
might
keep
the
url
index,
but
we
wouldn't
have
any
of
the
content
index.
So
that
means
that
if
someone
is
searching
for
a
piece
of
content
which
is
across
the
rest
of
your
website
somewhere-
and
we
also
know
this
roboted
page-
has
a
piece
of
content,
then
we
would
probably
just
show
the
rest
of
your
website.
A
Whereas
if
someone
is
explicitly
looking
for
that
exact
url,
then
we
would
still
show
that
url
in
the
search
results
so
in
in
practice.
If
you're
talking
about
a
bigger
website
that
has
a
lot
of
different
kinds
of
content,
probably
we
would
just
have
something
else
to
show
and
a
robot's
text
disallow
would
be
very
similar
to
kind
of
a
no
index
on
a
page
as
well.
A
It's
it's
a
little
bit
different
if
you're
disallowing
an
important
page
on
your
website
and
you
don't
have
any
equivalent
content.
So,
for
example,
if
you
disallow
the
crawling
of
your
home
page
and
someone
searches
for
your
website
by
name,
then
your
homepage
is
still
the
most
important
page
that
we
would
show.
So
we
would
show
that,
even
if
it's
roboted,
whereas
if
it
had
no
index
on
it,
we
wouldn't
show
it.
A
So
those
are
those
are
kind
of
the
subtle
differences
there.
My
feeling
is
in
in
your
case
it
doesn't
matter
either
way.
I
would
personally
try
to
use
the
no
index
rather
than
a
robot's
text,
especially
if
you're
talking
about
a
relatively
small
number
of
pages
like
couple
thousand
pages,
because
that's
something
that
can
be
picked
up
with
normal
crawling
and
we
can
just
process
that
on
a
per
page
basis,
everything
in
the
robot's
text
file
is
is
sometimes
really
hard
to
diagnose
and
debug
what
what
it
might
be
causing.
C
Okay,
thank
you
just
a
follow-up
question
on
on
this
one,
which
is
very
similar
to
this
one.
For
example,
there
are
a
few
pages
that
we
remove
through
removals
tool.
We
are
not
using
any
no
index
meta
tag
on
that.
You
know
during
those
six
months
does
google
still
use
those
urls
for
quality?
You
know
assessment
of
the
website.
Will
it
consider
even
those
pages.
A
Yeah
I
mean
from
from
our
point
of
view,
those
pages
would
still
be
indexed
if,
if
they
don't
have
a
no
index
or
any
kind
of
blocked,
we
just
don't
show
them
in
the
search
results.
So
that's
something
where
we
we
would
still
take
that
into
account.
Usually,
if
you're
talking
about
individual
pages,
then
that's
not
going
to
skew
your
overall
kind
of
assessment
of
the
quality
of
your
website.
C
D
Okay,
julian,
hey
john,
my
question
is
about
the
value
of
backlinks,
pointing
to
a
page
that
is
no
indexed
kind
of
like
praveen
was
just
talking
about
user
profiles.
So,
in
my
case,
when
a
user
signs
up
for
my
site,
they
get
a
user
profile
page
and
that
page
can
be
useful
for
a
user
so
that
they
end
up
linking
to
it
from
around
the
web.
A
Maybe
maybe
it's
it's!
It's
a
bit
tricky
because
kind
of
from
from
the
way
that
the
the
indexing
systems
work
on
on
a
technical
level.
When
it
comes
to
links,
we
have
to
kind
of
understand
the
link
between
two
canonical
pages
so
that
that's
kind
of
that
one
external
page,
where
it's
linking
from
the
canonical.
A
However,
for
the
most
part,
when
we
recognize
that
a
no
index
is
kind
of
a
persistent
state
on
a
page,
then
we
will
just
say:
well,
we
don't
have
to
do
anything
with
this
page.
We
will
ignore
it
completely.
We
don't
need
to
keep
it
in
our
index
and
in
those
cases,
essentially
those
links
pointing
to
the
no
index
page.
They
they
go
nowhere
and
then
we
drop
them.
A
So
that's
something
where,
if,
if
it's
critical
to
your
website
that
you
use
kind
of
these
no
index
pages
to
forward
signals
or
page
rank,
or
what
have
you
within
your
website,
then
I
would
see
if
there's
a
way
to
make
them
indexable
instead,
just
to
make
sure
that
you
don't
have
this
ambiguity
of
well,
maybe
google
picks
it
up.
Maybe
google
doesn't
pick
it
up
and
that's
I
don't
know,
that's
that's
something
I
may
take
into
account
there.
A
A
The
quality
of
the
page,
but
it's
it's
kind
of
uncertain,
whether
the
the
links
would
would
forward
from
there
to
the
rest
of
your
website.
Yeah:
okay,
okay,
cool!
Thank
you
sure.
Okay,
francis.
A
E
I
have
four
questions
for
you
so
before
I
go
ahead
I'll
just
brief,
you
on
the
history
of
the
website
yeah.
So
we
are
a
tech
repair
service
company.
We
fixed
iphones
ipads,
you
know
apple
products,
so
our
website
wordpress
website,
we
launched
it
back
in
2011,
okay
with
the
single
page
website,
then
we
completely
revamped
in
2012
with
about
150
pages
that
again
in
2014
with
500
pages.
This
was
a
major
upgrade
and
it
has
been
growing
steadily.
E
E
Now
I
observed
after
february
2018
that
there
was
a
gradual
decline
in
organic
traffic,
so
in
june
2019
we
decided
to
revamp
the
whole
website
again
and
I
used
you
know
the
same
website
agency
who
helped
me
to
you,
know,
create
new
pages
and
refresh
the
content
in
2016..
E
So
now
the
majority
of
the
you
know
responsibility
was
given
to
this
outfit
agency
all
right.
Now
they
didn't
ever
work.
You
know
keeping
seo
in
mind,
so
what
they
did
is
they
deleted,
like
we
deleted
all
the
pages.
All
pages
and
created
new
pages
and
new
internal
links,
so
this
is
a
bad
sign,
yeah,
so
and
without
redirecting
the
old
ones.
So
I
discovered
this
problem,
so
I
fixed
this
within
a
week,
but
you
know
I
couldn't
put
the
old
urls
back
so
instead
I
redirected
the
old
urls.
E
You
know
to
the
new
ones
and
I
finished
the
contact
with
the
seo
agency
and
I
started
working
on
the
website
on
my
own
and
started
learning
seo.
Now,
just
in
middle
of
this
december
2020
core
update,
I
discovered
a
major
technical
problem
with
the
website,
so
search
console
was
reporting
close
to
4000
urls,
which
were
indexed
but
were
not
submitted.
So
when
I
looked
into
those
pages
okay,
they
were
all
images.
E
So
basically
it
was
thin
content,
just
one
image
with
a
header
and
footer
okay
and
in
fact
I
have
only
close
to
500
450
pages.
At
that
time
you
know
good
quality
pages,
so
I
I
realized.
Okay,
this
is
a
big
issue,
so
I
fixed
this
problem.
It
was
an
seo
plugin,
which
I
was
using
with
my
wordpress
site,
and
there
was
the
bug
which
was
reported
back
in
2018.
E
It
was
called
a
media
attachment
bug,
so
what
it
happened
was
when
they
updated
the
plug-in
it
enabled
that
option
in
the
plug-in,
which
creates
unique
links
for
every
single
image
on
the
website.
Like
icons,
images
everything,
so
I
had
close
to
4000
images.
E
So
I
I
fixed
this
problem
and
I
used
a
removal
tool
in
search
console
to
remove
all
the
index
pages
from
the
search
right
and
after
that,
I've
been.
You
know,
working
hard
on
the
website.
You
know
creating
new
content,
so
new
content
has
been
their
new
pages
plus.
You
know,
according
to
customers,
point
of
view,
user
friendly
and
I've
drastically
improved
the
page
speed
as
well.
Now.
My
first
question:
does
this
six
months
or
more
timeline
applies
in
my
case?
E
C
A
Because
I
I
took
a
quick
look
at
the
website
beforehand,
so
I
I
think,
first
of
all
those
those
kind
of
media
pages
would
not
be
seen
as
as
a
problem
from
our
side.
A
So
that's
not
something
where
I
I
would
say
that
would
have
triggered
any
severe
issues
on
your
website
so
kind
of
just
just
as
an
aside.
A
I
also
took
a
quick
look
at
your
site
in
our
internal
tools,
just
to
double
check
some
things
and
it
it
doesn't
look
like
we.
We
consider
your
site
to
be
low
quality
or
problematic,
or
anything
like
that.
So
with
that
kind
of
in
mind,
my
my
kind
of
initial
assumption
that
this
change
since
2018
is
more
of
kind
of
like
a
natural
change
over
time.
A
I
I
think
what
was
kind
of
confirmed-
and
I
would
see
this
as
something
where
it's
like
well-
the
web
has
evolved,
some
user
needs
might
have
changed
over
time
and
with
that,
it's
important
that
you
kind
of
keep
updating
your
website.
Keep
working
on
your
website,
keep
improving
the
website
to
make
sure
that
you
kind
of
remain
relevant,
because
if
you
don't
do
anything
on
a
website
over
time,
then
it
just
becomes
less
and
less
relevant.
It's
not
that
your
website
becomes
worse.
A
It's
just
it's
not
not
as
relevant
as
it
might
have
been
in
the
past.
So
my
kind
of
recommendation
there
would
be
to
say
well
let
all
of
those
old
things
kind
of
leave
them
by
the
side.
Don't
worry
about
them
and
instead
focus
on
your
website
as
it
is
now
and
focus
kind
of
on
the
future
and
keep
working
to
improve
things.
There.
E
Right
and
even
I'm
writing
new
blogs
about
new
topics
related
to
this
industry,
so
I
hope
that
helps
now.
But
second
question
which
I'm
confused
about
this
is
about
duplicate
content
on
the
website,
so
I
have
individual
pages
all
right
for
each
model
and
they
are
faults.
For
example,
we
fix
you
know,
iphone
6
screen
and
also
iphone
7.,
so
the
content
for
iphone,
6
and
7
is
almost
like
the
same,
except
for
that
the
name
is
different.
Heading
is
different.
E
A
I
I
don't
think
we
would
see
it
as
being
kind
of
like
purely
duplicate
content,
but
what
what
is
probably
happening
is
you're
competing
with
yourself
and
you're
going
into
a
situation
where
someone
is
searching
for
maybe
iphone
screen
repair
or
something
like
that,
and
instead
of
having
one
really
strong
page
on
this
topic,
you
have
multiple
pages
on
that
topic,
and
that
means
all
of
those
multiple
pages
have
to
rank
on
their
own,
which
means
if
the
competition
is
very
strong
for
some
of
those
queries,
then
your
pages
will
have
a
hard
time,
whereas
if
you
have
one
really
strong
page
on
this
topic,
where
you
say
well,
it's
like
this
is
for
these
five
models
of
iphones
and
along
the
way
in
the
page.
A
A
So
that's
usually
the
direction
where
I
would
head
and
say
well
try
to
combine
the
content
into
kind
of
stronger
pieces
so
that
you
can
compete
better
in
that
area,
especially
if
this
is
an
area
where
the
competition
is
very
strong.
If
it's
something
where
you're.
By
far
the
only
website
that
is
talking
about
this,
then
maybe
it
makes
sense
to
have
multiple
pages
on
the
topic.
E
I
understand
I
have
a
strong
page,
so,
for
example,
I
have
a
page
for
iphone
screen
replacement
and
I
have
described
about
the
services
and
I've
also
mentioned
screen
replacement
services
for
each
particular
model
on
that
page
itself.
So,
let's
suppose
this
customer
wants
to
look
into
a
particular
problem
for
its
particular
device.
For
example,
if
customer
has
iphone's
iphone
x,
so
customer
will
go
to
that.
You
know
particular
model
look
for
his
problem
and
he
can
dive
deeper
into
you
know.
E
B
E
All
right,
so
I
have
a
third
question.
So
is
it
okay
to
place?
You
know
a
schema
markup
of
google,
my
business
reviews
on
the
footer
of
my
site,
because
I'm
using
a
plugin,
you
know
which
I
can
use
this
ad
as
a
widget,
okay,
and
I
can
place
it
on
the
footer
for
google.
My
business
review.
Is
there
a
problem
with
this
implementation?
Will
google
think
it's
a
duplicate
content?
You
know
because
the
schema
is
on
all
the
pages.
A
It's
it's
not
so
much
that
it's
duplicate
content,
it's
more
that
the
reviews
are
not
collected
directly
on
your
website
and
that
would
be
against
the
review
policies.
So
that's
that's
kind
of
the
the
angle
I
I
would
watch
out
for
there.
So
with
the
the
structured
data
types
and
the
rich
results
that
we
show
in
search,
I
would
double
check
the
policies
that
we
have
in
our
documentation
and
really
make
sure
that
any
plug-in
that
you
use
it
kind
of
complies
with
the
policies.
E
I
understand
all
right
so
now
the
fourth
question
and
last
question.
So
my
average
response
time
on
my
website
when
I
see
in
google
search
console
it's
around
about
800
milliseconds
yeah.
So
I
want
to
understand
where
google
bought
crawls
from,
because
my
server
it
is
hosted
in
bahrain
in
middle
east,
and
my
users
are
in
ui,
which
is
very
close,
and
I
have
when
I
check
the
server
response
time
with
the
tool
it
shows
only
20
milliseconds.
So
I
want
to
understand
what
is
what
could
be
the
cause
of
the
flow
response
time?
A
Probably
so
we
crawl
for
the
most
part
from
the
us,
and
that's
probably
where
I
would
assume
we're
crawling
your
website
from
as
well
the
the
response
time.
There
is
purely
a
matter
of
kind
of
like
us
accessing
your
server
and
crawling
the
data
there.
It's
not
what
we
would
take
into
account
for
ranking
purposes
for
kind
of
the
core
web
vitals
the
speed
ranking
factors
for
those
factors
we
take
into
account
what
users
users
actually
see.
E
A
For
I
I
think
in
your
case
it
wouldn't
make
any
difference
at
all.
So
if,
if
you
had
a
website
that
had
hundreds
of
millions
of
pages-
and
we
have
trouble
crawling
because
of
your
server
location,
then
I
would
move
it.
But
if
you're
talking
about,
I
don't
know
a
couple
of
thousand
pages-
tens
20
30
100
000
pages.
I
I
wouldn't
worry
about
that.
A
Okay,
cool,
I
see,
there's
still
more
hands
varies,
but
let
me
run
through
some
of
the
submitted
questions
just
to
make
sure
that
we
have
their
that
side
covered
as
well.
Let's
see
the
first
one
is
one
about
content
celos,
which
is
basically
like.
Does
it
make
sense
to
to
kind
of
create
these
kind
of
content
silos
on
with
like
focused
internal
linking
on
specific
topics,
and
things
like
that?
A
So
from
from
my
point
of
view,
I
would
not
see
this
primarily
as
an
seo
move,
but
more
as
something
that
you
might
do
for
users
in
the
sense
of
if
it's
clear
for
users
that,
like
your
site,
is
really
strong
on
a
specific
topic
that
makes
it
a
lot
easier
for
them
to
understand
the
context
of
the
individual
things
that
you
have
there
and
kind
of
indirectly.
We
understand
things
a
little
bit
better
for
seo
as
well,
but
it's
not
something
where
I
would
say
it's
like.
A
Well,
those
internal
links
are
coming
from
this
theme
page
and
going
to
that
theme
page
that
should
match
exactly
and
then
google
will
rank
our
pages
better.
I
I
wouldn't
focus
on
that
level.
I
would
focus
really
more
on
kind
of
creating
a
strong
part
of
your
website
or
your
whole
website.
That
is
focused
on
a
specific
area,
because
that
makes
it
easier
for
us
to
understand
what
this
website
is
about.
A
It's
less
about
kind
of
like
tweaking
internal
links
and
more
about
actually
making
making
it
clear
what
your
content
is
about
in
terms
of
page
quality.
What
is
better
in
google's
eyes
a
page
that
ranks
for
a
single
keyword
with
high
traffic
or
page
that
ranks
for
lots
of
keyword,
but
gets
little
to
no
traffic.
I
don't
think
we
would
care
about
this
at
all
when
it
comes
to
understanding
the
quality
of
a
page.
A
It's
it's
really
almost
more
a
strategical
question
on
your
side,
so
some
websites
like
to
focus
on
kind
of
very
head
queries
where
you
have
few
keywords
going
to
these
pages.
But
it's
like
a
lot
of
traffic.
Other
websites
like
to
focus
more
on
the
long
tail
side
of
things
where
you
have
kind
of
longer
queries
going
to
those
pages
and
generally
less
individual
traffic
per
query.
But
overall
it
might
still
be
interesting
to
you
and
that's
something
where
you
you
can
kind
of
focus
on
things.
A
The
way
that
you
want,
and
sometimes
it
makes
sense,
also
to
think
about
the
the
competitive
environment
you're
active
in
and
if,
if
there
are
lots
of
other
sites
active
and
focusing
on
the
head,
queries
already,
then
it's
going
to
be
hard
for
you
to
get
in
there
and
say
like
well.
My
page
is
also
really
relevant
for
online
bookstore,
because
there
are
lots
of
really
strong
online
bookstores
out
there
already,
and
maybe
it
makes
sense
to
focus
more
on
long
tail
queries
first
and
to
kind
of
build
your
website
up
over
time.
A
First,
but
it's
not
that
one
or
the
other
is
better.
It's
just
a
different
way
of
kind
of
doing
things
online,
despite
having
cleared
all
possible
caches.
The
old
version
of
my
website.
I
keep
showing
in
page
speed
insights
mobile.
Very
rarely
do
I
get
the
right
mobile,
updated
version.
I
just
noticed
that
the
pagespeed
insights
is
pulling
the
google
cache
version
of
the
site
most
of
the
time.
A
Then
that
sounds
very
much
like
something
kind
of
temporarily
is
is
serving
a
cached
version.
A
One
thing
that
I've
I've
used
in
some
situations
that
has
helped
to
kind
of
like
break
through
this
kind
of
caching,
where
you
don't
know
what's
happening,
is
just
to
add
some
query
parameters
to
the
url.
So,
instead
of
just
checking
your
home
page
check
your
home
page
plus,
a
question
mark
plus
some
random
number
and
in
general
servers,
will
ignore
kind
of
everything
after
the
question
mark.
If
they
don't
need
it,
but
the
networking
systems
along
the
way
will
recognize.
A
This
is
a
unique
url
and
we'll
try
to
request
the
most
recent
version
of
that
from
your
server
and
by
doing
that,
you
kind
of
circumvent
any
of
the
caching.
That
is
happening
along
the
way
it
doesn't
work
hundred
percent
of
the
time.
Sometimes
the
the
caching
along
the
way
does
something
smart
and
breaks
that
from
happening,
but
for
the
most
k
part
that
that
kind
of
works
and
that
should
be
a
way
to
test
individual
pages.
A
Then
three
questions.
Let's
see
localizations
we
use
separate
websites
with
brandable
domain
names
for
each
language
using
hreflang.
Is
there
something
like
too
many
localizations
for
this
method,
specifically
15
languages?
A
There's
nothing
like
too
many
localizations,
but
what
what
happens
here
is
kind
of
similar
to
what
I
mentioned
before
in
that
you're,
creating
a
lot
of
different
pages,
and
essentially
you
have
all
of
these
pages
that
are
kind
of
maybe
perhaps
a
little
bit
lower
quality.
Well,
not
lower
quality
but
kind
of
lower
strength
overall,
and
then
they
also
end
up
competing
with
each
other
a
little
bit.
A
Overall,
so
usually,
there's
there's
a
kind
of
a
balance
that
you
need
to
find
between
having
different
country
or
different
language
versions
and
having
really
strong
versions
of
your
website
and
to
kind
of
make
sure
that
you're
not
diluting
things
too
much.
So
it's
sometimes
I
mean
it's
like
technically
it's
possible
just
to
take
any
website
and
say
I
will
create
a
version
for
every
country
on
the
world
and
usually
that's
overkill
and
results
in
us
kind
of
really
having
trouble
understanding
your
website
at
all.
A
And
if
you
have
an
international
website,
the
other
alternative
is
kind
of
like
to
say
I
will
just
have
one
website
and
probably
that's
not
that
great,
because
you're
not
focusing
on
individual
markets
and
somewhere
in
between
there
is,
is
the
the
sweet
spot
and
that
sweet
spot
depends
a
lot
on
your
website
on
your
users,
and
so
it's
not
something.
Where
there's
like
one
clear
number
that
we
can
say
it's
like
you
should
have
three
country
versions
or
15
country
versions,
because
it
really
depends
on
the
website.
A
Webmaster
tools
or,
I
guess,
search
console
wow
still
using
webmaster
tools,
often
shows
clickable
elements
too
close
for
some
of
our
mobile
pages.
These
are
html5
games,
so
the
html
elements
are
close
together
on
purpose.
Is
there
any
way
to
signal
that
the
elements
are
close
on
purpose
or
maybe
exclude
the
pages
from
the
check?
Users,
love
the
game
and
the
uss
metrics
are
great.
A
Will
it
offset
this
warning
when
the
algorithm
decides
about
mobile
ranking,
so
there's
there's
no
way
to
exclude
those
pages
from
the
kind
of
mobile
friendliness
check,
so
that's
kind
of
the
the
first
thing
there
with
regards
to
mobile
ranking,
I
don't
know
actually
how
how
this
would
work
out,
because
it's
it's
possible
that,
depending
on
on
the
setup
that
you
have
that
we
would
see
these
pages
as
not
being
mobile
friendly
anymore
and
then
it's
possible
that
we
would
say
well
if
they're
not
mobile
friendly
and
we
have
other
other
pages
on
the
web
that
are
equivalent,
but
they
are
mobile
friendly,
then
maybe
on
mobile
we
should
be
showing
those
other
pages
instead.
A
So
that's
something
I
don't
have
a
clear
answer
for
you
there.
If
you
have
some
examples,
I'm
happy
to
take
a
look
with
the
team.
If
you
you
can
send
them
to
me,
maybe
on
twitter
or
if
you're,
here
drop
them
in
the
chat.
I
can
pick
them
up
as
well
and
then
a
question
about
mobile
interstitials
in
2017
interstitials
were
a
big
no-no
and
even
today,
monetizing
with
adsense
vignettes
is
capped
to
showing
only
once
per
hour.
A
However,
for
our
use
case
again,
interstitials
are
far
better
solutions
than
banner
ads,
because
banner
ads
takes
precious
screen.
Space
interstitial
ads
is
very
common
in
mobile
apps.
Is
there
any
change
to
google's
stance
on
interstitials?
Will
it
affect
our
rank
if
we
do
use
interstitial
ads
on
our
mobile
website?
A
A
Then,
from
our
point
of
view,
that
looks
bad,
but
if
people
can
go
to
your
website,
they
can
do
something
on
there.
They
can
start
to
play
a
game
and
kind
of
like
at
the
start
of
the
next
level.
You
show
an
interstitial
ad,
then,
from
our
point
of
view,
that's
that's
perfectly
fine,
that's
essentially
something
between
you
and
the
user
and
finding
that
balance
of
how
many
ads
you
show
or
what,
how
you
present
those
ads
during
the
kind
of
session
on
the
site.
That's
essentially
something
up
to
you.
A
A
A
Then
you
wouldn't
be
republishing
the
same
content,
whereas
if
you
did
want
to
kind
of
keep
some
of
the
content,
because
you
think
that
content
is
actually
good,
then
I
would
just
delete
the
bad
content
and
then
focus
on
building
up
a
website
with
the
good
content.
Instead
and
not
kind
of
do
this
thing
where
you
like,
delete
everything,
wait
a
couple
of
months
and
then
hope
that
google
doesn't
realize
that
actually
you're
just
republishing
the
same
thing
as
before.
A
So
I
I
would
kind
of
shy
away
from
this
approach
and
try
to
think
of
it
more
as
building
up
something
of
value
over
time.
And
if,
over
time,
you
realize
that
there
are
some
things
in
there
that
weren't
that
great,
then
just
kind
of
like
improve
those
things
or
remove
them
and
focus
on
continuously
building
up
the
the
overall
value
of
a
website,
especially
compared
to
kind
of
this,
almost
like
slash
and
burn
strategies
that
that
spammers
sometimes
use
where
they
just
like
build
up
a
big
website.
A
And
if
it
doesn't
work,
they
tear
it
down
and
start
over
completely
again,
because
it's
just
so
much
work
and
you
lose
so
much
energy.
On
kind
of
like
this
continuously
building
up
taking
down
thing,
rather
than
just
focusing
on
building
something
really
strong
and
really
good
and
improving
that
over
time.
A
I
do
a
query
on
google
and
the
results
are
not
good.
The
first
result
only
has
one
video
and
thousands
of
comments.
Why
does
that
one
side
rank?
I
end
up
writing
quality
content
on
the
same
topic,
even
not
only
me
other
results
that
are
also
greater
still
below
that
result.
I
look
up
the
website
and
I
found
there
a
lot
of
authoritative
in
in
their
niche,
but
as
much
as
I
know,
google's
priorities
on
content
and
relevancy
and
the
query
was
programming.
A
A
So
my
my
recommendation
here
is
especially
if
you're
starting
out
don't
focus
on
queries
like
programming
instead
focus
on
something
that
is
really
strong,
something
that
you
can
do
really
well
and
something
that
doesn't
have
as
much
competition
or
doesn't
have
as
much
other
content
out
there
already
so
that
you
can
kind
of
build
up
some
experience
over
time
and
understand
how
things
work
understand
how
users
actually
react
to
your
content.
A
Understand
which
kind
of
content
works
well
for
search
which
kind
of
content
works
well
for
users
and
over
time
kind
of
like
keep
building
that
up,
and
that
can
result
in
the
end
with
you
focusing
on
things
that
are
more
head
terms.
So
things
like
that
are
shorter
queries
that
users
search
for
a
lot,
but
that
also
gives
you
a
little
bit
of
foundation
to
kind
of
build
on
where
you
kind
of
know
that.
A
Well,
I
get
a
lot
of
questions
on
the
specific
aspect
of
programming
and
that's
why
I
have
a
lot
of
great
content
on
and
that's
where
I
rank
really
well
and
then
over
time.
Maybe
it
makes
sense
to
expand
from
there
into
a
more
broader
topic,
or
maybe
you
find
other
topic
areas
where
it's
equally
the
case
that
there's
actually
not
a
lot
of
content
out
there.
But
there
are
enough
people
that
are
searching
for
this
information
that
it
makes
it
worth
your
while
to
actually
create
that
content
and
maintain
it.
A
Let's
see,
I
have
a
question
about
branded
names
and
site
diversity
update
in
turkey.
We
have
a
case
about
this
one
website
in
turkish.
It
means
directly
from
the
owner
or
I
guess
it's
a
turkish
word,
and
we
can
say
that
it's
a
generic
keyword
in
our
sector.
So
our
problem,
when
searching
for
a
generic
and
critical
word
like
this,
the
search
results
of
our
companies
is
operating
in
the
used
car
listing
and
real
estate
sectors
do
not
appear
on
google
search
results.
For
that
page.
A
On
the
same
page,
google
argues
that
users
can
only
get
the
best
service
when
they
can
access
very
different
content
from
a
variety
of
sources.
However,
the
situation
is
also
in
contradiction
with
the
fact
that
showing
dozens
of
results
belonging
to
a
single
site
and
queries
that
are
made
with
this
word,
I
don't
know.
It
sounds
like
something
very
specific
where
I
probably
need
to
see
some
of
the
example
search
results
pages,
but
in
in
general,
while
we
we
do
try
to
show
a
variety
of
sources
in
the
search
results.
A
But
it's
also
really
hard
for
me
to
judge
this
because,
on
the
one
hand,
I
I
don't
know
turkish
and
I
don't
know
what
what
the
results
could
be
in
for
for
some
of
those
queries.
So
it's
it's
tricky
for
me
to.
I
don't
know
figure
out
what
what
the
best
approach
here
would
be.
My
my
recommendation
here
would
be,
if
you
can
send
me
some
examples,
maybe
on
twitter
or
somewhere
else,
on
what
you're
exactly
seeing
here
and
why
you
think
this
is
bad.
A
A
But
this
isn't
something
where
we
would
take
individual
queries
and
individual
websites
and
kind
of
like
hand
tune
them
that
that
just
doesn't
work
at
scale,
but
it
is
something
where
sometimes,
this
kind
of
feedback
helps
us
to
improve
our
algorithms.
For
the
long
run,
we
deleted
the
disavow
file
because
it
was
made
in
the
wrong
way.
How
long
will
it
take
to
affect
the
site
results?
A
There
is
no
fixed
time
for
kind
of
deleting
and
reprocessing
disavow
files
in
practice.
What
happens
is
we
pick
up
the
disavow
file
immediately
and
over
time,
as
we
recrawl
the
links
that
used
to
be
in
there?
We
double
checked
the
disavow
file
we
recognize.
Oh,
these
links
are
no
longer
blocked
by
disavow,
and
then
we
take
those
into
account.
A
So
my
recommendation
here
would
be
not
to
delete
the
file
and
just
wait
until
it's
clean
and
then
update
a
new
file,
but
rather
to
just
update
the
disavow
file
to
what
you
want
to
be
taken
into
account
for
the
disavow
and
kind
of
work
from
there
and
kind
of
see
it
as
something
that
over
time
will
be
taken
into
account,
cool,
okay,
we're
kind
of
running
into
the
last
10
minutes.
So
maybe
I'll
switch
back
to
questions
from
you
all.
I
also
have
a
bit
more
time
afterwards.
A
So
if
we
can't
get
through
everything,
we
can
do
more
of
that
later
on.
Let's
see
michael
you're
up
next.
F
Good
morning
get
three
questions:
one
following
your
disavow
one
if
somebody
had
put
a
disavow
file
and
since
then
they've
passed
away
and
stuff
like
that,
we've
taken
over
the
website,
but
we
don't
have
access
to
that
shirts
console
account
if
we
upload
a
blank
or
a
different
disavow
file.
Would
this
one
supersede
the
previous
one.
F
A
F
F
Okay-
okay,
all
right
I'll
have
a
look
for
that.
One,
okay,
okay!
The
next
question
is
want
paul
vitals.
It's
we've
got
static
websites
done
years
ago
and
they
perform
really
well
in
the
search
console.
We've
got
good
urls
all
across
the
board
152,
but
on
the
lighthouse
it
comes
up
as
a
performance
pool
now.
F
Normally,
if
somebody
comes
to
this
website,
they're
browsing
generally
about
seven
pages,
they're
looking
at
limousines,
so
they're
looking
at
different
cars
and
different
events,
and
then
they
convert,
maybe
after
going
through
a
couple
of
pages,
so
it's
not
the
initial
page,
but
obviously
when
they
first
land,
all
the
relevant
assets,
the
css
and
any
javascript,
and
that
so
is
it
sending
the
customers.
Look,
don't
worry
about
the
headline
figures
on
lighthouse
or
just
look
at
your
search
console
for
152,
good
urls
or.
A
A
bit
of
both-
so
I
I
I
think
so,
let's
see
essentially
what
you're
looking
at,
is
the
different
kinds
of
data
that
we
have
around
speed,
which
is
the
the
field
data.
So
what
users
are
actually
seeing,
which
is
in
search
console
and
the
lab
test,
which
is
what
lighthouse
would
be
showing
and
the
the
lab
tests
try
to
approximate
what
users
would
see
and
they're
very
useful
for
things
like
recognizing
low
hanging
fruit,
things
that
you
can
improve
on
your
website
and
they're
useful
for
kind
of
this
iterative,
debugging
and
optimization
process.
A
But,
lastly,
what
we
take
into
account
is
what
users
actually
see
and
if
users
are
seeing
good
results
and
lighthouse
is
saying.
Oh,
like
you
could
improve
some
things,
then
you
could
still
improve
those
things,
but
users
are
actually
seeing
something
good.
So
it's
essentially
you're
already
in
a
good
state
and
because
lighthouse
tries
to
approximate
what
users
see.
Sometimes
that
doesn't
match
what
your
actual
users
see.
A
So
in
particular,
we
we
make
certain
assumptions
with
regards
to
kind
of
device
connections,
the
capacity
of
the
devices
that
they're
using,
and
it
might
be
that
your
users,
don't
don't
match
those
assumptions
and
because
of
that
maybe
they're,
seeing
a
really
good
experience
versus
what
our
lab
tools
say.
Well,
perhaps
they
might
be
c.
F
Okay,
look,
okay
and
the
last
question
is
we're
reworking
a
number
plate
registration
site,
so
these
aren't
number
plates
you
make
up.
These
are
pretty
old
number
plates.
It's
got
500
number
plates,
so
the
actual.
Now
we
want
people
to
find
those.
The
only
problem
is
each
number
plates,
obviously
on
their
own
page
and
all
that
kind
of
stuff.
But
the
only
thing
different
is
the
number
and
you
can't
write
you
know:
1500
words
about
a
registration
plate.
A
I
I
think,
that's
tricky
yeah.
I
I
think
on.
On
the
one
hand,
it
it
makes
it
really
hard
to
make
interesting
pages
and
on
the
other
hand,
I
assume
lots
of
other
sites
already
have
similar
information.
A
So
we
see
that,
for
example,
with
all
of
the
phone
number
sites,
where
essentially
people
write
a
script
and
they
iterate
through
all
possible
numbers,
and
they
just
create
pages
for
those,
and
it's
like,
on
the
one
hand,
there's
a
lot
of
competition
because
those
numbers
are
on
different
places
on
the
web.
A
On
the
other
hand,
there's
very
little
actual
content
on
the
individual
pages,
so
you
have
to
kind
of
find
a
balance
between
making
something
that
is
actually
useful
for
for
users
there
and
making
something
that
I
don't
know
has
enough
value
that
it
can
stand
on
its
own.
So
maybe
it
would
make
sense
to
combine
some
of
these
and
say,
like
I,
don't
know,
500
or
number
plates
or
600
or
number
plates
something
like
that.
A
F
Those
look
yeah,
so
these
are
like
numbers
going
back
since
they
dot
in
the
uk,
they're
not
made,
so
these
are
previously
owned,
so,
for
example,
a1
and
on
a
number
plate.
They
can't
be
purchased
anywhere
else.
So
if
you
wanted
your
a1
on
your
car,
the
only
place
to
get
it
from
is
this
particular
website
and
that's
it.
So
if
somebody's
going
to
search
for
it
that
page,
that
number
needs
to
come
up
somewhere.
A
A
G
We
run
a
ecommerce
platform
where
anybody
can
come
along
and
create
products
stores
they
have
a
small
blog
area
and
so
on,
and
all
of
this
is
under
our
domain,
so
think
shopify,
but
all
their
stores
under
one
domain
and
we
move
about
100k
pages
indexed
and
we
think
these
pages
are
best
described
as
user
generated
content
or
udc,
and
we
watch
previous
office
hours
of
yours
and
we
learned
that
udc
content
and
our
content
is
treated
the
same
and
that
we
should
filter
out
low
quality
content
and
we're
doing
our
best
to
do
that.
G
My
question
is,
and
hopefully
it's
not
a
silly
one
we'd
like
to
rank
higher
for
our
own
target
keywords,
but
we're
worried
that
all
this
varied
content
and
all
this
noise
is
holding
us
back.
Is
there
something
else
we
should
be
doing
to
kind
of
differentiate
our
content?
We
have
our
own
blog
and
our
own
marketing
pages,
but
then
there's
a
ton
more
ecommerce
pages.
G
A
A
But
when
you
essentially
offer
individual
websites
that
people
can
create
and
host
on
your
platform,
then
it
makes
more
sense
to
put
that
into
something
like
a
subdomain,
so
that
each
user
has
their
own
subdomain
or
maybe
even
make
it
so
that
it's
easy
for
them
to
put
it
on
their
own
domain
and
for
you
to
kind
of
take
your
kind
of
like
infrastructure
information
and
also
put
that
on
a
separate
site
got
it.
A
So
essentially,
something
like
I
don't
know
your
your
provider
as
as
a
separate
website
from
all
of
the
hosted
content
that
you
have
and
using
subdomains
is
sometimes
an
easy
way
to
do
that.
Sometimes
it
makes
more
sense
to
separate
that
out,
even
more
kind
of
similar.
I
guess
to
to
how
a
blogger
does
it
and
that
you
go
to
blogger.com
for
the
blogger
information
and
the
subdomains
on
blogspot.com
are
the
individual
user
sites.
A
I
I
think
that
kind
of
model
makes
it
a
lot
easier
that
also
helps
to
isolate
things
a
little
bit
when
individual
providers
post
a
lot
of
low
quality
content,
because
then
we
can
recognize.
Actually,
this
sub
domain
is
low
quality.
Not
the
website
overall,
or
sometimes
we
run
into
situations
where
someone
includes
malware
or
spam
or
something
on
their
site,
and
if
we
see
that
all
as
a
part
of
your
whole
site
itself,
then
we'd
say
well.
Actually
this
whole
site
is
has
a
lot
of
spam
or
malware
elements
attached
to
it.
G
Got
it
that
makes
sense?
It's
interesting.
You
mentioned
the
malware
because
we,
it
was
actually
a
second
part
to
my
question,
where
we
in
search
console
we've
been
getting
uncommon
downloads
as
a
warning
and
what
we've
done
with
that
is.
We've
scanned
our
website,
it's
easy
for
us
to
destroy
infrastructure
and
recreate
it,
and
then
we
submit
a
review,
but
it's
been
coming
back
once
a
month.
We
get
a
success
message,
but
it
comes
back
again.
I
guess
it's
really
hard,
also
to
ask
for
a
specific
url.
That's
causing
that
issue.
G
I
don't
know
what
we
could
do
with
that.
Yeah.
A
I'm
I
mean
usually
the
the
uncommon
downloads
comes
from
something
where
you're
hosting
something
that
is
almost
like:
unique
per
user.
So
that's
kind
of
the
general
situation
where,
for
example,
you
can
sign
up
and
you
can
download
some
software
tool
and
it
generates
a
zip
file
or
an
executable
specifically
for
that
user
and
then
because
those
files
are
unique
to
each
user.
We
can't
scan
those
files
and
that's
that's
why
we
flagged
them
as
uncommon
downloads.
A
A
But
if
you
do
the
review,
we
will
double
check
and
we
will
let
you
know
if
there's
anything
that
was
actually
problematic
and
if
these
are
files
that
individual
users
are
uploading,
then
it's
it's
kind
of
tricky
and
that's
also
one
of
those
cases
where
isolating
the
individual
users
from
each
other.
I
think,
makes
a
lot
of
sense,
because
then
it's
just
one
sub-domain
or
one
domain.
That
is
getting
this
warning
and
they
have
to
figure
it
out,
and
it's
not
something
that
you
have
to
figure
out
for
everyone.
G
Got
it
got
it
and
just
to
clarify
the
subdomain?
Do
you
mean
the
subdomain
on
our
domain,
because
wouldn't
that
still
fall
under
our
property.
A
A
Okay
time
wise,
I
think
we're
at
time,
but,
like
I
mentioned,
I
still
have
more
time
for
for
the
people
that
still
have
raised
hands.
Thank
you
all
for
all
of
your
questions.
So
far,
it's
been
really
interesting
lots
of
good
questions,
and
I
hope
you
found
this
useful
if
you're
watching
this
on
youtube
feel
free
to
join
in
on
one
of
the
future
office
hours.