►
From YouTube: English Google SEO office-hours from July 16, 2021
Description
This is a recording of the Google SEO office-hours hangout from July 16, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
seo
search
central
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
on
the
search
relations
team
here
at
google
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
website
and
web
search
looks
like
a
bunch
of
you
have
raised
your
hands
already
and
I
think,
michael
you
were.
You
were
the
first
one,
somehow
your
hand
disappeared.
I
want
to
get
started
sure.
B
So
I
came
in
to
work
on
a
site
that
certain
sections
of
the
site
didn't
seem
to
have
the
upward
momentum
with
the
organic
search
that
I
would
expect
I
diagnosed
what
I
believe
is
the
correct
diagnosis.
For
that
section
of
the
site,
the
blog,
I
think
the
entire,
like
the
entire
blog
page,
is
coded
in
javascript
in
the
javascript
app
with
uncrawlable
links.
B
So
I've
taken
your
advice
and
from
previous
office
hours
to
confirm
that,
and
so
this
week
we
rolled
out
a
temporary
band-aid
because
it's
going
to
take
a
long
time
to
kind
of
recode
that
entire
section,
so
temporary
band-aid
we
did
was
put
some
no
script
code
in
there,
so
that
if
javascript
was
disabled,
that
they
would
see
links
to
all
the
blog
posts.
B
B
Is
that
gonna
hurt
us
on
the
on
that
main
blog
page
by
having
600
links
on
there
yeah,
and
so
that's
just
generally,
you
want
to
see
I
I
would
assume
that
we
would
have
a
positive
experience,
a
positive
impact
by
doing
this
fix,
but
I
just
wanted
to
confirm
with
you.
If
you
saw
any
red
flags
there.
A
Okay,
so
I
think
first
of
all,
we
wouldn't
consider
that
cloaking,
because
it's
essentially
just
providing
something
similar
to
what
you
have
in
the
javascript
side.
So
I
wouldn't
worry
about
that
part
in
general.
We
do
try
to
ignore
things
in
a
no
script
tag,
so
that
might
be
something
to
watch
out
for
to
double
check
if
it's
actually
working
as
expected.
A
Okay,
so
I
kind
of
watch
out
for
that
and
the
the
600
links
by
by
themselves
that
shouldn't
be
problematic,
but
it
it
does
sound
like
this
is
more
of
kind
of
like
a
temporary
solution
to
kind
of
like
as
a
stop
gap
measure
to
try
to
get
all
of
these
these
pages
back
into
the
index,
something
like
that
and
for
for
like
a
temporary
solution.
I
I
don't
see
any
problems
with
that.
A
I
think,
for
the
long
run,
it
would
be
worthwhile
to
figure
out
a
way
to
make
sure
that
you
have
a
clear
structure
within
the
website:
that's
kind
of
crawlable,
but
as
kind
of
a
temporary
step
to
at
least
making
those
pages
findable
sure.
Okay,
thank
you
sure.
Let
me
just
grab
you
lindsay.
I
I
think
you
were
somewhere
in
the
list
I
forgot
where
but
we'll.
C
Great
thanks,
I
hope
not
cutting
anybody
else
off
my
question.
I
don't
know
if
it's
quite
applicable
for
this,
this
conversation
or
not,
but
I
thought
I
would
give
it
a
shot
it's
relevant
to
the
top
stories
carousel
and
how
it's
sort
of
approached
and
curated
in
different
countries.
C
I
work
on
international
seo,
so
I
spend
a
lot
of
time
looking
at
serps
in
multiple
regions,
I
just
noticed
a
lot
of
differences
by
region,
some
of
which
are
obviously
good
and
relevant
because
they're
targeting
different
audiences
with
different
search
habits,
but
in
canada
I've
noticed
some
differences
that
seem
to
just
have
a
generally
kind
of
detrimental
effect
on
on
the
quality
of
ourselves.
I
posted
something
about
this
in
the
channel,
so
apologies
that
this
is
redundant
as
well,
but
wire
stories
are
syndicated
content.
For
example,
it's
not
unusual.
C
It's
actually
quite
common
for
the
same
identical
story
to
appear
multiple
times
in
top
stories:
side
by
side
by
side,
so
a
user
might
search
for
a
news
topic
and
see
identical
content
across
publishers,
which
has
always
sort
of
struck
me
as
unusual
and
not
something
I
ever
see
in
u.s
search
results.
C
Similarly,
there's
often
more
stale
content,
so
the
same
query
right
at
the
same
time
in
say,
canada,
the
uk
and
the
us.
The
us
often
has
much
more
fresh
results
even
for
a
story
that
is
not
u.s
specific
in
nature,
whereas
results
in
canada
or
the
uk
might
be.
You
know
four
or
eight
hours
a
day
earlier,
so
obviously
in
the
new
space.
A
Now
I
I
don't
have
really
a
lot
of
insight
into
that,
but
in
in
general
we
try
to
treat
these
kind
of
search
features
as
a
part
of
organic
search
results,
and
that
means
we
do
apply
everything
around
geo
targeting
there
as
well-
and
I
could
imagine
there
might
be
situations
in
some
countries
where
we
just
don't
have
enough
content
to
kind
of
make
sure
that
we
have
the
the
really.
I
don't
know
high
quality
bar,
essentially
they're
in
the
top
stories,
and
I
could
imagine
that
you
might
be
seeing
something
like
that.
A
At
least
it's
not
the
case
that
we
kind
of
like
purposely,
have
different
requirements
in
different
countries.
So
it's
that's
that's
kind
of
the
direction
I
I
would
suspect
there.
I
think
what
would
be
useful
from
from
my
point
of
view
would
be
to
have
some
screenshots
and
some
examples
from
you.
So
if
you
run
into
this
and
you
you're
like
oh
this,
this
looks
terrible
like
send
sense
on
my
way,
and
I
can
pass
those
on
to
the
team
that
works
on
these
features
to
kind
of
give
them
feedback.
A
Usually,
these
kind
of
changes
aren't
things
that
they'd
be
able
to
do
from
one
day
to
the
next,
but
sometimes
having
some
examples
of
where
things
are
really
bad
makes
it
easier
for
them
to
say.
Oh,
we
need
to
prioritize
working
on
international
search
quality
or
what
whatever
it
is.
C
I've
been
tracking
for
about
a
year
and
have
more
examples
than
you
would
ever
want,
so
I'll
happily
send
that
that
your
way
is.
Is
there
a
preferred
address
or
a
place
to
send?
Let
me
just
drop
my
email
address
here
in
the.
A
Chat,
thank.
A
Sure
all
right,
akash.
D
Hi
john
hi
hi,
so
actually
I
was
working
for
a
client
for
internet
like
internet
internalization
and
what
I
found
is
like
for
few
countries.
It
could
be
like
say,
france
or
uk
with
the
proper
age
reflect
tax.
We
are
able
to
rank
up
pages
like
what
they
what
the
scenario
was,
but
for
few
countries
we
are
unable
to.
You
know,
get
our
the
country
specific
urls
over
there.
So
we
have
added
that
proper
hf
link
tags
site
maps.
We
have
separated
them.
D
We
make
sure
that
the
home
page
and
the
internal
part
of
those
you
know
particular
sections
are
building
properly.
But
since
you
know,
like
few
countries
have
been
able
to
get
their
proper
urls,
but
but
for
some
few
countries
they
are
not
able
to
achieve
that.
Like
I've
been
working
for
more
than
like
two
to
three
weeks,
what
could
be
the
like
possible
issues
and
how
to
figure
it
out,
but
like
they're,
not
able
to
find
all
those
things.
A
A
So,
for
example,
if
we
think
one
of
your
pages
should
be
ranked
number
five
and
a
user
is
searching
in
the
uk,
and
you
have
one
page
that
is
marked
up
with
hreflang
for
the
uk.
For
that,
then
we'll
try
to
show
that
page.
So
it's
not
that
it
will
change
the
rankings.
It
it'll
just
try
to
find
the
best
page,
based
on
the
set
of
pages
that
you
have
marked
up
with
hrefland.
So
that's
kind
of
the
first
thing
for
ranking
internationally
geo-targeting
would
be
the
the
correct
approach.
A
So
that's
something
you
can
do
either
with
country
code,
top-level
domains
or
with
a
generic
top-level
domain
and
then
either
sub-domains
or
sub-directories,
and
you
can
specify
geo-targeting
in
search
console
geo-targeting
changes
rankings
slightly
when
we
think
that
the
user
is
looking
for
something
local,
so
it
wouldn't
change
it.
A
If
we
think
the
user
is
looking
for
something
completely
global,
but
if
we
suspect
they
would
like
to
have
local
results,
we'll
try
to
use
geotargeting
to
improve
that,
and
that's
that's
the
other
thing,
and
I
think
the
the
final
thing
also
worth
saying
is
specifically
because
atrial
flank
doesn't
change
rankings.
You
need
to
kind
of
make
sure
that
your
website's
different
country,
specific
versions,
are
properly
known.
That
means
they're
properly
linked
within
your
website.
A
That
means
that
you
properly
work
to
promote
them
as
well
externally,
so
that
we
know
that
these
individual
pages
should
be
shown
in
the
search
results
in
individual
countries.
So
those
are
kind
of
the
the
aspects
there
so
hr
flying
is
is
a
good
approach,
but
it
doesn't
change
rankings
geo-targeting
would
change
rankings
when
someone
is
looking
for
something
local
and
with
international
sites.
You
always
need
to
make
sure
that
your
pages
are
are
also
kind
of
supported,
so
that
we
can
show
them
in
search.
D
Okay,
one
more
thing,
john,
how
about
to
you
know,
tell
a
user.
You
know
that
this
is
the
proper
url
for
your
country.
Rather
than
going
to
the
more
specific
of
like
canonical
version,
because
I
see
in
in
some
other
countries
we
have
a
canonical
url
like
the
the
one
which
is,
you
know
same
for
all
these
pages.
It's
been
getting
ranked
rather
than
what
like
the
u.s
would
have
the
u.s
version.
So
for
some
countries
it's
been
properly
visible.
A
Usually
it
does
take
time,
yes,
but
I
I'm
not
100
sure
if
you're
doing
it
right
with
the
royal
canonical
in
particular,
if
you
have
different
country
versions,
you
should
not
pick
one
country
as
a
canonical,
so
that's
that's.
One
thing
to
watch
out
for
these
pages
should
be
separate
and
they
should
be
able
to
rank
separately.
A
But
the
the
other
thing
that
you
mentioned
kind
of
like
to
tell
the
user,
which
is
the
right
version,
I
would
recommend
using
a
javascript
banner
to
tell
the
user
hey.
We,
we
have
a
version
that
we
think
is
a
local
version
for
you
and
you
can
click
on
it
here.
If
you
use
a
redirect,
that
always
makes
it
a
little
bit
tricky
because
we
can't
crawl
through
that
properly,
because
we
always
crawl
from
one
location.
E
Hi,
so
my
first
question
is
related
to
a
product
that
we
have,
and
this
product
is
something
that
is
available
in
in
to
different
countries
with
different
names,
so,
for
example,
it's
available
in
uss
product
us
and
in
india,
it's
like
product
india.
So
both
these
products
have
like
similar
features,
specifications
not
exactly.
F
E
But
similar
now
what
we
were
looking
for
is
if
someone
in
india
is
searching
for
product
u.s.
Ideally
they
should
see
product
india
in
search
results
will
provide
them
and
an
option
to
go
to
product
use
if
they
want
to,
but
in
search
they
should
see
a
product.
India.
So
do
you
think
hillary
clinton
tag
is
a
good
solution
to
the
problem,
or
do
we
need
something
else?
I
I
think.
A
Hr
flang
is,
is
the
right
solution
for
that
because,
like
I
mentioned
before
it,
it
doesn't
change
rankings,
but
it
swaps
out
the
url
so
in
in
a
case
like
that,
even
if
you
have
different
product
names,
if
someone
searches
for
one
of
those
product
names,
then
we
will
still
use
atria
flang
to
try
to
swap
out
the
local
version.
So
that
would
help
someone
searching
for
the
us
name
if
they're
located
in
india
to
get
the
page
from
the
indian
site
instead.
E
Yeah
sure
sure,
and
that's
just
one
more
question:
it's
related
to
the
search
console
so
just
want
you
to
understand
that
the
search
the
queries
that
we
see
in
google
search
console
are
the
exact
query
that
the
user
is
using
while
searching
or
is
it
like
something
where
google
clubbing
different
queries
with
same
intent
into
something?
You
know
one
query
and
that's
how
it
shows.
A
We
don't
try
to
mix
different
intents
together,
but
we
do
try
to
simplify
some
things
in
search
console
and
I
I
think
what
happens
is
everything
is
lowercase
and
we
probably
remove
some
punctuation.
I
don't
know
like
the
the
exact
details
with
the
punctuation,
but
it's
it's
something
where
we
don't
interpret
the
query
and
try
to
make
a
simpler
version
of
the
query,
but
we
do
very,
very
light
kind
of
canonicalization
of
those
queries.
G
Hi
john,
so
I
think
my
question
is
hopefully
more
simple
than
everyone
else's,
but
I
work
at
a
startup
and
we
just
launched
our
website
and
unfortunately,
when
you
search
up
our
name
on
google,
our
web,
our
company
name,
which
is
q,
a
I
so
h-u-m-e-space
ai.
The
meta
description
for
the
home
page
has
really
odd
punctuation
and
formatting,
and
so
we
use
like
a
like
a
content
management
system
called
contentful
to
update
the
meta
descriptions
and
we've
tried
to
do
that.
G
We've
tried
to
force
google
to
reindex
and
recall
the
site,
but
that
meta
description
never
gets
changed
and
so
kind
of
like
a
sad
and
stressful
thing
on
our
part,
because
we
work
so
hard
on
the
site.
But
you
know
it
kind
of
looks
janky
when
you
search
it
on
google.
So
I'm
not
sure
if
you
sort
of
can
help
me
troubleshoot
that
or
if
you
know
anybody,
I
can
contact,
I
kind
of
drop
them
on
the
office
hours
because
I
couldn't
find
any
any
other
help
out
there.
A
Yeah,
so
I
I
would
definitely
drop
by
the
search
central
help
forum,
because
folks
there
can
probably
help
out,
but
the
I
think
there
are
two
a
couple
things.
On
the
one
hand,
we
we
do
sometimes
update
the
description
based
on
what
people
search.
So
that's
something
where
it's
not
always
guaranteed
that
we'll
use
your
meta
description.
Even
if
you
have
a
clean
one.
We
we
try
to
match
the
description
to
what
people
search
for.
So,
if
you
have.
A
Okay,
that
sounds
weird:
okay,.
G
Yeah,
like
you,
just
look
up
like
hume
space
ai
on
google
right
now,
literally
like
it
just
it's
it's
quite
odd.
So
maybe
we
can
take
this
conversation
offline,
but
I
can,
I
also
tried
the
search
for
him
and
unfortunately
nobody
there
wasn't
anything
like
this
online
and
nobody
was
able
to
help
so
now.
This
is
like
my
last
resort.
A
I
mean
sometimes
the
description
takes
a
while
to
update.
So
that's
that's
another
thing
with
regards
to.
I
don't
know
weird
characters
in
the
description.
A
That's
I
I
suspect,
that's
something
that
would
go
away
when
we
start
processing
the
the
meta
description
for
your
pages
and
yeah.
That's
just
a
matter
of
time.
It's
been
like
two
months.
I
think
so
it's
pretty
long
yeah
and
the
other
thing
to
watch
out
for
especially
with
titles
and
descriptions
is.
A
You
should
ideally
make
sure
that
they're
unique
per
page
and
that
they
don't
come
across
as
being
too
spammy,
so
not
like
a
collection
of
keywords,
but
rather
like
clear
sentences
where
you're
saying
our
company
is
this
and
this
and
this,
but
it
is
yeah.
Okay,
can
you
drop
the
the
link
to
your
site?
Maybe
in
the
chat
here
and
I
can-
I
can
take
a
look
just
afterwards-
double
check
if
there's
anything
happening
on
our
side,
but.
G
That's
fine
or
just
if
you
know
of
anybody
who
can
respond
or
would
be
the
point
person
for
me
to
contact
that
would
be
fantastic.
A
Cool,
let's
see
miguel.
H
My
question
is
related
to
soft
photo
four
errols,
so
our
site
is
having
a
number,
an
increased
number
of
errors
in
what
looks
valid
pages
and
we
are
having
these
errors
for
several
months
and
before
we
managed
to
solve
for
a
lot
of
pages
updating
the
text
so
a
text
that
was
there
for
for
several
times.
We
we
updated
from
because
it
contained
something
like
no
results.
No
no
results
found,
but
now
we
are
having
again
this
type
of
errors
and
yeah.
Actually,
we
we
are
struggling
on
how
to
to
fix
it.
H
So
I
would
like
to
ask
if
there
is
any
specific
text
that
might
trigger
this
soft
photo
for
errors
or
any
other
issue
yeah,
and
also
we
see
that
on
the
on
the
google
index
is
saying
that
the
page
is
so
404.
But
when
doing
the
live
test,
is
it's
okay
and
when
visiting
the
page
with
the
browser
is
also
okay,
so
yeah?
If
you
can
give
us
some.
A
Hints
I
I
would
have
to
take
a
look
at
some
of
the
pages,
so
if,
if
you
have
a
thread
in
the
help
forum
with
maybe
some
links
from
your
site,
where
you're
seeing
this
I'd
love
to
be
able
to
pass
that
on,
so
if
you
could
maybe
drop
a
link
to
the
forum
thread
or
to
some
sample
urls
here
in
the
chat,
then
I
can
pass
it
on
to
to
the
team.
Okay.
So
I
I
think
the
the
text
that
you
had
in
there
before,
where
it's
something
like
no
results
found.
D
A
That's
less
a
matter
of
it's
like
how
do
I
change
that
text
to
make
it
so
that
it
doesn't
look
like
an
empty
search
results
page,
but
rather
it
should
be
a
not
an
empty
search
results
page
because
empty
search
results
pages
are
always
tricky
for
for
us,
because
we
we
don't
really
know
if
there's
anything
worth
indexing
there.
A
A
I
I
would
just
make
sure
that
you
have
significant
content
on
these
pages
so
that
it's
not
like
a
really
short
page.
That
looks
like
an
error
page.
A
I
I
would
make
sure
that
you're
using
a
clear,
url
structure
so
that
it's
clear
to
us
that
these
pages
are
a
part
of
your
normal
website
and
not
that
these
pages
look
like
let's
say,
search
results
pages
where
you
have
like
a
parameter
for
searching
and
then
some
keywords
some
something
like
that,
so
those
are
kind
of
the
the
tips
there,
but
in
general,
if
these
are
normal
content
pages
on
your
website,
then
I
feel
our
system
should
be
able
to
pick
that
up
properly
and
if.
A
I
G
J
English,
okay,
but
that's
actually
not
what
I'm
here
for
I
just
wanted
to
take
it
back
to
be
like
yeah.
We
had
to
think
about.
How
can
we
make
this
look
like
a
profile
unless,
like
a
results,
not
found
page?
J
Should
I
drop
anyway
mike
I'm
coming
today,
because
we've
been
flying
a
little
bit
blind
with
our
search
results,
console
because
we
have
like
a
very
large
number
of
crawled,
but
not
indexed
pages
listed
under
excluded,
but
then,
when
we
click
into
them,
most
of
these
seem
to
have
converted
into
indexed
pages,
so
we're
like
really
unable
to
accurately
track
how
improvements
to
our
site
are
like
impacting
which
pages
are
being
indexed,
and
I
was
curious,
I
guess,
about
the
timeline
and
that
we're
concerned
it's
impacting
your
crawling
budget.
A
Okay,
I
doubt
it
would
be
affecting
your
crawling
budgets
so.
A
Kind
of
like
as
a
side
note,
it's
that's
good,
it's
something
where
I
I've
recently
seen
some
some
threads
like
this
on
on
twitter
as
well,
where
people
saw
urls
that
were
flagged
as
not
being
indexed
in
search
console
and
then,
when
you
check
them
individually,
they
are
actually
indexed,
and
I
don't
know
exactly
what
what
is
happening
there
yet.
But
if
you
have
some
examples,
if
you
can
send
them
to
me,
that
would
be
fantastic.
A
Okay
just
so
I
can
take
a
look.
My
my
suspicion
is
it's
more
a
matter
of
timing,
in
that
we
show
them
in
the
search
console
report,
and
then
they
get
indexed
over
time
and
then
at
some
point
they
would
drop
out
of
the
report
again
and
for
whatever
reason,
kind
of
that
dropping
out
is
taking
a
little
bit
longer
than
it
should
that's.
Okay,
that's
kind
of
my
guess.
There
one
way
to
kind
of
verify
that
is
to
see
if
these
pages
actually
show
up
for
normal
searches,
so
yeah.
F
A
J
J
I
I
think
we're
just
trying
to
really
keep
an
eye
out
for
like
actual
problems
that
are
preventing
indexing
as
they
come.
So
it's
I
think,
a
little
bit
difficult
with
those,
but
then
my
other
question
about
being
concerned
about
it
impacting
crawling.
We
did
submit
a
google
bot
report
asking
about
like
our
numbers
being
consistently
low.
Despite
improvements,
do
you
know
what
the
timeline
would
be
for
hearing
back
from
that
team?
Roughly.
A
Something
where,
if
they
find
issues
with
regards
to
crawling,
they
just
fix
it
and
then
kind
of
set
things
up
normally.
So
that's
something
where,
for
the
most
part,
you
wouldn't
get
a
response
from
from
that
team,
it's
more
like
you're
reporting
it,
and
then
they
take
a
look
and
say:
oh,
it's
an
issue
or
not
an
issue,
and
then
they
they
deal
with
that.
J
I
see
well,
thank
you.
So
much
do
you
have
any
other
advice
for,
I
think
getting
insight
into
like
the
current
crawling
budget
other
just
because
I
feel,
like
we've
really
been
trying
to
make
improvements,
but
haven't
seen
a
jump
in
like
pages
per
day
crawled
how?
How
big
is
your
site?
Our
site
is
in
the
hundreds
of
thousands
of
pages,
and
we
we've
seen
like
maybe
around
2
000
pages
per
day
being
crawled,
even
though
there's
like
a
backlog
of
like
60
000
discovered,
but
not
yet
indexed
or
crawled
pages.
J
A
So
in
in
practice,
I
I
see
two
main
reasons
why
why
that
happens?
On
the
one
hand,
if
the
server
is
significantly
slow,
which
is
kind
of
the
the
response
time,
I
think
you
see
that
in
the
crawl
stats
report
as
well.
A
That's
one
area
where,
if
like
I
had
to
give
you
a
number
I'd,
I'd,
say
aim
for
something
below,
like
I
don't
know:
300
400
milliseconds,
something
like
that
on
average,
because
that
allows
us
to
crawl
pretty
much
as
much
as
we
need
it's
not
the
same
as
the
the
page,
speed,
speed
kind
of
thing,
so
that's
kind
of
one
thing
to
watch
out
for
the
the
other.
Big
reason
why
we
don't
crawl
a
lot
from
websites
is
because
we're
not
convinced
about
the
crawler
quality
overall.
J
Yeah,
because
we
were,
I
think
in
that
camp
of
like
too
many
lower
quality
pages
and
we've
adapted
over
time
for
like
here
are
the
pages
that
should
be
listed
and
should
be
crawled.
But
I
feel
like
we're,
probably
still
in
the
lower
quality
camp,
and
we
don't
know
how
to
move
up.
But
I
guess
it
takes
time.
A
Instead
of
saying
you
have
a
hundred
thousand
pages,
you
say:
well,
I
have
20
000
pages
that
I
want
google
to
crawl
and
index,
and
these
are
our
20
000
best
pages,
and
by
doing
that,
it's
a
lot
easier
for
us,
on
the
one
hand,
to
say:
well,
we
can
crawl
in
index
20
000
pages.
That's
fine,
and
we
can
look
at
these
pages
and
we
can
see.
Oh,
these
are
actually
performing
really
well.
These
are
really
good
pages
and
then,
from
there
over
time
we
can
kind
of
expand
to
the
rest
of
the
site.
A
J
A
I
I
would
assume
that
takes
a
couple
of
months,
maybe
half
a
year,
something
along
those
lines,
because
we
we
really
need
to
kind
of
take
the
time
to
to
understand
the
essential
new
site
that
we
find
like
that,
and
some
some
things
will
pick
up
fairly
quickly,
but
when
we're
talking
about
the
overall
quality
of
a
website,
that
does
take
a
bit
of
time.
A
Okay,
thank
you
all
right.
Let
me
just
take
you
mark
and
then
I'll
switch
to
some
of
the
submitted
questions
so
that
we
don't
lose
track
of
those
either.
I
Sure,
thanks
thanks
john
yeah,
I
was
wondering
if
google
has
any
updated
guidance
on
what
else
called
white
label
providers
or
those
that
take
advantage
of
leasing,
subdomains
or
subfolders
from
high
authority
sites
who
aren't
actually
creating
the
content
themselves,
specifically
after
the
latest
core
updates.
We
see
many
of
I'll
say,
like
news
sites
doing
this
and
they're
jumping
up
in
rankings
pretty
quickly
taking
maybe
five
or
six
of
top
10
search
results
for
some
high
volume
terms
in
our
in
our
industry.
A
Yeah,
it
would
be
treated
appropriately
that
magic
word
yeah.
I
I
don't
think
we
have
any
updated
guidance
on
that.
Essentially
we
we
still
feel
the
same
way.
The
the
idea
is
essentially
on
our
side
to
say.
Well,
these
are
not
a
part
of
the
main
news
website,
but
maybe
we
can
show
and
rank
them
kind
of
independently
as
well,
and
that's
something
a
kind
of
an
approach
that
I
think
is
is
reasonably
fair,
but
I
could
imagine
that
over
time
there
are
some
things
that
we
need
to
reconsider.
A
A
Just
so
that
we
we
can
pass
it
on
to
the
team
and
say
look
like
we.
We
worked
on
this
couple
years
ago,
but
maybe
it's
time
to
work
on
it
again.
A
bit.
I
Okay,
yeah
just
trying
to
understand
if
that's
like
an
acceptable
model,
so
that
we
understand,
if
that's
something
we'll
see
long
term,
so
I'll,
definitely
flag
it
in
twitter
and
one
other
thing
real,
quick
after
the
latest
core
updates.
We
see
people
also
ask
coming
up
higher
in
terms.
I
was
wondering
if,
like
there's
a
if
that
changes
with
core
updates
or
how
google
determines
that,
if
it's
like
per
search
or
if
it's
kind
of
per
industry
where
they
appear
in
the
results.
A
I
I
don't
think
it's
per
industry,
but
usually
what
what
happens
with
these
kind
of
elements
in
the
search
results
is
that
we
we
try
to
figure
out
what
is
the
most
appropriate
thing
to
show
and
with
a
core
update
that
does
change
some
things
kind
of
inner
core
algorithms
and
it
can
result
in
kind
of
a
mix
or
different
mix
of
elements
that
we
show
in
the
search
results.
So
sometimes
that's
that's
what
the
people
also
ask.
Sometimes
that's
with
regards
to
featured
snippets
or
rich
results.
A
A
Okay,
let
me
run
through
some
of
the
submitted
questions
so
that
we
have
some
of
that
covered.
I
think
some
of
these
overlap
with
what
we
talked
about
already.
How
can
a
website
related
to
other
niches,
like
ringtones,
handle
high
quality
content
when
I
search
for
high
quality
content?
Everyone
talks
about
writing
high
quality
articles,
but
few
niches,
like
I
mentioned,
are
completely
different.
So
what
what
can
I
do?
A
A
Then
it's
really
hard
to
go
in
there
and
say
I
will
provide
something
of
higher
quality.
Usually
my
recommendation.
There
is
to
look
at
this
more
as
a
business
problem
than
an
seo
problem
and
if
you
want
to
start
a
business
in
a
very
competitive
area
with
a
commodity
product,
essentially
then
you're
going
to
have
a
really
bad
time
or
you're
going
to
have
to
have.
I
don't
know
a
long
long
time
available
to
kind
of
work
and
improve
things
over
time.
A
A
Okay,
wow
you're
next
too
perfect
yeah,
especially
on
large
websites,
with
a
lot
of
no
index
pages
and
a
lot
of
internal
links
to
those
pages.
Do
you
think
this
can
save
crawling
budget
and
link
juice?
I
I
don't
know
I'm
I'm
torn
about
this.
I
I
think
to
to
some
extent
like
if
you're
linking
to
things
like
category
pages
or
different,
filtered
pages
on
an
e-commerce
site,
different
sorting
parameters,
those
kind
of
things
I
could
imagine
that
using
nofollow
helps
us
to
focus
on
the
primary
content.
A
But
if
you're
talking
about
things
like
a
privacy
policy
page,
you
just
don't
want
to
link
to
that.
I
don't
see
how
using
nofollow
on
those
links
will
change.
F
F
J
A
Yeah,
I
I
don't
think
it's
something
that
has
like
a
clear
binary
answer
where
you
should
never
do
it.
I
could
imagine
in
some
situations
it
makes
sense.
Like
I
mentioned,
if
you
have
categories
or
filters
and
sorting
parameters,
things
like
that,
I
could
imagine
it
helps
us
a
little
bit,
but
in
general,
even
without
the
no
index
there,
without
the
no
follow
there.
A
F
F
A
It's
something
that
that
our
systems
tend
to
learn,
though
so
that's
that's
kind
of
the.
I
think
the
tricky
aspect
here
in
that
when
it
comes
to
crawling
websites,
we
try
to
understand
which
pages
need
to
be
crawled.
How
often,
and
if
we
see
that
a
page
is
a
no
index,
then
we'll
say
well,
we
don't
have
to
look
at
this
that
often
because
it's
essentially
a
soft
404
or
something
like
that.
F
F
A
Yeah
I
so
I
I
don't
think
it's
terrible
to
put
in
no
follow
on
these
on
these
pages
and
it
might
have
a
positive
effect
in
in
that
regard.
So
that's
something
I
I
would,
in
your
case
at
least
try
out
and
you
you
can
test
it
out
and
see
what
happens.
It's
not
it's
not
the
case
that
if
you
put
no
follow
on
internal
links
and
something
will
explode,
it's
just
like
for
for
normal
pages,
you
wouldn't
need
to
do
that.
A
But
in
in
a
case
like
yours,
I
could
imagine,
maybe
it
makes
sense.
The
other
approach
that
you
might
take
is
to
make
sure
that
these
pages
are
in
in
a
separate
url
section,
something
like
a
separate
folder
or
a
separate
subdomain,
or
something
like
that,
so
that
we
can
easily
recognize
that
we
don't
need
to
handle
them
as
often,
if
that's
possible,
then
I
suspect
our
systems
would
learn
that
over
time
as
well.
But
I
I
also
don't
see
any
problems
with
saying.
A
F
Okay,
thank
you
very
much,
so
you
do
recommend
you
do
recommend
to
try
it.
At
least
I.
A
A
A
question
on
schema
markup
for
aggregate
rating
if
we
have
multiple
services,
but
we
collect
the
reviews
in
one
generic
place
like
trustpilot.
How
should
we
mark
up
the
separate
product
pages?
As
I
understand
the
advice
is
not
to
mark
up
each
page.
So
at
the
moment
I
only
have
one
marked
up.
We
have
about
a
thousand
two
hundred
reviews
and
our
rating
is
approximately
four
point.
Seven,
so
we're
keen
to
shout
about
this.
A
A
So
if
some
third
party
is
collecting
the
reviews-
and
you
just
kind
of
like
take
that
aggregate
rating
and
copy
it
onto
your
website,
we
would
not
consider
that
to
be
valid
aggregate
review
markup,
and
it
might
be
that
we
don't
show
it.
It
might
even
be
that
the
webspam
team
says
oh,
this
is
so
problematic
that
we
will
suppress
the
reviews
completely
for
this
website.
A
So
that's
that's
kind
of
the.
I
think
the
bigger
issue
that
you
need
to
think
about,
essentially
from
from
our
point
of
view,
these
third-party
pages,
where
you're
collecting
the
reviews,
those
are
the
ones
that
would
be
able
to
mark
up
the
reviews,
but
not
your
website,
because
you
can
kind
of
see
it
as
essentially
curating
the
reviews,
because
you're
taking
a
number
from
one
website
and
you're
manually,
copying
it
into
another
or
maybe
you're,
using
an
api.
A
But
essentially
we
have
to
trust
you
that
you're
taking
the
sum
of
the
reviews
and
not
like
a
sample
that
you
like.
So
that's
that's
kind
of
the
reason.
Reasoning
behind
that
is
there
any
benefit
to
using
schema
markup
outside
of
the
featured
snippet
possibilities,
for
example
using
collection,
page
markup
on
category
pages,
to
make
a
clear
distinction
of
the
entity.
A
But
the
the
effect
is
not
something
that
you
would
probably
see
in
most
cases
and
it
wouldn't
be
visible
because
it's
like
it's
not
not
a
part
of
the
rich
results
type.
So
we
wouldn't
be
able
to
show
that.
So
that's
something
where,
if
you're,
focusing
on
kind
of
like
return
on
investment
for
implementing
different
structured
data
features,
I
would
focus
on
things
that
are
immediately
visible
in
the
search
results,
because
that's
something
where
you
know
that
there
is
an
effect
and
the
effect
might
not
be
your
ranking
effect.
A
But
at
least
you
see
a
visible
effect
of
the
work
that
you
did,
whereas
if
you
add
structured
data
to
your
pages-
and
you
have
no
idea
if
google
or
anyone
does
anything
with
it,
then
that
feels
a
little
bit
futile
and
kind
of
like
well
you're.
Trusting
that
some
magic
will
happen
along
the
way
and
in
particular
something
like
collection,
page
markup,
where
you're
saying
oh,
this
is
a
collection
of
different
things.
A
So
it
would
be
slightly
different
if
you're
marking
up
things
like
individual
locations
or
individual
entities
and
names
and
companies-
things
like
that
that
I
imagine
might
be
useful
to
us,
because
we
can
look
at
the
text
and
the
text
might
be
ambiguous,
but
your
structured
data
tells
us.
Oh,
this
is
a
location
and
not
a
spanish
word
or
whatever.
A
A
I
got
an
algorithm
penalty
from
google
between
the
31st
of
march
and
the
first
of
april.
I
didn't
get
any
manual
action
email
from
google
after
a
few
days
pages
got
indexed.
What
can
I
do
to
recover?
A
Then
that
feels
like
something
where
there's
a
significant
quality
issue
that
you
need
to
work
on,
and
quality
issues
like
that
are
not
things
that
you
need
to
flag
to
us
that
you've
completed
that,
but
rather
over
time
we
will
kind
of
reevaluate
your
site.
And
if
things
are
okay,
then
we
will
start
showing
it
differently
again.
A
A
Structured
data
is
always
useful
in
particular
because
less
and
less
it
feels
like
we're
showing
urls
in
the
search
results,
and
instead
we
try
to
show
these
kind
of
breadcrumb
links
and
if
we
have
to
figure
out
the
breadcrumb
links
ourselves,
then
that's
sometimes
hit
and
miss,
and
if
you
can
specify
those
yourself
with
structured
data,
that's
usually
a
lot
better.
So
I
I
would
definitely
aim
for
that.
A
A
What
do
you
recommend
to
young
websites
that
can't
outrank
old
websites
because
they
have
many
links?
Yeah,
I
I
don't
know
this
is
almost
like.
Like
one
of
those
business
questions
again,
it's
like
you're
starting
off
with
a
new
business.
What
can
you
do
to
compete
against
kind
of
these
old
businesses?
A
A
It's
like
oh
some
new
site
is
blank
is
ranking
ahead
of
us
and
essentially
from
from
our
point
of
view,
we
don't
have
any
preference
with
regards
to
new
websites
or
old
websites,
but
rather
we
try
to
evaluate
what
makes
the
most
sense
to
show
to
people
and
it's
not
just
based
on
links,
so
just
kind
of
like
blindly
saying
well
old
websites
have
more
links.
A
But
it
is
something
where
kind
of,
like
I
mentioned
before,
if
you're
going
into
a
market
where
there
is
a
lot
of
really
strong
competition,
then
you
really-
I
don't
know
you
should
expect
some
headwind
and
it's
going
to
be
hard
and
usually
it's
a
lot
easier
to
start
off
with
something
where
you're
kind
of
like
ahead
of
everyone
else,
where
you're
providing
some
content,
that's
really
fantastic!
A
That
doesn't
compete
with
existing
content
where
you're
doing
something
different
from
everyone
else,
rather
than
just
saying.
Well,
I'm
writing
about
the
news
in
a
little
bit
of
a
better
way,
essentially,
if
you're
providing
something
different,
then
that's
like
a
it's
a
whole
different
game,
then,
because
then
the
old
websites
might
not
have
anything
that
competes
with
that.
A
K
Hey
john
how's,
it
going
hi
hey.
So
our
question
is
we're
ranking
in
some
competitive
niches
and
we've
been
exploring
different
link
building
strategies,
one
of
the
ones
that
have
given
us.
The
most
interest
is
actually
google
property
stacking
and
I
just
want
to
know
if
it's
still
acceptable
to
do
with
the
new
core
update.
K
So
during
our
research
we
found
out
that
some
people
were
using
sc
were
using
google
properties,
like
say,
for
example,
google
slides,
google
sheets,
google,
docs
and
creating
them
within
a
google
site
and
then
linking
that
google
site
into
their
website
to
try
to
boost
authority.
Is
that
an
acceptable
strategy?
Or
what
do
you
think
about
that?.
A
I
don't
think
that
would
work
at
all,
because
I
I
don't
know
kind
of
the
way
that
all
of
the
these
user-generated
content
sites
are
handled.
I
don't
think
that
would
change
anything
so
kind
of
going
with
the
mindset
that
oh,
it's
hosted
on
a
google.com
domain.
Therefore
it'll
be
valuable
for
my
website.
I
don't
think
that
that
really
works.
A
So
I
don't
know
that
that
sounds
very
questionable.
I
I
mean,
like
I,
I
think,
creating
google
docs
and
hosting
them
on
your
website
like
why
not,
but
I
wouldn't
expect
anything
magical
to
happen
out
of
that.
I
I
think
in
general,
with
regards
to
link
building,
I
would
recommend
being
super
cautious
and
really
thinking
about
like
what
google
is
trying
to
do
with
regards
to
links,
because
we
want
to
see
essentially
natural
links
in
that
you're.
Providing
something
fantastic
and
people
are
saying.
Oh
look
at
this
fantastic
thing.
A
K
Right
well,
we
found
this
issue
when
we're
looking
through
doing
some
back
backlinking
research,
and
we
found
that
in
the
city
of
austin
there
was
this
google
site
that
was
just
having
iframes
all
over.
Like
google
calendar,
google
docs
and
it
was
ranking
number
one
and
it's
been
ranking
number
one
for
well
over
six
months
and
we're
just
like
man.
This
looks
terrible,
but
how
is
it
number
one
what's
going
on
there?
So
just
we
need
to
come
with
some
sort
of
solution
for
that
yeah.
A
No,
I
mean
it's,
it's
definitely
not
because
of
like
all
of
those
in
beds.
That's
that
would
be
crazy.
I
I
don't
think
anything
like
that
would
happen.
I
mean
like
in
in
the
early
days.
There
was
also
talk
that
links
from
gmail,
for
example,
would
help
your
website
to
rank
well,
and
it
just
doesn't
make
any
sense.
A
So
that's
something
where
I,
I
suspect
someone
is
just
trying
everything
out
and
one
thing
it
might
be
working
one
thing
might
be:
okay
and
all
of
the
rest
is
essentially
ignored,
and
then,
when
you
look
at
it,
you
say
like
oh
they're
doing
all
of
these
things.
Maybe
all
of
these
things
are
working
and
actually
it's
something
completely
unrelated,
and
we
often
see
that
also
with
small
business
sites
in
that
they'll
do
a
lot
of
things
wrong
and
our
systems
are
essentially
tuned
to
ignoring
a
lot
of
these
wrong
things.
A
All
right,
barry.
L
Good
morning
john
hi
hi
I've
got
a
another
soft
404
issue.
I've
got
a
site,
I'm
working
on.
They
have
about
40
000
products.
L
I've
posted
this
in
the
google
search
console
support
and
they
don't
see
an
issue
with
the
page
sample
page
and
in
google
search
console.
If
it's
like
each
time,
there's
a
crawl.
There
are
more
pages
added
to
the
software
for
list
and
if
I
take
a
product,
that's
showing
is
fine
and
then
revalidate
it.
It
will
then
give
a
soft
404
error.
A
So
if
you
can
maybe
copy
a
link
to
to
the
forum
thread
here
in
the
chat,
then
I
I
can
pass
those
examples
on
to
the
team
to
make
sure
that
we
we
pick
up
that
properly.
I
mean
it's
it's
something
where
I
would
say
like.
We
shouldn't
be
recognizing
a
random
line
in
javascript
as
being
a
reason
for
a
soft
404,
and
maybe
you
can
fix
it,
but
we
still
shouldn't
be
recognizing
that.
B
Hi,
just
a
very,
very
quick
follow-up
on
my
first
question
about
putting
all
those
no
no
script
links
the
result
of
that
is,
we
are
the
html
file
went
from
like
89
kilobytes
to
about
200
kilobytes.
B
A
A
The
the
only
thing
I
would
watch
out
for
is
the
general
speed
of
the
page
itself,
which
you
can
test
with
all
of
the
the
core
web
vitals
tools,
and
my
assumption
is
that
this
kind
of
content
would
compress
fairly
well.
So
it
wouldn't
be
that
it
slows
down
the
the
largest
contentful
paint
or
anything
like
that.
A
Cool
okay,
let
me
take
a
break
here
and
pause.
The
recording
and
I'll
be
around
a
little
bit
longer.
I
have
to
leave
in
like
a
half
an
hour,
but
until
then
we
can
hang
around.
I
can
finish
the
the
people
who
have
raised
hands
as
well,
if,
if
you're
watching
this
on
youtube
thanks
for
watching
and
hopefully
you
can
join
in
one
of
the
future
ones,
also
for
all
of
you
here
as
well
as
those
watching.
A
We
have
a
survey
running
on
youtube
at
the
moment
and
I
would
love
to
have
your
input
there.
So
essentially
we're
trying
to
figure
out
what
we
should
tweak
on
our
youtube
channel
and
our
youtube
strategy,
and
if
there
any
things
on
your
mind
that
we
should
do
differently
or
that
we're
doing
well
like.
Let
us
know
about
that
all
right.
So
thank
you
for
watching
and
maybe
see
you
all
next
time
and
pause.
The
recording.