►
From YouTube: English Google SEO office-hours from November 5, 2021
Description
This is a recording of the Google SEO office-hours hangout from November 5, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
A
Lots
of
questions
were
already
submitted
on
youtube,
but
it
also
looks
like
we
have
a
bunch
of
people
with
their
hands
raised
ready
to
go
so
maybe
we'll
just
go
through
some
of
those.
First,
let's
see
abraham.
I
think
you're.
First.
B
Just
specifically
for
that
shop
that
we
got
spamming
back
things,
I
tried
a
lot
to
disavow
them
by
google
links
but
at
the
same
time
called
vitality
and
sauna
store.
We
increased
like
double
in
traffic
and
visibility,
but
so
I'm
I'm
really
not
sure.
If
this
this
was
because
of
just
spammy
back
things,
I
assume
that
so
I
would
like
to
get
your
recommendation
on
on
that.
What
should
I
do?
What
can
we
do?
B?
We
try
to
figure
out
things.
B
A
Okay,
it's
it's
hard
for
me
to
say
without
digging
into
the
detail,
but
in
in
general,
with
the
core
updates.
If
you're
seeing
changes
there,
usually
that's
more
related
to
trying
to
figure
out
what
the
relevance
of
a
site
is
overall
and
less
related
to
to
things
like
spammy
links.
So
that's
something
where
I
wouldn't
expect
any
reaction
from
in
a
core
update
based
on
random
spammy
links
that
go
to
your
website
and
also
with
core
updates.
A
If,
like
you,
you
can
make
incremental
changes
to
improve
your
site
over
time
with
regards
to
the
overall
quality,
and
that
will
incrementally
help
there.
But
if
it
was
a
really
strong
adjustment
with
a
core
update,
then
you
probably
need
to
wait
until
the
next
core
update
to
see
those
changes
and
because
I
don't
think
it
would
be
related
to
the
spammy
backlinks.
A
A
And
I
I
imagine
it's
tricky.
If
you
have
multiple
shops
that
are
fairly
similar
in
that
it's
probably
not
the
case
that
one
of
them
is
really
bad,
the
other
ones
are
really
good,
but
it
might
still
be
something
where
you
can
use-
maybe
user
studies
to
figure
out
what
what
are
the
differences?
What
are
things
that
you
could
do
to
make
it
clear
that
this
site
is
is
particularly
relevant,
and
I
I
think,
especially
with
regards
to
websites
like
pharmacies.
A
It
is
something
where
our
algorithms
probably
try
to
be
a
little
bit
more
critical,
just
because
it's
there's
just
so
much
more
involved.
It's
not
a
random
website
that
has
a
story
and
a
funny
picture.
It's
like
people's
health.
That's
involved.
B
Thank
you.
Thank
you
so
much
so
we
need
to
work
more
on
the
quality
of
the
content.
It
seems
because
we
have
kind
of
same
content
on
all
shops
because
of
restriction.
We
cannot
always
change
that
and
also
we
are
solving
and
already
solved
a
lot
of
four
or
four
pages
in
size
map,
but
thank
you
so
much
yeah.
A
B
A
Yeah,
especially
with
regards
to
things
like
404
pages
and
technical
issues,
that
would
not
be
related
to
core
updates,
so
core
updates
are
really
more
about
understanding
your
site's
overall
quality
and
its
relevance
and
less
about
technical
issues
and
less
about
spam.
A
Good
luck:
let's
see
angie
hi
john.
C
Hi,
I
have
a
question
about
javascript
seo
and
getting
javascript
content
indexed
so
so
for
core
web
vitals.
So
I
work
on
a
blog
website,
first
of
all
and
for
core
web
vitals.
We
so
we
have
a
feature
where
we
put
a
youtube
video
at
the
top,
so
that
became
the
lcp
element,
so
it
was
heavier
than
what
we
just
had
like
a
regular
image,
so
we're
trying
a
method
where
we're
dynamically,
injecting
it
we're.
I
saw
a
web
dev
article
that
suggested
lazy
loading
with
a
facade.
C
So
since
it's
not
below
the
fold
content,
we
don't
lazy,
we
don't
lazy,
load
it
but
we're
using
a
facade
and
then
the
iframe
is
dynamically
injected.
When
the
user
clicks
the
play
button,
I'm
realizing
now
that
the
articles
are
not
being
indexed
with
the
video
content
on
the
page.
C
Essentially,
so,
if
I
search
for
the
page
and
I
go
to
video
search,
it
doesn't
appear
there,
I'm
wondering
what
the
best
way
to
get
that
content
indexed
with
the
page
is
I
mean
I
know
I
can
do
things
like
submit
a
video
sitemap
and
stuff,
but
for
it
to
be
for
basically
for
the
video
to
be
seen
as
like
related
to
the
web
page,
the
the
way
that
it
was
indexed
before
so
is
something
like
no
script
or
structured
data.
A
Yeah,
depending
on
the
the
way
that
you
set
up
the
kind
of
the
the
facade
that
you
mentioned
there,
where
you,
you
click
on
an
image
essentially
or
a
div
or
something,
and
then
it
loads
the
the
video.
In
the
background,
it
can
definitely
be
the
case
that
we
don't
automatically
pick
it
up
as
a
video
when
we
view
the
page-
and
I
have
had
feedback
from
the
video
search
team
telling
us
like,
we
shouldn't
tell
people
to
do
this
because
it
causes
problems
like
that.
A
Essentially,
the
the
best
approaches
there
are
at
least
to
make
sure
that,
with
structured
data,
we
can
tell
that
there's
still
a
video
there.
So
I
believe,
there's
a
kind
of
structured
data
for
specifically
for
videos
that
you
can
add.
A
A
So
from
that
point
of
view,
I'm
kind
of
torn
like
if
the
video
team
tells
me
like
you
should
put
it
directly
and
the
other
team
says
like
you-
should
make
things
fast,
then
like
hard
to
find
the
middle
ground.
But
I
I
think
at
least
making
sure
that
that
we
can
recognize
the
video.
Is
there.
That's
really
important.
C
Okay,
so
for
the
video
object,
structured
data,
I
think
there's
like
a
guideline,
a
general
structure
data
guideline
that
you
can't
mark
up
content
that
isn't
visible
on
the
page.
So
could
it
be
potentially
seen
as
misleading,
because
google
can't
actually
see
the
video
I.
A
I
think
that's
fine,
no,
I
I
don't
think
you
have
to
worry
about
that.
That's
specifically
like
if
there
were
no
actual
video
on
the
page.
That
would
be
a
problem
and
specifically
with
text-based
structure
data,
that's
something
we
can
try
to
figure
out
automatically,
but
when
it
comes
to
videos
there
are
lots
of
different
ways
of
embedding
videos
and
some
of
the
ways
to
embed
videos.
We
just
don't
pick
up
automatically
and
because
of
that
we
have
the
structured
data.
So
from
that
point
of
view,
that
should
be
fine.
C
Okay
and
then,
if
so,
I
know,
there's
a
few
different
options
here,
so
I
wanted
some
clarification
on
the
no
script
tag
like
I
know,
that's
recommended
for
users
who
don't
have
javascript
on
how
does
google
handle
that
is,
and
is
it
recommended
to
also
include
that
just
for
when
the
page
doesn't
get
rendered.
A
We
generally
ignore
the
content
in
the
no
script,
so
that's
I.
I
don't
think
that
would
be
a
workaround
if
you're
trying
to
include
something
for
indexing.
So
so
from
that
point
of
view,
I
I
would
use
the
the
other
methods
more.
A
D
D
Thank
you.
Thank
you,
so
I
I
have
a
question.
I
I'm
testing
with
the
plugin
in
wordpress
that
generates
dynamic
content,
but
then,
when
you
check
the
html
code,
you
actually
don't
see
any
of
the
words
that
we're
trying
to
use
within
the
text
within
the
subheadings,
and
I
was
wondering
if
google
would
have
a
problem
understanding
what's
well
in
the
end
shown
to
the
to
the
end
user.
It's
really
nothing,
there's
nothing
in
the
html
code.
D
It
says
something
like
just
you
know
the
tags,
it
says
dynamic
content,
but
you
can't
really
read
the
words
that
we're
actually
showing
to
the
user.
So
I'm
not
sure
if
that's
also
something
that
google
would
be
able
to
see
in
the
end,
and
we
can
perhaps
rank
for
the
keywords
we
want
to
run
for
or
what's
what's
going
to
happen.
There.
A
I
I
don't
know
that
plugin.
So
it's
it's
hard
for
me
to
say
one
one
thing
you
can
do
is.
A
So
many
different
things
so
so
one
thing
you
you
can
do
is
use
the
inspect
url
tool
in
search
console
to
to
check
that
page
and
you
can
check
the
html
there
and
the
html
is
based
on
on
rendering.
So
when
we
kind
of
try
to
process
the
page
like
in
a
browser
and
if
in
the
html
you
find
the
content
that
you
want
to
have
shown,
then
we
we
can,
we
can
index
it.
But
if
it's
not
in
the
inspect
url
html,
then
we're
not
seeing
the
content.
A
D
E
F
Hey
john
super
quick
one
from
me:
I've
sent
a
lot
of
applications
to
the
book,
schema,
rich
snippet
application
tool
and
we've
done
everything
on
our
site
to
add
to
the
documentation.
F
F
So,
basically,
when
you
apply
for
book
featured
like
rich
snippet
on
the
schema
documentation,
there's
a
google
form
to
apply
to
have
the
website
that
you
look
after
be
included
within
that,
and
so
you
kind
of
like
submit
it,
and
then
nothing
happens,
and
I
just
wondered
if
there
was
an
internal
team
that
looks
after
the
application
process
or
if
that
inbox
is
kind
of
just
not
checked.
I
was
just
kind
of
like
wondering
we
spoke
about
it
like
a
couple
of
weeks
back.
A
For
books,
I
I
can
check
afterwards,
but
usually
I
pass
these
on
or
if,
if
you
can
drop
the
details
into
the
chat
here,
I
can
copy
and
paste
it
out
and
make
sure
that
they
they
have
it
that'd,
be
amazing.
G
Hey:
hey
john,
so
I'm
working
on
actually
I'm
a
content
creator
that
creates
blog
posts
for
for
an
e-commerce
site
and
we're
selling
like
super
technical
products
like
really
it's.
It's
it's
metal
profiles
and
we
have
many
different
types
of
those
profiles
and
we
have
a
lot
of
thin
content
because
we
have
the
same
variations.
G
Basically,
if
we
have
one
u-shaped
profile,
for
example,
we
have
u-shaped
profile
in
green,
in
red
and
in
blue,
let's
say,
and
then
when
we
click
green,
red
and
blue,
which
are
all
three
urls
different
urls.
We
have
within
this
url
like
thousands
of
variations
and
combinations
and
lengths
and
thicknesses,
and
all
that
and
I'm
I'm
just
I'm
not.
I
was
never.
I
don't
know
how
to
how
to
handle
this,
because
I
don't
know
like,
should
I
canonicalize
those
but
then
again
we're
linking
to
them.
G
So
I
don't
want
to
like
create
bad
in-links
or
whatever
or
bad
quality
links
inside
of
my
page
or
if
I
should,
I
don't
know
no
index
or
block
from
robots.
I
really,
I
really
don't
know
I
don't
have
experience
in
there.
G
Okay,
everything
on
the
internet
seems
to
be
like
very
like
nobody
seems
to
agree
and,
like
some
people
handle
it
the
one
way
and
of
people
another
way,
so
I'm
not
quite
sure
what
the
best
thing
to
do
is
and
yeah
we
don't
need.
We
don't
need
people
to
come
to
those
variation
pages.
We
don't
want
that.
We
just
we
just
want
to
create
good
quality
on
the
on
the
category
pages.
A
But
if
you
don't
care
about
those
individual
urls,
if
you
care
more
about
the
the
higher
level
categories
or
if
you
have
something
like
a
a
broader
product
or
category
that
essentially
is
is
the
most
important
way
of
finding
the
content.
Then
you
can
canonical
to
that
page.
You
can
now
index
the
other
versions.
If
you
want,
you
can
essentially
do
do
whatever
you
want
there
to
kind
of
make
it
so
that
we
focus
all
of
our
signals
on
that
main
page,
that
you
do
care
about.
G
A
Fine,
that's
fine
like
if,
if
you
link
to
those
variations
yeah
and
you
don't
want
those
variations-
findable,
that's
that's
perfectly
fine,
and
this
is
something
where
I
I
suspect,
some
of
the
confusion
online
is
is
just
based
on
people
having
different
opinions
on
what
they
consider
to
be
important.
A
So
if
you're
selling
shoes,
then
probably
a
specific
shoe
model
is,
is
the
product
that
you
care
about.
If
you're
selling
shoes
for
maybe
extremely
large
feet,
then
you
want
the
shoe
size
to
be
also
indexable.
So
it's
it's
something
where
people
have
different.
I
don't
know
priorities
and
you
can
kind
of
pick
and
choose
what
you
want
to
care
about.
A
Like
I
don't
know,
if
you're
selling
a
phone,
you
might
have
the
different
phone
colors
all
on
one
phone
product
page,
if
you
have
one
of
those
colors
is
gold,
and
it's
like
super
fancy.
Then
maybe
you
want
to
have
a
separate
gold
page
and
just
all
the
other
colors
together
on
one
page
and
like
everyone.
Does
it
a
little
bit
differently
and
I
imagine
with
industrial
products
like
you
mentioned,
where
you
have
like
so
many
different
dimensions
and
variations
it?
G
A
You
don't
need
to
you,
don't
need
to
have
the
same
content.
The
the
thing
to
keep
in
mind
with
setting
a
canonical
is
that
we
will
try
to
index
the
canonical
page
that
you
mentioned.
So,
if
there's
anything
unique
on
the
non-canonical
pages,
then
we
wouldn't
be
able
to
find
that
so,
essentially
anything
that's
critical
make
sure.
It's
also
mentioned
on
the
canonical
page.
G
Okay,
all
right,
thank
you
very
much
sure
christian.
H
Yeah
hi
hi
john,
this
time
I
I
have
a
question
regarding
discover
in
this
cover.
We
have
yeah,
we
have
more
or
less
two
two
different,
let's
say
search
result
pages,
so
we
have
the
first
page
and
then
you
can
click
on
more
results.
H
And
then
you
have
the
second
page,
and
my
question
is:
is
there
any
kind
of
ranking
involved
in
this
so
that
some
article
is
on
the
second
page
or
is
it
more
a
matter
of
of
time
or
current
or
of
the
time
when
it
is
updated
or
something.
A
There
is
probably
a
sense
of
ranking,
but
I
don't
think
it's
the
same
as
traditional
web
ranking
in
that
like
discover,
is
just
so
personalized.
So
it's
not
it's
not
something
where
I
I
think
it
would
make
sense
to
have
kind
of
the
traditional
notion
of
oh.
You
opened
a
discover
page
and
you're
number,
five
and
and
maybe
next
time,
you're
number
four
or
something
like
that.
A
So
there
there
is
a
sense
internally
within
the
product
of
trying
to
figure
out
what
is
most
important
for
you
or
most
relevant
for
you
now
when
you're
browsing
discover,
but
I
I
don't
think
any
of
that
is
exposed
externally.
So
it's
I
don't
know
it's
it's
basically
a
feed.
The
way
that
I
see
it.
It's
like
keeps
going.
H
So
this
would
be
kind
of
personal
ranking,
which
only
involves
my
my
personal
interests
or
are
there
other
aspects
like,
let's
say
the
the
strength
or
popularity
or
anything
else
of
a
website
which,
which
would
also
in
take
effect
on
onto
this
ranking?
A
H
Okay,
so
the
best
way
to
to
cope
with
it
would
be
to
follow
the
recommendations
there
are
for
for
being
listed
in
this
cover
which
have
been
yeah
published
by
google.
Sometimes.
A
Before
yeah
yeah
I
I
would
follow
those
recommendations
in
particular
watch
out
for
the
the
aspects
where
we
say:
don't
do
this
or
those
kind
of
things,
and
I
I
would
also
look
around
externally
on
twitter
they're
handful
of
people
who
are
really
almost
like
specialized
on
discover
and
they
they
have
some
really
cool
ideas.
They've
written
some
good
blog
posts
on
what
they've
seen
the
kind
of
content
that
works
well
on
discover
the
kind
of
content
that
doesn't
work
well
and
discover.
A
Cool,
let
me
jump
to
some
of
the
submitted
questions
and
I'll
get
to
to
all
of
you
with
raised
hands
as
well
later
on,
but
just
to
make
sure
that
the
submitted
questions
also
get
covered
a
little
bit.
Let's
see
a
question
regarding
301
redirect
and
cache
control
headers.
A
A
We
like
the
the
whole
crawling
and
indexing
system
is
essentially
different
from
browsers
in
the
sense
that
all
of
the
the
network
side
of
things
they're
they're
optimized
for
different
things,
so
in
a
browser
it
makes
a
lot
more
sense
to
to
cache
things
to
cache
things
longer,
but
essentially,
from
our
point
of
view
on
the
crawling
and
indexing
side,
we
we
have
different
ways
of
or
different
things.
We
need
to
optimize
for
so
we
we
don't
treat
crawling
and
indexing
the
same
as
a
browser.
A
So
and
then
the
second
question
is:
will
google
accept
301
redirects
with
cash
control,
no
cash
or
time
in
the
headers
in
order
for
us
to
get
the
best
of
both
worlds?
Yes,
that's
perfectly
fine.
If
it's
a
301
redirect
we
treat
it
as
a
301,
redirect
it
doesn't
matter
what
what
kind
of
cache
headers.
You
also
add
on
top
of
that.
A
So
from
from
that
point
of
view,
if
this
is
a
solution
that
works
well
for
your
dev
team
and
for
yourself,
why
not,
I
think,
that's
perfectly
fine.
The
other
thing
is
302.
Redirects
might
also
be
an
option
if
that
works
better
for
your
dev
team.
302
redirects
have
a
bad
reputation
among
seos,
which
I
I
think
is
incorrect,
because
they
they
do
work
the
same
as
normal
redirects
as
well.
A
A
A
question
regarding
images.
I
have
portfolio
style
sites
with
pages
each
showing
10
to
30
thumbnails
displayed
in
a
gallery
and
in
full
size
when
clicked,
depending
on
the
cms
plugin
used,
the
full
size
image
is
or
is
not
included
in
the
page
html.
So,
for
instance,
we
have
a
kitchen
url,
showing
a
portfolio.
A
The
plugin
has
a
feature
that
you
can
load
kitchen
question
mark
image
equals
one
two
three
and
it'll
load
exactly
the
same
page,
but
javascript
will
open
a
full
size
version
of
the
image
providing
a
sort
of
unique
url
for
a
page
for
that
image.
A
In
reality,
there's
only
one
page
containing
the
entire
image
set
and
it's
set
it's
only
javascript
that
is
kind
of
making
that
work.
Do
you
advise
in
on
considering
each
image
having
its
own
page
and
listing
it
in
a
sitemap,
or
should
I
canonicalize
them
all
to
the
main
kitchen
page
and
list
only
kitchen
in
the
sitemap,
as
in
reality,
there's
only
one
html
page
there,
so
I
in
general,
I
I
would
treat
these
as
unique
pages.
If,
when
the
page
is
loaded
and
rendered
in
the
browser,
it
shows
something
unique.
A
A
Landing
page
is
something
that
that
really
helps
when
it
comes
to
image
search,
because
otherwise,
if
we
just
have
this
portfolio
page
with
like
30
small
thumbnails
on
it
and
someone
was
searching
for
an
image,
it's
hard
for
us
to
say
well,
you'll
find
the
information
you
need
on
this
big
landing
page
with
lots
of
images
on
it.
Because
chances
are,
the
image
is
somewhere
where
they
don't
see
it
offhand
and
they'd
feel
kind
of
confused
if
we
sent
them
there
after
searching
for
that
image.
A
The
the
thing
with
image
search
is
that
it's,
it
works
kind
of
separate
from
normal
web
search
and
not
all
sites
care
about
it.
It's
not
the
case
that
if
you
have
kind
of
good
performance
and
image
search
that
you'll
have
better
performance
in
web
search,
you
can
essentially
treat
these
as
separate
things
and
from
that
point
of
view,
if
you
only
care
about
web
search,
if
you
don't
really
need
these
images
to
be
visible
in
image,
search
then
like
this
might
not
even
be
something
that
you
need
to
to
care
about.
A
We
run
a
small
website
with
less
than
200
urls
and
performed
a
major
redesign
to
a
headless
cms
to
improve
the
quality
of
the
user
experience
a
few
weeks
ago.
Our
traffic
and
rankings
have
dropped
significantly
a
few
days
later.
How
long
will
it
take
for
google
to
reassess
the
new
design
and
return
to
previous
search
position?
A
Our
urls
appear
to
be
crawled,
post,
update
in
search
console
the
migration
used,
the
same
content,
url
structure,
metadata
and
navigation,
with
only
a
few
technical
issues.
For
example,
the
h2
headers
were
not
shown
in
the
html
which
may
have
have
caused
some
confusion
but
which
were
resolved
two
weeks
later.
A
So
it's
it's
hard
to
say
what
what
all
was
involved,
but
in
in
general,
with
bigger
site
relaunches.
It
is
important
to
watch
out
that
as
much
as
possible.
You
can
reuse
the
old
things
if
you
really
want
to
kind
of
be
seen
as
a
new
iteration
of
the
old
website.
So
using
the
same,
urls
sounds
like
you're
you're
doing
that
things
like
internal,
linking
that
they
matter
the
the
text
on
the
pages,
the
headings,
the
the
overall
structure
of
the
pages.
A
All
of
that
matters
a
little
bit
as
well,
and
I
from
just
from
kind
of
looking
through
this
question,
because
I
don't
know
the
website
my
feeling
is
that
there's
more
of
a
almost
like
a
technical
issue.
That's
involved
here,
that's
causing
these
problems
rather
than
something
subtle
from
the
redesign
that
that
you
launched
there.
So,
especially
if
you're
saying
you
have
the
same
urls
and
the
url
structure
is
the
same.
The
text
is
mostly
the
same
and
you're
seeing
a
60
drop
in
traffic
from
search
to
me.
A
That
would
point
more
at
some
kind
of
a
foundational
technical
issue
that
may
be
popped
up
together
with
the
redesign
which
could
be-
I
don't
know,
maybe
we
can't
crawl
your
site
at
all
anymore,
or
maybe
we
can't
reach
your
server
properly
or
maybe
the
the
hosting
setup
that
you're
now
using
is
detecting
googlebot
as
a
rogue
bot
and
blocking
googlebot.
A
Those
kind
of
things
are
elements
that
I
would
look
at
here.
If
this
were
a
redesign
where
you
just
changed
everything
new
url
structure,
you
set
up
some
redirects
and
it's
a
completely
different
website.
Essentially,
then,
I
would
expect
this
kind
of
a
drop
or
a
change,
but
if
essentially,
everything
is
the
same
and
small
things
have
changed,
then
I
wouldn't
expect
that
big
of
a
drop
and
it
would
be
more
of
a
technical
thing.
A
I
think,
let's
see
second
part
of
the
question
if
improving
the
quality
of
pages
is
important
for
google,
with
the
page
experience
update,
why
does
it
take
google
so
long
to
recognize
these
improvements?
A
It
seems
counterintuitive
for
seos
to
commit
to
making
a
real
difference
to
improve
the
web
experience
if
they
have
to
sacrifice
losing
search
position
and
traffic
for
several
months
yeah.
I
I
I
think
if,
if
you
make
a
bigger
change
on
your
website,
then
like
some,
sometimes
you
do
see
fluctuations,
but
that's
not
something
where
we
would
say.
Like
those
fluctuations,
are
there
because
you
improved
your
website,
it's
just.
A
We
have
to
re-understand
a
website
when
you
make
really
big
changes
across
a
website,
but
from
from
my
point
of
view,
like
a
lot
of
the
the
restructurings
that
you
can
do
across
a
website
depending
on
how
you
set
them
up
how
you
kind
of
execute
on
them,
you
can
do
that
in
a
way
that
essentially
has
a
very
smooth
transition
when
it
comes
to
search
and
so
that
it
doesn't
like
cause
your
whole
website
to
disappear.
A
If,
if
you're,
I
don't
know
if
you're
seeing
this
or
if
you're
here,
I
don't
know,
I
don't
see
you
offhead
here,
but
if
you,
if
you
want
to
drop
me,
a
url,
maybe
on
twitter,
to
double
check,
I'm
happy
to
take
a
look,
does
having
multiple
pages,
not
indexed
due
to
quality,
affect
the
overall
site
crawlability.
A
A
But
if
you
have,
I
don't
know
a
normal
ratio
of
indexable
to
non-indexable
pages,
where
we
can
essentially
find
the
indexable
pages
very
quickly
and
there
are
some
non-indexable
pages
on
the
edge.
Essentially,
I
I
don't
see
that
causing
any
issue
at
all
with
regards
to
crawlability,
and
this
is
not
something
that
is
due
to
kind
of
quality
reasons
that
google
says.
Oh
no
index
pages
are
bad,
it's
purely
a
technical
thing
like
if,
if
we
have
to
crawl
a
million
urls,
we
have
to
crawl
a
million
urls
to
see
what
is
there?
A
A
We've
heard
that
temporary
302
redirects
don't
pass
page
link.
Equity.
Okay,
is
our
understanding
of
that
accurate.
More
in
general,
we've
heard
that
using
302s
causes
significant
seo
issues
which
has
caused
us
to
wonder
if
we
should
avoid
these
at
all
costs
or
if
there
are
specific
instances
where
we
should
use
them
good
question
kind
of
similar
to
the
the
previous
one
on
redirects
and
the
answer
is
a
clear:
no,
there
is
no
negative
seo
effect
from
302
redirects
they're,
like
the
I
think,
the
the
whole
feeling
of
you
lose
pagerank.
A
When
you
do
302
redirects,
it
is
false
and
it's
it
comes
up
every
now
and
then
I
think
the
the
main
reason
why
this
comes
up
is
because
302
redirects
are,
by
definition
different
in
the
sense
that,
with
a
301
redirect
you're
changing
the
address,
and
you
want
google
systems
to
pick
up
the
destination
page
and
with
a
302
redirect
you're
saying
well.
This
is
temporarily
somewhere
else,
but
you
want
google
systems
to
keep
the
original
url.
A
So
if
you're
purely
tracking
ranking
of
individual
urls,
then
of
course
a
301
will
kind
of
cause
the
destination
page
to
be
indexed
and
ranking,
and
a
302
redirect
will
keep
the
original
page
index
and
ranking,
but
there's
no
loss
of
page
rank
or
any
signals
assigned
there.
It's
purely
well,
which
of
these
two
urls,
is
actually
indexed
and
shown
in
search
so
sometimes
302.
Redirects
are
the
right
thing
to
do.
Sometimes
301
redirects
are
the
right
thing
to
do
if
we
spot
302
redirects
for
a
longer
period
of
time
where
we
think
well.
A
I
hope
you
can
help
me
with
this.
Is
it
normal
for
a
url
to
rank
high
on
the
first
page
for
its
chosen
keyword?
It
can
be
found
by
searching
the
with
a
site
query,
yet
it
can't
be
found
by
doing
a
simple
url
search
in
google.
I
have
lots
of
urls
like
this
across
many
sites,
and
I
don't
know
if
my
sites
are
having
any
filter
applied
to
them.
A
I
noticed
that
previews
in
publisher
center
don't
show
thumbnails
for
webp
image
formats.
There
seems
to
be
little
to
no
information
on
the
web,
since
there
is
a
constant
urge
to
improve
metrics
and
user
experience.
Could
you
clarify
if
the
format
is
currently
supported
or
offer
advice
on
adoption
with
regards
to
discover
and
google
news?
A
A
A
A
simple
way
to
double
check
would
be
to
to
see
what
these
pages
show
show
up
as
in
search
directly,
and
if
they
look
look
okay
to
you
there,
then
probably
it's
just
a
bug
in
publisher
center.
What
what
I
would
do
here,
though,
is
pass
that
feedback
on
to
the
publisher
center
scene.
A
Let's
see
another
redirect
question,
I
think
I
work
with
a
developer
who's
merging
two
sites
for
mutual
client.
I
would
like
to
see
site
a
redeveloped
to
include
valuable
content
from
site
b
and
then
specific
301s
from
site
b
to
site
a
with
site
b
remaining
hosted
until
I
see
the
re-indexing.
A
A
Site
b
will
actually
be
replaced
with
site
a
and
no
longer
exists.
So
my
question:
is
there
any
difference
between
keeping
site
b,
hosted
with
301
redirects
versus
renaming
site
b
to
become
site
a?
Is
that
confusing?
Yes
to
me,
it
feels
like
a
bad
idea,
but
I
can't
find
anything
specifically
talking
about
the
difference
between
having
content,
hosted
and
301
thing
versus
redirecting
the
domain
and
there
being
no
actual
website
files
anymore.
A
A
I
would
try
to
find
a
way
to
do
that
in
one
step
or
find
a
way
to
kind
of
incrementally
move
things
from
site
b
to
site
a
and
then
build
the
new
structure
directly
on
site,
a
incrementally,
that's
kind
of
the
the
first
thing.
A
The
other
thing
with
regards
to
redirects
or
just
hosting
everything
on
on
the
new
site.
My
suspicion
is,
it's
exactly
the
same
thing
from
from
a
kind
of
an
external
point
of
view.
A
G
I
Maybe
ask
a
few
more
afterwards,
so
a
few
of
them
are
related
to
one
of
the
topic
of
one
of
the
previous
questions
regarding
multiple
versions
of
of
a
given
product
and
canonicalizing
it.
So
I'm
curious,
you
know
in
cases
where
you
have
like
unique
products,
but
their
only
difference
is
a
very
slight
variation,
so
there's
literally
the
same
content
on
every
page
and
that
maybe
the
image
or
the
color
or
something
like
that
is,
is
different,
but
not
something
that
people
would
actually
search
for.
I
However,
let's
say
you
have
like
10
products,
products
and
eight
of
them
you
you
cannot
equalize.
Would
that
mean
that
google
only
sees
like
a
category
with
two
products
and
maybe
figure
out
that?
Well,
this
category
only
has
two
products.
Maybe
we
won't
rank
it
as
well
versus
other
sites
that
maybe
have
more.
I.
I
don't
think
that.
A
Would
be
bad
like
I,
I
don't
think
the
the
number
of
products
in
a
category
page
is
is
a
ranking
factor
from
from
that
point
of
view,
so
I
I
wouldn't
see
that
problematic.
Also
on
a
category
page,
even
if
you
have
only
two
pages
that
are
indexable
that
are
linked
from
there.
You
still
have
things
like
the
thumbnails
and
kind
of
product
descriptions,
and
things
like
that
that
are
also
listed
on
the
category
page,
so
kind
of
having
the
category
page
with
10
products,
and
only
two
of
them
are
indexable.
I
And
so
in
terms
of
maybe
internal
linking
would
that
make
any
difference
like
maybe
on
the
product
pages?
You
have
breadcrumbs
that
lead
back
to
the
category
page
with
the
canonicalizing,
so
many
products
would
that
make
an
impact,
or
this
basically
sums
up
to
the
same
result.
I
I
I
suspect
it's
probably
all
the
same.
It's
probably
all
the
same
right,
okay,
one
more
question
regarding
this
scenario
is:
if
you
have
like
eight
variations
of
that
product,
all
of
them
canonicalize
to
one
of
them,
but
that
one
of
them
that's
the
canonical
version,
gets
out
of
stock,
but
you
do
have
the
other
ones
that
are
in
stock.
Would
it
be
okay
to
move
the
canonical
to
one
of
the
products?
That's
in
stock.
A
G
A
I
Okay,
yeah:
I
have
a
few
more
questions,
but
I'll
just
leave
it
to
to.
J
Hi
john
nice,
to
see
you
nice
to
that,
I
can
talk
about
one
problem.
I
have
with
a
client.
It's
like,
I
think,
a
standard
problem.
It's
many
urls
blocked
by
robots.txt.
J
Also
they
have
a
http
header
that
is
set
on
no
index,
so
I
set
open
the
robot60,
so
the
urls
can
be
de-indexed,
but
the
client
is
afraid
that
the
server
won't
precede
all
the
requests
and
will
fail.
So
I
said
well
just
if
you
see
the
user
agent
as
a
bot,
you
can
just
give
out
a
blank
html
body
or
maybe
another
kind
of
page.
So
my
question
is
like:
is
there
any
risk
of
getting
panelized
because
of
cloaking
if
the
http
header
is
set
to
no
index.
A
No,
I
I
don't
see
any
problem
with
that,
so,
in
particular,
if
you're
showing
search
engines
less
than
you
would
show
users,
then
that's
that's
less
of
an
issue
with
regards
to
cloaking.
So
the
the
part
of
cloaking
that
is
is
more
problematic
for
us
is,
if
you
show
us
like
a
really
big
and
interesting
page
and
when
users
get
there,
they
see
something
really
tiny
or
slightly
different.
A
But
if
you're
showing
us
essentially
an
empty
page
and
saying:
oh
there's
nothing
here,
you
shouldn't
index
this
page
and
we
drop
it
out
of
the
index.
Then
like
we
don't
care.
If
users
see
something
else,
because
from
our
point
of
view,
what
we
want
to
avoid
is
that
we
promise
users
something
that
they
can't
find.
A
So
if
we
drop
a
page
from
our
index,
we
can't
recommend
that
page
because
we
don't
have
it
anymore,
but
kind
of
the
other
way
around.
If
we
recommended
a
page
to
people
for
a
specific
query
and
they
go
there
and
they
can't
find
that
content,
then
they're
frustrated
and
they
think
like
we
did
a
bad
job
and
that's
something
where
kind
of
our
cloaking
problem
comes
from.
J
K
K
My
question
today
is
related.
I
go
back
to
the
first
question.
They
asked
today
it's
about
the
the
spammy
backlink,
so
I've
noticed
I'm
I'm
noticing
this
period,
I'm
use
I'm
using
one
of
the
google
tools
for
us
using
google
alerts.
K
That
gives
me
some
tips
about
who
is
talking
about
our
websites
and,
let's
see
yeah,
I'm
seeing
a
lot
of
urls
coming
from
kind
of
ai
generated
content,
but
those
those
sites
are
not
listed
are
not.
Indexed
in
google
are
clearly
fake
sites,
so
it's
okay,
google,
don't
see
them,
I
don't
see
them
in
in
semrush
or
in
h,
rights
like
or
in
href.
K
The
only
point
where
I
see
this
link
is
in
google
alert,
so
I'm
I'm
a
little
bit
worried
why
I
get
this
type
of
information
from
google
and
then
this
site
is
clearly
not
relevant,
and
this
takes
me
to
another
thinking
that
is
about
the
the
use
of
artificial
intelligence,
especially
especially
in
google,
with
a
with
the
raising
of
the
mama
technologies
and
all
this
kind
of
stuff
we're
seeing
a
lot
of
black
hat
ceo
and
some
site
using
artificial
intelligence
generated
content
to
try
to
trick
the
the
google
algorithm.
K
I
know,
probably
you
are
aware
of
this
and
I'm
curious
about
you
know.
How
are
you
dealing
with
this?
I
mean:
what
can
we
do
to
stay
out
of
all
of
this
spammy
backlink.
A
A
It
it's
essentially
trying
to
find
content
as
quickly
as
possible
and
alert
you
of
that,
and
because
of
that,
my
my
assumption
is
that
it
picks
up
things
that
we
see
for
search
before
we
do
all
of
the
complete
spam
filtering
and
because
of
that,
we
we
can't
send
out
alerts
for
things
where
actually,
when
we
take
a
second
look
for
indexing,
we
say
oh,
this
is
this
is
junk
we
can
get
rid
of
it,
and
I
I
see
that
with
with
alerts
as
well.
A
So
if,
if
they're
not
being
indexed,
if
they
don't
show
up
in
in
the
other
tools,
I
would
just
ignore
them.
So
that's
that's
kind
of
the
the
one
thing
with
regards
to
automatically
generated
content.
I
I
feel
like
this
is
a
topic
that
has
actually
been
around
for
a
really
long
time,
because
it's
like
there
have
just
been
so
many
different
ways
of
generating
content
automatically,
and
some
of
it
is
used
in
in
useful
ways,
and
some
of
it
is
used
in
really
spammy
ways.
A
I've
I've
made
test
sites
with
auto
generated
content.
For
I
don't
know
since
before
I
joined
google.
What
was
that?
Maybe
I
don't
know
13
14
years
ago,
and
it's
something
where
it's
like
these
technologies
exist
and
people
will
use
them
and
search
engines
will
index
some
of
that
content,
but
for
the
most
part
we'll
look
at
it
and
say:
oh,
this
is
really
low
quality
content
and
we'll
see
that
these
pages
don't
collect
signals
the
way
that
normal
pages
do
and
because
of
that
we'll
basically
say.
A
But
my
my
feeling
is
at
some
point
that
is
going
to
shift
a
little
bit
in
the
sense
that
we'll
focus
more
on
the
quality
rather
than
how
it
was
generated,
and
that's
something
where
I
I
could
imagine
that
there
might
be
some
setups
for
automatically
generated
content
where
you
can
actually
create
something
that
is
fairly
useful.
Where
based
on
maybe
the
input
data
that
comes
in
it's
actually
something
that
is
useful
for
people.
A
So
I
I
know,
for
example,
I
I
think
in
the
us
some
some
of
the
news
sites
use
feeds
from
the
different.
I
don't
know
geological
institutes
for
earthquake
detection
and
they
will
automatically
generate
a
news
article
if
they
see
that
one
of
these
feeds
says.
Oh,
there
was
an
earthquake
in
this
big
city
and
they
will
take
this
automatically
generated
content
and
publish
it
initially,
because
it's
like
as
soon
as
possible,
get
the
information
out
there.
A
But
they'll
also
have
people
who
go
in
and
actually
create
something
useful
on
top
of
that,
so
some
mix
of
maybe
automatically
generated
content
and
human
curated
content,
I
imagine,
will
become
normal,
but
we'll
continue
to
have
the
the
really
low
effort
automatically
generated
content
as
well,
where
people
just
say.
Oh,
I
want
to
target
this
keywords.
A
Give
me
five
paragraphs
of
text
and
make
it
look
like
it's
written
in
english,
and
you,
like
a
normal
person,
looks
at
it
and
reads
through
it
and
says
this
doesn't
make
much
sense
or
these
sentences
don't
fit
together
and
this
kind
of
low
effort
content.
I
I
think
will
continue
to
be
something
that
our
systems
will
try
to
recognize
as
low
quality,
maybe
spam
and
treat
appropriately
and
in
the
end,
if
it's
low
quality
content,
it
doesn't
matter
if
it
was
written
by
a
person
or
by
a
machine.
K
Okay,
that
is
clear,
and
it's
interesting
that
you
said
maybe
one
day
we
will
treat
computer
generated
counter
even
more
important
and
human
content.
I
I
don't
think
more
importantly,
no.
A
A
Similar
to
I
don't
know
you
have
a
spreadsheet
in
in
your
in
your
browser
nowadays
and
if
you
want
to
create
graphs
like
you
put
the
numbers
there
and
it
makes
the
graphs
for
you,
maybe
at
some
point,
if
you
have
a
system
that
helps
you
with
writing
you,
you
start
your
sentences
off
and
it
helps
you
finish
them
up,
or
it
watches
out
for
connectivity
issues
that
it's
it's
consistent.
What
you're,
writing
or
the
style
is
the
same
all
of
these
things
so
using
it
as
a
tool.
A
All
right,
maybe
I'll
just
take
michael
and
then
we
can
pause
the
recording.
I
will
see
how
long
your
question
is.
A
L
I
mean
everything
else
really
objectively
looked
great,
but
it
was
one
of
those
things
where
it
was
like:
where's
waldo,
meaning
where's
the
content,
because
they
were,
it
was
just
so
cluttered
with
ads,
and
you
know
you
just
want
to
go
crazy
and
say
please,
with
ads.
Sometimes
less
is
more
and
the
fewer
ads
will
actually
convert
to
more
traffic
and
revenue
and
the
other
things
that
I'm
sure
they're
concerned
about.
So
I
felt
like
that
was
really.
Everything
else
was
really
really
good.
It
was
kind
of
disheartening
to
see
them
hurting
themselves.
M
Cool
my
managers-
may
I
just
pop
in
here
sure,
michael
when
when
so
many
apps
have
you
checked
the
the
co-work
vitals?
I
think
the
page
could
be
very
heavy
and
loading
very
slowly.
Something
like
that.
Probably.
L
A
Cool
okay,
let
me
pause
here
for
for
the
recording.
I
have
more
time
afterwards,
so
all
of
you,
with
your
raised
hands,
will
hopefully
get
through
and
get
your
questions
in
as
well.
If
you're
watching
this
on
youtube
feel
free
to
join
us
for
one
of
the
future
sessions
or
drop
your
questions
in
the
next
office
hours,
one,
usually
I
open
them
up
a
couple
days
before
we
go
live,
so
you
can
add
your
questions
to
youtube
in
the
community
section
all
right
with
that.