►
From YouTube: English Google SEO office-hours from October 22, 2021
Description
This is a recording of the Google SEO office-hours hangout from October 22, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I
am
a
search
advocate
on
the
search
relations
team
here
at
google
and
part
of
what
we
do
are
these
office
hour
sessions
where
people
can
jump
in
and
ask
their
questions
around
seo
and
their
website,
and
we
can
try
to
find
some
answers.
A
As
always,
a
bunch
of
questions
were
submitted
on
youtube
already,
so
we
have
some
of
those
to
go
through,
but
we
also
have
lots
of
new
live
people
here.
So
maybe
we'll
go
through
some
of
the
live
questions
first
and
sure.
Then,
let's
get
started.
Let's
see
christian,
I
think
you're
first
hi
john,
how.
B
Are
you
so
nice
to
see
you
again
yeah
I've
got
a
question
regarding
corvette
vitals
and
I
would
like
to
know
if
the
the
weight
of
the
corvette,
vitals
or
the
page
experience
can
depend
on
the
kind
of
the
website
you
are
looking
at.
So
let's
say
you
have
a
website
this,
which
is
more
sensitive
to
loading
time
like
online
gaming
or
something
in
that
direction
versus
some
plain
text
website.
A
No,
not
not
really,
so
it's
not
that
we
would
evaluate
the
kind
of
website
and
say:
oh
speed
is
more
important
here.
It's
it's
really
like.
If
we
can
take
it
into
account,
we'll
take
it
into
account.
I
I
think
the
the
tricky
part
is
in
some
search
results.
The
the
competition
is
quite
strong
and
everyone
is
very
like
similarly
strong
and
then
it
can
look
like
it
has
a
stronger
weight.
C
Yes,
hello,
john
hi.
Yes,
so
we
are
trying
to
improve
page
experience
on
one
of
our
of
our
websites.
Currently,
we
are
working
on
a
cls
metric.
So
the
background
of
the
problem
is
that
the
images
on
our
website
use
lazy
load,
but
before
loading
we
don't
know
their
sizes.
C
A
B
A
With
regards
to
indexing,
what
what
is
important
is
that
we
have
information
about
the
images
on
those
pages,
so
what
you
can
do,
if
you're
not
sure
if
you're
lazy
loading
is
is
recognized
by
google,
is
use.
The
image
structured
data
so
kind
of
on
the
pages
themselves
give
us
the
structured
data
for
those
images,
and
then
we
can
definitely
pick
that
up.
C
A
So,
to
be
quite
quite
honest,
so
I
I
don't
think
for
for
web
search.
We
look
at
the
specific
images
on
the
page
and
say:
oh,
this
is
a
nice
image
and
this
is
a
boring
image.
A
A
I
don't
think
we
use
it
by
default,
but
if,
if
it
loads
with
lazy
loading
as
as
an
image
element
on
the
page,
then
we
can
pick
that
up
and
if
you
use
the
structured
data
or
an
image
type
map,
we
can
also
pick
that
up.
So
it's
not
by
default.
That
way,
you
would
use
the
data
src,
but
we
use
what
what
ends
up
being
rendered
on
the
page.
D
Hi
john,
I
have
a
question
about
reputation.
Research
does
google
collect
android
and
ios
application
reviews
in
the
process
of
extensive
reputation,
research?
This
type
of
reviews
will
affect
our
company
website
quality
scores.
A
So
I
I
could
imagine
that,
indirectly,
if
these
reviews
are
published
on
the
web
somewhere,
we
might
pick
them
up
and
we
might
index
them.
But
if
they're
like
in
an
app
store
somewhere,
I
don't
think
we
even
see
them
for
web
search
like
for
for
other
kinds
of
searches.
I
don't
know
okay.
Thank
you.
E
John,
may
I
add
something
to
this
question
sure
if
you
don't
mind,
of
course,
yeah.
I
also
don't
think
that
google
is
able
to
see
these
stores
because
they
are
kind
of
closed
ecosystems,
especially
the
app
store.
So
if
you
don't
have
an
account,
you
cannot
access
them.
In
other
words,
so
it
is
almost
the
same
like
with
the
websites.
If
you
have
something
that
is
locked
with
user
and
passwords,
it's
not
possible
to
be
publicly
accessed
by
by
a
software
like
the
crawler.
In
this
situation,
yeah.
A
F
Hi
john
nice,
to
have
you
here
so
my
first
question
is
recently:
the
call
request
on
my
company
site
has
dropped
like
nearly
over
90
and
we
have
checked
all
aspects
according
to
the
google
official
doc
like
robots
text
obsidians,
and
we
also
we
want
to
know,
we
want
to
know
some
any
other
possible
technical
factors
that
can
cause
a
sudden
drop
of
crawl
request
in
a
day.
A
What
it
sounds
to
me,
like
our
our
systems,
have
trouble
accessing
your
content
quickly
enough.
So
so,
when
it
comes
to
the
number
of
requests
that
we
make
on
a
website,
we
we
have
two
kind
of
things
that
we
balance.
On
the
one
hand,
the
crawl
demand,
which
is
how
much
we
want
to
crawl
from
a
website
and
assuming
this
is
a
reasonable
website,
then
the
crawl
demand
usually
stays
pretty
stable.
B
A
Other
side
is
the
crawl
capacity.
This
is
how
much
we
think
the
server
can
support
from
crawling
without
causing
any
problems,
and
this
is
something
that
we
evaluate
on
a
daily
basis,
and
it
can
react
quite
quickly
if
we
think
that
there's
a
critical
problem
on
the
website
so
for
critical
problems,
we
we
think
of
things
like
server
errors.
A
If
we
see
a
lot
of
server
errors,
if
we
can't
access
the
website
properly,
if
the
server
speed
goes
down
significantly,
so
not
not
the
time
to
render
a
page
but
the
time
to
access
the
html
files
directly
and
those
are
kind
of
the
the
three
aspects
that
that
play
into
that
and
if,
if,
for
example,
the
speed
goes
down
significantly
you,
you
would
see
that
in
the
cross
stats
report
in
search,
console
yeah
and
that's
something
where
like.
A
If,
if
we
think
we
cause
problems
from
crawling
too
much,
we,
we
will
scale
that
back
fairly
quickly.
F
Oh,
I
see
so.
The
response
response
time
is
is
highly
relevant
and
it
is
highly
related
with
the
call
request
right
yes,
correct.
Also,
do
you
think
the
response
codes
of
huge
five
xx
and
for
access
also
or
may
reduce
the
crow
rate.
A
Five
xx
errors.
Definitely
those
are
server
errors
which
we
would
see
as
being
potentially
problematic.
400
errors
are
less
problematic
because
it's
basically
content
doesn't
exist,
so
we
we
can
crawl
normally.
So
if
a
page
disappears,
that's
no
problem.
If
it
has
a
server
error,
that
is
a
problem.
F
So
how
many
levels
of
5x
code
would
you
see
it
as
a
as
may
cause
cause
surveyor
like
if
we
have
100
pages
and
maybe
among
15
pages
returned
to
returns
to
5xxx
due
to
some
technical
use?
Will
google
both
sync
our
server,
may
overload
like
how
how
many
percent
would.
A
Make
I
don't
think
we
have
a
fixed
number,
so
what
what
I
would
check
if
this
recently
dropped
the
speed
I
I
would
go
into
search
console
and
the
crawl
stats
section
and.
B
A
F
Oh,
I
see
so
if,
if
assuming
that,
we
find
the
critical
problem-
and
we
may
maybe
it
was
because
of
a
response
time
or
maybe
the
several
euro.
If
we
fix
this
problem,
like
maybe
in
one
or
two
weeks,
when
will
we
see
the
crow
rate
back
to
the
normal
normal
request,
like.
F
G
Hey
hi
john,
how
are
you
hi
we're
good?
I
don't
know
if
you
if
you
remember,
I
joined
the
office
hour
session
like
a
few
weeks
ago,
and
I
asked
you
about
some
software
before
I
send
you
a
couple
of
examples,
and
you
said
you
would
send
them
over
to
the
team
to
check
what
the
problem
exactly
was
because,
as
I
described
back
then
in
the
url
inspection
tool
will
show
us
that
everything
was
pretty
much
there
and
then
you
said.
Maybe
there
is
additional
side.
A
It's
possible,
I
I
don't.
I
don't
have
kind
of
offhand
the
the
escalations
that
we
did
there
so
yeah.
I
I
don't
know.
I
I
think
if
you
you
posted
the
urls,
then
I
definitely
passed
them
on
to
your
team.
But
I
don't
know
if
like
what,
what
what
changes
they
would
make
based
on
that
yeah.
G
Is
there
a
way
to
understand
if
the
issues
on
our
side
or
on
your
side,
because,
obviously,
if
he's
on
your
side,
we
don't
have
to
worry
about
that.
But
if
there
is
something
we
are
doing
wrong,
we
would
like
to
fix
it
at
the
moment.
For
us,
it's
a
bit
difficult
to
understand
what
we
are
doing
wrong.
Yeah.
A
So
if
we
see
things
like
an
error
page
on
a
website
or
if
we
see-
I
don't
know
no
no
index
pages
or
something
that
that
mentions
errors
in
the
text
on
the
page,
then
that's
something
we
we
could
potentially
classify
as
a
soft
404..
So
I
don't
remember
the
actual
urls
that
you
had.
So
it's
it's
really
hard
for
you
to
say,
but
I
I
imagine
these
are
normal
pages
on
your
side.
A
Then
that
would
probably
be
something
that
just
our
systems
have
to
pick
up
over
time
and
with
with
the
escalations
that
that
I
do
for
the
soft
404
team.
Usually
they
take
that
into
account
for
future
algorithms,
okay,
so
it's
not
that
they
would
manually,
tweak
it
and
say:
oh,
it
says
this
one
is
not,
but
rather
they
they
fix
the
code
and
improve
it
so
that
it
doesn't
catch
those
kind
of
pages.
H
John
hi,
all
right,
I've
got
a
question
on
query,
strings
and
facet
navigation.
How
to
deal
with
this,
so
we've
got
our
site.
H
We
load
our
faceted
nav
on
our
category
pages
with
javascript,
there's
no
ahref
links
on
them,
so
google
is
not
going
to
find
those
so
and
we
don't
want
those
indexed
either.
So
the
way
we
kind
of
have
it
set
up.
The
only
way
these
kind
of
seem
to
be
accessible
to.
Google
is,
if
someone
externally
links
to
them
and
so
to
prevent
that
indexation
we've
got
canonical
set
up.
H
We
decided
not
to
do
it
through
robots.txt,
but
you
know
I
was
thinking
about
this,
and
I
know
the
canonicals
is
more
of
a
suggestion.
So
I
get
a
little
paranoid.
Sometimes
I'm
like
I
don't
want
all
the
you
know
these
things
to
get
indexed
and
stuff
like
that.
A
So
we
we
use
that
tool
as
a
way
of
recognizing
pages
that
we
shouldn't
index
and
for
picking
better
canonical
choices.
So
basically
what
what
would
happen
there
is
if,
if
we've
never
seen
that
page
before
then
like
nothing
will
get
indexed
anyway.
A
If
we've
seen
that
page
before
and
we've
noticed,
it
has
a
real
canonical
on
it
and
we're
unsure
like
what
we
should
do
with
this
real
canonical,
because
it's
not
a
100
clear
signal,
then
having
it
listed
in
that
tool,
basically
skews
at
that
discussion
and
basically
says:
oh,
like
we
don't
want
this
index,
then
we
we
just
won't
index
that,
and
we
we've
seen
in
real
canonical
here
so
we'll
just
follow
that
real
canonical.
A
Yeah,
usually
what
happens
is
the
crawling
still
happens,
but
it
doesn't
happen
at
the
same
speed
just
because
we
we
like
recognize
ahead
of
time
that
actually
this
is
probably
not
something
that
we
need.
H
Okay,
that
makes
sense,
because
then,
that
kind
of
takes
care
of
the
next
question,
because
then
I
was
thinking
about
because
they
had
the
you
know:
there's
the
options
if
it's
just
tracking,
based
or
if
it
actually
like
changes
the
page
and
the
tracking
one
allows
you
to
kind
of
follow.
You
know
only
the
representative
url,
which
in
my
head
was
like
oh
yeah.
That
sounds
great,
but
I
guess,
if
the
other
one
does
the
same
thing,
then
we're
coach,
yeah.
A
All
right,
let
me
run
through
some
of
the
submitted
questions
and
I'll
I'll
get
back
to
the
the
rest
of
you
with
your
raised
hands
as
well.
Don't
worry
and
probably
have
some
time
afterwards
as
well
for
for
more
questions
if
needed.
Let's
see.
The
first
question
I
have
on
the
list
here
is
in
search
console.
We
can
see
for
a
keyword
after
we
have
filtered
the
country
that
every
month
on
the
27th
there's
an
impression
increased
by
300.
A
A
It
can
definitely
be
the
case
that
bots
or
scrapers,
or
things
like
that
are,
are
triggering
these
impressions.
We
filter
and
block
bots
at
different
levels
in
in
the
search
results,
and
it
can
certainly
happen
that
some
of
these
go
through
into
search
console
as
well.
So
it's
almost
like
a
weird
situation
that,
if
you're
running
these
scrapers
to
see
what
your
position
or
ranking
would
be
on
these
pages,
then
you're
kind
of
getting
some
metrics
but
you're
skewing
the
other
metrics.
A
So
I
kind
of
discourage
that
if,
if
you're
checking
these
things
yourself,
if
this
is
not
something
that
you're
running-
and
you
just
see
this
appearing
in
search
console,
it's
fine
to
flag
this
with
the
feedback
function
which
it
sounds
like
you
already
did,
and
it's
fine
also
to
completely
ignore
it.
Because
it's
it's
something
where,
like
I
said,
we
filter
at
different
positions
in
our
search
systems
and
the
all
of
the
ones
that
I've
escalated,
which
were
like
this
ended
up
being
filtered
just
a
little
bit
after
a
search
console.
A
A
Let's
see,
my
question
is
related
to
internal
links,
specifically
on
the
homepage.
Assuming
the
homepage
got
a
certain
value
while
having
lots
of
internal
links
going
from
the
home
page
to
other
pages,
affect
the
value
of
the
home
page,
therefore
affecting
its
performance
and
ability
to
rank
for
a
specific
query,
interesting
question.
So,
basically,
what
what
you're
doing
with
the
internal
linking
is
spreading
the
the
value
that
you're
getting
from
your
pages
relative
to
the
rest
of
your
website.
A
So
if,
for
example,
like
all
of
the
external
links
go
to
your
home
page
and
that's
where
all
of
the
signals
get
collected-
and
your
homepage
has
absolutely
no
links,
then
we
can
focus
purely
on
the
home
page
as
soon
as
it
has
other
links
as
well.
Then
we
kind
of
distribute
that
out
across
all
of
these
links
and
depending
on
the
way
that
you
have
your
internal
linking
set
up.
A
That
can
mean
that
there
are
certain
places
within
your
website
that
are,
relatively
speaking,
more
important,
based
on
your
internal
linking
structure
and
that's
kind
of
how
we
pick
things
up.
So
it's
not
a
one-to-one,
a
mapping
of
your
internal
linking
to
your
ranking,
but
it
does
give
us
a
sense
of
relative
importance
within
your
website
and
from
that
point
of
view,
it
definitely
makes
sense
to
link
to
important
and
new
things
from
your
home
page
where
you're
saying
this
is
really
important,
or
this
is
really
new.
A
People
should
take
notice
of
this,
then
we
can
pick
that
up
a
little
bit
faster
and
we
can
give
it
a
little
bit
more
weight
in
the
search
results
as
well.
It
doesn't
automatically
mean
it
will
rank
better,
but
it
does
mean
that
we
kind
of
recognize.
Oh
you're
saying
this
is
relatively
important,
so
we
will
kind
of
take
that
on
as
feedback
and
try
to
treat
it
appropriately.
A
Let's
see
several
questions,
we
would
like
to
know
more
about
what's
the
possible
technical
factors
that
can
cause
a
sudden
drop
of
crawl
requests
in
a
day.
I
think
we
we
touched
upon
this
fairly
briefly
before
the
response
time,
like
I
mentioned,
and
the
frequency
for
google
to
recalculate
the
crawl
budget.
A
That's
essentially
once
a
day.
We
try
to
look
at
that
also
in
search
console
if
you're,
seeing
that
we're
crawling
too
much
which
doesn't
sound
like
this
problem,
but
if
you're
you're,
seeing
that
you
adjust
the
maximum
crawl
rate
in
search
console,
that's
also
taken
into
account
about
once
a
day
so
that
once
a
day
factor
is,
is
kind
of
at
play.
There.
A
I
just
want
to
pause
there
and
say
I
don't
think
this
is
absolutely
true
and
it's
it's
something
where
both
the
technical
side
of
seo
and
the
content
side
of
seo
they're
very
important
and
they're.
They
play
a
big
role
in
how
search
engines
understand
your
content,
and
I
would
not
say
that
technical
seo
is
less
important
than
the
quality
side.
A
I
think
they
both
have
to
be
good,
and
the
other
thing
I
just
want
to
mention,
while
I'm
here
is
that
when
it
comes
to
the
quality
of
the
content,
we
don't
mean,
like
just
the
text
of
your
articles,
it's
really
the
quality
of
your
overall
website,
and
that
includes
everything
from
the
layout
to
the
design
like
how
you
have
things
presented
on
your
pages,
how
you
integrate
images,
how
you
work
with
speed
all
of
those
factors
they
kind
of
come
into
play
there.
A
So
it's
not
the
case
that
we
would
look
at
just
purely
the
text
of
the
article
and
ignore
everything
else
around
it
and
say:
oh,
this
is
high
quality
text.
We
really
want
to
look
at
the
website
overall,
so
I
I
think,
that's
even
like
for
all
of
your
questions,
so
I
I
hope
it
doesn't
ruin
the
rest.
Let's
see
with
that
in
mind,
is
there
a
point
where
technical
improvements
are
still
extremely
valuable?
Yes,
like
I
mentioned
it's,
it's
definitely
still
important.
A
A
Let's
see
the
question
goes
on
when
lcp
is
over
20
seconds
and
blocking
time
over
10
seconds,
or
are
there
some
more
general
guidelines
so
that
I
I
think,
is
specific
to
the
page
experience
ranking
change
that
we
made
and
those
numbers
sound,
pretty
bad
and
sound
like
a
pretty
slow
page.
So
that's
definitely
something
I
I
would
work
on
to
improve.
A
I
I
think,
if
you
want
to
pick
apart
the
individual
factors
of
how
a
page
is
ranking,
then
you
can't
just
blindly
isolate
one
element
there,
which
in
this
case
would
be
speed
or
the
page
experience,
ranking
factor
and
say
I
will
fix
this
one
issue
and
then
the
ranking
will
improve,
because
maybe
the
ranking
is
stuck
because
of
this
one
issue.
Maybe
the
ranking
is
stuck
because
of
500
other
issues
that
you
have
on
a
website.
A
So
that's
something
where
you
can't
just
blindly
isolate
that
I've
been
treating
improvements
that
cut
off
about
half
a
second
or
more
lcp
as
valuable
and
we've
seen
jumps
in
organic
traffic,
corresponding
to
the
weeks
we've
made
these
improvements,
but
we're
not
certain
that
these
increases
are
tied
to
the
changes
and
we're
nearing
a
point
where
it
seems
that
we
may
be
better
served
working
on
something
else.
Yeah,
but,
like
I
mentioned
it's
it's
hard
to
say,
if
it's
really
specific
to
this
one
change.
A
One
thing
maybe
worth
mentioning
specifically
about
speed
is
that
for
the
core
web
vitals
we
take
into
the
account
the
data
that
is,
I
think,
delayed
by
28
days
or
something
like
that
so
about
a
month
back,
which
means,
if
you
make
significant
speed,
changes
on
your
website
that
affect
the
core
web
vitals
and
accordingly,
the
page
experience
ranking
factor.
Then
I
would
expect
it
to
take
about
a
month
to
be
visible
in
the
search
results.
A
So,
if
you're
seeing
changes
in
search
that
happen
like
on
the
next
day,
then
probably
that
wouldn't
be
related
to
the
speed
changes
that
you
made.
And
similarly,
if
you
make
bigger
speed
changes,
then
you
need
to
take
into
account
that
it's
going
to
take
about
a
month
before
you
even
see
any
effects
from
that
yeah.
I
I
think
it's
it's
also.
A
One
of
the
hardest
parts
of
seo
is
finding
out
which
which
parts
of
kind
of
this
big
catalog
of
seo,
things
that
you
could
be
doing
are
currently
the
most
critical
for
a
website
and
that's
something
where
experience
comes
into
play
and
understanding
kind
of
the
competitive
environment
there.
The
queries
that
you're
trying
to
rank
for
comes
into
play
and
there's
no
essentially
no
tool
that
will
just
go
out
and
say
it's
like
here
are
500
things
that
are
wrong,
and
this
is
the
one
thing
that
you
should
be
doing.
A
I
think
that's
fine.
I
would
use
the
appropriate
structured
data
testing
tools
just
to
make
sure
that
you
can
pull
out
that
data,
but
from
what
I
understand
from
the
structured
data
guidelines
for
those
elements,
these
don't
necessarily
have
to
be
nested.
So
that's
probably
okay,
but
you
can
you
can
test
with
the
testing
tools.
The
testing
tools
essentially
do
what
we
would
use
for
indexing
and
they'll.
Tell
you
right
away
if
it's
good
or
not
does
a
website
which
includes
great
content,
improve
in
trust
with
google,
or
is
that
only
determined
through
links?
A
I
don't
think
we
have
like
a
trust
factor
that
we
can
kind
of
look
at
and
say:
oh
trust
is
at,
I
don't
know,
9
out
of
12
or
whatever
numbers
you
would
have
there.
So
that's
kind
of
I
don't
know
it's
almost
like
a
philosophical
question.
At
that
point,
it's
like
does
improving.
Does
improving
the
quality
of
your
content
overall,
make
a
website
more
trustworthy
with
regards
to
google
and
like
well,
I
don't
know
there
are
no
metrics
specifically
for
that.
A
I
think
improving
the
quality
of
your
content
is
always
a
good
idea,
but
it's
yeah.
I
don't
know
lots
of
things
are
involved
there
and
when
it
comes
to
trust,
it's
definitely
not
a
matter
of
just
links
that
are
pointing
at
a
website
with
delaying
loading
of
non-critical
javascript
until
the
first
user
interaction
affect
seo
ranking
in
any
way.
This
will
not
affect
ux
and
ui.
Is
there
anything
we
should
be
concerned
about,
such
as
affecting
the
chrome
user
experience
report
data?
A
So
I
I
I
think
from
from
what
I
understand
from
the
question:
it's
it's
basically,
you're
lazy
loading,
the
the
functionality
that
takes
place
when
a
user
starts
to
interact
with
the
page
and
not
lazy
loading,
the
actual
content.
If,
if
that's
the
case,
then
that
sounds
perfectly
fine
to
me-
that's
something
similar
to.
A
I
believe
they
call
it
hydration
when
it
comes
to
javascript
based
sites
where
you
essentially
load
the
content
from
html
as
a
static
html
page,
and
then
you
add
the
javascript
functionality.
On
top
of
that,
from
our
point
of
view,
if
the
content
is
visible
for
indexing,
then
we
can
take
that
into
account.
We
can
just
use
that
it's
not
the
case
that
googlebot
will
go
off
and
click
on
different
things.
We
we
essentially
just
need
the
content
to
be
there.
A
The
the
one
thing
where
kind
of
clicking
on
different
things
might
come
into
play
is
with
regards
to
links
on
a
page.
If
those
links
are
not
loaded
as
a
elements,
then
we
won't
be
able
to
recognize
those
as
being
links,
and
similarly,
as
one
of
the
questions
I
think
early
on
was
with
regards
to
lazy
loading
of
images
on
a
page.
A
I
think
you
have
your
hand
raised
as
well,
so
I
just
wanted
to
check,
does
does
that?
Does
that
work
for
you.
I
It
does
yes,
thank
you
for
answering
that
I
only
want
to
add
first
of
all,
yeah
the
a's,
so
the
links
are
there.
So
we're
good
with
that,
and
what
we're
trying
to
address
here
is
is
the
page
speed
which
we're
using
this
cdn,
which
does
this
part
of
the
of
the
lazy
loading,
let's
say
js
and
css.
So,
yes,
we're
trying
to
increase
the
speed.
This
did
increase
the
web
vitals
as
well
by
like
a
lot
more
than
50
percent.
A
I
I
think,
that's
perfectly
fine
to
to
do
like
that,
the
with
regards
to
css
in
particular-
so
I
don't
know-
javascript-
probably
less
so,
but
css
in
particular,
since
it
affects
the
layout
of
the
page
that
could
affect
the
cls
score
and
it
could
affect
the
understanding
of
whether
or
not
a
page
is
mobile
friendly.
So
that
might
be
something
to
watch
out
for,
but
otherwise,
if
you're
using
lazy,
loading
and
you're
making
your
pages
much
faster
for
users,
then
it's
like
good
job
yeah.
A
Let's
see
my
client
is
having
a
rather
unique
issue
that
is
relevant
to
their
times.
Ordinarily,
I
would
recommend
that
if
a
product
is
out
of
season
or
out
of
stock,
that
they
keep
the
url
live
and
put
a
note
to
the
user
as
to
when
the
product
will
be
available
with
supply
chain
shortages.
This
company
is
experiencing
an
inordinate
amount
of
temporarily
out
of
stock
products,
and
this
creates
a
negative
user
experience.
A
A
Yeah,
I
don't
know.
I
think
this
is
always
a
bit
challenging
specifically
around
out
of
stock
and
temporarily
out
of
stock
content.
It
is
something
where
what
what
works
best
for
us
is
if
we
can
keep
the
url
online
for
things
that
are
really
temporary
in
the
sense
that
if
the
url
remains
indexable
and
with
structured
data,
you
tell
us
this.
This
product
is
currently
not
available.
A
So
especially,
if
you
add
a
product
back
and
then
suddenly,
it
has
internal
links
again.
That
helps
us
to
pick
that
up
again,
you
can
speed
this
up
a
little
bit
by
being
a
bit
deliberate
with
your
internal
linking
so
in
particular,
like
I
think
one
of
the
questions
earlier
where,
when
things
are
linked
from
the
home
page,
we
think
that
they're
a
little
bit
more
important
and
we
go
off
and
try
them
out
fairly
quickly.
A
If
you
add
those
products
back
and
if
you
add
a
link
to
your
homepage
saying
oh
these
things
are
now
in
stock
again,
then
we
can
take
that
and
say:
oh,
this
seems
to
be
an
important
thing.
We
will
double
check
these
pages,
a
bit
quicker
and
we'll
see
if,
if
they're,
actually
now
in
stock
or
not
so
that's
kind
of
the
direction,
I
would
go
there.
I
think
also
with
regards
to
in
stock
and
not
in
stock,
especially
when
it
comes
to
products.
A
One
thing
you
can
also
do
is
kind
of
hedge,
your
seo,
together
with
product
search.
So
if
you
submit
a
merchant
center
feed,
then
we
can
also
show
those
products
within
the.
I
don't
know
what
it's
called
the
the
product
search,
sidebar
thing
or
product
listing
ads.
I
think
it
used
to
be
called.
I
don't
know
what
the
current
name
is.
A
A
A
A
After
doing
a
lighthouse
report
on
our
site,
we
notice
a
common
javascript
library
we
use
was
flagged
as
having
two
security
vulnerabilities.
Do
these
vulnerabilities
have
any
effect
on
seo,
or
would
you
say
this
is
more
just
to
let
us
know
so
lighthouse
is,
I
think,
a
tool
within
chrome
and
also
a
standalone
tool.
I
think
not
not
sure
if
it's
just
in
chrome,
but
it's
basically
from
from
the
chrome
side.
A
It's
not
it's
not
by
definition
an
seo
tool,
but
it
does
have
a
lot
of
things
that
you
can
use
for
seo
and
specifically,
security
vulnerabilities
are
not
something
that
we
would
flag
as
an
seo
issue,
but
if
these
are
real
vulnerabilities
on
scripts
that
you're
using-
and
that
means
that
your
website
ends
up
getting
hacked-
then
the
hacked
state
of
your
website.
That
would
be
a
problem
for
seo,
but
just
the
possibility
that
it
might
be
hacked.
That's
not
not
an
issue
issue
with
regards
to
seo.
A
A
A
So
I
I
think
this
this
kind
of
comes
from
everything
around
the
quality
rater
guides
and
information
that
we've
published
around
eat,
which
is
expertise,
authoritativeness,
trustworthy
knot,
and
this
basically
applies
to
to
sites
that
are
really
critical
and
essentially
sites
where
you're,
giving
medical
information
or,
where
you're,
giving
financial
information,
then
for
those
kind
of
really
critical
situations,
you
want
to
make
sure,
as
a
user,
that
this
is
actually
written
by
someone
who's,
trustworthy
or
who's
an
authority
on
this
topic,
but
when
it
comes
to
theater
news,
for
example,
or
seo
news
or
anything
kind
of
random
on
the
web,
that's
not
necessarily
something
where
the
trustworthiness
of
the
author
is
a
big
issue.
A
So
that's
something
where
I
I
would
say
it's
it's
the
less
critical
how
you
frame
this.
I
I
think,
having
admin
as
an
author
seems
kind
of
I
don't
know
too
generic
with
regards
to
to
any
business
where
maybe
you're
better
off
just
saying.
Well,
there
is
no
author
for
this
specific
piece
of
information
or
it's
just
written
by
by
our
website,
but
essentially
that's
more
up
to
you.
A
The
the
one
place
where
author,
I
think
the
author
name
does
come
into
play,
is
some
types
of
structured
data
also
have
kind
of
information
for
the
author
and
in
in
that
case
it
might
be
something
that
is
shown
in
the
rich
results
on
on
a
page.
So
from
that
point
of
view,
you
might
want
to
make
sure
that
at
least
there's
a
reasonable
name
there
or
you
explicitly
don't
use
that
kind
of
structured
data,
but
otherwise,
for
I
think,
a
generic
news
and
reviews
and
tickets
website.
A
I
wouldn't
necessarily
worry
about
the
name
of
the
author
that
you
specify
there
recently
we
added
post
section
to
the
home
page
as
most
sites
have,
and
that
section
has
been
constantly
changing
and
we
don't
really
want
to
rank
these
pages
for
any
specific
query,
meaning
we
don't
want
to
favor
them
in
any
way.
A
So
for
the
most
part,
I
don't
think
it
makes
sense
to
try
to
fine-tune
the
internal
linking
with
the
the
relno
follow
attribute
on
links.
I
I
don't
think
that's
something
that's
very
easily.
A
You
want
to
have
those
pages
indexed,
in
which
case
you
have
normal
internal
links
to
them,
or
you
don't
want
them
indexed,
in
which
case
you
don't
have
internal
links
to
them
or
in
which
case
you
could
just
put
a
no
index
on
those
pages,
but
I
wouldn't
try
to
tweak
things
with
internal
linking
of
this
kind
of
content
with
the
the
rel
no
follow.
A
So
that's
something
where
I
I
would
try
to
figure
out
on
your
side
like
this
is
is
totally
up
to
you,
and
this
is
more
of
a
strategic
question
than
seo
questions
like
do
you
want
these
pages
indexed
or
do
you
don't
not
want
them
indexed?
If
you
do
want
them
index
then
like
link
to
them
normally
and
make
them
indexable.
If
you
don't
want
them
indexed,
then
don't
link
to
them
or
put
a
no
index
on
those
pages
and
decide
kind
of
based
on
that.
A
Can
we
add
a
view
of
a
week
and
a
month
in
the
search
report
in
search
console,
I
don't
know
I
would
submit
feedback
for
that,
so
that
the
search
console
team
kind
of
knows
what
what
you
would
like
to
do
there?
A
How
does
I
mean
one
thing
you
could
do
is
just
export
the
data
and
do
it
in
a
spreadsheet,
so
that
might
be
an
option
for
you.
How
does
google
search
engine
differentiate
between
e-commerce
website
and
informational
website?
Is
it
from
the
plugins
installed
or
from
the
cms
or
from
the
website
structure?
A
We
want
to
use
woocommerce
plugin
for
a
specific
purpose
other
than
e-commerce.
We
don't
want
to
sell
any
products.
Will
the
woocommerce
installation
classify
your
website
as
an
ecommerce
website,
so
we
we
don't
classify
websites
based
on
the
infrastructure
that
they
use
and
we
I
I
sometimes
get
this
question
with
regards
to
blogs.
It's
like.
I
want
to
publish
an
article
and
I
do
it
in
wordpress.
A
Does
this
mean
that
google
thinks
that
my
article
is
a
blog
post
like
we?
We
don't
have
this
notion
of
trying
to
understand
what
what
type
of
content
something
is
just
based
on
the
infrastructure
that
it
uses
you
can
use,
essentially
an
e-commerce
site
set
up
for
posting,
your
blog
posts,
if
you
want
to
it's-
probably
not
very
suitable
for
things
like
that,
but
you
could,
if
you
wanted
to,
and
if
we
see
that
we
will
say
well,
this
page
has
a
lot
of
content
on
this
specific
topic.
A
We
will
rank
this
page
based
on
this
content.
So
from
that
point
of
view,
if
you
want
to
use
a
specific
infrastructure
which
you
think
works
well
for
your
website,
then
then
go
for
it.
It's
not!
It's
not
going
to
be
the
case
that
we
will
say.
Oh
your
website
uses
this
infrastructure,
so
it
must
be
this
kind
of
a
website.
That's
not
the
case.
We
essentially
look
at
the
pages
that
we
see
there
and
based
on
the
content
that
we
find
on
those
pages.
Based
on
that
we
do.
A
We
try
to
classify
these
individual
pages
and
even
then
it's
not
the
case
that
we
would
say.
Oh,
this
is
an
e-commerce
page,
or
this
is
an
article
page,
but
rather
we
would
try
to
figure
out
what
what
intent
does
this
page
cover
and
how
does
that
map
to
the
search
results
like
if
someone
is
looking
to
buy
something
specific
and
we
see
it's
a
product
page
then
we
might
say
well.
This
covers
this
kind
of,
like
user
wants
to
buy
something
intent
fairly.
Well,
therefore,
we'll
rank
it.
A
Okay,
let
me
see
we
have
lots
of
raised
hands
and
a
bunch
of
questions
left.
Maybe
I'll
just
go
back
to
the
raised
hands
adriana.
I
think
you're
next.
J
A
Okay,
there's
a
lot
about
that,
so
what
we
we
have
a
help
center
page
on.
What
is
impressions,
clicks
and
positions?
I
think
it's
it's
roughly
called
that
has
a
ton
of
details
on
on
those
impressions.
So
that's
something
that
that
I
would
look
at
first,
it's
something
I
usually
look
at
when
I
get
this
question
so
that
that
would
be
my
recommendation.
A
First
of
all,
with
regards
to
infinite
scroll
or
kind
of
this
continuous
scroll
setup
that
that
we're
trying
out,
I
I
think
it's
it's
a
little
bit
tricky
because
it's
hard
to
determine
what
exactly
is
happening
from
from
an
seo
point
of
view,
but
essentially
from
from
our
side,
we're
still
loading
the
search
results
in
groups
of
10,
essentially
and
as
a
user
scrolls
down
on
the
page,
we
kind
of
dynamically
load.
A
The
next
set
of
10
results
there
and
when
that
set
of
10
results
is
loaded
that
counts
as
an
impression,
so
that
basically
means
that
kind
of
the
scrolling
down,
and
you
start
seeing
page
two
of
the
search
results
that
we
would
see
is
like
well.
This
is
page
two
now,
and
it
now
has
impressions
similar
to
if
someone
were
to
just
click
on
page
two
directly
in
the
links.
A
So
from
that
point
of
view,
not
much
really
changes
there.
What
what
I
think
will
change
a
little
bit
is
that
users
will
probably
scroll
a
little
bit
easier
to
page
two
page,
three
or
four,
and
based
on
that,
the
the
number
of
impressions
that
a
website
can
get
in
the
search
results
will
probably
go
up
a
little
bit.
A
Just
because
it's
easier
to
reach
page
two
in
the
search
results
and
the
number
of
clicks,
I
suspect,
will
remain
similar
because,
like
people
will
kind
of
like
scroll
up
and
down
and
look
at
the
results
on
a
page
and
they'll
click
on
one
of
them.
So
probably
what
will
happen?
Is
impressions?
Go
up
a
little
bit
clicks
stay
the
same.
That
means
the
click-through
rate
tends
to
go
down
a
little
bit
and
that,
if
you're,
focusing
purely
on
click-through
rate
for
seo,
then
I
suspect
that
will
be
a
little
bit
of
a
kind
of.
A
I
don't
know
weird
situation,
because
it's
it's
hard
to
determine
did
the
click-through
rate
drop,
because
this
page
was
shown
this
continuous
scroll
environment
or
did
it
drop
because
users
saw
it,
but
they
didn't
like
to
click
on
it
as
much
anymore.
So
that's,
I
think,
kind
of
tricky
there.
One
one
thing
that
I
don't
know
helps
a
little
bit.
I
think,
is
we
do
this
continuous
scrolling
just
for
the
first
four
pages
and
then
afterwards
you
click
on
kind
of
the
load
more
or
next
page.
A
I
don't
know
what
exactly
it's
called.
So
it's
not
something
that
you
would
see
if
you're
ranking
on
page
five
or
six
at
least
not
at
the
moment.
I
don't
like
nobody
knows
how
these
things
evolve
and
at
the
moment
it's
only
for
english
results
in
the
us.
I
suspect,
if
they
see
that
this
works
well
for
users,
they'll
roll
it
out
in
more
countries
and
more
regions
but
like,
as
always,
who
knows
and
it's
hard
to
determine
what
the
timeline
there
will
be.
A
So
I
I
think
the
short
version
is
probably
over
time.
The
number
of
impressions
for
for
a
lot
of
these
pages
will
go
up
a
little
bit
and
at
some
point
that
will
stabilize
again
when
people
are
used
to
this
new
ui
when
it's
rolled
out
everywhere.
A
But
my
my
guess
is
the
the
click-through
rate
number,
if
you're,
just
purely
looking
at
that
that
will
go
down
slightly
and
if
you
see
that
kind
of
a
downward
change
with
the
click-through
rate,
I
wouldn't
assume
that
it's
something
that
you're
doing
wrong
with
with
the
search
results.
It
might
just
be
that
the
impressions
has
gone
up.
J
K
Can
you
can
you
hear
me?
Yes,
yes,
perfect,
sorry,
my
question
kind
of
links
to
neil's
question
and
it's
about
your
money,
your
life.
Would
you
consider
texts
about
energy,
energy
supply
for
the
household
or
telecommunications
supply
for
the
household?
Would
you
consider
them
to
be
part
of
the
your
money,
your
life
structure
as
well.
A
I
don't
know
I
so
I
I
I
think
for
for
things
like
this.
On
the
one
hand,
we
don't
have
this,
like
hard-coded
inner
algorithms,
that
we
try
to
look
at
for
these
specific
kinds
of
pieces
of
content.
So
it's
it's
not
something
where
we
would
have
an
absolute
answer,
but
what
I
would
do
is
look
at
the
the
quality
rater
guidelines
and
see
what
kind
of
sites
they're
talking
about
there
and
then
based
on
that.
A
Try
try
to
estimate,
but
it's
also
something
where
you
can
ask
users
directly
or
you
can
do
like
a
small
user
study
or
where
you
could
do
maybe
a
b
testing
on
your
website,
where
you
have
some
of
your
content
with
an
author
or
some
of
it
without
an
author
and
you
just
kind
of
see
like
from
a
almost
like
a
natural
point
of
view.
Do
users
react
differently
to
this
content
and
based
on
that,
make
make
your
decision.
L
Sure,
rafael,
hey
john,
so
my
question
is:
I
have
a
customer
that
they
have
like
a
very
high
average
response
time,
two
seconds
to
be
precisely
and
I'm
having
a
very
hard
time
to
have
buying
from
them
to
improve
that.
So
I
would
like
to
know
how
does
the
average
response
time
pets
rankings
and
search
or
improving
that
what
benefits
this
website
would
have
from
for
google
bot
or
from
an
seo
point
of
view.
So
this
is
question
one
and
then
question
two
is
on
a
different
subject.
That
is
another
customer.
L
They
just
redesigned
their
website,
it's
still
the
same
cms
and
content
and
after
that,
all
of
our
faq
schemas
stop
being
displayed
in
google
search
results,
and
this
is
three
months
old
now
and
I
always
still
are
not
there,
even
everything
being
passing
on
the
rich
results
test
and
inspection
too.
So
like
what
did
that
happen,
and
when
should
I
expect
to
see
faq
being
just
play
again
or
if
it
will
ever
be.
So.
Thank
you.
A
Okay,
so
average
response
time.
I
assume
you're
looking
at
the
the
server
response
time
in
in
the
crawl
stops,
or
things
like
that.
A
So
two
seconds
seems
pretty
long
to
me.
Usually
I
I
recommend,
like
we
don't
have
any
fixed
numbers,
but
I
I
recommend,
somewhere
along
the
lines
of
maximum
200
milliseconds,
to
to
kind
of
aim
for,
and
two
seconds
is
a
lot
more
than
that.
What
what
this
affects
is
how
quickly
we
can
crawl.
L
The
sturdy
part
scripts
do
you
use
start
describe
30-part
scripts
into
consideration
and
when
measuring
that
or
it's
purely
my
server,
that's.
A
A
That
answers
my
question.
Thank
you
so
much
and
that
doesn't
affect
the
rankings.
It's
purely
just
from
a
technical
point
of
view.
How
much
we
can
crawl
and
the
other
question,
I
think,
was
the
faq
schemas
there.
I
think
there
are
two
things
that
might
might
have
happened.
It's
it's
hard
to
say
offhand.
A
One
is
that
we
might
have
reevaluated
the
quality
of
your
website
overall,
at
about
the
same
time
that
you
made
those
changes.
It's
it's
probably
more
of
a
coincidence
if
that
were
the
case,
but
it
could
be
that
we
kind
of
like
are
not
that
convinced
about
this
website
anymore,
and
if
we're
not
convinced
about
the
website,
then
usually
we
don't
show
any
rich
results
and
that
would
include
the
the
faqs
so
one
one
way
to
kind
of
double
check.
A
That
is,
if
you
do
a
site
query
for
these
individual
pages,
do
the
rich
results
show
up
there
or
not?
If
they
do
show
up
there,
then
that
means
technically
we
can
recognize
them,
but
we
don't
want
to
show
them
so
that's
kind
of
a
hint
that
maybe
from
a
quality
point
of
view,
you
need
to
improve
things.
If
they
don't
show
up
with
a
side
query,
then
that
means
more
that
there's
still
something
technical
which
is
broken
with
regards
to
that.
A
So
it's
not
that
there's
a
fixed
delay
after
a
a
restructuring
of
a
website
for
us
to
start
showing
them
again,
it's
more
it's
like.
Maybe
there
was
coincidentally
weird
timing
or
maybe
there's
a
technical
issue.
A
A
Cool,
let
me
take
a
break
here
with
the
recording
so
that
we
have
kind
of
a
shorter
recording
for
for
the
channel,
and
I
can
answer
more
questions
afterwards
with
you
all
as
well,
if
you're
watching
this
on
youtube,
thanks
for
watching,
feel
free
to
submit
more
questions
on
youtube
for
the
next
office
hours
or
join
us
in
person,
if
you'd
like
to
and
with
that.
Let
me
pause
here.