►
From YouTube: English Google SEO office-hours from November 12, 2021
Description
This is a recording of the Google SEO office-hours hangout from November 12, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hour
hangouts.
My
name
is
john
mueller.
I'm
a
search
advocate
here
at
google
in
switzerland
and
part
of
what
we
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
their
website
and
web
search
looks
like
a
bunch
of
things
have
already
been
submitted
on
youtube
lots
of
long
questions
too.
So
we'll
see
how
far
we
get
there,
but,
as
always,
I
think
it's
nice
to
start
off
with
some
live
questions.
B
Okay,
so
I'm
first
still
so
I
recently
built
a
new
chrome
extension
and
I'm
waiting
for
it
to
be
indexed,
and
hopefully
people
can
find
it.
B
Meanwhile,
I
also
created
a
dedicated
website
pointing
to
it-
let's
say
my
chrome
extension
is
called
abcd.
Then
I
also
created
abcd.com,
which
shows
very
similar
content.
As
this
the
store
page
says,
abcd
is
a
chrome
extension
that
can
do
this,
and
this
for
you,
if
you
want
to
try
it
out,
click
this
link
to
install
and
that
link,
of
course,
links
back
to
the
store
page
and
that's
all
for
this
entire
domain.
It
just
does
this
one
thing
and
I
didn't
create
this
domain,
because
I
need
to
show
additional
content.
B
A
I
I
think
it's
impossible
to
say
for
for
for
the
long
run,
because
it
depends
quite
a
bit
on
how
it's
kind
of
in
interconnected
with
the
rest
of
the
web
and
kind
of
like
I
I
imagine
in
the
beginning,
it
might
be
easier
for
your
own
domain
to
be
visible
in
search
just
because
you
have
a
little
bit
more
control
over
that
you
can
define
exactly
what
you
want
to
have
presented.
You
can
submit
that
to
to
google
to
be
indexed
directly
in
in
the
long
run.
I
I
don't
know
is.
B
A
I
don't,
I
don't
think,
there's
any
kind
of
automatic
bias
in
in
that
regard.
There
even.
A
I
mean
it's,
it's
a
new
extension
too,
so
it's
like
I,
I
don't
think
we
we
have
anything
where
we're
like.
Well,
if
it's
hosted
in
the
in
the
chrome
extension
store,
then
it
automatically
shows
up
in
search.
I
I
think
that
would
be
something
that
we
explicitly
try
to
avoid
happening.
B
I
have
an
issue
with
backlinks,
so
I
may
create
my
own,
my
own
backlinks,
and
maybe
I
want
to
post
it
to
some
places
to
say
that
hey,
I
built
this
new
thing
and
check
it
out,
and
I
need
to
pick
one
of
these
two
and
I
think
if
I
know
which
one
is
going
to
do
better
in
the
future,
I
would
backlink
the
winner
right
now,
because
I
want
the
winner
to
win
more,
but
I
don't
know,
but
one
of
my
thoughts
is
that
I
I
can't
control
how
other
people
in
the
future,
maybe
my
users
find
it
useful
and
they
create
backlinks,
and
maybe
half
of
them
will
link
to
one
that
I
have
a
link
to
the
other.
A
So
so
I
think,
first
of
all,
you
you
need
to
be
careful
when
you're
going
and
creating
links
for
your
own
sites.
I
I
would
double
check
the
webmaster
guidelines
with
regards
to
which
ways
would
be
acceptable
and
which
ways
would
not
be
acceptable,
but
I
I
think
in
in
the
future.
It
it's
definitely
a
situation
where
you
can't
control
how
other
people
are
linking
to
your
site
and
they
might
link
to
the
extension
directly
or.
A
A
Specifically
is
something
where
you:
you
have
full
control
over
that,
where
you
could
put
background
information
or
information
about
you
as
a
developer,
whatever
you
want
kind
of
extra
information
there,
and
sometimes
that
extra
information
means
that
this
is
is
a
better
place
for
people
to
link
to,
because
it's
like
here
is
about.
I
don't
know
almost
like
the
company
that
is
making
this
extension
and
obviously
some
people
will
also
link
directly
to
the
extension
and
say
it's
like
just
click
here,
but.
B
Yes,
do
you
think
that's
because
some
people
were
always
linked
directly
to
the
store?
Do
you
think
that's
possibly
splitting
my
pressures
backlinks
and
I.
A
Think
it
it
does
split
kind
of
the
value
from
the
links,
but
the
the
advantage
of
having
two
places
that
could
be
visible
in
search
results.
I
think
that's
quite
high
and
also
the
advantage
of
having
one
place
that
you
fully
control.
I
I
think
that
that
is
super
important
and
that's
something
where
it's
like
you,
you
control
the
the
urls.
You
control
the
the
full
content
that
is
shown
there.
A
A
You
can
create
more
things
and
that
continues
to
grow
within
your
website,
whereas
the
the
chrome
extension
side
well,
that
could
just
be
like
one
part
out
of
many
in
the
end
and
if
you
only
focus
on
that,
then
suddenly
you
have
all
of
these
small
islands
like
the
the
individual
extensions
or
the
individual
other
things
that
you
work
on
so
kind
of.
I
think,
having
one
clear
home
that
that
is
really
fully
under
your
control,
that
that
is
super
valuable.
B
Okay
and
another
thought
another
is
about
mobile
search,
so
I
I'm
thinking,
maybe
because
a
mobile
user
cannot
install
a
chrome
extension
on
their
phone.
Google
will
automatically
hide
that
store
page
from
search
results
on
mobile
right.
A
B
A
A
All
right,
wooter.
D
Hey
hello,
john
good
morning,
I
have
a
small
search
console
related
question,
so
for
reference,
it's
about
a
commerce
website
of
a
two
or
three
thousand
pages
in
total
and
next
to
the
product
pages.
It
also
contained
a
couple
of
course
pages
now,
when
these
were
developed,
they
were
wrongly
set
to
no
index.
D
This
was
fixed
couple
of
weeks
or
even
a
couple
of
months
ago,
but
in
search
console
most
of
these
pages
and
we're
really
talking
about
only
a
handful
of
pages,
like
maybe
10
pages
but
like
80
of
these
pages,
are
still
set
to
no
index
and
search
console,
and
I
can
see
that
the
google
bot
last
crawled
it
months
ago.
Back
in
june,
we
tried
we
fixed
it
in
development,
so
it's
been
fixed
for
a
couple
of
months.
D
The
new
index
meta
tag
is
gone,
everything
can
be
indexed,
they
are
indexed
in
other
search
engines
as
well.
So
there's
no
technical
issues
really,
but
if
we
check
search
console,
it
still
says
well,
google
bot
last
past
this
page
in
june,
and
it
still
contains
a
new
index
tag
which
isn't
the
case.
So
we
tried
to
request
the
indexing
via
search
console.
A
I
don't
think
there
are
any
known
issues
in
in
that
regard,
but
some
sometimes
we're
a
little
bit
conservative
with
regards
to
submitting
to
indexing
requests,
which,
which
is
probably
partially
what
what
you're
seeing
there.
A
I,
I
think
I
think
there
are.
There
might
be
two
things.
Well,
I
guess
a
couple
things.
On
the
one
hand,
if
we
see
that
a
page
is
no
index
for
a
longer
period
of
time,
then
we
usually
slow
down
with
crawling
of
that.
So
I
I
think,
that's
that's
what
you're
seeing
there
that
also
means
that,
when
the
page
becomes
indexable
again
we'll
we'll
pick
up
crawling
again,
so
it's
essentially
that
one
kind
of
push
that
you
need
to
to
do.
A
The
another
thing
is
that
since
search
console
reports
on
essentially
the
urls
that
we
know
for
the
website
it,
it
might
be
that
the
picture
looks
worse
than
it
actually
is.
That
might
be
something
that
you
could
look
at
by,
for
example,
looking
in
the
performance
report
and
filtering
for
that
section
of
the
website
or
those
those
url
patterns
to
see,
if
kind
of
that
number
of
high
no
index
pages
in
search
console
is
basically
reporting
on
on
pages
that
weren't
really
important,
and
the
important
pages
from
those
sections
are
actually
indexed.
A
The
the
other
thing
that
that
you
could
do.
I
I
think
the
site
map
is,
is
essentially
a
good
start,
but
another
thing
that
you
could
do
is
make
it
clear
with
internal
linking
that
these
pages
are
very
important
for
the
website,
so
that
we
crawl
them
a
little
bit
faster
and
that
can
be
a
temporary
internal
linking
where
you
say
like
for
I
don't
know
a
couple
of
weeks.
We
we
link
to
individual
products
from
our
home
page,
and
we
call
it.
I
don't
know
whatever
section
you
want
to
call.
A
It
was
like
special
products
or
whatever,
and
essentially
when
we
find
that
the
internal
linking
has
significantly
changed
and
usually
we
go
off
and
double
check
those
pages
too.
D
A
Links
I
I
think
that
would
work
too.
I
think
it's
it's
usually
better,
if
or
or
best,
if
we
can
find
it
on
on
really
important
pages
on
the
website,
where
we
know
like
usually
like
on
your
home
page
or
something
where
we
can
tell
okay.
This
is
something
and
you're
saying
that
this
is
important
for
you,
therefore,
we'll
go
double
check.
That
page.
D
A
I
don't
know
I
I
would
guess
it
it
might
for
some
and
it
might
not
for
others.
So
I
I
think
overall,
I
would
try
to
keep
the
old
urls
unless
there's
really
a
severe
problem
with
the
old
urls
and
just
kind
of
try
to
push
push
them
through
that
way.
Changing
the
urls
is
something
where
you
you
lose
any
of
the
information
that
was
associated
with
the
old
url
and
it's
like
the
the
advantage
is.
A
E
So
I
don't
have
any
problems
yet.
Regarding
my
question,
so
I
just
like
to
verify
I'm
using
a
wordpress
website
and
I'm
using
two
plugins
one
is
yoast
seo,
which
automatically
adds
a
href
canonical
link
to
every
page,
but
I
also
use
a
translator
called
wpml.
I
think,
and
that
one
adds
of
every
page
an
alternate
href
link.
So
my
question
is
now
that
it
doesn't
make
it
logic
that
it
says
for
that
url,
it's
canonical,
but
it's
also
an
alternate.
Does
it
conflict
somewhere
in
the
crawler.
A
No,
I
mean
I,
I
don't
know
exactly
what
what
these
two
plugins
do.
So
it's
kind
of
just
from
from
an
overall
point
of
view.
A
If
you
have
a
page
that
has
a
rel
canonical
on
it,
you're,
essentially
with
a
canonical
saying
the
link
that
is
is
mentioned,
there
is
the
preferred
url
that
I
want,
and
if
it's
the
same
page,
then
that's
perfect,
because
then
it
kind
of
gives
us
confirmation
that
this
page
is
the
one
that
you
want
to
have
index
and
the
rel
alternates
basically
mean
there
are
alternate
versions
of
this
page
as
well,
so
with
with
different
languages,
for
example,
if
you
have
one
page
in,
I
don't
know
english
one
page
in
french,
you
would
have
the
rel
alternate
link
between
those
those
two
language
versions,
and
it's
not
saying
that
the
page
where
that
link
is
on
is
the
alternate,
but
rather
it's
like.
A
C
A
Place
to
watch
out
a
little
bit
is
that
the
canonical
should
not
be
across
languages,
so
it
shouldn't
be
that
on
your
french
page,
you
have
a
canonical
set
to
the
english
version,
because
they're
different
pages
essentially
but
the
french
page
can
be
canonical
and
the
english
page
can
be
canonical.
And
you
have
the
alternate
link
between
the
two
and
that's
essentially
a
good
set.
E
F
David,
hey,
john,
hey
everybody.
I
already
got
back
to
you
last
week
with
that,
quite
where
I
asked
you
this
question
last
week
it
was
about
it
was
about
no
indexing
versus
canonicalization
or
robots
txt
file
blocking
it
there.
So
basically,
what's
happening
is
like
we
have
a
website
with
an
e-commerce
store
with
a
lot
of
product
variations
that
have
thin
content
or
duplicate
content,
even
sometimes
so,
I'm
like
in
this
week.
F
I
made
a
list
of
all
the
urls
we
want
to
keep
or
we
want
to
have
index
the
best
urls
that
we
have,
and
then
I
made
a
list
of
all
the
urls
that
we
have,
but
we
don't
want
to
have
index
so
and
the
more
I
worked
on
it.
The
more.
I
asked
this
question
to
myself,
canonicalization
or
no
indexing
and
still
like
it.
It
doesn't
make
it
doesn't
add
up.
F
A
Yeah,
I
I
don't
recall
your
exact
situation,
so
it
might
be
tricky,
but
I
I
think
the
the
general
question
of
should
I
use
no
index
or
revel
canonical
for
another
page
is
is
something
where
there
probably
isn't
an
absolute
answer.
So
that's
kind
of
just
just
off
offhand.
It's
like
if
you're
struggling
with
that
you're,
not
the
only
person
who's
like
oh,
which
one
should
I
use.
A
That
also
usually
means
that
both
of
these
options
can
be
okay,
so
usually
what
what
I
would
look
at
there
is
what
what
your
really
strong
preference
there
is,
and
if,
if
the
strong
preference,
is
you
really
don't
want
this
content
to
be
shown
at
all
in
in
search,
then
I
would
use
no
index
if
your
preference
is
more.
A
I
really
want
everything
combined
in
one
page
and
if
individual
ones
show
up
like
whatever,
but
most
of
them
should
be
combined,
then
I
would
use
a
real
canonical
and
ultimately,
the
effect
is
is
similar
in
that.
Well,
it's
like
likely
the
page
that
you're
you're
looking
at
won't
be
shown
in
search,
but
with
a
no
index.
It's
definitely
not
shown
and
with
a
rel
canonical,
it's
more
likely
not
shown.
F
A
Forget
this
thought:
that's
all
right!
Thank
you
very
much.
You
answered
my
question.
Thank
you.
I
mean
you
can
also
do
both
of
them
and
it's
it's
something
if,
if
external
links,
for
example,
are
pointing
at
this
page
then
having
both
of
them,
there
kind
of
helps
helps
us
to
figure
out.
Well,
you
don't
want
this
page
index,
but
you
also
specified
another
one,
so
maybe
some
of
the
signals
we
can
just
forward
along
okay,
thank
you.
G
Okay,
so
I
accidentally
and
my
my
meeting
sorry
so
the
first
question,
one
of
one
of
my
first
question
is
that
I
don't
know
if
you
remember
in
our
formal
meeting,
I
mentioned
that
our
crew
rate
has
dropped
to
over
domestically
over
90,
and
we
assume
that
it's
because
of
a
response
time
and
now,
oh
and
now
we
have
done
many
actions
to
improve
our
response
time
and
now.
G
The
response
time
in
the
back
end
of
the
jsc
has
been
restored
to
to
the
formal
level
around
400
milliseconds
to
500
milliseconds
yeah.
But
the
crawl
request.
C
G
The
core
rate
isn't
is
restoring
much
more
slowly
than
than
the
response
time.
I
meant.
I
remember
that
you
said
you
said
if
we
fix
our
response
time
quickly
and
the
cruel
rate
will
be
restored,
except
in
several
days
right,
so
yeah.
So
right
now
the
crow
rate
didn't
restart
like
like
what
we
expected.
So
we
are
wondering.
Why
is
that?
And
maybe
is
that,
because
our
response
time
isn't
job
isn't
restored
to
maybe
lower
level
or
or
something
like
that.
So.
C
A
Yeah,
do
you
know
when
the
response
rate
has
has
improved
or
how
long
that
has
been.
A
G
Response
time
several
days,
it's
it's
restored
to
like
500
milliseconds
several
days,
and
and
now
we
we
couldn't
see
any
improving
or
the
improving,
is
much
slower
slower
in
the
gsc.
A
Yeah,
my
my
guess
is:
probably
it
just
takes
longer,
so
usually
the
the
system
that
we
have
is
is
very
responsive
in
slowing
down
to
make
sure
that
we
don't
cause
problems,
but
is,
is
a
little
bit
slower
in
ramping
back
up
again,
so
I
I
suspect,
if
it's
been
a
few
days
that
you
improved
that,
then
probably
I
I
would
give
it
maybe
a
week
or
longer
to
to
catch
up
again
what
you
can
also
do
in
in
the
help
center
in
search
console.
A
You
can
tell
them
like
we,
we
improved
our
crawl
rate.
Significantly,
you
can
crawl
again
with,
I
don't
know,
2qps
or,
however
crawl
rate
you
think
is
appropriate
and
the
the
googlebot
team
sometimes
has
time
to
take
action
on
those
and
kind
of
adjust
the
crawl
rate
up
manually.
A
A
I
see
I
can
look
for
the
link
later
on
or
it
should
be
in
the
help
center.
I
think
it's
called
report
a
problem
with
google
bot
or
report
a
problem
with
crawling
something
like
that
and
it's
it's
a
form
where
you
specify
the
ip
addresses
of
the
google
bot
that
you're
seeing-
and
it's
usually
meant
to
report
issues
where
googlebot
is
crawling
too
much.
But
you
can
also
report
issues
where
googlebot
is
not
catching
up
with
crawling.
Yet.
G
C
A
G
Okay,
okay,
thank
you,
so
you!
So,
as
our
response
time
has
back
to
the
formal
level
around
500
milliseconds,
do
you
think
do
you
think
maybe
lower
is
better
or
we
should.
A
Also,
I
I
think
500
is
is
pretty
good,
that's
usually
in
in
the
range
where
I
I
suggest
for
for
crawling
and
it
I
I
think
if
you
have
a
very
large
website
with
I
don't
know
millions
and
millions
of
urls
and
having
a
even
faster
cr
response
time
is,
is
useful,
but
for
normally
sized
or
even
medium
large
websites.
I
think
500
milliseconds
should
be
okay.
A
B
G
A
It
shouldn't
affect
the
the
indexing,
because
if
we
just
recrawl
pages
less
frequently,
then
we
we
still
keep
them
in
the
index.
It's
not
that
the
pages
expire
after
a
certain
time,
so
that
would
not
be
related
to
the
crawl
rate.
Unless,
unless
the
the
crawling
were
issues
where
we,
we
received
a
404
instead
of
content,
but
if
it's,
if
it's
just
slower,
then
that
just
means
like
we
crawl
less,
but
we
don't
remove
things
from
the
index
just
because
we
call
less.
G
Okay,
that's
weird!
So
do
you
know
there
have
no
any
any
reasons
that
may
cause
the
indexed
pages
dropped.
A
I
I
mean
there
are
lots
of
reasons
why
why
they
could
drop,
but
it's
it's
tricky
to
to
kind
of
say
like
this
is
what
you
should
be
looking
for.
The
main
thing
that
I
see
online
with
regards
to
index
pages
is,
is
more
a
matter
of
quality
that
our
systems
kind
of
understand
that
the
relevance
or
the
quality
of
the
website
has
gone
down
and
then
because
of
that
they
decide
to
index
less
for
for
larger
e-commerce
sites.
A
G
Okay
and
one
more
question
is
about
mobile
first
indexing,
so
we
we
all
know
the
mobile
first
indexing,
and
so
we
optimize
our
site
accordingly.
As
for
the
configuration,
google
doc
recommends
two
ways
of
doing
it.
The
first
is
responsive
web
design
and
second
is
dynamic
survey
and
because
the
first
way
is
a
little
difficult
for
us
to
achieve
it
through
our
tech
environment.
G
G
Yep
the
the
m
dock,
and
then
we
redirect
the
m
dot
to
the
main
domain
and
also
we
use
the
we
use
the
dynamic
serving
yes
now.
A
A
Thanks,
let
me
go
through
some
of
the
submitted
questions
and
I'll
get
back
to
to
all
of
the
raised
hands,
so
many
raised
hands
and
we
can
see
how
far
we
can
make
it.
Let's
see
the
the
first
one
is
about
a
regional,
a
website
that
has
a
regional
section,
that
targets
latin
america
and
we
use
personalization
to
change
some
elements
on
a
page
to
match
the
country's
specifics
like
currency
or
hero
image,
there's
one
product,
email
marketing
that
offers
different
features
based
on
the
market.
A
This
is
how
the
vendor
has
set
it
up
and
we
can't
do
much
about
it
through
personalization.
The
page
is
basically
ninety
percent
different
from
what
googlebot
sees
we're,
not
trying
to
be
shady.
It's
just
that
this
one
product
is
different.
The
rest
are
the
same.
A
The
site
has
one
subdomain,
that
targets
latin
america
and
only
one
url
for
this
product.
Would
this
be
an
issue
in
in
practice?
This
shouldn't
be
an
issue.
However,
what
what
you
need
to
keep
in
mind
is
that
googlebot
or
google
can
only
index
what
googlebot
sees
and
since
google
usually
crawls
from
the
us.
That
means
whatever
like
would
be
visible
to
users
in
the
us
users,
without
cookies,
google,
by
crawls,
without
cookies,
whatever
content
would
be
visible,
there
is
what
we
would
index
and
what
your
website
would
be
findable
for.
A
A
Another
question
about
latin
american
websites.
We
consolidated
our
latin
markets,
mexico,
argentina,
colombia
and
so
on
into
one
market.
We
also
target
spain
with
a
different
folder
for
hreflang.
We
have
selected
e-s-e-s
for
spain
and
reference
the
latin
urls
for
every
spanish-speaking
market,
so
the
same
url
for
es-mx,
et
cetera,
google
uses
es-419
to
target
the
latin
market,
but
this
is
this
no
longer
seems
to
be
supported.
Is
it
okay?
The
way
that
we
did
it
referencing
one
url
for
multiple
countries?
A
It's
perfectly
fine,
the
way
that
you
have
set
it
up.
In
fact
it's
it's
the
way
that
we
recommend
doing
it.
So
it's
it's
fine
having
one
version
for
one
country
and
another
version
for
lots
of
other
countries
listing
the
individual
countries
is,
is
a
perfect
way
to
do
this.
A
With
regards
to
es419,
I
think
that's
an
error
in
our
systems,
so
in
particular,
some
of
our
marketing
pages
also
use
hreflang
and
for
for
whatever
reason,
sometimes
we
specify
es419,
for
I
think,
latin,
american,
spanish
and
somehow
because
it's
in
our
systems
internally,
the
hreflang
links
are
set
up
for
that
as
well.
A
So,
basically
those
links
don't
work
and
if
you're
just
copying
our
markup,
then
that's
not
going
to
to
work
for
you,
and
I
I
see
this
kind
of
mistake
across
a
lot
of
websites
where
they
they
have
some
internal
country
or
region
coding
that
they
do
for
their
content,
and
then
they
kind
of
map
that
automatically
to
hreflang
as
well.
And
if
these
are
not
valid
country
and
region
notification
values
there,
then
essentially
those
hr,
flang
individual
links
are
ignored.
In
practice.
A
This
means
the
other
haflang
links
continue
to
work,
and
it's
just
this
one
country
link
that
doesn't
work,
so
I
wouldn't
copy
what
large
sites
do
unless
you're
sure
that
it's
actually
correct
and
then
I
think,
hazel's
questions
we
went
through
when
you
say
improve
the
quality
of
your
website.
Is
this
quality,
something
that
is
quantifiable
or
is
it
simply
a
term
used
to
determine
how
multiple
algorithms
look
at
your
website?
A
I
don't
think
it's
qual
quantifiable
in
the
sense
that
we
have
kind
of
like
a
quality
score
like
you
might
have
for
ads
when
it
comes
to
web
search,
and
we
we
have
lots
of
different
algorithms
that
try
to
understand
the
quality
of
a
website.
So
it's
not
just
one
number
anything
like
that.
A
But
then
that's
not
the
quality
metric
that
we
actually
use
for
search.
So
it's
kind
of
almost
like
misleading
and
if
we
were
to
show
exactly
the
quality
metric
that
we
use
then
on.
On
the
one
hand,
that
opens
things
up
a
little
bit
for
abuse
and
on
the
other
hand,
it
makes
it
a
lot
harder
for
the
teams
internally
to
work
on
improving
this
metric.
So
that's
kind
of
the
the
tricky
balance
there.
I
don't
know
at
some
point.
Maybe
we'll
still
have
some
measure
of
quality
in
search.
A
Console,
though,
is
there
any
relation
or
impact
upon
rankings
for
websites
who
have
made
normal
html,
css
and
javascript
and
another
one
with
pwa,
just
because
one
of
our
main
competitors
has
recently
adopted
it
and
we
noticed
a
huge
jump
in
their
rankings.
A
So
these
these
are
essentially
different
ways
of
making
a
website,
and
you
can.
You
can
make
a
website
in
with
lots
of
different
frameworks
and
formats
and
for
the
most
part
we
see
these
as
normal
html
pages.
So
if
it's
a
javascript
based
website,
we
will
render
it
and
then
process
it
like
a
normal
html
page.
A
If
it's
html
already
in
the
beginning,
we
can
do
that.
The
different
frameworks
and
cms's
behind
it,
usually
we
we
basically
ignore
that
and
just
say:
well,
here's
an
html
page
and
we
can
process
it.
A
So
just
the
fact
that
one
of
your
competitors
has
moved
from
one
framework
to
another
and
has
seen
an
improvement
in
in
search
that
framework
change
from,
from
my
point
of
view,
wouldn't
be
kind
of
responsible
for
that.
But
rather
maybe
they
have
a
newer
website
now,
together
with
that
framework
change,
maybe
they
the
newer
website
has
different
internal
linking
different
content.
A
Internally
is
significantly
faster
or
significantly
slower
users
really
like
it
or
they
did-
and
I
don't
know,
marketing
campaign
together
with
the
website
launch
all
of
these
things
kind
of
play
in
there,
and
these
are
all
things
that
are
not
limited
to
the
framework
that
you're
using
then
two
questions
on
pagespeed
insights,
the
lab
data
in
google
search,
google
speed
insight
results
are
the
same
as
lighthouse
results
in
my
chrome
browser.
Do
they
use
the
same
formula?
A
I
I
don't
know
100,
but
they
they
are
done
completely
differently
in
the
sense
that
if
you
use
page
speed
insights
that
is
run
on
a
data
center
somewhere
with
essentially
emulated
devices
where
we
try
to
act
like
a
normal
computer,
and
we
have
restrictions
in
place
that
make
it
kind
of
like
a
little
bit
slower
normal
internet
connection,
those
kind
of
things
and
in
lighthouse
it
basically
runs
on
your
computer
with
your
internet
connection,
and
I
I
just
think
lighthouse
within
chrome
also
has
some
restrictions
that
it
applies
to
make
it
a
little
bit.
A
Look,
I
don't
know,
maybe
a
little
bit
slower
than
that
your
computer
might
be
able
to
do
just
to
make
sure
that
it's
comparable,
but
essentially
the
these
run
in
completely
different
environments,
and
that's
why
you
would
often
see
different
numbers
there.
If
you
test
with
page
speed
insights,
you
see
one
number.
If
you
test
in
lighthouse
and
your
browser,
you
might
see
different
numbers.
A
So
I,
even
if
the
formulas
are
the
same
kind
of
the
the
whole
environment
around
these
systems,
is
very
different
and
the
the
second
question
is
if
pagespeed
inside
the
online
tool
is
using
our
computers
and
network.
No,
but,
like
I
mentioned,
the
the
online
tool
runs
online
in
a
data
center
somewhere
and
lighthouse
runs
locally
on
your
computer.
A
The
the
main
reason
I
think
we
have
so
many
different
options
here
is
that
websites
have
different
needs
with
regards
to
understanding
how
speed
works,
and
especially
with
with
things
like
lighthouse,
that
you
can
run
locally,
that
means
you
can
run
them
in
your
test
environment.
So
you
can
set
up
a
test
website
internally.
A
A
Someone
was
arguing
that
bolding
important
parts
in
a
paragraph
can
boost
your
seo.
Does
it
yeah?
I
think
this
is
is
something
that
comes
up
every
now
and
then
I
double
checked
before
the
session.
Actually
and
matt
cutts
did
a
video
I
think
in
like
2012
or
something
around
that
about
bolding
and
strong
in
on
pages.
A
So
usually
we
we
do
try
to
understand
what
what
the
content
is
about
on
a
web
page,
and
we
look
at
different
things
to
try
to
figure
out
what
is
actually
being
emphasized
here,
and
that
includes
things
like
headings
on
a
page,
but
it
also
includes
things
like
what
what
is
actually
bolded
or
emphasized
within
the
text
on
the
page,
so
to
some
extent,
that
does
have
a
little
bit
of
extra
value
there
in
that
it.
It's
a
clear
sign
that
actually
you
think
this
page-
or
this
paragraph
is
about
this.
A
A
The
other
thing
is
that
this
is,
to
a
large
extent
relative
within
the
web
page.
So
if
you
go
off
and
say
well,
I
will
just
make
my
whole
page,
bold
and
then
google
think
we'll
think
my
page
is
the
most
important
one
then,
by
making
everything
bold.
Essentially,
nothing
is
bold
because
it's
all
the
same,
whereas
if
you
take
a
handful
of
sentences
or
words
within
your
full
page,
where
you
say
this
is
really
important
for
me
and
you
bold
those,
then
it's
a
lot
easier
for
us
to
say.
A
Well,
here's
a
lot
of
text-
and
this
is
potentially
one
of
the
most
important
points
of
this
page
and
we
can
give
that
a
little
bit
more
value
and
essentially
what
what
that
kind
of
goes
into
is
everything
around
semantic
html,
where
you're
giving
a
little
bit
more
meaning
to
a
page
by
using
the
proper
markup
for
the
page
and
from
our
point
of
view,
that's
that's
good.
It
helps
us
to
understand
the
page
a
little
bit
better.
A
So
if
you
want
to
kind
of
like
simplify
it
to
to
one
word,
answer
does
bolding
important
points
on
a
paragraph
help
the
seo?
Yes
it
does.
A
It
does
help
us
to
better
understand
that
that
paragraph
or
that
page
does
google
crawler
get
confused
when
it
sees
a
url
that
has
been
canonicalized,
but
that
url
has
an
atrial
flange
alternate.
I
think
we
talked
about
this
briefly
before
we
have
an
e-commerce
site
with
over
3500
products.
A
A
I
don't
think
we
do
anything
special
for
menu
links,
they're,
essentially
normal
links
on
a
page.
So
from
from
that
point
of
view,
I
wouldn't
say
you
have
to
do
anything
unique
for
for
the
menu
itself
there.
There
is,
I
think,
some
practical
limit
with
regards
to
the
number
of
links
on
a
page
that
we
can
process.
A
I
don't
know
how
high
that
is
at
the
moment.
I
would
imagine
that's
I
don't
know
well
over
10
000
links
on
a
page,
so
if
you're
cross-linking
35
000
products
on
every
page,
then
I
could
imagine
that
maybe
you
are
kind
of
like
over
that
limit
of
the
number
of
links
that
we
process
for
a
page.
If
your
website
is
still
being
indexed
properly,
then
we
can
probably
find
all
of
those
pages,
but
it
gets
really
hard
for
us
to
understand
the
context
of
individual
pages.
A
So
it's
it
usually
makes
more
sense
to
provide
some
kind
of
hierarchical
structure
on
the
website
where
you're
saying
like
this
is
my
main
categories.
These
are
subcategories
and
from
there
linked
to
the
individual
products
rather
than
to
link
between
all
30
35
000
products.
F
May
I
say
something:
this
question
came
from
me
actually
sure
sure
I
yeah
so
that's
kind
of
I
don't
know.
I
felt
stupid
even
asking
this
question,
but
it's
really
like
the
the
menu
is
following
us
everywhere
throughout
the
page.
Obviously-
and
I've
never
asked
myself
this
question
actually,
how
is
this
treated
because
like
if
I
I
use
screaming
frog-
and
I
see
that
we
have
the
exact
amount
of
in
links
and
out
links
on
everything
and
on
every
url?
So
it's
really.
F
A
I
think
it
might
be
something
where
you
you
could
double
check
how
the
menu
is
implemented
depending
on
how
it's
set
up
and
if
you're,
using
things
like
javascript
to
load
content
when
people
navigate
the
menu,
maybe
that's
fine
one
one
thing
you
can
do
to
check
is
just
open
one
of
these
pages
in
the
browser
and
use
view
source
and
then
search
for.
I
don't
know
the
the
link
tag
so
kind
of
like
the
angular
bracket
and
a
and
just
see
how
many
references
it
brings
up
and
it
like.
A
The
total
number
of
links
on
the
page
is
one
thing,
but
you
should
kind
of
see.
Is
it
in
the
order
of
I
don't
know
a
couple
hundred
to
a
thousand,
or
is
it
really
thirty
five
thousand
links
on
every
page
because
of
it?
If
it
really
is
thirty,
five
thousand
page
on
every
link,
35
000
links
on
every
page,
then
I
I
think
from
the
page
size
from
the
speed.
A
A
Let
me
see.
Maybe
we
can
switch
to
more
live
questions
since
we
have
so
many
people
raising
hands
dorade,
I
probably
mispronouncing
your
name.
Sorry.
H
Yeah,
it's
george
okay.
Can
you
hear
me
okay?
Yes,
okay,
I
was
wondering
we
have.
We
noticed
a
big
problem
with
google
discover
on
our
website
it
in
two
days,
traffic
that
dropped
by
some
70
and
it
has
been
it
dropped
to
a
certain
level
and
has
been
stable
on
that
level
for
some
time.
So
we're
wondering
if
we
did
something
wrong.
Was
it
an
update
or
something?
Can
you
clarify
as
to
what
exactly
happened
since
it's
such
a
drastic
draw.
A
Now
I
I
don't
know
specifically
with
regards
to
your
website,
but
I
ana
totally
get
reports
from
a
lot
of
people
that
discover
traffic
is
very
either
on
or
off
in,
in
the
sense
that
there's
very
little
kind
of
room
in
between
in
that.
If
our
algorithms
determine
we
we're
not
going
to
show
much
content
from
this
website
in
discover
at
the
moment,
then,
basically,
all
of
that
that
traffic
disappears
and
in
in
the
other
way.
H
A
People
talking
about
are,
on
the
one
hand,
quality
issues
where
maybe
the
quality
of
the
website
is
not
so
good
and
with
regards
to
the
individual
policies
that
we
have
for
discover
so
in
in
particular
for
discover
we,
we
have
some
policies
that
are
different
from
web
search
and
recommendations
that
are
a
bit
different
with
regards
to,
I
think,
like
adult
content,
clickbaity
content,
things
like
that.
A
That's
all
mentioned
in
in
the
health
center
page
that
we
have
for
discover-
and
sometimes
I
I
suspect,
like
I
imagine-
a
lot
of
websites
have
a
little
bit
of
a
mix
of
all
of
these.
These
kind
of
things-
and
I
I
suspect,
sometimes
our
algorithms,
just
find
a
little
bit
too
much
and
then
they
say,
oh,
like
we
have
to
be
careful
now
with
this
website.
A
There
are
a
number
of
seos
externally
that
have
looked
into
discover
a
little
bit
more
and
try
to
figure
out
like
what
kind
of
content
performs
well
or
which
content
shows
up
at
all
in
discover,
and
they
they've
written
a
bunch
of
blog
posts
and
done
presentations.
On
that
I
I
would
try
to
search
for
some
of
that
and
see.
C
A
Other
people
have
found
because,
from
from
our
point
of
view,
discover
is
something
where
we
try
to
show
a
stream
of
information
to
people
and
because
of
that,
we
we
tend
not
to
have
a
lot
of
detailed
information
on
what
exactly
you
need
to
provide
there
to
perform
really
well.
So
sometimes
it
makes
sense
to
to
look
at
what
other
people
have
figured
out.
H
Okay
thanks,
I
was
wondering
also,
if
you
could
point
us
to
tell
us
if
what
would
be
a
good
response
time
for
a
new
site
news,
media.
H
A
Well,
I
mean
the
the
response.
Time
is
something
that
that
plays
into
our
ability
to
figure
out
how
much
crawling
a
server
can
take
and
usually
the
response
time
from
a
practical
point
of
view,
limits
or
kind
of
plays
into
how
many
parallel
connections
would
be
required
to
crawl.
A
So
if
we
want
to
crawl,
I
don't
know
1
000
urls
from
a
website,
then
the
response
time
to
spread
that
out
over
the
course
of
a
day
can
be
pretty
large,
whereas
if
we
want
to
crawl
a
million
urls
from
a
website
and
a
high
response
time
is
there,
then
that
means
we
end
up
with
a
lot
of
parallel
connections
to
the
server,
and
I
think
we
have
some
limit
there
with
regards
to.
We
don't
want
to
cause
issues
on
the
server.
A
A
So
that's
for
I
don't
know
that
that's
kind
of
the
the
angle
I
would
look
at
there.
It
can
be
that
a
news
website
we
crawl
10
000
pages
a
day
and
those
are
the
important
news
articles
are
all
covered.
It
might
be
that
we
have
to
crawl
millions
of
articles
a
day
because
we
always
have
to
refresh
the
archive
or
things
like
that,
then.
Obviously,
the
response
time,
the
crawl
rate
looks
different.
C
Internal
linking
structure
are
improved
and
all
those
are,
the
factors
could
be
a
cause
of
the
rankings
into
this
year.
But
what
do
my
intent
behind
the
questions
you
know?
Overall
game
is
shifted
toward
the
ui
and
ux.
Whatever
the
you
know,
the
happening
to
the
user
front
whenever
the
websites
are,
you
know,
open
and
or
the
rendering
upon
the
browser.
C
C
You
know
the
structure
back
in
front
in
a
lot
of
so
when
we
are
to
notice
our
competitor
adopted
the
pwa
from
the
last
three
four
months
and
obviously
they
have
the
minimize
the
some
of
their
html
elements
of
their
front
pages
and
some
of
the
internal
linking
structure
as
well.
So
just
we
are
the
wondering:
is
it
the
possible?
C
If
I'm,
if
I'm
thinking
into
the
right
direction,
so
what
could
be,
you
know
the
possible
factor
and
how
we
can
you
know
the?
Is
it
fine
to
go
with
the
pwa
or
maintaining
our
current
tech
style?
Thank
you.
Yeah.
A
A
It
can
improve
your
rankings
if
you
make
a
better
website,
but
you
also
have
a
lot
of
other
things
that
you
need
to
think
about.
So
it's
it's
something
where,
essentially
from
from
my
point
of
view,
it's
more
of
well
is
going
to
a
new
website
going
to
improve
my
rankings,
and
my
guess
is:
if
your
website
is
really
10,
15
years
old
and
has
kind
of
grown
organically
since
then,
probably,
yes,
probably
yes,
like
moving
to
any
kind
of
a
newer
framework
and
a
cleaner
website,
a
faster
website,
one
that
works
better
for
users.
B
A
A
A
On
your
market,
maybe
that's
really
valuable,
so
that
might
be
something
that
says.
Well,
it's
extra
work
to
make
a
javascript-based
website
work
in
search,
but
we
we
don't
have
to
create
a
separate
app
or
we.
We
have
all
of
these
extra
features
as
well,
and
it's
it's
worthwhile
from
a
financial
point
of
view
to
do
that.
But
that's
that's
something
where
almost
like
the
the
seo
aspect
is
secondary.
A
Especially
if
you
have
a
large
old
website,
that's
10,
15
years
old,
moving
to
something
new,
I'm
I'm
almost
certain
that
you
will
see
significant
changes
just
because
it'll
be
a
lot
cleaner,
it'll
be
a
lot
faster.
The
the
structure
will
be
a
lot
better.
It'll,
be
better
for
users.
It'll
be
easier
for
search
engines
to
understand
all
of
that
kind
of
every
all
of
the
html
improvements
that
have
happened
since
then.
They
they
play
in
your
face.
C
All
right,
so
what
I
getting
from
your
explanation,
I
got
the
two
points.
First
point:
pw
is
not
directly
relates
with
the
scrp
and
somehow
the
ranking
factor
and
second
point
you
are
telling.
Obviously,
though,
your
build
product
website
need
to
be,
you
know
updated
with
the
latest
tech
attack,
which
is
going
to
be
helpful
for
the
cleaning
of
overall
structure,
and
you
know
the
html
elements
which
is
rendering
on
the
front
is.
B
C
A
Okay,
good
luck!
Cool!
Okay!
Let
me
take
a
break
here
with
the
recording
and
I'll
get
back
to
more
of
your
questions
afterwards
as
well.
It
looks
like
lots
of
people
still
have
their
hands
up,
wow,
so
good,
okay,
if
you're
watching
this
on
youtube
thanks
for
sticking
around
to
the
end,
if
you'd
like
to
join
one
of
these
in
in
the
future,
feel
free
to
watch
out
for
the
link
we
post
them
in
the
community
section
on
the
channel
and
yeah
thanks
for
joining.