►
From YouTube: English Google SEO office-hours from September 2022
Description
This is an audio-only recording of the experimental Google SEO office-hours from September 2022. These sessions cover topics submitted around anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc. The answers are compiled by the Google Search Relations team.
Let us know in the comments how you like this format!
A
B
Hi
everyone,
john
mueller
here
we're
all
on
the
google
search
relations
team
based
here
in
switzerland,
like
lizzy,
mentioned
we're
doing
this
office
hour
session
a
bit
differently.
The
goal
of
this
format
is
to
make
it
a
bit
easier
for
everyone
to
participate,
including
on
our
side
we'd
like
to
experiment
with
the
formats
a
little
bit,
so
we
may
try
variations
over
time.
B
A
So
the
first
question
that
we've
got
here
is
from
demo
a
site
that
connects
seekers
and
providers
of
household
related
services.
Matching
of
listings
is
based
on
zip
code,
but
our
users
come
from
all
over
germany.
The
best
value
for
users
is
to
get
local
matches.
Is
there
some
kind
of
schema
markup
code
to
tell
google
algorithm
show
my
site
also
in
the
local
pack?
Please
note
we
do
not
have
local
businesses
and
cannot
utilize
the
local
business
markup
code.
A
The
site
is,
and
I'm
going
to
redact
that,
yes,
so
demo,
as
you
noted,
local
business
markup
is
for
businesses
with
physical
locations,
and
that
means
that
there's
typically
one
physical
location
in
the
world
for
that
place.
So
it
should
only
show
up
for
that
city
and
your
other
question.
Currently,
there's
no
rich
result
feature
for
online
services.
Only
that
you
could
use
structured
data
for,
but
you
can
claim
your
business
profile
with
google
business
profile
manager
and
specify
a
service
area
there.
So
I
think
that
that
could
help.
C
Sindra
asks
google
search
console
shows
excellent,
core
web
vital
scores
for
our
site,
but
I
understand
that
it
only
shows
details
of
chrome
users,
a
majority
of
our
users
browse
using
safari.
Is
there
a
way
to
measure
page
experience
or
the
core
web
vitals
on
safari
browsers,
and
is
there
a
way
to
improve
the
experience?
B
For
the
next
question,
I'll
paraphrase,
how
can
I
best
switch
from
one
domain
to
a
new
one?
Should
I
clone
all
the
content
or
just
use
80
of
the
content?
What
is
the
fastest
way
to
tell
google
that
they're
both
my
sites?
We
call
this
process
a
site
migration,
it's
fairly
well
documented.
So
I
would
look
up
the
details
in
our
documentation
to
simplify
and
leave
out
a
lot
of
details.
B
Ideally,
you'd
move
the
whole
website,
one
to
one
to
the
new
domain
name
and
use
permanent
301
redirects
from
the
old
domain
to
the
new
one.
This
is
the
easiest
for
our
systems
to
process.
We
can
transfer
everything
directly
if
you
do
other
things
like
removing
content,
changing
the
page,
urls
restructuring
or
using
a
different
design
on
the
new
domain.
That
all
adds
complexity
and
generally
makes
the
process
a
little
bit
slower.
A
And
our
next
question
is
from
indiequest1:
do
you
support
the
use
and
the
full
range
of
schema.org
entities
when
trying
to
understand
the
content
of
a
page
outside
of
use
cases
such
as
rich
snippets?
Can
you
talk
about
any
limitations
that
might
exist
that
might
be
relevant
for
developers
looking
to
make
a
deeper
use
of
the
standard?
A
So
to
answer
your
question,
no
google
does
not
support
all
of
the
schema.org
entities
that
are
available
on
schema.org.
We
have
the
search
gallery
which
provides
a
full
list
of
what
we
do.
Support
for
rich
snippets,
like
you
mentioned
in
google
search
results,
but
not
all
of
those
things
are
visual.
We
do
talk
about
certain
properties
that
might
be
more
metadata-like
that
aren't
necessarily
visible
as
a
rich
result
and
that
still
helps
google
to
understand
things
like
authors
or
other
metadata
information
about
a
page.
So
we
are
leveraging
that
kind
of
thing.
D
Anton
littau
is
asking
in
search
console.
I
get
the
message.
Sitemap
could
not
be
read
in
the
sitemap
report.
No
other
information
is
provided.
What
could
be
the
reason
that
the
sitemap
cannot
be
read
by
the
googlebot
good
question?
The
sitemap
could
not
be
read.
Message
in
search
console
may
be
caused
by
a
number
of
issues,
some
of
them
technical,
some
of
them
related
to
content
quality
of
the
site
itself.
D
A
A
We've
got
guides
and
tips
that
are
illustrated
on
our
website
and
they're,
not
performing
well
in
the
serp.
We
tried
to
be
unique
using
some
types
of
illustrations
and
persona
to
make
our
readers
happy.
Do
you
think
we
did
not
do
it
right?
I
don't
know
because
I
don't,
I
don't
think
I've
ever
seen
your
cartoons,
but
I
can
speak
to
the
how
to
improve
your
cartoon
illustrations
in
serp.
D
Kibusor
lawrence
is
asking:
does
posting
one
content
daily
increase
rankings,
no
posting
daily
or
at
any
specific
frequency,
for
that
matter
doesn't
help
with
ranking
better
in
google
search
results.
However,
the
more
pages
you
have
in
the
google
index,
the
more
your
content,
may
show
up
in
search
results.
A
Okay-
and
the
next
question
is
from
suresh
about
the
helpful
content
update
that
only
10
right,
quality
content
and
the
rest
90,
don't
write
quality
content,
lengthy
content,
but
how
should
they
write
quality
content?
Does
google
agree
with
the
word
count
or
not?
Well,
nope
content
can
still
be
helpful,
whether
it's
short
or
long.
It
just
depends
on
the
context
and
what
that
person
is
looking
for.
It
doesn't
matter
how
many
words
if
it's
500
a
thousand,
if
it's
answering
the
user
intent,
then
it's
fine,
it
can
be
helpful.
These
are
not
synonymous
things.
B
B
Well,
thanks
for
asking
words
and
urls
only
play
a
tiny
role
for
google
search.
I
would
recommend
not
overthinking.
It
use
the
urls
that
can
last
over
time
avoid
changing
them
too
often
and
try
to
make
them
useful
for
users,
whether
you
include
stop
words
in
them
or
not
or
decide
to
use
numeric
ids.
That's
totally
up
to
you.
D
Sanjay
sunwall
is
asking,
does
different,
bots
type
image
and
desktop
share
crawl
budget,
and
what
about
different,
hosts
fantastic
question?
The
short
answer
is
yes,
google
bots
and
its
friends
share
a
single
crawl
budget.
What
this
means
to
your
site
is
that,
if
you
have
lots
of
images,
for
example,
googlebot
images
may
use
up
some
of
the
cruel
budget
that
otherwise
could
have
been
used
by
googlebot.
D
In
reality,
this
is
not
a
concern
for
the
vast
majority
of
the
sites,
so
unless
you
have
millions
of
pages
and
images
or
video,
I
wouldn't
worry
about
it.
It's
worth
noting
that
cruel
budget
is
per
host,
so,
for
example,
if
you
have
subdomain.example.com
and
you
have
other
subdomain.example.com,
they
have
different
cruel
budgets.
B
Christopher
asks:
we've
sold
the
german
subdirectory
of
our
website
to
another
company.
They
request
us
to
301,
redirect
the
subdirectory
to
their
new
german
site.
Would
you
advise
against
it?
Would
it
hurt
us?
Well?
On
the
one
hand,
it
all
feels
kind
of
weird
to
sell
just
one
language
version
of
a
website
to
someone
else.
On
the
other
hand,
why
not,
I
don't
see
any
problems
from
redirecting
from
there
to
a
different
website.
B
A
A
Well
to
answer
this
question
specifically:
there's
no
guaranteed
time
that
you'll
be
able
to
see
a
specific
rich
result
in
google
search
after
adding
structured
data.
But
I
think
what
you're
asking
about
here
is
for
a
specific
thing
to
be
added
to
search,
console
and
we'll
have
to
check
with
the
team
on
the
timeline
for
that,
and
we
don't
pre-announce
when
certain
things
will
be
added
to
search
console.
A
B
Roberto
asks
we're
planning
to
share
the
same
back
end
and
front
end
for
our
two
brands,
we're
ranking
quite
well
with
both
of
them
in
google.
How
big
is
the
risk
of
penalization
if
we
use
the
same
html
structure,
same
components,
layout
and
same
look
and
feel
between
the
different
brands?
What
would
be
different
are
the
logos,
fonts
and
colors,
or
would
you
suggest
migrating
to
the
same
front
end,
but
keeping
the
different
experience
between
the
two
brands?
B
That's
it
if
the
urls
and
the
page
content
is
the
same
across
these
two
websites,
then
what
can
happen
for
identical
pages
is
that
our
systems
may
pick
one
of
the
pages
as
a
canonical
page.
This
means
we
would
focus
our
crawling
indexing
and
ranking
on
that
canonical
page
for
pages
that
aren't
identical.
We
generally
index
both
of
them.
For
example,
if
you
have
the
same
document
on
both
websites,
we'd
pick
one
and
only
show
that
one
in
search
in
practice.
C
Anna
giacinto
asks
javascript
seo
what
to
avoid
along
with
javascript
links.
Well,
the
thing
with
the
links
is
that
you
want
to
have
a
proper
link,
so
avoid
anything
that
isn't
a
proper
link.
What
is
a
proper
link?
Most
importantly,
it's
an
a
html
tag.
It
has
an
href
that
lists
a
url
that
is
resolvable,
so
not
like
a
javascript
colon
url
and
that's
pretty
much
it.
A
Our
next
question
is
from
sakshi
singh.
Let's
say
I
research
on
a
keyword
which
has
no
volume
or
keyword
density,
but
we
are
appearing
for
those
keywords
on
the
first
page.
Should
we
target
that
keyword?
Well,
sakshi,
you
can
optimize
for
whatever
keywords
you
want
it
and
it's
not
always
about
the
keywords
that
have
the
most
volume.
I
would
think
about
how
people
should
find
your
page
and
target
those
keywords.
D
Kim
on
a
is
asking
hello:
you
previously
advised
that
there
are
no
seo
benefits
to
audio
versions
of
text
content
and
that
audio
specific
content
doesn't
rank
separately
like
video
content.
However,
given
you
also
said,
it
might
be
that
there
are
indirect
effects
like
if
users
find
this
page
more
useful
and
they
recommend
it
more,
that's
something
that
could
have
an
effect
will
audio
content,
be
given
more
priority
and
independent
ranking
following
the
helpful
content
algorithm
update.
A
C
Someone
asked:
is
it
okay
to
fetch
meta
contents
through
javascript?
I
think
that
means
is
it
okay
to
update
meta
tag
data
with
javascript,
while
that
is
possible
to
do
it
is
best
to
not
do
that.
It
may
give
google
search
mixed
signals
and
some
features
may
not
pick
up.
The
changes
like
some
specific
search
result
type
might
not
work
the
way
you
expect
it
or
it
might
have
big,
correct
information
or
it
might
miss
something.
So
I
would
suggest
not
to
do
that.
D
D
No,
the
named
updates
that
we
publish
on
the
rankings
updates
page
on
search
central
are
not
penalties
in
any
shape
or
form.
They
are
adjustments
to
our
ranking
algorithms,
so
they
surface
even
higher
quality
and
more
relevant
results
to
search
users.
If
your
site
has
dropped
in
rankings
after
an
update,
follow
our
general
guidelines
for
content.
B
Ion
asks
when
would
be
the
next
possible
update
for
the
search
results.
Well,
on
our
how
search
works
site
we
mentioned
that
we
did
over
4
thousand
updates
in
2021.
That's
a
lot
of
updates.
Personally,
I
think
it's
critical
to
keep
working
on
things
that
a
lot
of
people
use
our
users
and
your
users
expect
to
find
things
that
they
consider
to
be
useful
and
relevant
and
what
that
means
can
change
over
time.
Many
of
these
changes
tend
to
be
smaller
and
are
not
announced.
B
A
A
So
while
the
stars
are
more
visual
and
eye-catching,
structured
data
in
and
of
itself
is
not
a
ranking
signal
and
it
isn't
guaranteed
that
these
rich
results
will
show
up
all
the
time.
The
google
algorithm
looks
at
many
things
when
it's
creating
what
it
thinks
is
the
best
search
experience
for
someone,
and
that
can
depend
on
a
lot
of
things
like
the
location,
language
and
device
type.
B
Christian
asks
I
have
set
the
rel
canonical
together
with
a
no
index
meta
tag
when
google
does
not
accept
the
canonical
at
all,
all
internal
links
are
dropped.
When
I
don't
set
a
real
canonical,
then
I
can
see
the
internal
links
in
search
console
in
the
links
report.
Is
this
normal?
Well,
this
is
a
complex
question,
since
it
mixes
somewhat
unrelated
things,
a
no
index
says
to
drop
everything
and
the
rel
canonical
hints
that
everything
should
be
forwarded.
B
So
what
does
using
both
mean?
Well,
it's
essentially
undefined
our
systems
will
try
to
make
do
the
best.
They
can
in
a
conflicting
case
like
this,
but
a
specific
outcome
is
not
guaranteed.
If
that's
fine
with
you,
for
example,
if
you
need
to
use
this
setup
for
other
search
engines,
then
that's
fine
with
us
too.
If
you
want
something
specific
to
happen,
then
be
as
clear
as
possible
for
all
search
engines.
A
Well
yes,
just
because
something's
indexed
doesn't
mean
that
there's
no
opportunity
to
improve
how
it
appears
structured
data
helps
google
understand
more
about
your
video
like
what
it's
about
the
title.
Interaction
statistic:
that
kind
of
stuff
and
adding
structured
data
can
make
your
videos
eligible
for
other
video
features
like
key
moments.
A
C
I
think
this
is
not
cloaking
as
what
users
see
when
they
click
on
the
search
result
roughly
matches
what
googlebot
sees,
and
if
you
have
a
lot
more
button
users
will
click
that
if
they
don't
see
the
product
they're
expecting
there.
So
I
don't
think
this
is
cloaking
and
that's
a
solution
that
I
think
works
from
a
crawling
point
of
view.
B
And
that
was
it
for
this
episode.
I
hope
you
found
the
questions
and
answers
useful
if
there's
anything
you
submitted,
which
didn't
get
covered
here,
I'd
recommend
posting
in
the
search
central
help
community.
There
are
lots
of
passionate
experts
active
there,
who
can
often
help
you
to
narrow
things
down.
Let
us
know
on
twitter
how
you
find
this.
We
hope
to
line
up
more
of
these
for
the
future.
In
the
meantime,
may
your
site's
traffic
go
up
and
your
crawl
errors
go
down
catch
you
next
time
for
more.