►
From YouTube: English Google SEO office-hours from December 10, 2021
Description
This is a recording of the Google SEO office-hours hangout from December 10, 2021. These sessions are open to anything search & website-owner related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.
Find out more at https://goo.gle/seo-oh-en
Feel free to join us - we welcome folks of all levels!
A
All
right
welcome
everyone
to
today's
google
search
central
seo
office
hours
hangout.
My
name
is
john
mueller.
I'm
a
search
advocate
at
google
in
switzerland
and
part
of
what
I
do
are
these
office
hour
hangouts,
where
people
can
join
in
and
ask
their
questions
around
the
website
and
web
search,
and
we
can
try
to
find
some
answers,
looks
like
a
lot
of
stuff
was
submitted
on
youtube
as
well,
but
a
bunch
of
people
are
already
jumping
in
and
raising
their
hands.
So,
let's
get
started
with
some
live
questions.
B
B
So
I
want
to
know
for
this
chrome
user
experience
data
that
google
uses
for
core
web
vitals.
Does
google
only
consider
users
that
were
coming
from
google,
organic
search
or
all
the
chrome
users,
irrespective
of
their
traffic
channel
or
source.
A
It
it's
not
tied
to
people
clicking
from
google
search
so
basically
well,
it's
it's!
Not
all!
Chrome
user
is
because
of
the
way
that
the
chrome
user
experience
report
collects
the
data,
but
it's
it's
not
tied
to
any
specific
kind
of
like
source
of
traffic
for
your
website.
A
I
I
don't
know
the
off
hand,
I
I
think
it
has
to
do
with
with
the
settings
in
in
chrome
that
you
have
it
set
up
that
you're
kind
of
submitting
user
feedback
through
chrome
those
kind
of
things.
But
I
I
don't
know
the
exact
details
of
hand.
B
A
Cool
all
right
christian.
C
Now
hi
good
morning,
john,
so
I've
got
a
question,
so
I
have
this
website
I'm
looking
for,
and
then
we
have
the
situation
that
this
website
hasn't
been
migrated
to
mobile
first
indexing.
C
So
and
there
is
a
kind
of
mixed
state.
We
have
some
pages
which
have
a
separate
mobile
version
and
some
other
pages
are
responsive
and
the
link
structure
is
kind
of
unclear.
So
my
I
suspect
that
this
might
be
the
reason
for
the
website
not
to
be
migrated
to
mobile
first
yet
and
just
wanted
to
confirm
if
this
is
true
or
if
there
might
be
other
reasons
for
that.
A
But
as
far
as
I
know,
there
there's
still
a
kind
of
a
a
batch
of
mobile
first
indexing
sites
that
are
just
not
migrated
over
and
it's
something
where
we're
working
through
some
of
the
bigger
sites
there
step
by
step
to
make
sure
that
they
can
all
be
migrated
cleanly
and
then,
when
we
have
kind
of
the
the
bigger
issues
resolved
and
we'll
migrate,
the
rest
as
well.
So
my
guess
is
the
the
website
is
kind
of
in
this
holding
position
where
it's
not
necessarily
that
they're
doing
anything
wrong.
C
Okay,
so
you
would
say
it's
good
to
to
just
wait
a
little
bit
longer
and
to
see
what
happens.
A
E
E
Hi
john
hi
hi,
one
of
my
clients
website
will
be
you
know
like
down
for
a
week
or
like
for
two
weeks.
Okay,
and
the
reason
I
am
here
is
no,
there
is
few
bugs
on
the
website
and
that's
the
reason
they
said.
You
know
they
wanted
to
take
the
whole
website
down
for
for
like
a
week
or
two,
and
my
question
is
only
related
like
you
know,
although
there
will
be
traffic
laws
and
so
and
so,
but
how
can
I
tell
google
that
this
is
the
temporary
situation?
E
It
will
be
only
for
the
two
weeks
and
our
website
will
be
live.
Is
there
anything
that
I
could
you
know
display
a
page
showing
you
know
the
website
is
temporary
down
and
will
be
displacing.
I
know
that
the
countdown
is
there
on
my
page.
You
know
it's
been
like.
D
E
Two
weeks
or
the
ten
days
like
something
like
that,
can
I
tell
to
google
that
you
know
this
website
is
currently
down,
but
it
will
be
just
you
know,
going
back
live
again
within
like
two
weeks
or
a
week,
but
there
shouldn't
be
any
ranking
loss
that
I
or
there
could
be
any
minimum
ranking
laws
that
I
could
get
like
something
like
that.
A
I
I
would
assume
that
at
least
temporarily,
you
will
have
really
strong
fluctuations
and
it's
it's
going
to
take
a
little
bit
of
time
to
get
back
in
it's
it's
not
impossible,
because
these
things
happen
sometimes.
But
if,
if
there's
anything
that
you
can
do
to
avoid
this
kind
of
outage,
I
would
try
to
do
that,
and
that
could
be
something
like
setting
up
a
static
version
of
the
website
somewhere
and
just
showing
that
to
users
for
the
time
being
but
like
especially,
if
you're
doing
this
in
a
planned
way.
D
Hi
john
good
morning,
it's
me
again,
so
we
want
to
ask
some
questions
about
google
about
ad
boats.
So
do
you
think
well
disallow
as
url
to
the
google
bot
in
the
robots
text
will
affect
our
google
ads
and
maybe
that
cause
ads
won't
work
normally.
A
So,
as
as
far
as
I
know,
the
the
ad
spot
doesn't
follow
the
normal
robots
text
directives.
You
have
to
use
the
user
agent
directly
for
it,
but
I.
A
F
D
A
Sure
sure,
that's
that's
perfectly
fine
blocking
blocking
the
ads
landing
pages
for
google
bot
is
is
fine.
If
you
again,
if
you
block
the
ads
pages
for
ad
spot,
then
that's
that's.
Something
you'd
have
to
check
with
the
ads
folks.
D
Yeah
that
that's
not
what
we
do
so
also
about
the
google
posts,
does
google
or
does
xbox
share
a
same
core
budget
with
google
vote.
G
D
Yes,
so
that
means
maybe
the
crawl
request
from
asbot
increase
and
maybe
googlebots
crowless.
A
Yes,
it
can
happen,
usually
it
settles
down
fairly
quickly.
I
again,
I
don't
know
the
details
from
the
ad
side,
but
my
understanding
is
when
you
submit
new
ads
campaigns,
then
the
ads
bot
checks
all
of
those
pages
and
once
once
that's
checked,
then
they
don't
need
to
be
checked
as
as
often
as
usual.
D
Okay,
I
see
so
if,
if
like,
if
we're
blocking
the
the
ads
urls
in
the
googlebot,
but
google
maybe
still
crawl
too
many
ads
urls,
what
can
we
do?
Maybe
some
measures
to
to
handle
this
situation,
to
let
the
googlebot
to
chrome,
more
product
pages.
A
So
I
I
think,
if
you're
blocking
the
the
ads
landing
pages
for
googlebot
specifically,
then
we
shouldn't
be
crawling
the
ads
landing
pages
with
google
bot.
It
would
really
only
be
the
ad
spot.
That's
crawling.
It.
D
A
It
should
never
be
the
case
that
if,
if
we've
recognized
the
robot's
text
file
that
we
still
crawl
the
url
anyway
from
google
bots,
so
that
feels
like
either
something
we
would
have
to
know
as
soon
as
possible
or
made.
Maybe
some
mistake
with
things
like
upper
and
lower
case
in
the
urls,
or
not
exactly
the
the
right
path
being
being
included
in
the
robot's
text.
File.
A
Yeah
and
if
you
see
situations
where
the
ad
spot
crawling
like
severely
drowns
out
everything
else,
I
would
use
that
contact
form
that
we
have
in
the
help
center
to
report
problems
with
googlebot.
I
think
you
you've
used
it
before
in
the
past
as
well.
D
A
D
Sure
sure,
if
we
notice
something
unusual,
we'll
definitely
would
report
it
to
google.
So
another
question
is
about
a
serial
four
response
code.
Do
you
think
the
serial
four
response
code
affects
chroming
because,
logically,
if
we
think,
if
googlebot
check
a
url
with
the
same
content-
and
it
returns
304
code
at
first
time
and
maybe
there's
a
possibility-
that
googlebot
may
reduce
the
crawling
for
the
same
url
because
it
returns
to
the
304
code.
A
It
so
so
I
think
there
are
two
things
so
the
304
is,
I
think,
in
response
to
the,
if
modified
sense,
requests
where
googlebot
tries
to
see
like
if
this
page
has
changed,
and
my
understanding
is
that
a
304
response
code
would
not
apply
to
the
crawl
budget
side
of
things,
so
that
basically
means
for
us
like
we.
We
can
reuse
that
request
and
crawl,
some
something
else
on
the
website.
A
So
there
there's
that
aspect
and
the
other
aspect
with
regards
to
crawling
that
specific
url
less
I
I
don't
think
that
would
be
the
case,
but
we
do
try
to
figure
out
how
often
pages
change
and
we
try
to
recrawl
pages,
based
on
kind
of
the
assumed
page
frequency
or
update
frequency
that
we
have.
So
it's
not
so
much
that
that
particular
url
would
get
crawled
less
frequently.
It's
more
that.
Oh,
we
understand
a
bit
better.
How
often
these
pages
change.
D
D
So
if,
if
most
of
pages
on
the
site
returns
to
304,
so
maybe
it's
a
signal
for
global
that
this
site
has
no
new,
updated
content,
maybe
reduce
the
crawling
rate.
A
D
Sure
sure
so
also
about
the
crawling
issues,
since
our
crawling
crawling
rate
is
back
to
normal,
and
we
noticed
that
our
request
from
smartphone
is
recovering
much
more
faster
than
desktop.
A
A
D
So
last
one
question
is
about
more
general:
one
is
about
301
redirects.
Oh,
we
are
doing
some.
We
are
like
some
taking
taking
down
some
useful
pages
and
doing
301
redirects
to
them,
and
we
just
think
well
well
a
huge
amount
of
serial
one
redirects
to
harm
to
a
site.
A
A
A
Thanks,
let's
see
rashmi.
A
H
Hi,
this
is
the
first
time
actually
I'm
attending
this
session.
It's
regarding
to
our
ecommerce
website.
This
year.
Only
we
have
launched
one
e-commerce
website
and
for
70
plus
countries
we
have
created
ahreflank
and
our
url
structure
for
different
countries.
We
have
created,
like
we
have
established
in
uae,
so
eu
we
have
like
en
underscore
a
and
when
it
comes
to
us,
our
website
structure
is
us
like
we
have
created
for
70
plus
countries,
and
the
language
is
in
english
and
for
all
the
regions.
H
Our
content
is
same,
except
for
some
regions.
We
have
like
mentioned
few
cities
like
uae.
We
have
mentioned
cities
like
our
emirates,
like
dubai
and
when
it
comes
to
us,
but
overall
the
content
is
same
and
we
have
more
than
800
000
products
on
our
website,
and
this
year
only
we
have
launched
and
for
other
countries.
Also
even
our
language
is
not
in.
H
In
english,
we
have
created
locale,
like
bg
underscore
bg
like
for
bulgaria
and
all
and
initially
we
have
submitted
all
the
sitemap
for
all
these
regions
as
well
and
first
we
have
experience
that
google
started
crawling
your
website
and
almost
350
000
urls
got
indexed.
Then
we
have
started
experiencing
like
the
indexing
of
the
urls,
so
we
first
thought
it
can
be
the
speed
of
the
website,
and
then
we
have
increased
the
speed
and
then
again
we
started
experiencing
okay,
the
url
got
indexed
latest
change,
then
again
it
stopped.
Then
what
we
did.
H
Now
it
we
are
experiencing,
like
the
indexing
of
the
url,
and
when
we
submitted
the
website
structure
like
slash
en,
we
have
experience
that
all
the
indexed
urls
are
going
to
the
excluded
list
and
we
don't
know
what
is
the
exact
reason
behind
that?
What
can
be
the
reason?
Is
it
because
of
the
since
the
language
is
in
english
and
we
have
created
different,
different
url
structure
and
we
don't
know
exactly
now.
The
ur
index
url
is
showing
is
only
160
000.
A
Yeah,
okay,
I
I
I
don't
know
your
website,
so
it's
really
hard
to
say,
but.
H
A
Overall,
from
from
what
you've
mentioned,
it
sounds
like
you
have
a
lot
of
urls
and
you
you
have
those
70
different
country
and
language
versions
on
on
top
of
that.
So
everything
is
multiplied
by
by
like
70.
from
my
kind
of
offhand
assumption
is
that
this
is
just
too
much.
A
So
it's
it's
not
that
from
from
a
technical
point
of
view
that
we
we
can't
handle
that,
but
essentially
you're
submitting
so
much
content,
and
you
have
everything
multiplied
by
70
essentially,
and
that
means
for
us
that
we
like,
we
start
somewhere
and
we
start
indexing
some
things.
But
there's
there's
almost
no
chance
for
us
to
to
get
through
to
everything.
A
So
usually,
what
I
recommend
in
these
kind
of
situations
is
start
off
with
with
a
very
small
number
of
different
country
and
language
versions,
and
make
sure
that
they're
working
well
and
then
expand
incrementally
from
there
and
in
particular,
if
you
have
a
lot
of
like
english
variations,
then
try
to
make
sure
that
you
just
have
maybe
one
english
landing
or
english
site,
rather
than
like
all
of
the
different
english
country
versions,
which
are
essentially
the
same
thing,
because
if
you
have
fewer
versions
of
the
the
english
content,
it's
a
lot
easier
for
us
to
focus
on
that
for
indexing
and
to
kind
of
treat
that
a
little
bit
better
when
it
comes
to
ranking.
A
A
D
A
Part,
we
will
treat
it
as
duplicates
and
then
it's
it's
kind
of
hit
and
miss
which
version
actually
ends
up
being
indexed.
So
that's
I,
I
think,
a
large
part
of
the
issue
there
is
that
we
just
see
all
of
these
different
copies
of
the
same
thing,
and
we
don't
know
what
what
you
really
want
us
to
do.
We
have
to
focus
on
a
small
part
of
it.
We
can't
do
everything
at
once,
so
it
ends
up
causing
almost
more
problems
for
your
website
by
having
all
of
these
different
versions.
A
Probably,
or
moving
too
fast,
I
don't
know
it's
it's
something
where
it's
very
easy
to
take
a
a
really
large
website.
That's
doing
well
and
say:
oh,
we
will
just
copy
their
system
and
take
all
of
the
different
english
versions
and
language
versions
and
do
the
same
thing.
But
if
you're
not
in
the
situation
of
that
that
large
well-known
website
already,
then
it's
very
different
for
you
and
then
it
may
it's
it's
a
lot
harder
to
actually
get
to
to
kind
of
expanding.
H
Yeah
because
since
we
want
to
target
worldwide
because
it's
an
e-commerce
website
and
we
want
to
target
worldwide-
and
we
want
to
improve
our
seo-
we
just
targeted
in
this
way
creating
different
urls
and
locating
them
in
different
countries
like
that.
So
I
think
we
can
focus
only
on
the
main
domain
instead
of
like
creating
locality
right.
H
And
no
need
of
going
for
the
en
I
mean
the
url
structure
should
not.
It
should
be
like
www.xyz.com
instead
of
the
locale
right.
A
You
you
can
have
the
locale
in
there.
It
doesn't
really
matter,
but
again,
I
I
would
make
it
so
that
you
primarily
just
have
one
english
version
rather
than
all
of
the
other
variations
of
english
as
well.
A
Sure,
let's
see,
I
think.
F
Hi
hi
I'm
new
and
bear
with
me
my
question
isn't
clear
enough,
but
we
have
a
website
with
about
say:
20,
30,
000
pages
and
it's
like
a
directory
service,
but
we
have
unique
content.
F
So
we
put
no
index
to
probably
about
two
thirds
of
those
pages,
because
we
wanted
to
kind
of
concentrate
what
google
looks
at
but
after
a
month,
or
so
only
less
than
one
percent
of
those
pages
are
indexed
and
we're
getting
identified
or
discovered,
but
not
indexed
for
almost
all
all
the
pages
so
kind
of
scratching
our
heads.
What
we
can
do
made
a
lot
of
tweaks
and
looks
like
internal
linking
is
all
right.
We
might
be
restricting
ourselves
a
little
bit
by
the
canonical
setting,
but
they
don't
look
wrong.
F
F
It's
data
and
it
looks
like
a
lot
of
the
quality-
is
assessed
by
sort
of
things,
that's
readable
by
human
being,
but
our
service
is
really
focused
on
a
number
that
is
very
valuable
and
we're
just
wondering
what
we
can
do.
Should
we
kind
of
try
to
squeeze
in
text
which
isn't
actually
central
to
what
we
do
or
are
there
things
that
we
can
do
to
tell
google
that
that
is
actually
valuable
to
people
who
visit
our
sites.
A
A
So
it's
hard
for
me
to
kind
of
estimate
what
what
kind
of
site
it
is
with
regards
to
kind
of,
like
you
mentioned,
as
a
directory
type
site,
the
from
from
a
quality
point
of
view.
I
I
do
think
that
matters
quite
a
bit
in
terms
of
how
much
we
actually
index
from
your
website.
A
So
that's
something
where
I
would
probably
still
focus
on
that.
If
you're
saying
the
site
is
like
25
000
pages
large,
it
feels
like
something
that
we
should
be
able
to
index.
If
it's
a
something
that
we
would
consider
to
be
reasonable
quality
website
and
when
it
comes
to
kind
of
the
the
quality
assessment.
A
F
Right,
just
as
a
follow-up,
we're
getting
discovered
and
not
indexed,
rather
rather
than
crawled,
and
not
indexed
for
those
99.
F
Should
we
differentiate
between
those
two,
because
our
site
is
not
that
big,
and
this
can't
be
a
crawl
budget
issue.
In
my
view,
in
that
case,
are
those
two
designations
pretty
much
the
same
in
that
it's
just
a
quality
issue,
I
like
again.
A
A
It's
really
less
a
matter
of
like
google
can't
go
off
and
crawl
that
many
urls,
because
again
with
like
25
000
pages,
most
servers
that
are
kind
of
reasonably
sized
can
can
easily
allow
that
kind
of
crawling
on
a
regular
basis
and
it's
probably
really
more
a
matter
of
our
understanding
of
the
overall
website
quality
and
with
with
larger
websites
or
if,
in
the
discovered
report,
discovered,
not
index
report.
F
Right,
so
do
you
think
we
should
try
to
add
text
in
there?
What
we
show
is
really
a
directory
of
companies
and
we
show
what
a
stock
price
means
in
terms
of
future
growth
of
that
company.
So
it's
a
number
but
there's
not
a
whole
lot
of
sort
of
readable
text
that
goes
with
it.
It's
just
you
know
a
number
does
this
20
stock
mean
10
growth
or
15
growth
is
what's
valuable,
we
can
add
it.
F
A
So
so,
from
that
point
of
view,
it's
something
where,
if
you
see
the
text
affecting
how
users
kind
of
look
at
your
pages
and
are
able
to
interact
with
your
pages
and
then
sure,
but
that's
more
a
matter
of
trying
to
figure
out
what
users
are
actually
looking
for
and
where
you
can
provide
unique
value
to
your
users,
but
just
kind
of
adding
text
to
pages,
I
don't
think
would
affect
how
we
would
crawl
and
index
those
pages.
A
If,
if
it's
something
where
you're
providing
numbers,
like,
I
don't
know
like
the
stock
numbers
there,
that's
something
where
I
would
also
try
to
figure
out
what
what
you
can
do
to
make
sure
that
what
you're
providing
is
unique
and
provides
value
to
users
and
do
something
maybe
along
the
lines
of
a
user
study,
to
figure
out
what.
What
is
it
that
we
can
do
to
make
our
website
such
that
users
recommend
it
to
other
people
as
well
and
kind
of
that
it
builds
up
almost
like.
A
A
A
Let
me
run
through
some
of
the
submitted
questions
and
then
we'll
get
back
to
like
the
giant
number
of
people
who
have
their
hands
raised
wow.
So
many
today,
let's
see,
does
google
have
any
trouble
indexing
sites
that
have
a
mobile
version
on
a
subdomain,
for
example
example.com
and
m.example.com,
and
then
the
question
goes
into
like
they're,
seeing
a
lot
of
pages
indexed
without
content.
A
So
from
from
our
point
of
view,
we
don't
have
well,
at
least
as
far
as
I
know,
we
don't
have
any
problems
with
mdot's
domains
in
general,
in
the
sense
that
this
is
one
of
the
supported
formats
that
we
have
for
for
mobile
websites.
We
don't
recommend
the
mdot
setup.
So
if
you're
setting
up
a
new
website,
I
would
try
to
avoid
that
as
much
as
possible
and
instead
use
a
responsive
setup,
but
it
it
is
something
that
can
work.
A
So,
if
you're
seeing
this
regularly
with
your
website
that
we're
not
able
to
index
your
mobile
content
properly,
then
to
me
that
would
point
more
at
an
issue
on
your
website
where,
when
mobile
googlebot
is
trying
to
crawl,
it's
not
able
to
access
everything
as
expected,
so
that's
kind
of
the
direction.
I
would
head
there
to
try
to
clean
that
up.
A
A
So
you
also
need
to
watch
out
for
not
only
redirecting
mobile
users
from
the
desktop
to
the
mobile
version,
but
also
redirecting
desktop
users
from
the
mobile
to
the
desktop
version
and
again
that's
something
if
you
have
a
responsive
design
setup,
you
don't
have
to
worry
about
that.
So
it's
like
another
reason
to
go
responsive
if
possible.
A
Would
I
be
penalized
if
I
don't
set
the
rel
sponsored
or
well
no
follow
for
my
affiliate
links,
probably
not
so
from
from
our
point
of
view,
affiliate
links,
kind
of
fall
into
that
category
of
something
financially
attached
to
the
link.
So
we
really
strongly
recommend
to
use
this
setup,
but
for
for
the
most
part,
if
you're,
not
if
it
doesn't
come
across
as
you
selling
links,
then
it's
not
going
to
be
the
case
that
we
would
manually
penalize
a
website
for
having
affiliate
links
and
not
marking
them
up.
A
A
A
few
months
ago
we
made
a
cms
migration
and
our
url
structure
completely
changed
all
of
our
urls
executed,
301
redirects
to
the
new
addresses,
and
we
have
improved
technical
indicators.
After
that
we
realized
our
organic
traffic
dropped
a
lot
and
immediately
after
the
ul
has
changed.
Is
that
what
we
should
expect?
A
What
time
does
it
take
so
it
when,
when
you
move
from
one
domain
to
another,
it's
very
easy
for
us
to
just
transfer
everything
from
that
domain
to
the
new
one
and
that's
something
that
can
be
processed
within
almost
like
a
week
or
so
in
many
cases.
However,
if
you
change
the
internal
urls
within
your
website,
then
that's
something
that
does
take
quite
a
bit
of
time,
because
essentially,
we
can't
just
transfer
the
whole
website
in
one
direction.
A
A
The
other
thing
is
when
you
change
your
cms
oftentimes
things
that
are
associated
with
the
cms,
also
change,
and
that
includes
a
lot
of
things
around
internal,
linking
for
example,
and
also
the
way
that
your
pages
are
structured
in
many
cases
and
changing
all
of
those
things
can
also
result
in
fluctuations.
It
can
be
that
the
final
state
is
higher
or
better
than
it
was
before.
It
can
also
be
that
the
final
state
is
is
less
less
strong
than
it
was
before.
A
So
that's
something
where,
like
changing
a
cms
and
changing
all
of
the
internal
linking
on
a
website
changing
the
internal
urls
on
our
website
changing
the
design
of
these
pages.
These
are
all
individually
things
which
can
cause
fluctuations
and
can
cause
drops
and
perhaps
even
rises
over
time,
but
doing
that
all
together
means
that
it's
going
to
be
messy
for
a
while.
So
that's
something
where,
on
the
one
hand,
waiting
to
see
what
what
happens
is
is
something
you
almost
have
to
do.
A
Archive.Org
is
a
great
way
of
looking
at
your
website
kind
of
in
the
past,
so
I
I
would
use
that
to
try
to
figure
out
like
what
things
looked
like
before,
how
the
internal
linking
was
set
up.
What
anchor
texts
were
there?
What
headings?
What
kind
of
text
was
there?
All
of
that?
The
another
thing
to
watch
out
for
with
these
kind
of
migrations
is
oftentimes.
There's
embedded
content
that
you
don't
think
about
directly,
because
it's
not
an
html
page
and
a
really
common
one
is
images.
A
So
if
your
site
was
getting
a
lot
of
image
search
traffic
that
can
also
have
a
significant
effect
and
setting
up
those
redirects
probably
still
makes
sense,
even
if
you've
moved
after,
I
don't
know
a
month
or
so
I
don't
know
what
the
timing
here
was,
and
then
I
have
a
not
a
technical
question
but
a
career
question.
What
is
your
advice
for
someone
interested
in
pursuing
a
career
in
seo?
What
should
they
focus
on
if
they
already
started
learning
and
are
familiar
with
the
basics?
How
to
go
from
there
to
become
great
at
seo?
A
I
don't
know
I
I
don't
get
these
kind
of
questions
often
so
I
don't
have
a
perfect
answer
lined
up
from
from
my
point
of
view.
What
what
I
kind
of
like
about
seo
is
that
there's
so
many
different
angles
where
people
can
kind
of
dive
in
and
understand
a
little
bit
more
and
oftentimes
people
have
interests
in.
A
I
don't
know
figuring
out
what
to
do
with
regards
to
content
or
figuring
out
trying
to
understand
the
audiences
that
are
involved,
trying
to
understand
the
technical
details
of
a
website,
maybe
even
going
into
the
direction
of
javascript
and
developing
things.
And
all
of
these
are,
I
think,
really
neat
angles
and
they're
all
important
for
seo.
A
It's
not
that
one
of
these
is
going
to
go
away
and
suddenly
you'll
be
stuck
so
my
recommendation,
there
would
kind
of
be
to
figure
out
which
of
these
angles
you
you
really
enjoy
doing
and
try
to
figure
out
ways
to
to
do
a
little
bit
more
of
that.
A
One
of
the
things
I've
recommended
in
the
past,
which
might
also
be
appropriate
here,
is
to
take
part
in
the
the
different
webmaster
and
seo
forums.
So
google
has,
I
think,
a
help
community
as
well,
and
there
are
definitely
others
out
there
as
well,
and
the
good
part
there
with
all
of
these
forums
is
that
you
see
a
lot
of
different
scenarios
and
you
run
into
situations
where
you
see
where
you
can
kind
of
like
dive
in
and
try
to
understand.
A
Well,
what
is
the
problem
here
and
how
could
I
solve
this
and
whether
or
not
you
actively
take
part
in
the
forum
and
kind
of
post,
your
suggestions
there
at
least
the
the
opportunity
to
look
at
individual
situations
where
you
see
well,
there's
a
problem
here.
How
can
I
try
to
figure
this
out
myself?
A
I
think
that
is
really
valuable
and
that's
really
important
in
the
long
run,
because
a
lot
of
times
you
will
see
the
same
kind
of
problem
come
back
over
and
over
again
and
if
you
understand
kind
of
the
patterns
there,
it's
a
lot
easier
for
you
to
jump
in
and
say
well.
This
is
the
same
pattern
as
I've
seen
before,
and
that
means
I
need
to
double
check
these
three
or
these
five
things
first.
A
So
those,
I
think,
that's
that's,
always
really
valuable
to
kind
of
see
and
practice
a
little
bit
more
but
again,
there's
so
many
different
options
and
so
many
different
directions
that
you
could
take.
It's
it's
hard
to
say,
like
you
need
to
do
this
one
thing
to
become
better
at
seo.
A
Let's
see,
I
have
a
situation
with
a
site
where
the
main
navigational
links
is
visually
different
from
the
mobile
and
the
desktop
versions.
The
mobile
version
is
slimmer
and
the
desktop
version
has
more
links.
Looking
exclusively
crawling,
the
desktop
is
more
helpful
for
technical
reasons.
We
can't
hide
the
desktop
links
in
the
source
and
the
site
is
on
mobile
first
indexing
already.
Is
there
a
chance
that
google
might
think
I'm
trying
to
game
the
algorithms
by
keeping
this
more
link,
rich
navigation,
even
though
it's
invisible
to
mobile
users
and
bots?
A
So
from
from
that
point
of
view,
I
don't
think
you
would
be
penalized
for
anything
like
this.
Well,
definitely
not
penalized,
because
it's
a
very
common
setup-
it's
probably
not
optimal,
but
it's
it's
also
not
something
where
I
would
say
you
need
to
jump
in
and
fix
that
right
away.
A
So
that's
kind
of
my
my
assessment
there
dealing.
My
question
is
about
dealing
with
outdated
blogs
on
our
platform.
We
have
about
450
blogs,
some
of
which
are
45
years
old
and
therefore
out
of
date
and
have
almost
no
traffic
on
them.
Do
you
recommend
deleting
them
because
they
hurt
our
general
search
rankings?
What's
the
best
way
delete
all
without
traffic
at
once
and
request
index
deletion
at
google?
Or
do
you
recommend
step-by-step
approach?
A
So
I
think
with
blogs,
you
probably
mean
like
blog
posts,
so
individual
pages
not
like
whole
sets
of
pages,
because
I
think
if
you
have
so
many
different
like
sets
of
pages
it's
it's
probably
a
bigger
change,
but
with
450
pages,
say
more
or
less,
where
you're
saying
well,
these
don't
get
a
lot
of
traffic.
Should
I
just
delete
them
or
not.
A
From
my
point
of
view,
probably
that's
that's
something
where
you
can
just
make
that
call
on
your
own.
I
I
don't
see
that
as
being
something
where,
from
from
an
seo
point
of
view,
you
would
see
a
significant
change
unless
these
are
really
really
terrible
blog
posts.
The
the
main
thing,
however,
I
would
kind
of
watch
out
for
is
that,
just
because
something
doesn't
have
a
lot
of
traffic
doesn't
mean
that
it's
a
bad
piece
of
content.
A
It
can
mean
that
it's
something
that
just
gets
traffic
very
rarely
maybe
once
a
year
or
maybe
it's
something
that
that
is
very
seasonal,
that
like
overall,
when
you
look
at
it
from
a
website
point
of
view,
it's
not
very
relevant,
but
it's
relevant.
I
don't
know
maybe
right
before
christmas,
for
example.
A
So
from
from
that
point
of
view,
I
would
say
it's
it's
fine
to
go
through
a
website
and
figure
out
which
parts
you
want
to
keep
and
which
parts
you
want
to
kind
of
clean
out,
but
just
purely
looking
at
traffic
for
figuring
out
which
parts
you
want
to
clean
out.
I
think
that's
too
simplified
but
again
from
an
seo
point
of
view,
like
removing
450
pages
from
a
larger
website.
A
That's
that's
like
tiny
change,
and
I
wouldn't
worry
about
when
you
do
that
and
how
exactly
you
do
that
delete
them
whenever
you
kind
of
like
recognize
that
they're
no
longer
valuable
delete
them
all
at
once.
That's
also
an
option
with
regards
to
submitting
the
them
with
the
removal
tool
as
well
in
search
console
that
probably
wouldn't
change
anything,
because
the
removal
tool
in
search
console
just
hides
the
page
in
their
search
results.
It
doesn't
actually
remove
anything
from
indexing,
so
that's
kind
of
like
one
thing
you
don't
have
to
do
but
again.
A
With
a
query
string
at
the
end
of
an
image
source
url
have
a
negative
respect
for
seo.
We
use
it
for
cache
invalidation
when
an
image
is
edited,
so
no,
it
wouldn't
cause
issues
with
regards
to
seo,
but
with
images
in
general,
we
tend
to
recrawl
and
reprocess
them
much
less
frequently.
A
If
it's
something
that
happens,
I
don't
know
very,
very
rarely
and
where
you're
saying
well,
it
doesn't
really
matter
too
much
how
things
are
in
image
search.
We
don't
rely
on
image,
search
for
traffic
to
our
website,
then,
from
that
point
of
view,
that's
that's
totally
non-issue.
A
The
thing
I
would
avoid,
especially
here
with
image
urls,
is
that
you
embed
something
that
changes
very
quickly,
so
something
like
a
session
id
or
just
like
always.
Today's
date,
because
that
would
probably
change
more
often
than
we
would
reprocess
the
image
urls
and
then
we
would
never
be
able
to
index
any
of
the
images
for
image
search.
A
I
Our
normal
audience
was
kind
of
a
thousand
active
users
in
real
time
and
now
is
about
90
or
100
active
uses,
and
we
didn't
change
anything
technical
or
editorial
in
our
search
console
did
not
report
any
problem.
We
have
all
corrupt
vitals
ratings
very
high
in
our
crooks
reports.
It's
okay,
two
are
above
80
percent.
I
A
So
my
main
recommendation
is
not
to
rely
on
google
discover
for
traffic,
but
rather
to
see
it
as
an
additional
traffic
source
and
not
as
the
main
one
the
when
it
comes
to
discover.
There
are
a
few
things
that
kind
of
play
in
there.
You
mentioned
some
of
the
technical
things
that
I
think
are
good
practices.
A
One
of
the
things
that
also
plays
in
there
is,
for
example,
the
core
updates,
also
play
a
role.
We
recently
had
a
core
update,
maybe
from
a
timing,
point
of
view
that
matches
what
what
you
saw
there.
So
that's
something
where,
if
you
do
see
an
effect
from
the
core
update,
then
I
would
double
check
the
blog
post
that
we
have
about
core
updates
with,
like
the
the
large
number
of
tips
and
ideas
that
you
could
think
could
focus
on.
A
The
other
thing
is
with
discover.
In
particular,
we
have
a
set
of
content
guidelines
that
we
we
try
to
stick
to
in
an
algorithmic
way
and
depending
on
on
the
website
itself,
it
might
be
something
where
some
of
these
content
guidelines
your
website
is,
is
kind
of
borderline.
A
So,
for
example,
like
I
don't
know
the
content
guidelines
all
by
by
heart,
but
I
I
think
there
is
something
around
like
clickbaity
titles
or
clickbaity
content
in
general
or
kind
of
adult-oriented
content,
for
example,
and
it
might
be
that
your
website
is
kind
of
borderline
there.
With
the
regards
to
how
we
kind
of
evaluate
your
website
in
that
regard,
and
then
it
can
also
happen
that
our
algorithms
say
oh
well.
A
A
large
part
of
this
website
is
just
bait
or
kind
of
one
of
the
other
categories
that
that
we
list
in
the
content
guidelines,
and
then
we
will
be
a
lot
kind
of
more
conservative
with
regards
to
how
we
show
the
website
in
discover,
so
those
are
kind
of
like
without
knowing
your
website
that's
kind
of
the
direction
I
would
head.
A
On
the
one
hand,
the
the
core
updates
kind
of
think
about
that,
on
the
other
hand,
the
content
guidelines
that
we
have
and
then
finally,
I
would
still
make
sure
that
you
don't
rely
on
discover
for
for
your
business
overall,
because
it
can
change
fairly
quickly
and
it's
it's
something
where
often
they're,
not
pure
technical
reasons,
behind
those
changes.
A
Okay,
okay,
thank
you,
cool
brian.
J
Hi
john
hi
nice
to
meet
you.
Finally,
yeah
okay.
I
have
a
couple
of
questions.
First,
one
on
a
reddit
post
a
few
weeks
ago.
I
believe
you
mentioned
that
having
less
than
30
pages
would
make
google
consider
the
website
less
authoritative
right
and
what
I
know
that
that
leaves
it
open
to
seo.
Is
writing
you
have
to
have
minimum
30
articles
to
rank
et
cetera,
but
my
question
concerns
the
other
extreme.
A
I
I
think
you
could
make
good
one-page
sites
so
from
that
point
of
view,
I'm
I'm
not
too
worried
about
that.
The
I
think
the
the
reddit
post,
as
far
as
I
remember
it
was,
was
something
along
the
lines
I
created:
30
blog
posts
and
they're
really
good,
and
therefore
my
website
should
be
authoritative
and,
from
my
point
of
view,
like
you
going
off
and
creating
30
blog
posts
does
not
automatically
make
your
website
authoritative
and
especially
for
kind
of
the
the
higher
or
the
more
critical
topics.
A
It's
something
where
you
can't
just
create
30
blog
posts
on
a
medical
topic
and
then
say,
like
I
am
a
doctor,
I've
written
30
articles
so
that
that
was
kind
of
the
direction
I
was.
I
was
headed
there
and
for
a
lot
of
websites.
It's
not
that
you
need
to
be
seen
as
an
authority.
It
you.
You
essentially
put
your
content
out
there.
A
If
you're
a
small
business
you're
selling-
something
like
you,
don't
need
to
be
an
authority,
and
especially
things
like
one
page
websites,
they're
often
very
focused
on
this
one
thing,
and
you
don't
need
to
be
an
authority
to
do
that.
One
thing
to
sell,
I
don't
know
an
ebook
or
to
give
information
about
opening
hours
for
business.
It's
like
it's
just
information.
A
So
from
from
that
point
of
view
having
a
one-page
website,
I
I
think
is
perfectly
fine
with
regards
to
starting
out
with
a
one-page
website.
I
think
that's
fine.
I
would
just
think
about
like
well.
Where
do
you
want
to
go
from
there
at
some
point?
Maybe
you
do
want
to
create
more
pages
and
try
to
find
a
way
that
you
don't
paint
yourself
into
a
corner
by
saying.
Well,
I
have
to
put
everything
on
one
page,
all
the
time,
but
rather
expand
when
you
see
that
it
fits.
J
Okay,
the
second
question
is
actually
more
of
a
hypothetical
so
for
international
seo,
say
you
have
a
site
that
is
basically
generally
well
established.
They
have
links
going
to
it
all
that
stuff,
it's
generally
good
from
an
seo
standpoint,
but
at
one
point
they
decided
to
roll
their
international
urls
and
set
up
into
one
domain
say
say,
for
example,
it
was.
It
would
be
something
like
media
news,
united
states,
as
opposed
to
the
international
urls
and
the
international
setup.
J
What
and
aside
from
the
obvious
from
things
like
links
being
combined,
maybe
not
being
redirected,
properly
stuff
like
that,
what
else
would
and
say,
google
all
of
a
sudden
decide
to
drop
traffic
90
after
that
implementation,
assuming
that
everything
was
done,
100
correctly,
which
is
likely
not
to
be
the
case
on
a
larger
site
like
that,
what
would
happen
to
cause
google
to
drop
traffic
like
that?
In
such
a
scenario?
A
To
say
so,
usually
what
what
I
would
do
in
in
a
case
like
this
is
try
to
figure
out
if
things
have
settled
down
or
not,
because
if
it's
in
this
flux
period
it's
it's
really
hard
to
determine
what
what
things
should
be
like
and
then
I
would
try
to
dig
in
to
both
individual
pages
and
individual
queries,
so
in
search
console
figure
out
like
what
I
don't
know,
the
top
50
queries
were
and
look
at
the
top
50
queries
now
and
see
like.
Are
there
certain
queries
which
have
dropped
are?
A
Is
it
more
a
matter
of
like
everything
overall
kind
of
dropping
slightly
to
try
to
figure
out?
Is
this
based
on
individual
parts
of
the
site,
or
is
it
a
kind
of
like
an
overall
change
and
the
same
thing
on
a
page
level
as
well
often,
what
I
find
when
I
look
at
things
on
a
page
level,
is
that
individual
pages
that
used
to
get
a
lot
of
traffic
suddenly
don't
work
anymore,
they're
like
404
or
they
got
redirected
incorrectly
or
something
like
that,
and
then
it's
something
where
it's.
It's
suddenly
not
clear.
A
What
actually
happened
there
in
that
you
see
well,
some
of
these
top
traffic
pages
are
gone
now.
Then,
of
course,
the
overall
traffic
would
be
kind
of
visibly
affected
by
that
as
well,
but
usually
going
and
trying
to
find
ways
to
to
figure
out
like
what
the
details
were
of
that
change
that
helps
to
understand
a
little
bit
better,
which
direction
people
should
start
looking.
A
Thank
you
john.
Thank
you
very
much
cool
thanks
all
right
david.
G
Hi
thanks
last
one
in
I
guess
so
I
posted
this
also
in
the
comments,
and
so
I
have
a
small
website
and
and
it's
a
couple
of
hundred
urls,
only
it's
a
software
company
and
we
there's
a
small
team
producing
content
like
white
papers
and
help
articles
and
blog
posts,
and
it
has
been
going
well
for
a
long
time
and
suddenly
now,
in
november
the
posted
pieces,
the
published
pieces
are
not
indexed
anymore,
not
all
of
them,
which
is
very
sad
because
they're
writing
nice
little
pieces
for
black
friday
how
to
write
a
newsletter
for
breakfast
or
for
christmas,
and
then
we're
sitting
there
and
seeing
that
google
is
crawling
them
or
discovering
them.
G
We
see
that
in
the
search
console,
but
not
indexed,
so
I
tried
everything
I
looked
at
the
technical
issues.
The
linking
is
good,
so
my
question
is:
is
there
a
paradigm
shift
that
google
is
saying?
Well
thanks
for
suggesting
these
new
publishing
these
articles,
but
we
don't
want
it
right
now.
Is
this
something
new.
A
That
has
changed
recently,
not
not
really
at
least
not
not
that
I
know
of,
but
I
I
think
what
what
I
see
a
lot
with
regards
to
a
lot
of
the
the
indexing
questions
that
I
get
nowadays
is
from.
From
a
technical
point
of
view,
it's
very
easy
for
websites
to
make
websites
that
just
work
like
you,
you
set
up
wordpress
and
then
essentially
all
of
the
the
seo
is
done
for
you.
A
A
With
regards
to
the
the
overall
quality
of
the
website,
the
quality
of
the
pieces
of
content
that
we
get
and
then
it's
something
where
additionally
in
search
console,
we
give
you
all
of
the
information
on
things
like
discovered,
not
indexed
or
crawled,
but
not
indexed.
And
then
suddenly,
you
see
all
of
these
problems
and
it
seems
like
something
that
that
people
have
to
fix.
A
G
Yeah
perfect,
and
I
would
love
to
hear
you
because
you
know
it's
just
it's
it's
not
you
know
you
could
say
it's
not
a
big
deal.
I
understand
that
google
cannot
index
everything
and
we
do
rank
for
some
of
these
topics,
for
example
black
friday
2020.
We
rank
for
that
and
now
we
wrote
a
new
piece
black
friday
2021
what
to
do
with
your
customers
and
it's
a
handwritten
piece.
So
the
question
would
be
there.
Should
we
just
lay
off
the
content
team
because
it's
really
expensive
to
produce
content.
G
I
mean
they're
writing
on
those
pieces
a
couple
of
days
I
mean
those.
Those
are
nice,
ladies
they're
sitting
there
and
writing
those
content
and
then
they're
getting
indexed
and
we're
not
getting
any
traffic
from
google
for
that.
So
basically
the
management
is
asking
me
well.
Why
should
we
do
that?
There's
no
sense
on
producing
good
content.
If
we
don't
get
that
part
of
the
traffic
that
we
get
from
google
normally
because
it's
just
because
google
says
well,
we
just
don't
want
it
right
now.
You
know
this
is
so
it's
tricky.
Isn't
it.
A
Yeah,
I
mean
that's,
that's
why
these
examples
are
really
useful,
because
we
we
do
bring
them
back
to
the
indexing
teams
and
kind
of
like
say
well.
What
is
wrong
here?
Why?
Why
aren't
we
picking
this
up?
This
is
something
reasonable.
We
should
be
doing
a
better
job
at
that,
and
especially
when
things
are.
G
Not
good
or
they're
something
technically
wrong,
but
I
just
don't
see
anything
and
it's
again:
it's
not
a
side
with
millions
of
pages,
but
we
have
500
pieces.
I
think
they're
all
handwritten
and
yeah.
It
has
always
worked
just
suddenly
in
november.
It
stopped
just
you
know
during
the
time
of
the
car
update,
so
I
was
afraid
that
google
is
busy
with
other
things
right
now,
really.
A
A
Twitter
is
probably
easiest
and
for
for
the
most
part,
people
can
ping
me
and
ask
to
to
kind
of
like
be
followed,
so
they
can
send
me
a
dm
if
that
makes
it
easier
all
right.
Thanks
cool
all
right.
Let
me
take
a
break
here
with
regards
to
the
recording.
I
can
be
around
a
little
bit
longer,
but
not
not
as
long
as
as
usual
this
time,
because
we
have
more
meetings,
but
thank
you
for
joining.
A
Thank
you
for
all
of
the
questions
so
far
and
I
hope
we
can
get
through
some
of
the
remaining
questions
as
well,
so
if
people
still
have
their
hands
raised,
if
you're
watching
this
recording
on
youtube
feel
free
to
submit
questions
for
the
next
sessions
which
I'll
be
setting
up
as
well,
I
don't
know
how
the
timing
of
the
next
sessions
will
be
with
regards
to
holidays
and
things
like
that,
but
we'll
figure
something
out
cool
and
with
that,
let
me
pause
the
recording
here.
Just
a
second.