►
Description
In this episode of CQC Connect, the podcast from the Care Quality Commission, we update on the development of our new regulatory model, a key part of our new strategy.
Hear from our Heads of Policy Dave James and Amanda Hutchinson on our latest thinking around the evidence categories and outputs from assessments.
Read all about our new strategy: https://www.cqc.org.uk/about-us/our-strategy-plans/new-strategy-changing-world-health-social-care-cqcs-strategy-2021
Listen to more podcasts from the CQC: https://soundcloud.com/carequalitycommission
Find out more information about the CQC and how it regulates health and social care in England: https://www.cqc.org.uk/
A
C
What
we've
also
done
is
we've
already
consulted
on
some
of
the
broader
changes
that
we
want
to
make
so,
for
example,
moving
away
from
set
inspection
frequencies
and
also
being
able
to
update
ratings
without
inspection.
What
we're
doing
now
is
to
work
up
the
detail
of
what
this
means
in
practice
for
our
regulatory
model.
C
So
some
of
the
things
that
we're
looking
at
is
obviously
making
sure
that
our
ratings
reflect
the
current
view
of
the
quality
in
it
provider
and
that's
one
of
the
things
that
we
kind
of
need
to
achieve
through
our
approach
going
forward.
We
also
importantly
want
to,
in
future,
adopt
a
kind
of
single
view
of
quality
that
works
across
all
of
our
functions,
and
I
think
dave
is
going
to
talk
a
bit
about
our
work
on
our
assessment
framework
and
how
we're
developing
the
thinking
about
that.
C
But
it's
about
how
we
make
sure
that,
whether
we
are
registering
whether
we
are
assessing
a
provider
or
whether
we
are
assessing
a
local
authority
or
potentially
an
integrated
care
system.
There's
a
kind
of
overall
unified
view
of
what
good
quality
care
looks
like
from
a
cqc
perspective
and
also
importantly,
from
the
perspective
of
people
using
services.
We're
looking
at
how
we're
going
to
bring
together
our
registration
and
ongoing
assessment
processes
in
a
more
seamless
way
and
also,
as
we've
said,
importantly,
how
we
put
people's
experiences
at
the
center
of
our
judgments
going
forward.
C
So
lots
of
work,
lots
of
development
work
and
thinking
going
on
across
all
of
those
areas
of
cqc's
work
and
then
also
really
starting
to
get
in
into
this.
In
some
of
the
detail
around
what
an
assessment
process
that
delivers,
that
up-to-date
view
of
quality
actually
looks
like
in
practice
and
how
we
deliver
that
in
a
way
that
improves
the
experiences
of
people
using
services,
but
also
providers
as
well.
A
Thanks
amanda,
you
do
you
do
my
segues
for
me,
though,
I
think
it's
a
nice
link
into
the
assessment
framework
and
process,
so
dave
just
come
up
to
you.
I
know
people
have
heard
a
couple
of
months
ago.
We
did
a
podcast
zero
in
on
our
assessment
framework.
It'd
be
really
great
to
hear
about
where
we're
at
with
that
work
and
a
bit
of
a
pretty
for
people.
B
Sure,
yeah
and
amanda's
touched
on
it
there
as
well,
so
we
are
looking
at
developing
one
framework
across
everything
that
we
do,
whether
that
be
registering
providers,
our
ongoing
quality
assessments,
our
future
role
in
local
authority
and
as
a
mandatory,
potentially
the
kind
of
system,
level,
assessments
and
assurance
that
we
will
be
delivering
as
part
of
our
future
regulatory
approach,
we're
essentially
in
the
process
of
testing
our
thinking.
Do
people
think
this
looks
like
a
sensible
structure,
because
a
lot
of
this
is
actually
about.
B
This
does
the
structure
that
we've
come
up
with,
makes
sense,
and
just
to
recap
on
that
for
people
that
might
not
have
heard
previous
podcasts,
we
are
looking
at
having
much
clearer
and
shorter
descriptions
of
quality,
pitched
at
the
level
of
good
under
each
of
our
five
key
questions,
and
at
the
moment
that's
looking
like
there
are
between
six
and
ten
statements
which
are
in
the
form
of
effectively
a
commitment.
So
we
do
this.
We
do
that.
B
It's
that
language
and
it's
very
much
plain
english
sitting
underneath
those
we
have,
what
we're
calling
evidence
categories
but
which
some
people
might
appreciate.
Thinking
of
in
terms
of
what
we
used
to
term
our
key
lines
of
inquiry,
they
were
essentially
the
questions
and
the
ways
into
was
understanding
if
those
commitments,
those
expectations
are
being
met,
and
so
we're
currently
talking
to
our
partners
and
our
stakeholders
and
trying
to
understand
how
they
feel
about
what
we've
got
to
getting
their
views
today
is
part
of
that
as
well.
B
This
conversation
will
happen
with
you,
but
also
a
more
detailed
level.
Are
we
covering
the
right
topics?
Our
starting
point
was
essentially
building
on
our
frameworks
that
we
already
had
in
place
and
transposing
that
into
this
new
simplified
structure.
So
that
meant
where
we'd
set
the
bar
and
the
topics
all
went
with
that
as
well.
B
We're
also
doing
some
work
internally
at
the
moment,
just
to
start
to
test
out
this
idea
of
how
we
might
use
scoring
more
part
of
the
structured
approach,
how
we
might
be
able
to
generate
scores
at
levels
that
make
sense
to
allow
us
to
be
smarter,
as
we've
talked
about
in
our
strategy.
So
that's
broadly,
where
we
are
in
the
assessment
very
much
thanks.
A
Dave
it's
great,
I
mean
cqc
has
a
really
strong
track
record
of
engagement.
You've
talked
about,
we've
already
had
quite
a
lot
of
those
conversations
already
and
we're
very
much
in
like
that.
Listening
mode.
Now,
what
are
the
sorts
of
things
that
we've
been
hearing
from
providers?
You
know?
Are
they
are
they
fans?
Are
we
getting
the
green
light
from
them.
B
Yeah,
it
is
going
well,
certainly
not
trying
to
blow
our
trumpet
here,
but
in
some
ways
I'm
not
surprised,
because
our
starting
point
for
this
was
looking
at
everything
we've
heard
from
providers
across
all
of
our
sectors
over
the
years.
You
know
and
actually
learning
from
the
issues
that
we
now
mean
areas
that
we
know
we
want
to
improve,
but
also
just
some
of
the
problems
that
providers
have
come
to
us
with
and
just
trying
to
sort
of
take
a
step
back
from
that
and
take
a
holistic
view
as
to
okay.
B
What
would
a
new
model
need
to
do?
What
would
a
new
framework
need
to
do
as
part
of
that
to
start
to
address
some
of
these,
and
it's
great
to
hear
that
we
seem
to
be
getting
that
right,
we're
getting
really
positive
responses.
I
think
we're
probably
at
the
point
now
where
a
lot
of
our
stakeholders
are
ready
for
the
next
level
of
detail,
and
I
think
that's
right
and
that's
kind
of
okay.
We
at
a
high
level.
We
can
see
this
works.
B
We've
got
confidence
that
we're
not
losing
anything
here,
but
how
actually
will
it
work
in
terms
of
you
know,
a
small
learning,
disability,
residential
service
versus
an
nhs
trust?
What
is
it
about
next
level
of
detail
in
terms
of
the
evidence
that
we'll
look
at
and
how
will
that
actually
work
and
that's
kind
of
where
we're
coming
to
now
is
starting
to
build
up
that
detail
and
get
thoughts
on
that,
but
yeah.
It's
good
support
from
providers
for
what
we've
got.
A
To
so
far
great,
it's
really
encouraging
to
hear
so
amanda
just
to
focus
in
on
the
role
that
evidence
might
play
in
this
new
model.
So
we
know
that
having
the
right
evidence
is
vital
for
making
good
regulatory
decisions.
Could
you
introduce
some
of
our
thinking
in
this
area?
I
know
they
took
briefly
on
the
scoring
there,
but
you
know
it'd
be
really
good
to
have
a
bit
of
an
overview
on
on
the
role
that
evidence
and
insight
might
play
for
us
in
the
future.
C
Yeah
sure
so
so
I
think,
as
dave
said,
we're
kind
of
looking
at
very
carefully
how
we
structure
our
assessment
framework
and
assessment
approach
in
future,
and
I
think
we're
clear
that
we
we're
holding
on
to
the
five
key
questions.
They've
served
as
well
we're
also
building
underneath
those
that
set
of
quality
statements.
That
are
our
view
of
what
good
quality
care
looks
like
linking
up
with
what
we
know.
People
who
use
services
look
for
in
the
service,
but
at
the
level
underneath
that
quality
statement.
C
What
we're
going
to
be
much
clearer
about
is
the
evidence
that
we
will
look
for
in
different
services
in
order
to
make
our
judgments-
and
I
think,
as
dave
said,
we've
been
able
to
kind
of
look
across
all
of
the
prompts
that
we
have
in
our
current
regulatory
model
and
distill
those
down
into
basically
six
evidence
categories.
So,
looking
across
the
quality
statements,
we'll
be
looking
for
evidence
that
we
can
categorize
as
either
evidence
about
people's
experiences
of
care,
feedback
from
staff
and
leaders,
feedback
from
partners,
observation
of
care-
and
that's
really
important.
C
So
so
that's
the
thinking
that
we're
we're
doing
at
the
moment
around
how
we
use
evidence
and,
as
dave
said,
I
think,
the
feedback
that
we've
had
so
far
has
been
that
this
feels
like
kind
of
really
useful
direction
of
travel,
certainly
from
a
provider
perspective,
because
it
does
give
that
clarity
and
transparency
around
what
we're.
Looking
for.
A
C
I
think
we
kind
of
need
to
think
through
whether
that's
the
right
word
or
not,
because
I
think
there's
a
there's
a
kind
of
sense
in
this.
This
is
this
is
about
a
view
of
quality.
That's
up
to
date
in
order
to
keep
that
view
of
quality
up
to
date.
We
need
up-to-date
information,
but
there
are
kind
of
many
sources
and
ways
in
which
we
might
get
that
information.
A
Thank
you
so
now
we
just
want
to
move
on
to
looking
at
some
of
the
things
that
we've
produced
following
our
assessments
at
the
moment.
Obviously,
we've
got
carrying
out
inspection,
inspection
report
and,
alongside
that,
a
rating
or
a
judgement
of
quality
dave,
I
wonder
if
you
could
introduce
what
we're
starting
to
look
at
in
this
area.
Sure
I
mean.
B
It's
it
follows
on
from
what
amanda
is
saying
about
how
we
want
to
break
down
our
ratings,
to
to
show
more
transparency
really
and
show
how
we've
arrived
at
our
judgements,
and
so
I
think
that's.
The
first
thing
is:
how
can
we
best
share
that?
What's
useful,
what's
useful
for
to
all
the
range
of
of
people
and
their
individual
needs,
whether
that's
the
public
looking
to
to
choose
a
service
or
whether
it's
you
know
the
provider?
What
do
those
different
customers
of
ours
actually
need
to
see?
B
One
thing
we've
heard
you
asked
early
read
about
how
providers
are
feeling,
and
one
thing
we
heard
recently.
One
of
our
events
was
provided
as
a
key
that
we
don't
lose.
That
story
that
we're
able
to
tell,
and
particularly
where
there
is
something
perhaps
unique
happening,
which,
of
course,
you
know,
every
every
service
really
is
unique.
So
we
do.
B
Now
is
you
know,
starting
to
look
at
and
to
seek
views
on
what
would
be
useful
in
terms
of
the
reporting
that
we're
doing
what
we
put
out
there
and
how
that
might
differ
for
different
groups
and
what's
the
right
balance
between
that
textual
kind
of
report
and
more
structured
scoring,
and
I
think
there's
some
some
questions
there
about
you
know
and
and
how
often
do
you
update
that,
given
we've
got
this
ambition
to
have
an
always-on
view
of
quality?
B
C
B
That
the
other
thing
here
for
me
would
be
that
what
we
do
makes
sense
to
to
everybody
and
that
we
actually
do
have
that
single
share
view
of
quality
across
health
and
social
care
in
a
way
that
genuinely
meets
the
public's
expectations
and
needs,
but
also
means
that
organizations
like
cqc
and
other
oversight,
agencies
and
improvement
agencies
are
all
kind
of
pulling
in
the
same
direction
that
can
all
sign
up
effectively.
I
think
to
what
it
is
that
we're
trying
to
trying
to
achieve
here
great.
A
I'm
sure
that
last
bit
in
particular,
will
be
music
to
providers
ears,
so
I
mean
it's
great.
We
covered
a
lot
of
ground
there
and
then
there's
loads
of
great
stuff
and
lots
lots
of
stuff
for
people
to
look
forward
to
if
people
would
like
to
get
involved
in
helping
shape
this
work
on
an
ongoing
basis
and
we
can
go
on
our
digital
engagement
platform.
Citizen
lab,
we'll
pop
the
link
in
the
description.
A
You
can
also
catch
more
on
how
we're
implementing
our
strategy
and
we'll
do
run
blogs
and
things
like
that
on
our
medium
page
and
also
keep
an
eye
on
our
youtube
channel
as
well.
Just
final
for
the
thing
for
me
is
just
to
say,
thanks
to
my
brilliant
guests,
dave
and
amanda.
Thanks
for
sharing
me
your
thoughts
on
our
latest
work
today
and
I'm
sure
sure
we'll
catch
you
again
down
down
the
line
thanks
very
much.