►
Description
South Big Data Hub Education & Workforce Working Group
December 2021
Karl Schmitt, Trinity Christian College, Program Coordinator for Data Analytics
A
I
did
to
share
this
as
andy
sort
of
realized
and
I
realized
as
I
was.
We
were
going
over
this
our
last
meeting.
We
were
like
wow.
We
actually
did
make
some
real
progress
and
it
was
super
exciting
and
even
though
it
feels
like
we
kept
having
these
meetings
and
we're
like.
Oh,
I
kind
of
do
to
do
poked
along.
So
I
want
to
sort
of
give
you
that
scope
and
tell
the
story
of
where
we're
at
and
point
to
where
we're
hoping
to
get
to
by
you
know
may
or
next.
A
So,
as
rana
said,
this
is
the
program
assessment
subgroup
and
I
have
to
okay.
A
So
we
were
sort
of
aiming
for
an
end
delivery
in
may,
and
the
summary
of
sort
of
our
recent
progress
is
that
we
really
spent
a
lot
of
time
narrowing
the
scope
of
what
exactly
we
want
to
be
producing
as
our
working
document,
and
we
also
wanted
to
so,
and
we've
been
doing
some
sort
of
work
about
actually
putting
a
framework
around
what
that
scope
and
assessment
is
going
to
look
like
some
of
the
challenges
that
we
we
are
still
aware
of
and
are
hopefully
we'll
have
some
chance
to
sort
of
mitigate,
as
we
move
through
in
this
spring
is
sort
of
making
sure
that
we
actually
have
produced
supporting
documentation
and
details
and
processes.
A
So
so
others
besides
us
that
have
spent
a
lot
of
time.
Thinking
about
it,
can
use
it
and
also
sort
of
a
worry
that
perhaps
what
we
have
focused
on
is
either
too
narrow
for
other
people
to
use
or
isn't
able
to
be
incorporated
into
their
needs
or
that
what
we
have
is
potentially
too
complicated
for
others
to
be
using
easily
and
quickly.
So
let
me
give
you
a
deeper
insight
into
what
we've
actually
been
doing,
so
our
original
charge
was,
or
as
as
I
always
proposed
it.
A
So
this
is,
this
is
a
really
big,
broad
scope
and
we
we
knew
when
we
proposed
this,
that
we
weren't
going
to
be
able
to
tackle,
producing
a
program
level
assessment
plan
for
every
program
out
there,
no
matter
what
they
had.
So
we
started
with
sort
of
talking
about
our
the
different
possible
focuses
that
we
believed
would
show
up
in
almost
every
program
in
some
way
or
form
and
those
consisted
of
either
technical
outcomes.
So
this
could
be
things
like
various
descriptions
of
data,
wrangling,
machine
learning,
knowledge.
A
It
could
be
database
knowledge.
Things
like
this,
so
the
very
much
focused
on
technical
skills
or
technical
knowledge
that
they
needed
to
have
there's
also
professional
outcomes,
and
so
this
could
be
things
like
presenting
to
clients.
It
could
be
how
you
write.
It
could
be
how
you
work
as
a
team,
etc
and
then
taking
the
language
from
the
acm
data.
Science
curriculum
guides,
there's
also
sort
of
disposition
outcomes,
and
so
those
are
things
like
ethics,
so
students
approaching
data
science
in
an
ethical
manner.
It
could
be
about
knowing
data
privacy
laws.
A
It
could
be
about
sort
of
the
term
that
was
put
into
the
national
science
academy's
report,
which
is
data
acuity.
So
those
are
all
things
that
we
could
have
focused
on
now.
That's
a
lot
and
again
we
said
we
couldn't
do
it
all.
So
what
did
we
go
and
apparently
it's
not?
Why
is
it
not
showing
all
right
hang
on?
I'm
gonna
put
it
back
up
onto
this
one.
So
what
we
decided
to
do
to
figure
this
out
was.
A
We
did
a
quick
case
study
of
existing
learning
outcomes
from
several
for
basically
from
the
group
members
programs,
data
science
programs
that
they
were
involved
in
so
yes
limited
scope,
but
we
figured
that
was
enough
of
a
perspective
to
figure
out
what
people
were
actually
doing
so
that
we
might
be
able
to
have
a
good
idea
of
what
fit
into
those
spaces.
A
A
A
So
what
we
ended
up
doing
was.
As
a
group,
we
wrote
a
generic
program
learning
outcome.
We.
We
are
all
well
aware
that
if
you
read
this
you're
going
to
be
like
whoa,
that's
a
giant
program
learning
outcome
that
this
is
not
something
that
we
would
actually
expect
a
degree
program
to
use
as
their
program
learning
outcome,
because
it
is
way
too
complex
and
way
too
inclusive.
A
But
what
we
wanted
was
something
that
was
that
that
covered
all
of
the
things
that
might
go
into
one
of
those
two
topics
so
that
when
we
sort
of
wrote
the
assessment
people
could
pick
and
choose
a
little
bit
of
those
pieces
that
made
sense.
So
the
generic
learning
outcome
we
came
up
with
was
communicate
about
the
data
models
and
methods
and
results
of
the
analysis,
including
the
use
of
appropriate
context,
limitations.
Biases.
A
A
Okay,
second
pause
is:
does
anyone
have
any
questions
about
that
generic
learning
outcome
before
I
keep
going,
because
I
do
want
to
make
sure
that
people
kind
of
see
what
it
is
and
what
it
covers
before
we
talk
about
what
we're
going
to
be
doing
to
assess
it,
because
if
you
don't
get
the
scope,
then
the
next
piece
is
just
going
to
be
sort
of
like
noise
to
you.
You're
not
gonna
help.
A
B
And
when
you
talk
about
the
different
level
of
expertise,
you
mean
the
one
that
specify
in
the
scn
document
tier
one
to
tier.
A
A
Yeah
so,
overall
the
asset,
the
group
decided
that
sort
of
our
big
picture
recommended
assessment
process
is
that
we
think
that
students
should
create
some
kind
of
portfolio
of
artifacts
around
each
of
these
communication,
things
that
then,
could
be
evaluated
and
assessed
by
program
faculty
and
so
that
sort
of
leaves
a
couple
of
places
where
our
subgroup
can
continue
to
do
work
and
flash
out
pieces
and
I'm
gonna
sort
of
go
into
those
right.
A
So,
in
particular,
a
program
needs
to
define
it
for
itself
what
categories
of
desired
communication,
skills
or
practices
they
want
to
see
their
students
exhibit,
and
then
they
also.
We
also
need
to
have
sort
of
the
ways
in
which
they
plan
to
assess
those.
So
let's
take
a
look
at
what
we
have.
We
have
sort
of
framed
out
to
sort
of
talk
about
this,
so
our
first
pass
for
a
sample
execution
of
this
assessment
design.
A
Is
we
built
out
a
nice
little
table
here
that
we
thought
mostly
captures
the
broad
swaths
of
ways
in
which
you
could
articulate
different
audiences
for
communication
about
data?
So
you
can
see
there
it's
a
nice
little
grid
that
says
cross.
One
is
domain
expertise
or
no
domain
expertise,
that's
across
our
our
x
horizontal,
because,
throughout
almost
all
of
the
design,
all
the
curricular
literature
for
data
science
domain
knowledge
is
highly
is
like
regularly
acknowledged
as
something
that
needs
to
be
considered
as
part
of
data
science
student
skills.
A
We
also
know
sort
of
sort
of
kind
of
obviously
right
that,
like
having
data
expertise
is
part
of
the
skills
that
data
scientists
need
to
have
and
for
our
purposes
we
have
decided
to
group
coding
and
data
expertise
together
again,
recognizing
that
a
particular
program
might
choose
to
do
something
slightly
different
with
this,
using
this
as
our
grid
of
sort
of
knowledge
that
people
have,
and
so
in
this
case
we
can
see
a
group
of
people
that
have
domain
expertise
and
coding
and
data
expertise,
so
that
might
be
data
science
teams
at
a
big
company.
A
It
might
be
specific
scientists
that
are
working
on
a
project.
Things
like
that.
So
these
are
people
that
have
expertise
in
all
of
those
pieces
for
whatever
the
content
is,
but
we
could
also
imagine
that
there
are
people
that
have
high
levels
of
either
coding
or
data
expertise.
This
might
be
data
engineers,
it
might
be
software
implementers
or
developers,
but
they
don't
know
the
specific
domain.
That's
going
on.
If
you
know
you're
talking
about
some
kind
of
biology
application
or
whatever
or
or
bioinformatics
thing,
maybe
they're
not,
they
know
nothing
about
biology.
A
We
also
imagined
a
group
of
people
that
might
have
domain
expertise,
but
not
as
much
coding
or
data
expertise.
Examples
of
this
might
be
managers
or
ceos
or
people.
The
bosses
of
the
data
science
teams
that
perhaps
know
something
about
data,
but
not
really
enough
to
like
know
the
processes
that
are
happening
and
there's
also
large
swaths
of
scientists
that
are
highly
intelligent,
have
lots
of
domain
expertise,
but
simply
don't
have
a
lot
of
experience
working
in
code
or
working
in
data
and
then
there's
for
lack
of
a
better
idea.
The
general
public.
C
A
So
the
idea,
here
being
that
a
senior
gener
graduating
out
of
a
program,
a
data
science
program
or
data
analytics
program,
would
have
collected
through
their
coursework
examples
of
them
doing
each
of
these
pieces
and
then
perhaps
through
the
capstone
course
or
as
part
of
an
exit
process,
they
would
submit
the
portfolio
for
faculty
assessment
not
as
a
grade
necessarily
but
for
program
purposes.
A
Those
things
include
things
like
a
code
and
document
documentation,
repository
think
of
github
repo,
but
it
could
be
something
else
again
we're
not
over
specifying
here
we
could
be
talking
about
data
storyboard
or
map
and
in
both
of
those
cases,
they're
fitting
multiple
categories,
right,
like
code
and
documentation,
are
probably
being
conveyed
to
data
teams,
other
scientists,
but
also
to
software
developers
that
might
need
to
implement
the
software
things
ideas
that
they've
done.
A
A
The
process
is
identify.
What
groups
you
want
students
to
be
able
to
communicate
with
determine
artifacts
that
would
represent,
demonstrate
that
students
have
communicated,
can
communicate
with
those
groups
and
then
identify
things
that
will
allow
you
to
assess
those
artifacts.
That's
the
big
picture,
one
our
sample
example.
Pet
says
here
is
how
we
might
define
groups
here
is
a
list
of
items
that
would
go
into
that
portfolio.
A
What
we
haven't
done
for
our
sample
assessment
is,
we
haven't
done
the
rubric
pieces,
and
so
one
of
the
things
that
we
want
to
do
in
the
spring
is
sort
of
put
together
some
design
rubrics
or
some
other
processes
and
things
that
might
allow
faculty
to
assess
these
outputs
that
are
in
our
portfolio.
A
The
other
piece
that
I
think
that
we
need
to
do
is
make
sure
that
we
put
together
a
good
documentation
that
sort
of
gives
suggested
questions
for
a
program
or
a
chair
to
walk
through
and
say.
Hey.
Did
you
ask
this
about
your
program?
A
Did
you
ask
this
about
what
you
want
and
want
to
assess,
etc
so
that
they
can
figure
out
how
to
walk
through
and
figure
out
what
they
want
to
do
for
their
particular
assessment
of
their
particular
program
or
whatever
their
learning
outcome
is
and
then,
ideally
the
sort
of
bottom
one
over
here
potential
actions
or
responses
to
assessment
results?
You
know
it'd
be
nice
to
be
able
to
provide.
A
A
Here's
a
couple
of
ways
you
might
choose
to
change
stuff
that
you're
doing
to
improve
that
deficiency
or
whatever
that's
definitely
a
nice
to
have
and
we'll
see
where
we
get
to
so,
and
that
is,
I
believe,
the
end
of
what
I
have
to
show
you.
So
if
there
are
any
questions,
I
would
be
happy
to
take
them.
It
took
me.
C
Yeah,
thank
you,
carl.
I
think
this
is
a
really
good
summary
of
how
to
assess
the
piece
around
communication,
which
I
think
is
really
under
assessed
or
under
thought
about.
In
some
cases
you
know,
even
though
it's
really
highly
prized
by
a
lot
of
potential
hiring.
C
B
So
so
again
this
is
mark,
so
the
assessment
is
for
the
bachelor
in
science,
in
data
science,
correct.
B
Okay
and
because
I
bet
I
bet
they
just
release
in
august,
you
know
the
criteria,
did
you
so
so
maybe
did
you
try
to
align
or
work
and
what
you're
doing
with
your
advert
so.
A
We
did
yeah,
we
didn't.
I
wasn't
aware
that
abec
had
actually
finally
released
their
accreditation
pieces.
What
we
pulled
from
were
the
acm
data
science
curricular
guidelines
and
then
we
cross
referenced
that
a
little
bit
with
some
of
the
other
documentation
out
there
that
had
been
published
sort
of
prior
to
august.
I
knew
abet
was
in
process,
but
I
wasn't
aware
that
they
were
done.
A
C
And
carl
sorry.