►
From YouTube: Safer Driver Human Driver Perception Platform Bojan Mrazovac (NTT Data) Ramakrishna Yekulla Red Hat
Description
Safer Driver: Human Driver Perception Platform That Prevents Accidents, Before They Happen
Bojan Mrazovac (NTT Data)
Ramakrishna Yekulla (Red Hat)
OpenShift Commons Gathering on Automotive
April 6th 2022
full agenda here: https://commons.openshift.org/gatherings/OpenShift_Commons_Gathering_on_Automotive.html
A
Okay,
so
let
me
introduce
myself
first
as
being
the
guest
here,
so
my
name
is
blam
brazos,
I'm
responsible
for
research
and
development
department
at
entity
date,
romania.
Basically,
if
you
go
to
the
next
slide,
my
company
is
a
partner
with
redhead
and
we
are
belonging
to
entity.
Group
entity
group
is
one
of
the
largest
service
and
it
providing
company
from
japan.
It's
a
global
corporate
and
basically
companies.
A
Let's
say
a
sub
company
from
the
group
from
where
I
coming
is
ntt
data,
so
basically
identity
data
we
are
focused
mainly
to
service
providing
and
some
of
our
key
focus
areas
are
actually
the
automotive
what
we
are
going
to
cover
today
as
well:
ai
blockchain
cybersecurity
data
and
intelligence
and
basically
other
emerging
technologies
that
can
generate
a
big
technical
footprint
at
today's
market.
B
So
we've
been
partnering
with
ntt
as
a
system
integrator
over
the
past
18
months
and
specifically
in
the
newer
areas
which
red
hat
has
come
into.
That
would
be
the
edge
and
openshift
and
kubernetes
being
rapidly
adopted
into
these
spaces
entity
we've.
B
We
found
a
perfect
partner
in
entity
where
there
are
lots
of
industry
ledge
cases
in
the
automotive
use
cases
which
can
be
hosted
on
our
red
hat
openshift
container
platform
and
it
being
a
true
hybrid
solution,
and
it
being
it
I
mean
taking
your
taking
your
workloads
to
a
multi-cloud
and
also
the
hybrid
cloud
way
is
the
way
to
go
forward
and
over
the
past
18
months,
we've
been
working
with
various
projects
which
were
traditionally
written
for
the
embedded
site
are
to
be
onboarded
onto
the
I.t
side,
as
most
of
the
previous
speakers
were
speaking
about
how
to
integrate
developers
at
and
car
or
or
automotive
oem
onto
reddit
platform.
B
So
we
kind
of
use
various
accelerators
in
order
to
adopt
these
cloud
and
these
practices
these
solutions
onto
the
openshift
cloud.
B
So
one
of
the
ways
in
which
we
started
integrating
is
we
have
these
validated
patterns,
because
there
are
lots
of
ways
in
which
red
hat
has
built
expertise
over
the
past
one
decade
about
linux
containers,
devops
practices,
how
to
how
to
write
enterprise
solutions
on
an
enterprise
grade.
B
Kubernetes
distribution
like
openshift
container
platform,
that
there
were
many
aspects
of
various
industry
verticals
which
needed
to
be
adopted,
are
adopted
in
an
opinionated
way
in
terms
of
what
would
be
the
right
way
of
getting
these
solutions
on
top
of
openshift
container
platforms.
B
So
working
with
the
various
engineering
and
the
field
solution,
architects,
they
came
up.
They
they
came
out
with
various
validated
patterns
in
which
you
can
you
can
use
the
open
source
software
which
is
built
into
these
validated
patterns,
as
as
code
like
in
the
words
of
liner
store
walls,
it's
more
of
a
talk.
It's
talk
is
cheap.
B
Show
me
the
code
and
many
developers
wanted
to
look
at
how
to
integrate
their
solutions
and
how
to
integrate
their
edge
solutions
on
openshift
and
in
general,
in
the
cooperators
platform,
and
many
of
the
developers
are
eager
to
get
their
workloads
predominantly,
which
were
in
the
automotive
in
the
industrial
use
cases
to
use
a
modern
distributed
software
like
kubernetes
in
their
production
environments.
B
So
these
validated
patterns
and
the
accelerators
which
power
these
validated
patterns
could
be
one
of
the
first
steps
in
which
both
the
partner
developers
and
the
customer
developers
can
start
accelerating,
can
start
adopting
their
code
to
be
to
be
from
staging
to
production
environments
pretty
soon
and
on
boarding.
Developers
is
easy
with
these
validated
architectures
and
onboarding
projects
and
getting
your
projects.
B
Two
products
is
also
is
very
helpful
with
these
and
all
the
software
which
we
are
talking
about
like,
for
example,
here
the
accelerator
which
we
used
for
this
particular
product
was
the
red
hat
bobby
carts.
So
this
is
a
vehicle
accelerator.
It's
a
vehicle
simulator
in
which
there
are
various
red
hat
products,
not
just
the
linux
containers.
Also,
various
red
hat
run
times
various
various
kubernetes
operators,
which
are
used
to
get
to
get
your
solutions
to
to
staging
and
production.
B
So
it's
beyond
just
the
just
the
vehicle
workload.
It's
also
how?
How
are
you
going
to
take
it
to
production
across
multiple
clouds?
So
your
your
your
customers
could
be
using
aws,
they
could
be
using
the
google
cloud
or
they
could
be
using
azure
or
they
could
be
having
their
own
private
data
centers.
So
getting
these
applications
and
processing
these
workloads
at
any
at
any
of
these
endpoints.
These
accelerators
make
it
possible,
and
also
another
thing,
is,
we
believe
in
doing
doing
things
in
the
open.
B
All
source
code
of
this
is
available
on
github
and
and
on.
Most
of
these
are
reproducible
to
handle
most
of
the
automotive
and
the
industrial
use
cases,
and
we
are
also
a
part
of
various
collaboration
like
elisa,
is
one
of
the
one
of
the
foundations
where
we
actively
participate,
and
we
also
collaborate
with
many
customers
and
partners
like
in
this
case,
ndt
raters,
helping
us
onboard
their
automotive
expertise
and
their
work,
automotive
workloads
on
the
openshift
container
platform.
B
So
this
is
one
of
the
one
of
the
ways
in
which
we
use
bobby
car
to
accelerate
an
entity
in
car
solution
for
human
driving,
human
driver
perception
platform
and
one
of
the
gateways
which
it
uses
called
the
multi-sensory
analytics
platform
gateway
onto
openshift
all
the
code.
What
we're
seeing
here
is
is
is
is
workable
and
the
integration
platforms
all
of
these
source
code
is
available
on
github.
We
will
share
the
links
later
so
bobby.
B
Car
itself
is
an
accelerator,
and
we
we
use
this
because
you
once
you're
adopting
a
enterprise
distribution
like
kubernetes
like
openshift
you're,
trying
to
integrate
various
redhat
components,
not
just
the
operating
system.
You
have
messaging
queues,
you
have
time
series
databases,
you
have
the
authentication
mechanism.
Again,
you
need
to
build
your
ai
ai
ml
pipelines
on
top
of
your
on
top
of
your
existing
cloud
and
platform
solutions.
B
So
what
bobbycard
did
is
was
it
was
opinated
architecture
of
how
how
these
in-car
solutions
can
be
onboarded
and
what
are
the?
What
are
the
various
software
components?
You
would
need
from
the
vast
cloud
native
compute
foundations
landscape,
because
there
are
quite
a
few.
There
are
quite
a
few
software
which
you
need
to
choose,
so
what
bobby
car
does
is
it
accelerates
the
process
of
you,
focusing
on
your
own
automotive
code
and
red
hat,
taking
the
workload
and
distributing
it
across
multiple
clouds.
B
A
A
Basically,
if
we
take
a
look
at
connected
vehicles,
market
trends,
which
was
published
by
gartner
for
the
last
year,
we
can
see
that
a
lot
of
topics
around
metaverse
and
human-centric
design
has
been
raised
nowadays,
so
autonomous
vehicles
market
is
a
little
bit
stunning.
This
is
something
that
many
other
analysts
has
also
reported,
and
according
to
that,
we
need
to
find
a
new
way.
A
If
you
go
to
the
next
slide,
basically
not
only
from
the
trends
on
the
market,
but
also
the
regulations
that
we've
seen
have
been
issued
by
european
union,
but
also
by
the
u.s
nate
they're
all
requesting
to
have
this
kind
of
mandatory
safety
features
which
needs
to
be
implemented
and
which
needs
to
be
supported
by
the
vehicles.
A
So
if
you
take
a
look
at
the
list
of
such
features,
you
will
see
that
basically
different
solutions
which
are
supporting
for
a
driver
to
prevent
from
accidents
or
at
least
to
recognize
early
signs
of
possible
accident
and
to
try
to
improve
the
behavior
of
vehicle
which
will
be
correlated
with
the
perception
of
humans.
This
is
something
that
definitely
we
want
to
have
basically
at
our
next
slide.
A
This
is
something
where
we
want
to
show
exactly
the
concept
so
ramke,
if
you
can
just
move
to
the
next
slide-
and
this
is
exactly
the
concept
of
human
centered
approach
that
we
are
proposing
here-
what
we
want
to
do,
we
want
basically
to
give
more
feedback
from
the
client,
so
we
know
the
famous
hazel
formula,
exposure,
controllability
and
severity,
so
these
are
three
main
factors
that
we
are
calculating
today
to
understand
the
ac
level
of
certain
models
which
are
developed
according
to
functional
safety
regulation.
A
But
what
is
missing
here
is
human
sentiment
and
perception
of
certain
features.
You
can
imagine
situation
that
emergency
braking
will
do
its
job,
so
the
vehicle
will
break
on
time,
but
how
smooth
it
will
be.
Will
it
stress
people
inside
of
a
vehicle
how
it
will
cope
with
their
emotions
and
level
of
stress?
This
is
something
that,
unfortunately,
at
this
moment,
we
do
not
have
many
possibilities
to
understand,
also
lane
assistance,
warning
and
similar
operations
and
similar
features
by
the
vehicle.
A
We
can
just
measure
the
behavior
of
a
vehicle,
but
we
cannot
measure
the
human
feedback
for
that,
and
this
is
very
important
because
at
the
end
of
the
drive
at
the
end
of
the
test
drive,
you
can
ask
people
to
fill
the
questionnaire,
but
in
most
cases
it
will
not
represent
the
exact
moment
in
time
when
certain
manner
happened
and
when
the
corrective
operation
needs
to
be
taken.
If
you
go
around
to
the
next
slide,
this
is
exactly
the
solution
that
we
are
proposing.
A
We
are
taking
several
different
parameters
all
together
and
we
are
applying
it
to
cohort
of
vehicles,
meaning
that
we
want
to
understand
the
signals
coming
from
the
canvas
we
want
to
understand
the
human
behavior
and
facial
recognition
to
understand
the
emotional
level.
We
want
to
classify
the
objects
in
front
of
the
vehicle.
A
We
want
to
know
the
exact
context
in
terms
of
weather
and
in
terms
of
the
road,
through
real-time
navigation,
to
understand
exactly
what
is
happening
and
why
this
situation
happened
that
basically
produced
certain
behavior
from
a
driver
or
other
passengers,
and
this
is
exactly
if
you
go
to
the
next
slide
shown
in
this
short
sequence,
how
we
are
collecting
these
data
and
how
we
are
trying
to
understand
the
vehicle
to
collision
risk,
which
is
happening,
for
instance,
during
this
specific
manoeuvre.
A
I
must
mention
here-
and
I
would
like
to
thank
to
avl
company,
which
is
also
one
of
our
partners
and
which
supported
us
for
collecting
this
data
during
the
large
data
collection
and
the
large
experiment
that
we
did
in
the
previous
period.
A
A
Basically,
the
whole
solution
is
very
probabilistic
and
we
are
trying
to
understand.
Basically,
our
machine
learning
model
has
been
trained
to
understand
different
signals
and
their
change
in
time.
So
just
from
these
signals,
you
cannot
understand
if
you
take
one
signal
outside
of
other.
You
cannot
understand
the
exact
situation
and
you
cannot
predict
a
certain
manner
or
event,
but
if
you
put
them
all
together
and
if
you
train
the
model
in
a
way
that,
by
monitoring
in
our
case
it's
around
14
signals
at
the
same
time,
what
is
their
distribution
through
time?
A
A
certain
maneuver
will
happen
during
the
drive,
which
will
give
us
the
possibility
to
check
what
the
drivers
are
doing
and,
if
necessary,
we
can
do
additional
corrective
operation
through
the
monitoring
of
vehicle,
but
also
we
can
understand
how
people
in
general
behave
in
this
situation.
This
is
very
important
because
not
only
to
do
corrective
operations
which
are
correlated
with
human
perception,
we
can
also
do
the
benchmarking
with
certain
features,
and
this
is
what
we
are
trying
to
achieve.
A
So,
as
our
future
cases,
if
you
go
rank
into
the
next
slide,
please
this
is
the
goal,
and
this
is
something
that
we
are
achieving
all
together
with
our
colleagues
from
redhead.
A
We
are
trying
to
apply
it
to
a
large
number
of
vehicles,
so
not
only
to
monitor
one
vehicle
but
to
apply
the
joint
understanding,
the
federated
learning,
which
will
help
us
to
improve
the
general
model
and,
of
course
it
will
keep
safety
and
security
of
the
information's
exchanged
between
a
vehicle
and
between
the
exact
device,
which
is
in
vehicle
collecting
information
in
real
time
and
the
global
knowledge
which
is
stored
on
a
cloud.
And
this
is
something
that
we
believe
could
help
to
prevent.
A
Even
the
hazards
and
accidents
in
future,
because
we
will
try
to
understand,
we
will
try
to
put
the
human
in
the
center
of
happenings
in
the
center
of
drive
and
that
way
we
will
try
to
make
much
better
interaction
between
a
machine
which
is,
in
this
case,
of
course,
vehicle
and
the
human
who
is
driving
or
being
driven
in
autonomous
vehicle.
B
And
if
you're
a
customer
or
a
or
a
partner
or
oem,
we
are
happy
to
demo
demo.
The
human
driver
perception
platform
rtu
also
here
so
it
is.
It
has
been
hosted
on
the
github
project
of
of
the
bobby
car
project
and
then
you
could
you
could
if
you
already
have
a
openshift
cluster,
you
can
just
run
this
project
and
get
there
and
and
try
it
for
yourself.
C
Thank
you
so
much
for
joining
us
today
and
ramakrishna
this.
This
is
great.
I
am
almost
afraid
to
buy
a
new
car
until
all
of
this
stuff
is
gets
ready
because
it's
moving
so
fast.
I
just
don't
know
what
I
should
do
next.