►
From YouTube: 013 ONNX 20211021 Karzynski ONNX SIG Operators
Description
Event: LF AI & Data Day - ONNX Community Meeting, October 21, 2021
Talk Title: ONNX SIG Operators
Speakers/Co-Chairs: Michal Karzynski (Intel), Ganesan Ramalingam (Microsoft)
B
Okay,
so
good
good
morning
or
evening,
depending
on
where
you
are,
and
I'm
mijo
and
together
with
rama,
we
are
chairing
the
operators
sig
in
onyx,
and
I
wanted
to
give
you
a
few
updates
and
talk
about
some
of
the
some
of
the
plans
for
the
near
future.
B
So
starting
with
the
updates
of
what
happened
since
the
last
onyx
workshop
after
the
last
onyx
workshop
was
held,
release
1.9
of
onyx
was
made
public
and
then
more
recently,
release
version
110
and
was
made
available
and
with
onyx,
1.9
and
operator
set.
14
was
added
to
onyx
and
with
110
operators
at
15..
B
So
these
two
new
operator
sets
made
some
changes
and
some
of
those
operator
for
operators
at
14
changes
were
already
discussed
during
the
last
workshop.
So
I
won't
go
into
details,
but
just
to
give
you
an
overview
with
offset
14,
we
added
a
small
number
of
new
operators
and
not
just
those
two,
but
with
15
we
added
a
few
more
so
with
operator
set
15.
B
We
added
the
bernoulli
of
random
number
generator
as
a
function
using
other
random
number,
generating
operators
already
available
in
onyx
and
also
cast
like
is
a
simple
function
that
can
cast
data
to
a
different
type
based
on
the
type
of
another
input,
and
so
those
are
additional
functions
that
can
be
decomposed
if
you
have
support
for
for
onyx
functions
and
the
other
operators
that
were
added
all
have
to
do
with
the
new
optional
data
type,
and
that
I
will
talk
about
in
a
minute.
So
these
are
new
operators.
B
A
few
more
operator
updates
were
added
in
upset
15,
and
these
are
changes
related
to
support
of
new
types
or
in
case
of
in
case
of
shape,
there's
a
more
substantial
change
where
the
shape
operator
can
now
return
a
slice
with
new
attributes.
There
were
a
few
clarification
related
to
operations
and
updates
to
batch
normalization.
B
Now,
all
of
these
all
of
the
numbers
on
the
right
represent,
pull
requests
where
these
changes
were
made.
So
you
can
review
them
if
you,
if
you
want
to
by
going
to
the
appropriate
pull
request,
the
more
impactful
for
the
future
of
onyx
change.
Are
these
the
changes
in
them
in
the
type
system
which
introduce
new
possibilities
of
expressing
inputs
to
operators?
B
Some
of
the
operators?
Don't
support
them
yet,
but
many
operators
will
likely
be
updated
in
the
near
future
to
support
them.
So
one
is
the
optional
type,
which
represents
a
value
or
a
placeholder
for
that
value,
depending
on
the
scenario,
so
you
can
have
an
optional
type
and
the
other
is
the
sparse
tensor
which
actually
was
already
available
in
on
xml,
but
was
now
promoted
to
be
part
of
the
onyx
default
typeset.
B
For
this
specific
conversion
from
the
older
offset
to
the
newer
offset
should
be
provided
as
well,
and
these
tests
will
make
sure
that
this
happens,
and
we
continue
working
on
this,
as,
as
you
know,
also
mentioned
in
in
her
presentation-
there's
investment
into
this.
But
I
have
a
question
for
for
the
audience,
because
we're
we've
mainly
been
focusing
on
upgrade
adapters
from
a
lower
operator
set
to
a
higher
operator
set.
B
So
there
aren't
very
there
aren't
many
or
there
isn't
complete
support
for
downgrade
operators
and-
and
if
you
have
use
cases
for
that,
if
this
is
a
pain
point
for
you,
then
please
let
us
know,
and
for
the
future
there
are
some
directions
that
have
appeared
as
part
of
the
discussions
of
the
onyx
roadmap,
for
example,
there
is
some
need
to
focus
on
pre-processing
of
models.
It
seems
that
in
deployments,
using
onyx
pre-processing
is
becoming
a
bigger
and
bigger
performance
problem.
Since
these
these
pre-processing
steps
are
often
carried
out
in
python.
B
But
for
the
pre-processing
steps
there
are
operations
which
are
which
are
missing,
and
these
have
some
of
these
have
been
pointed
out
during
the
road
map
discussions
and
are
listed
here.
But
perhaps
more
pre-processing
operations
are
also
needed
and
you
can
let
us
know
which
ones
you
see
and
that
are
required
for
more
integrated
machine
learning
pipelines
that
could
be
performed
all
in
onyx.
B
Now
another
direction
is
federated
learning
and
this
federated
learning
could
be
could
be
done
with
onyx
tools
and,
in
particular
the
onyx
training.
Extensions
could
be
used
for
for
learning
on
edge
devices,
but
there
are
probably
functions
and
operators
that
are
missing
for
these
scenarios,
and
also
your
feedback
is
welcome.
B
What
what
is
required
here
to
support
your
use
cases
and
for
the
pre-processing
work.
There
is
actually
a
work
group
established
already
focusing
on
pre-processing.
The
goal
of
this
workshop
is
to
identify
the
requirements
and
present
them
in
a
form
that
can
be
integrated
with
with
onyx
so,
for
example,
defining
new
operations.
B
There
are
monthly
meetings
of
this
work
group,
so
you
can,
you
can
join
those
and
another
direction
that
we're
focusing
on
is
the
unified
is
unifying
the
interface
of
operations
and
during
the
octo
ml
presentation
we
had
an
example
of
an
inconsistency
in
in
the
spec
pointed
out
to
us.
B
We
are
aware
that
there
are
a
few
of
these
inconsistencies
and
we
would
like
to
identify
all
of
them
and
then
align
the
spec
so
that
there
aren't
inconsistencies
and
that
the
interface
is
unified
across
all
operations
and
to
that
end
there
is
now
an
issue
on
github
number
3651,
and
that
issue
was
started
by
jackie
and
he
started
collecting
their
these
different
inconsistencies
and
other
problems
in
this
pack
into
one
list
that
that
we
can
use
to
generate
this
unified
interface
style.
So
please
contribute
to
this
list.
B
If
you
have
observations
that
aren't
yet
accumulated,
there
add
your
comments
and
also
if
somebody
is
interested
in
making
small
fixed
prs.
That
is
a
good
starting
point
to
go
through
that
list
and
just
start
making.
These
fixes
we're
also
as
as
before,
focusing
on
reducing
the
number
of
primitives
added
to
to
onyx.
So
most
of
the
new
operations
are
being
added
as
functions,
if
possible,
we're
keeping
we're
making
sure
that
that
happens
and,
if
possible,
we're
looking
for
operators
that
we
could
replace
with
functions.