►
Description
Susannah Engdahl, Postdoctoral Research Fellow at George Mason University, gives an overview of her work with using sonomyography (ultrasound) for control of upper limb prostheses.
A detailed write-up can be found here: https://drive.google.com/open?id=1ZBzx0Y_mm700YYkTy2DZ4nHfT65yctzy
More information and discussion about EnableCon 2019 here: https://hub.e-nable.org/s/e-nablecon-2019/
More information about e-NABLE here: https://enablingthefuture.org
A
I'm
here
today,
from
George,
Mason,
University,
usually
and
another
talk
later
on
in
the
afternoon-
that
will
give
more
of
a
deep
dive
into
this
technology,
but
I
wanted
to
just
present
a
quick
overview
of
some
work
going
on
in
our
lab
right
now
towards
developing
some
biographic
control
for
upper
limb
prosthesis.
For
those
of
you
who
aren't
familiar
with
that
term,
that
means
using
ultrasound
images
to
control
prosthesis
some
quick
background.
A
We
know
that
traditional,
my
electric
prosthesis
can
be
is
very
difficult
to
use
and
in
fact,
about
a
quarter,
maybe
more
of
my
electric
users.
Ultimately
shoes
tonight
use
the
processes
that
said
about
74
percent
of
those
people
say
that
they
would
reconsider,
trying
a
prosthesis
if
improvements
were
made
to
their
functionality,
so
this
is
kind
of
the
space
that
we're
working
in
now.
The
question
is
researchers
that
we
need
to
answer
is:
what's
the
best
way
to
translate
a
person's
intended
movement
into
actual
movement
of
the
prosthesis.
A
Most
technologies
right
now
is
sort
of
fall
into
this
lower
left
corner
the
ability
of
that
technology
to
accurately
detect
an
intended
movement
versus
it's
surgical,
invasiveness.
So
right
now
standard,
my
electric
controllers
have
fairly
limited
ability
to
accurately
detect
those
movements,
but
the
positive
side
is:
they
are
completely
non-invasive
to
use.
Now.
A
lot
of
the
emerging
technologies
right
now
are
showing
comparatively
improved
ability
to
accurately
detect
those
movements,
but
many
of
them
require
surgery
in
order
to
use
this
can
be
as
as
brain
surgery.
A
So
some
people
who
maybe
Logan
March
17
wrists
are
okay
with
trying
these
technologies,
but
there's
definitely
a
segment
of
the
population
that
is
not
going
to
be
interested
in
pursuing
those
options
and
that's
where
our
work
comes
in
and
so
show
you.
Throughout
this
talk.
We
believe
that
our
approach
has
improved
ability
to
accurately
detect
movements,
but
we're
doing
this
completely,
not
in
basically,
as
I
said,
our
approaches
through
certain
biography.
So
that
means
we're
using
ultrasound
to
detect
muscle
activity.
A
A
A
Okay.
So
what
happens
when
someone
actually
initiates
of
the
event?
So
in
the
video
that's
playing
right
now,
you
can
see
the
muscle
contractions
that
are
associated
with
someone's
flexing
their
index
finger.
But
if
you
ask
them
to
perform
a
different
movement
like
flex,
their
middle
finger,
that
pattern
of
activation
changes
a
little
bit
because
they're
using
different
muscles
to
initiate
this
movement.
So
the
trick
for
us,
then,
is
to
be
able
to
identify
the
differences
in
these
different
patterns
and
translate
that
to
the
prosthesis.
The
system
that
we've
developed
as
I've
said
uses
ultrasound.
A
Just
a
quick
video
to
show
you
a
little
bit
about
how
this
works
in
this
video,
you
can
see
that
we
have
a
patient
who
has
an
ultrasound
probe
on
his
amputated
limb,
we're
recording
ultrasound
movements
in
real
time.
That's
what
and
it's
showing
the
left
handle
there
and
the
data
that's
being
applied
at
the
bottom
is
essentially
a
correlation
between
the
incoming
signals.
A
The
kneading
of
the
older
stand
sound
frames
and
a
frame
that's
recorded
when
the
subject
is
completely
at
rest,
so
these
lower
plateau
B's
correspond
to
when
the
subject
is
at
rest
and
the
higher
plateaus
corresponds
to
when
the
subject
has
fully
completed
their
grasp.
Those
frames
are
what
goes
into
our
database
once
we
have
that
database
constructed
then,
as
we
record
new
images
in
real
time,
compare
each
of
those
images
to
the
friends
that
are
in
the
database
and
try
to
identify
what
quote
most
closely
matches
the
grasp
in
the
database.
A
We
can
achieve
individual
level
figure
control
of
a
prosthetic
hand.
You
can
see
that
the
movement
being
initiated
by
the
participant
here
is
very
closely
matched
by
the
prosthesis
now
another
advantage
of
this
technology
is
that
not
only
can
we
achieve
the
motion
end
state,
we
can
actually
control
all
of
the
positions
in
between
rest
and
motion
completion.
So
you
can
see
this
video.
A
So
putting
this
all
together,
what
happens
when
we
actually
put
this
prosthesis
on
a
person
for
comparison?
The
video
that's
playing
right
now
is
showing
a
prosthesis
user
who
has
a
your
prosthesis
standard
drug
control
and
he's
actually
owned
this
system
for
about
ten
years,
so
he's
a
very
experienced
user,
but
even
so,
you
can
tell
that
it's
really
struggling
to
pick
up
individual
blocks
to
move
them
over.
A
This
barrier
he's
very
slow
at
this
task
and
you
can
actually
see
a
lot
of
involvement
of
his
upper
body
as
he's
trying
to
manipulate
the
prosthesis
where
it
needs
to
be
now
contrast.
The
video
playing
on
the
right
is
showing
the
same
participant
using
prosthesis
controlled
by
stenography,
and
this
is
after
essentially
Noah
training.
He's
basically
put
this
on,
and
it's
already
showing
clear
visual
improvements
in
this
functionality
so
he's
much
faster
at
picking
out
the
blocks
you
can
see.
This
movements
are
much
smoother,
there's
a
lot
less
compensation
involved
in
his.
A
So
these
are
all
exciting
things.
What
I
want
to
close
on
is
that
one
question
that
you
may
have
been
thinking
threads
holes
presentation
is
that
ultrasound
machines
are
giant
if
you
have
ever
seen
one,
and
you
might
be
wondering
why
someone
would
ever
want
to
drag
one
of
those
around
with
themselves,
just
use
the
prosthesis
as
a
great
question,
and
that's
actually
what
we're
working
on
right
now
to
try
to
miniaturize
an
ultrasound
system
so
that
it's
small
enough
that
we
can
incorporate
it
into
a
prosthetic
socket.