►
From YouTube: What's new in ML.NET?
Description
Announcing .NET Core 3.0: https://aka.ms/dotnetcore3
#dotNETConf
Get started with ML.NET so you can create machine learning models in C# to run impressive predictions and use them in your .NET applications!
Create innovation in your apps with scenarios such as sentiment analysis, anomaly detection, text classification, object detection and many more!
B
B
So
if
you
want
to
get
a
comprehensive
introduction
about
a
month
of
net
many
things
that
we
don't
have
time
today
to
talk
about
you
can
you
can
listen
the
session
that
we
had
in
in
build
in
May
and
there's
a
link,
but
today
we're
going
to
be
focusing
on
what's
new
and
very
shortly
into
a
bottom.
All
of
that,
but
then
focusing
on
things
like
the
new
database,
loader
deep
learning,
what
new
things
we
have
with
image
classification
and
training
on
tensorflow
and
finally,
a
sneak
peek
about
Jupiter
notebooks
as
well
with
another
net.
B
So,
let's
start
with
a
quick
intro
on
animal
dotnet.
First
of
all,
I'm
at
the
ML
dotnet
is
most
of
all
a
framework
and
it
is
cross-platform
and
in
open
source
and
it
is
built
for
dotnet
developers.
We
aim
to
make
it
simple,
and
so
you
can
create
your
own
machine
learning
models
without
having
a
lot
of
knowledge
on
on
machine
learning
and
of
course,
then
you
use
just
dotnet
and
natively.
B
Then
another
important
point
is
that
with
amount
of
net,
you
can
create
your
own
models,
so
it's
custom
AI
a
suppose
or
or
prevail
AI
where
you
just
consume
services.
Here
you
are
creating
your
own
models
for
your
own
business
domains
and
your
own
data
and
images
for
instance,
and
then
we
are
also
extending
them
a
lot
net
to
other
frameworks
or
other
libraries
like
deep
learning,
transfer
flow
and
many
tools
that
I
will
be
showing
later.
B
And
finally,
it
is
trusted
or
prove
another
scale,
because
even
when
we
launch
it
publicly
in
May
this
year,
there's
been
like
around
more
than
eight
years
that
that
has
been
used
internally
by
Microsoft
in
large
products
like
being
or
Windows
or
office
or
sure
as
well.
A
few
products
in
usher
are
using
internally
for
many
years
ml
dotnet,
so
you
can
go
and
search
more
about
in
there.
Another
point
that
I
want
to
highlight
about
ml.
Dotnet
is
that
ml
dotnet,
because
it
is
a
framework
it
runs
anywhere.
B
You
can
run
it
on
premises
on
your
servers
on
your
laptop
or
in
Asscher
or
any
cloud
anywhere
you
want
and
because
it
is
cross-platform
you
can
run
it
on
Windows,
Linux
or
Mac
OS
right.
So
the
frameworks
that
we
support
our
dotnet
core.
That's
why
it's
cross
platform,
but
also
for
legacy
applications.
You
can
also
infuse
machine
learning
into
those
applications
with
dotnet
framework
and
we
even
have
Python
bindings
with
Nimbus
ml.
So
you
can
generate
also
ml
dotnet
models
with
Python
and
we
support
our
64
bits
and
32
bits.
B
So
what
can
you
do
with
amount
of
math?
There
are
many
scenarios
that
you
can
do.
You
can
create
models
for
like
having
text
about
customers,
feedback
and
and
and
create
a
model
for
sentiment,
analysis
or
for
recommend
recommendations
about
products
or
predict
the
price
of
something
like
a
house
or
pretty
proud
about
a
taxi
first,
like
the
demo,
I'm
gonna
do
later
or
anomaly,
detection,
etc.
You
can
go
to
that
URL
and
see
many
samples
that
we
have
in
there
and
do
any
of
those
scenarios.
B
B
You
need
to
prepare
your
data
and
build
on
train
your
model
and
that's
kind
of
the
the
package
storage
process
that
you
do
a
sub
model
creator
or
as
a
developer,
and
you
use
tools
on
the
CLI,
maybe
or
up
or
console
applications
to
build
a
model,
and
once
you
have
the
model
which
is
a
sip
file
when
you
serve
as
that,
then
you
can
deploy
that
into
you're
in
user
applications
in
in
production.
So
that
would
be
the
model
consumption
okay.
B
So
there
are
two
very
different
cycles
in
when
working
with
machine
learning
and
then
what
are
the
ways
you
can
use?
Ml
dotnet.
Basically
we
have
three
ways:
one
is
just
use
the
plain
API
in
c-sharp
or
F
sharp.
You
can
research
the
dog,
so
you
can
go
to
the
samples
and
do
a
similar
case
than
the
sample
and
just
write
your
code
and
train
your
model
and
then
consume
it.
B
Basically,
you
can
see
here,
I
have
just
Visual,
Studio
and
console
app
is
completely
empty
and
from
there
I'm
gonna,
right
click
and
the
machine
learning
model,
and
so
this
is
like
a
wizard
where
you,
you
have
different
scenarios
and
or
custom,
and
you
can
get
into
machine
learning
tasks,
but
for
this
demo,
I'm
gonna
do
the
price
prediction
scenario
which
is
under
the
covers:
are
regression
I
want
to
predict?
What's
going
to
be
the
prices
or
the
taxi
fares
of
taxi
trips,
depending
on
on
input
variables
like
distance
on
or
time
and
so
on?
B
Right?
So,
basically,
you
can
load
the
data
and
train
the
model
with
that
data
from
a
file
or
from
a
database.
In
this
case,
I'm
going
to
use
a
file
in
the
file
is
in
here
it's
a
basically
a
CSV
file
with
a
bunch
of
of
data
about
many
taxi
trips.
How
long
were
those
taxi
trips
or
how
long
in
time,
weirder
or
and
then,
of
course,
what's
the
fair
amount
in
the
past?
B
So
with
that
data-
and
we
have
here
like
a
20,000
records
or
more-
we
can
train
the
model
and
I
need
to
select.
What's
going
to
be
the
column
that
I
want
to
predict
on,
which
is
the
fair
amount
right.
So
with
that
I
can
train
the
model,
let's
put
20
seconds
and
start
training.
So
something
interesting
here
is
that
it's
not
just
training
with
a
single
algorithm
under
the
covers
we're
using
Auto
ml,
which
is
looking
for
different
algorithms
and
hyper
parameters,
combinations
and
then
selecting
the
best
model
for
that
data
set
right.
B
So
in
this
case,
just
in
20
seconds,
it's
a
it's
been
looking
for
a
few
models
and
you'll
see
that
the
best
algorithm
in
here
has
been
for
so
far
like
GBM
regression.
But
you
don't
really
need
to
know
all
these
algorithms
when
you
see
model
builder,
because
it's
doing
that
for
you
all
those
tries
and
and
finding
what's
the
best
model
for
you
right.
B
So
now
that
the
training
is
complete-
and
this
is
just
a
demo-
if
you
have
a
huge
data
and
then
you
might
be
doing
this
training
for
minutes
or
even
hours-
but
in
this
case
is
quick,
because
it's
a
just
small
data
right
in
here.
When
you
go
to
the
evaluation,
then
you
can
see
what
were
the
the
models,
the
best
5
models
and
you
can
see
the
light.
Gbm
regression
was
the
best
one
and
finally,
we
can
generate
the
code
and
the
model.
B
So
if
I
click
here
and
add
projects,
we
get
these
projects
added
into
your
solution.
And
here
you
can
see
the
model.
Dots
Abe
is
precisely
the
serialized
model
that
was
for
the
best
algorithm,
so
you
could
just
use
it
in
code
like
in
the
code
you
we
have
here,
generate
it
for
you
and
just
in
in
in
a
couple
of
lines
and
I'm
going
to
put
a
breakpoint,
you
can
see
that
you
can
create
simple
data
sample
data
and
then
predict
with
that
sample
data.
So
let's
debug
it.
B
B
So
we
just
got
sample
data
from
a
record
from
a
file,
but
you
could
also
create
your
own
data
or
hard-coded
or
could
be
coming
from
your
application.
But
in
this
case
you
can
see
this
sample
data
as
input
data,
and
then
we
do
our
prediction.
With
that
data.
Calling
the
method
predict
from
the
prediction
engine-
and
here
you
can
see
the
prediction
that
in
this
case
is
like
16
dollars
right.
B
So
if
I
just
move
forward,
you
can
see
that,
for
that
test
predict
test
data,
the
actual
value
was
17
and
a
half
dollars,
and
it
was
proof
pretty
close,
the
predicted
value
right,
but
this
is
not
all
like
it's
not
just
about
generating
the
model
and
how
to
run
it.
We
are
also
providing
you,
the
code
that
we
actually
use
for
training
the
model.
So
if
I
create
a
write
model
builder
and
then
create
model,
this
is
a
custom
function
that.
B
B
You
can
see
how
it
is
loading,
the
data.
So
let's
see
the
code
loading
the
data
from
the
file,
then
we
are
building
the
pipeline
doing
data
transformations
like
converting
all
the
text
into
numbers,
because
machine
learning
is
all
about
math
and
we
need
everything
in
numbers.
Finally,
concatenating
all
the
values
as
input
features
and,
finally
adding
the
algorithm
is
going
to
be
using
and
then
training
model,
and
when
you
call
fit,
then
you
train
the
model
and
then
you
can
save
it
as
the
zip
file
that
we
we
were
seeing
right.
B
B
Well
before
that.
Just
to
mention
also
that
with
a
CLI,
the
command-line
interface,
you
can
do
the
same
thing
that
I
just
did
in
Visual
Studio.
You
can
also
do
it
with
the
CLI,
if
you're,
just
in
a
Mac
or
Linux
in
just
an
editor,
and
you
don't
you're,
not
using
Visual
Studio.
You
can
also
use
the
CLI
with
a
similar
line
that
or
command
below
I'm
a
lad
outer
trained
tasks
were
the
data
and
what
you?
B
What
do
you
want
to
predict
and
then
you
will
get
the
code
generated
and
more
all
in
the
same
way.
So,
let's
see
what's
new
in
enamels
of
net
first
thing
is
the
database
loader
and
basically
what
you
can
do
with
the
database
loader
is,
you
know
just
a
three
four
lines
of
code:
you
can
access
a
database
and
use
it
for
training
and
it's
gonna
be
loading
streaming.
Data
can
be
huge
amount
of
data
and
and
it's
gonna
be
used
for
training
your
model
as
needed
right.
B
You
can
use
it
with
a
sequel,
server
or
any
relational
database,
supported
by
system
data,
so
it
could
be
also
Oracle
or
Posterous
or
my
sequel,
etc.
This
is
in
preview,
but
it's
been
used
for
a
few
weeks
by
customers
and
let's
see
a
demo
about
that
in
how
is
this
to
use
database
instead
of
a
file
like
like
we
were
using
in
the
other,
damn
right.
B
So
in
this
case,
I
have
a
project
that
I
generated
from
Mara
Bueller,
it's
pretty
similar
to
the
previous
one
and
just
made
a
change
in
the
day
in
the
class
for
loading,
the
data
act
aside
or
needy
attributes.
But
then
the
on
thing
that
I
also
did
is
I
commented
the
code
that
was
using
for
loading
from
the
text
file
from
the
file
and
instead
of
doing
that,
we're
gonna
put
here
the
code
for
loading
data
from
a
sequel
database
right.
B
So
this
code
is
for
a
simple
should
be
very
familiar
for
any
for
any
dotnet
developer.
You
can
see
that
I
have
a
connection
string
here.
In
this
case
it's
a
sequel,
local
DB,
but
could
be
any
other
database.
I
have
the
sequel
statement
to
my
table
where
I
have
the
same
data
then
I
had
in
the
in
the
file
in
this
case
and
then
create
the
database.
B
Loader
and
finally
provide
those
parameters,
in
this
case
its
sequel
client,
but
it
could
be
Oracle
or
could
be
any
other
database
and
finally,
we'll
load
that
data
into
the
I
data
view
and
the
I
date
of
view
is
able
to
assume
data
as
needed
will
when
training
right
and
then
the
training.
This
is
exactly
the
same
code:
doing
data
transformations
and
then
column
fit.
So
that's
not
nothing
new.
B
B
So
before
getting
into
this
scenario,
I
want
to
highlight
a
little
bit
of
insight,
so
in
the
market
or
in
the
deep
learning
world.
Many
organizations
like
Google
or
Microsoft
or
universities
have
created
complex,
DNN
or
deep
neural
networks,
architectures
and
pre-trained
models
like
Inception,
v3
or
ResNet,
or
many
others,
focusing
on
specific
scenarios
like
image,
classification
or
object
detection,
and
they
train
those
models
with
a
huge
amount
of
material
of
images
and
and
then
you
can
use
those
models
in
enamelled
of
that.
B
The
the
difference
is
that
later
on,
we
are
going
to
be
able
to
train
and
do
transfer
leading
with
those
right.
But
let's,
let's
start
with
a
simple
scenario:
image
classifier
right,
so
you
could
use
one
of
those
free
train
models
and
then
you
get
a
picture
and
it
says
it
is
a
dog
or
it
isn't
Alif
an
or
it
is
a
flower
right.
B
So
you
would
be
using
any
of
those
three
train
models
like
Inception,
b3
or
ResNet,
and
you
can
just
use
that
that
model
from
tends
to
flow
in
in
just
score
it
or
use
it
and
enamel
up
that.
You
can
see
the
samples
that
we
had
previously
and
same
thing
for
object,
detection,
which
is
not
the
same
object
detection.
B
You
provide
a
photo,
for
instance,
and
then
you
detect
different
objects
like,
in
that
case,
a
dog
or
a
person
with
the
same
object
of
the
same
photo
and
same
thing
about
this
case,
a
cat
or
a
dog
within
the
same
picture
right.
But
the
in
this
case
is
just
about
using
those
pre-trained
models,
but
the
important
thing
this
is
this
is
the.
The
key
question
is
what,
if
you
want
to
classify
images
based
on
your
custom
domain,
based
on
your
own
images
on
your
own
labels
right?
B
So
for
that
you
need
to
train
your
own
model
and,
for
instance,
what
if
I,
don't
want
to
just
say:
hey
this
picture
is
a
flower
I
just
want
to
say:
I
want
to
be
more
precise
and
say
this
is
a
rose
or
this
tulip
right.
So
for
that
you
can
train
a
model
in
ml
dotnet
and
what
we're
doing
is
using
our
approach,
called
transfer,
learning
and
kind
of
deriving
from
those
pre
train
models
and
then
train
on
your
own
images.
B
Right
so,
let's
say
I
want
to
show
you
very
quickly
that
you
can
do
that
with
model
builder,
and
here
I
have
another
sample
where
I
just
have
an
empty
project.
I
can
add
machine
learning
again
and
in
this
case
I
will
click
image.
Classification
I
select
the
folder,
where
I
have
the
images,
so
this
folder
with
images
basically
has
a.
B
One
subfolder
per
class
or
four
different
type
of
images
like
they
see
it's
one
subfolder
or
roses,
or
some
flowers
and
children's
right,
so
we
basically
are
going
to
say
either
we
train
this
model
based
on
on
those
classes
right.
So
this
would
take
a
few
minutes.
I
want
to
move
on
and
show
it
how
we
also
are
working
on
the
API
and
in
doing
the
native
transfer
learning
that
I
mentioned
right.
B
So
so,
basically,
what
we
are
doing
is
we're
using
the
internal
or
low-level
tensorflow
bindings
for
c-sharp
in
terms
of
felt
net
and
and
with
that
we
are
retraining
your
model
using
transfer
learning.
So
those
models
like
Inception
b3,
maybe
was
trained
with
many
image
as
many
photos
that
are
comparable
similar
to
the
ones
you
want
to
use.
So
we
are
taking
that
knowledge
from
that
pre
trained
model
and
then
retraining
with
your
own
images
in
the
last
layer
of
the
of
the
transfer
flow
model.
B
But
that
means
that
we
are
really
creating
a
tension
flow
model
under
the
cover.
So
it's
a
deep
learning
training
as
well
right,
there's
a
big
benefit
that
we
are
doing
with
this
API
for
image
classifier,
which
is
we
are
doing
that
transfer,
learning
for
image
classification
in
a
single
line
or
maybe
three
lines
with
all
the
parameters
that
you
can
see
in
here
and
for
all
that
transfer
learning
right.
B
So,
just
with
with
with
this
code,
we
are
able
to
do
all
these
transfer
learning
and
then
finally
generating
the
tensor
flow
model,
which
is
a
PV
file
and
the
ML
dot
and
model
that
you
can
use
right.
So
I'm
gonna
run
the
sample,
and
here
you
can
see
the
code
using
the
API
and
the
new
API,
which
is
image
classification,
where
we
are
providing
hey
when
I
use
the
in
this
case
the
this
pre-trained
model
ResNet
or
could
be
Inception
v3
and
a
few
parameters
on.
B
If
you
want
to
use
the
callback
to
get
feedback
and
so
on
so
I'm
gonna,
debug
and
start
running
the
training,
it
will
take
a
few
seconds
while
I'm,
showing
you
the
the
images.
So,
in
this
case,
it's
gonna
be
training
with
around
400
images
right,
like
40
matches
per
class,
and
if
you
compare
this
code
with
the
code
that
you'd
need
to
use
by
justice
and
tensorflow
and
low-level
API
would
be
like
400
lines,
and
in
this
case
we
are
able
to
do
it.
B
But
I
just
add
in
this
single
call
to
this
estimator
image.
Classification,
loading,
the
images
from
the
folder
specifying
the
architecture
saying
how
many
tries
you
want
to
do
on
those
images
with
epoch
and
and
that's
it
and
here
should
be
the
training
star
in
so
it's
gonna.
Be
now
did
the
first
epoch.
It
has
like
80%
accuracy
and
in
a
few
seconds
we'll
get
many
more
epochs
or
or
training
with
the
images.
And
finally,
we
would
get
the
the
predictions
that
we
have
right.
B
B
Okay
and
another
thing
is
about
data
exploration
and
machine
learning.
Experimentation
and
informants
are
good
for
learning
like
curses
and
and
hands
on
lap
right.
So,
if
I
talk
about
that
a
machine
learning,
what
would
you
think
I
probably
would
think
about
Jupiter
notes
right.
So
this
is
what
we
are
working
now
as
well.
We
are
providing
support
for
dotnet
in
general,
C,
sharp
and
F
sharp
on
Jupiter
notebooks
right
now,
it's
we
are
working
on
it.
B
If
you
want
to
try
it
ping
me
through
Twitter
and
I
will
provide
access
on
how
to
use
it,
but
the
public
preview
will
will
be
coming
pretty
soon
in
in
a
few
weeks
right.
So,
basically,
with
with
this
support,
we
we
have
a
Jupiter
kernel
for
dotnet,
which
is
based
on
on
donĂt
try
and
we
can
run
ML
dotnet
code.
So
I
want
to
show
you
a
demo
where
I
have
here.
B
Jupiter
install
in
my
machine,
you
can
install
it
very
easily
with
anaconda
and
then
launch
it
and
after
that
I
install
the
new
kernel
right.
So
once
you
have
it
installed,
you
can
see
that
now
I
can
create
a
notebook
for
C,
sharp
or
F
sharp,
and
here
I
have
one
of
the
notebooks
that
I
created
using
ml
dotnet.
B
B
That
is
loading
from
the
file
like
the
previous
code
that
you
just
saw,
but
you
I
can
just
run
this
piece
of
code
and
then
show
the
schema
and
it's
running
for
real
and
then
showing
the
data
about
how
these
data
set
is
like
I
have
the
in
these
fields
and
or
I
could
also
show
few
records,
for
instance,
in
this
case
and
I,
see
that
what
kind
of
data
it
is
I
could
do.
Data
transformations
and
show
it
later
and
and
then
even
even
further.
B
You
can
also
use
X
plot
in
plot
charts,
like
in
this
case
the
distribution
of
the
taxi
trips
per
cost.
Look
I
can
see
that,
for
instance,
most
of
the
taxi
trips
were
in
between
five
and
ten
dollars
and
or
I
can
also
plot
other
interesting
information
about
the
time
versus
distance
and
then
different
color,
depending
on
the
first.
B
So
I
can
see
that
the
defer
is
pretty
much
related
in
this
case
to
the
distance
at
the
time
right
and
or
if
I
just
want
to
see
the
relation
between
the
the
taxi
fare,
depending
on
the
time
or
depending
on
on
the
distance
and
but
then
depending
on
the
passengers.
For
instance,
I
can
see
that
there's
no
real
relationship
here
right,
so
you
can
get
a
lot
of
insights
when
plotting
data
in
in
Jupiter
and
then,
of
course,
you
can
also
run
and
build
your
model
here.
B
So
here
we
have
the
ML
dotnet
pipeline
for
this
model
and
I
could
just
run
in
here
and
see
the
data
transformations
or
I
could
append.
Then
the
trainer
and
call
fit
and
see
how
much
time
with
this
instruction
person
its
percent
area
time
and
then
I
see
that
it
took
three
seconds
to
train
this
model.
Right
or
I
can
do
predictions
and
then
display
the
metrics,
and
you
see
it
in
a
very
neat
way
in
this
document.
B
So,
basically
you
can
document
your
exploration
or
your
model
creation
in
a
very
neat
way
with
Jupiter,
notebooks
or
finally,
I
could
see
kind
of
how
are
the
predictions
versus
all
the
actual
values
or
even
in
this
case,
we
it
was
a
regression,
so
I
could
be
plotting.
A
regression
line
like
in
this
case
was
the
regression
line
and
where
are
the
the
predictions
versus
the
actual
values
right
so
or
even
saving
the
the
model?
B
With
that,
just
finishing
just
want
to
highlight
that
there
are
many
other
important
features
in
ml
dotnet
like,
for
instance,
how
to
take
into
account
your
model
life
cycle
in
your
CS
CD
pipelines
in
the
Bob's,
not
just
for
your
application,
but
also
for
your
model
or
model,
explain
ability
and
future
importance.
So
what
are
the
more
important?
B
The
most
important
input,
columns
that
you
have
for
your
model,
cross
validation
or
how
to
deploy
your
model
into
asp.net,
core
applications
or
sure
functions
and
by
the
way,
also
want
to
highlight
that
we
have
a
ml
of
net
YouTube
playlist?
So
you
can.
You
can
take
a
look
too
many
short
videos
about
ml
dotnet
in
YouTube,
like
one
single
video
just
about
cross
validation
or
one
single
video
and
I'm,
pretty
sure
to
talk
about
how
to
deploy
into
a
Web
API,
or
things
like
that
right
and
about
the
road
map.