►
From YouTube: Magento Async Import #1 - Grooming (Aug 1, 2018)
Description
First grooming of async import functionality with good discussion around deadlocks and scalability
A
B
A
A
A
Casting
product
sale
saw
the
idea
to
to
make
a
small
prototype
around
the
brand.
The
default
I
bought
export
with
this
V
data
there
with,
for
let's
say,
1000
product
out
ten
thousand
products
and
make
the
time
calculation
between
the
initial
start
time
and
the
time
and
all
products
are
compared
to
the
back
end
of
magenta
and
then
the
same
way
for
bout
API.
Just
to
see
the
time
internally
and.
C
It's
not
really
to
validate
if
it's
needed
it's
rather
to
see,
would
draw
backs
and
like
pluses
and
minuses
you
have
correct,
because
if
you
switch
to
a
synchronous
import,
you
might
probably
notice
that
all
together,
like
every
product
can
be
persist
slowly,
because
we
will
use
eight
guys
instead
of
just
direct
import
procedure.
Lakes
in
town
like
it
is
implemented
right
now
in
magenta.
C
From
the
other
hand,
overall
scalability
will
be
higher
because
you
can
run
multiple
consumers
for
the
queue
and
process
those
inputs
in
constrain,
so
measuring
those
drawbacks
and
advantages
of
this
process.
You'll
allow
you
to
kind
of
communicate
it
better
like.
If
you
use
that,
then
you
probably
should
be
aware
of
that
answer.
A
Which
is
open
for
anybody
who
want
to
take
it
and
then
so.
I
came
up
with
another
epic,
which
only
appeared
on
bulk
API
project,
and
it
was
moved
here
that
was
already
all
khaki
a
syncope
implementation.
So
the
idea
was
that
we
uses
the
radius
database
so
for
all
bulk
API
calls
so
occupy
call
works
like
you
make
an
API
call
to
the
system
it
store
in
this
operation.
So
the
object
in
an
operation
table
of
Magento
and
then
consumer
reads
it
afterwards
from
stable.
A
A
A
Good
then
I
don't
know
where
I
stopped,
but
we
made
early
performance
testing
for
for
using
the
radius.
Instead
of
the
database
for
Magento
operations
and
and
their
performance
improvement,
there
was
I
guess
it
was
like
30
percent
faster
without
using
the
tables
and
user
IDs
for
that,
and
we
also
came
up
to
idea
that
we
can
move
forward
and
in
exclude
the
database
usage
for
the
party.
Pho
and
I
made
a
some
profile,
a
test
with.
A
You
object
correct,
but
just
one
screenshot
of
all
the
output
which
I
get
that
was
a
request
which
margin
to
select
from
AV
AV
attributes
on
a
step.
One
is
creating
a
new
location,
a
new
project
but
yeah
when
in
a
step,
and
we
send
in
a
new
API
call
to
Magento,
so
it
and
also
it's
doing
the
identification
so
always
getting.
A
The
users
based
on
their
token,
is
have
those
requests,
which
is
quite
a
lot
of
them
and
then
comments
emergent
operation,
tables
usage
and
also
made
a
small
cold
check
and
I
removed
this
region
from
the
Navy
etiquette
on
the
import
step
and,
for
example,
should
I
can
live
right
here.
Just
for
the
future.
A
A
A
D
A
B
So
when
I
tried
to
do
something
similar
with
my
company,
I
wound
up
with
a
lot
of
deadlocks,
and
that
was
the
big
issue
and
I
couldn't
get
past
it.
It
was.
It
was
mainly
in
in
attributes
first
important.
Those
attributes
caused
a
lot
of
other
places
to
change
and
getting
past
those
deadlocks
was
was
I,
just
decided
not
to
do
it,
because
it
was
too
much
yeah.
A
If
you're
coming
back
to
the
history,
that
wasn't
a
point
where
we
came
to
the
ball,
turkey
import.
So
there
was
a
lot
of
problems
with
the
dead
logs
which
user
reported
to
us,
and
we
came
as
the
DSP
using
a
sim
request.
So
I
mean
you
push
in
all
your
product
data
as
many
as
you
have
you
pushing
them
to
magenta
and
magenta,
puts
them
in
queue
and
then
executing
them
one
one
by
one.
This
means
you're,
not
you
don't
have
any
deadlocks
problems
anymore.
A
B
A
C
Yeah,
so
is
this:
what
I
want
to
talk?
Actually,
what
version
of
Magento
was
that,
because
into
those
two
six
is
further
than
maybe
five
probably
already
in
five.
There
is
fix
for
a
lot
of
deadlock
issues
on
the
product
creator,
okay,
you're
talking
about
those
issues
they
might
already
be
resolved.
There
are
different
issues,
though
it
will
be
very
good
to
report
them
back
to
the
magenta
and
have
it
fixed,
because
the
facts
all
other
performance
aspects-
magenta
not
just
not
just
a
simple,
simple
yeah.
B
It
was
when
I
was
working
on
two
to
two
point.
Two
point
I
want
to
say
three,
so
it
was
a
little
while
back
so
okay
I'm
just
wondering
because
I
really
want
to
this
is
this
is
a
huge
aspect
for
for
me
and
in
my
company.
So
if
there's
something
that
I'm
waking
started,
I
can
actually
start
working
on
this.
You
know
actually
working
and
solving
some
of
these
issues,
then
I
mean.
C
A
C
Was
talking
about
that,
if
you
start
working
on
that-
and
you
still
see
deadlocks
in
the
newest
version
of
magenta
to
be
nice
to
report
them
as
soon
as
possible,
so
that
magenta
can
start
working
guys,
it's
important
to
get
it
fixed
as
soon
as
possible.
Great,
it's
important
outcome
of
this
exercises
again.
A
A
A
The
biggest
one
you
have
any
other
question:
she
can
help
us,
no
good
cool.
Yes,
so
I
make
it
just
a
short
overview
of
the
tickets,
which
I
which
I
created.
We
have
journey
some
initial
one.
Let's
start
from
this
one,
so
to
create
the
cubic
key
value
framework
module.
There
was
a
prototype
provided,
and
you
know
by
Alexander
or
by
youichi.
Some
of
you
guys
did
it
occupy
approach.
A
A
A
It
creates
already
smaadahl
itself,
which
will
read
the
config
data
from
an
PHP
and
if
it's
enabled
and
user
as
Radiesse
connection,
then
we
have
the
story
to
write
operation
status
to
you
ready.
So
we
store
in
the
operations
that
is
not
as
much
into
database,
but
we
are
doing
it
in
a
weighted
storage
in
a
register
base
and.
A
After
the
consumers
accuse
the
calls
that
was
like
part,
which
is
related
to
Redis,
and
we
also
has
a
part
of
story.
Av
attributes
we
just
yeah
also
related
to
Redis,
but
in
this
case
you
have
to
eliminate.
The
idea
is
to
eliminate
those
select
culture,
maybe
attributes-
and
we
came
up
with
ideas
that
took
case-
that
we
have
to
catch
them.
A
Yeah
also
one
more
story:
that's
funny,
invalidate
AV
cache
type
on
achieve
good
save.
So
in
any
case,
when
something
changed
on
a
every
structure,
we
have
to
imitate
the
cache
correctly,
but
generally,
for
this
part,
I
guess
that's
all
from
my
side.
Eugene,
maybe
I,
don't
know
if
you
with
us
done
yeah.
C
B
Can't
rewrite
what
I
have
for
my
company
to
use
this
yet
right
or
or
start
working
full-time
using
my
company
resource
to
do
so.
What
I
would
like
to
do
is
is
pull
down
a
repo
and
and
start
testing
and
seeing
and
perhaps
doing
some
of
the
performance,
testing
and
stuff,
because
I
can
probably
knock
that
out
pretty
quickly.
A
B
I'm
using
raw
queries
right
now,
so
I
I
do
it's
it's
it's
so
I
have
an
intermediary
between
my
PIM,
which
takes
the
exports
from
pump
in
and
creates
the
deltas
for
them.
So
now
only
importing
Delta's
into
intermediate,
oh
and
then
I
take
that
and
that
exports
in
CSVs
and
then
I
think
that
and
import
them
with
load
a
and
file
and
some
other
you
know,
I.
A
B
As
I
see,
if
I
can
take
it
on
later
tonight,
when
I'm-
or
you
know,
sometime
in
even
in
the
next
couple
of
days,.
E
So
I'm
not
pretty
familiar
with
the
current
scope
and
what
what
is
done
and
what
has
to
be
done
so
I
will
I
think
will
take
some
time
to
get
familiar
with
backlog
and
I.
Think
when
I
will
set
up
my
development
environment
I
will
be
able
to
to
test
it
and
pick
something
something
for
me.
So
that
will
take
I,
don't
know
a
couple
of
days.
If
someone
will
give
rise
to
the
a
back
lock
and
to
the
repository
I
mean.
A
E
A
A
C
I'm
still
in
progress
of
reviewing
to
the
tuber
port-
and
you
still
the
tasting
last
week
had
some
some
problems
is
my
environment.
So
it's
a
little
bit
blocked
by
that,
but
still
progressing
on
it
and
I
hope
to
get
more
updates
this
week.
Great
thanks,
I
just
want
you
to
nourish
to
the
two
first
and
then
we
can
move
to
the
two
separate
scopes
of
work,
every
sure,
good!
That's
fine!
For
me,
cool.