►
From YouTube: Discussion of Performance Improvements Bulk API
Description
Discussed possibilities on elimination of excessive queries to database during asynchronous request processing which will go to the backlog of the project
A
B
B
Yet
correct,
so
the
very
idea
was
that
we
planned
to
be
old:
the
Redis
implementation
for
bulk
operations
so
to
write.
Okay,
the
initial
idea
was
to
write
the
status
operations
to
Redis
and
then
read
it
read
them
also
from
Redis,
so
we
excluding
the
magenta
database
on
this
step
which
giving
us
speed
improvements,
which
is
quite
good,
but
still
on
this
step.
We
have.
B
We
have
the
database
connection
open,
so
there
is
still
SQL.
Queries
are
coming
in
and
out
on
the
database.
We
generally
for
this
task
when
we
create
a
new
API
request
to
Magento
via
the
bulk
influence.
We
don't
really
need
the
database
of
magenta,
because
we
have
all
data
coming
with
response
and
we
can
just
put
it
in
a
queue
which
configuration
is
stored
in
it:
C,
environment,
PHP
file.
So
an
idea
was
it
in
this
step
we
can.
A
Yeah,
so
first
of
all
performance
test
indicates
that
that
will
be
at
least
twice
faster
for
any.
Given
a
synchronous,
API
request
to
write
message,
the
message
queue
in
case,
if
looks
cool
database,
so
it
was
very
simplistic
performance
testing.
First,
we
just
did
wrong
asynchronous
requests
or
pasting,
like
thousand
concurrent
requests
magenta
to
Kent,
and
we
measured.
A
How
long
did
it
take
to
get
response
for
early
request
and
second
was
just
piece
of
the
code
which
did
removed
right
database
and
reading
from
database
and
same
test,
so
it
was
sick
in
seconds
versus
15
seconds,
twice
fast,
our
garden.
What
where
we
need
the
top
is
requested.
First
of
all,
like
I
mentioned,
if
this
status
of
the
bulk
API
is
written
in
database
in
a
bulk
request,
so
we
kind
of
take
the
bulk
request,
which
includes
request,
generates
a
ball
kg
and
write
to
database
and
writes
the
status
of
the
request.
A
So
later,
when
we
process
the
request
we
can
correctly.
The
for
instance,
second
scene
is
what
we
can
frame.
Work
itself
requires
the
doughboys
connection
to
read
information
about
web
with
GI
user,
which
is
loaded
from
the
device
based
on
the
authorization
token.
The
kind
of
parse
pumpkin
cake
is
ready.
Word
from
lot
of
ways,
is
the
correct
killer
and
then
make
requests
in
context
to
that
year.
So
the
convention
to
check
permissions,
you
can
check
differences.
A
A
In
the
array
of
attributes,
we
read
at
the
top
way
read
the
database
and
read
the
metadata
for
immediately,
so
you
know
which
active
you,
amongst
particular,
eight,
for
example,
product
Lord,
all
product
attributes,
and
if
you
serialize
data,
and
then
we
know
that
this
particular
field
on
the
custom
attributes,
the
product
is
actually
in
the
attribute,
and
this
is
validated-
and
we
also
read
some
additional
metadata
information
which
helps
us
to
understand
the
types
so
I
think.
The
biggest
thing
we
can
attack
right
now
is:
first,
don't
try
status
to
database.
A
For
that
we
have
discussed
that
we
can
use
ladies
and
correctly
display
statuses,
just
writing
them
right
to
the
bulk
ID
of
operation
to
the
radius
and
having
correct
status
reflected,
and
then
every
time
start
has
changed.
This
atomic
operation
on
register
can
update
the
status
so
that
it
will
always
be
correct,
and
second,
simple
attack
is
easy
attributes.
So
if
you
look
at
why
we
need
to
with
database,
this
is
only
because
any
attributes
can
be
changed
dynamically
by
admin
user
when
he
adds
or
removes
particular
attributes
from
particular
entity
like
controller.
A
A
Other
side
yeah,
potentially
it
can
be
possible.
So
that
was
a
short
a
year
we
discussed.
However,
it
will
not
solve
the
same
problem
for
a
big
guy
like
if
it
is
a
synchronous
request
is
to
do
the
same,
but
there
you
don't
have
a
place
where
you
can
move
this
logic,
because
every
single
signals,
one
request.
B
A
If
you
actually
fixed
as
a
way,
how
did
you
see
realized
at
abuse
for
the
Web
API?
It
will
make
a
good
performance
improvements
boss
for
synchronous
and
asynchronous
a
days
and
why
we
need
to
deserialize
date
and
wiring
if
towards
the
motor
data,
because
we
want
to
validate
that
beta
k
incorrect.
So
we
don't
have
any
fields
which
does
not
belong
to
the
entity
or
we
don't
or
we
have
all
zeros
which
are
required
for
that.
B
A
I
think
it
will
be
able
to
start
from
some
prototype
where
we
can.
First
of
all,
cash
in
a
structure
then
invalidate
in
structure
ones
changed
from
the
cache
and
then
a
use
cash
from
the
web
api
serialization
in
this
regulation.
So
this
is
our
three
more
most
important
things
too,
investigate
how's
it
going
on
yep
and
second,
a
little
bit
less
important
scene
to
investigate
is
how
much
performance
game
do
we
get
out
of
it,
because
why
it's.
B
B
A
You
would
say,
yes
that
should
be
there
is
like
it
should
be
different.
It
should
work
the
same
for
every
object.
Basically,
so
object
should
be
serialized
realized
some
way
and
using
same
page
capabilities,
because
in
future
we
might
decide
to
expose
by
any
object
to
the
system
or
even
for
new
objects
like
if
you
introduce
a
new
entities,
a
new
API
Aguila
from
the
cache
to
go.
B
A
B
Bit:
okay,
so
yeah
the
novel
propose
I
will
I
will
go
through
the
stories
I
will
formulate
them.
I
also
will
try
to.
We
have
already
story
for
readies
for
operation
status.
I
will
charlie,
extend
it.
Co
sister
general
I
would
maybe
make
it
miss
minimum
three
different
stories
out
of
it.
The
small
issues,
then
each
you
developer,
you
contributor,
can
start
with
something
small
like
HQ,
a
library
or
a
try.
A
And
also
we
are
starting
performance
track
and
the
track
will
be
dedicated
specifically
to
removing
the
number
of
queries
for
any
magenta.
Page
click-
if
you
lowered
homepage
in
you,
see
like
20
queries,
but
you
want
a
one-way.
This
will
be
the
part
of
the
project.
So,
for
our
specific
case,
it
intersects
a
little
bit.
I
think
we
should
track
this
work
here,
because
this
is
important
about
it.
Guy
there
were
50
contributors
and
they
were
looking
forward
to
contribute
additional
if
to
performance
track.
That
will
be
also
one
of
the
options.