►
Description
In this episode, we review PRs and new issue tickets that have been created after the recent remote Contribution Day in Kyiv, as well as work done by Hittarth in advance of the architect review.
A
Hi
guys
a
guitar
out
here,
I
was
I,
I
hide,
so
yeah
I
was
fast
basically
welcome
to
our
meeting.
We
have
some
info
some
updates
for
today
on
somebody
else
coming
among
very
uncool
hi
hi,
so
we
had
a
contribution.
Danis
Friday
in
Ukraine
and
I've
been
there,
and
we
basically
just
heard
a
lot
of
I
love
interest
and
all
of
people
are
very
interested
in
innocent
cute
project
and
I
just
want
to
make
the
small
recap
about.
A
What's
happened
there
so
before
this
I
created
bunch
of
the
issues,
labeled
Evan
contribution
day,
I
already
basically
deleted
their
labels
for
issues
that
were
taken,
but
still
there
was
multiple
pull
requests,
some
of
them
already
merged.
That
will
go
shortlist
route.
There
was
one
pool
request
which
I
will
start
from.
There
was
pool
request,
also
from
Alex
that
deliver
changes
not
to
a
synchronous
import
project
but
to
bulk
API
project.
A
That's
why
I
asked
him
to
create
it
too
much
into
modular
to
repository
and
it
basically
already
validated
accepted,
tested
and
yeah
officially
accepted
by
by
a
reviewer
and
what
is
basically
doing
and
I.
Guess
it's
nice
to
see
here
on
the
QA
screenshot,
it's
delivering
the
data
hash
here,
because
usually
our
like,
before
our
bulky
preparation,
has
a
data
cache
filled,
but
it's
never
ever
been
used.
It's
always
delivered
null
whatever.
What
we
have
there
now
currently
is
delivering
the
sha-256
from
from
input
body
yep.
A
That
was
a
pull
request
to
watch
it
like
to
bulk
API
or
to
a
sink
API,
and
then
we
have
multiple
to
a
sink.
Import.
I
will
start
from
like
small
ones
somewhere
the
pike.
Here
there
was
one
one
from
Stepan
that
basically
quite
a
small
one.
He
created
the
SEL
resource
for
the
import
service
and
I
thought
it's
already
used
to
know
a
pull
request
from
he
thought
it
was
merged.
Even
on
the
same
day,
then
we
have
calculated
was
closed
here
we
have
correct,
correct
it
created
at
date.
A
A
Basically,
I
guess
from
contribute
now
that's
from
mesh
once
and
we
have
still
some
open
ones,
which
is.
Let
me
check
that
was
create
import,
start
endpoints,
which
step
and
started
to
do
basically.
I
had
no
chance
to
reviewed
yet.
But
what
is
the
dude
is
only
creating
again
Danny
and
Poland
with
service
contract
for
import
type
and
start
so,
basically
that
we
can
start
imported
file,
I
guess
from
the
contribution
they
paid.
Second,
that's
all
that
we
have
yep
correct.
A
A
Sorry
change
directly,
oh
yeah.
First
of
all,
you
asked
me
to
prepare
some
ticket
for
him,
but
unfortunately
for
him,
a
lot
of
no
small
task,
we're
dialed
in
a
contribution
day
and
currently
I'm
a
bit
out
of
tickets.
So
we
had
I
made
for
him
stills
bunch
of
tickets
for
battle
for
bulk
API.
It's
like
three
of
them,
which
is
not
so
easy
and
I
added,
two
of
them
for
the
test
coverage
for
our
processors,
which
maybe
somebody
want
to
make
on
the
contribution
date
and
yeah.
A
What
is
the
complex
for
the
best
price
and
import?
Don't
know
if
somebody
will
take
it
over
or
not.
Okay,
that's
about
topic
of
contribution
days.
Then
what
is
next
we
have.
We
have
draft
pull
request
from
Mario
about
you,
IDs
implementation,
I
made
some
review,
I,
guess
Marlene,
you
didn't
check
it
yet
so
yeah
I
mean
when
you
will
have
a
time.
This
Kevin
Luke
is
quite
minor,
fixes
nothing
crucial
and
what
I'm
going
to
do
after
you
will
change
chair
check
it
that
you
merged
in
your
branch.
A
Okay
and
yeah
we
had
key
child,
is
doing
the
cool
job
and
start
and
proceeding
to
provide
us
with
a
pull
request
for
first
of
all
year.
He
still
have
the
he
completed.
The
partial
update,
which
has
started
to
validate
I,
didn't
complete
it.
Yesterday
we
had
some
chat
with
Vova
about
the
implementation
for
Technic,
not
like
conceptionally
looks
good
I
will
check
the
implementation
and
also
the
validators,
though
he
thought
you're
welcome
to
present
your
poor
request
to
us.
If
you,
if
you
wish
yeah.
B
Someone
has
created
a
whole
class
to
manage
different
different
validations
for
the
local
processors,
for
the
for
the
for
the
external
files
for
the
base64
and
all
the
different
different
validations
implemented
with
the
same
validator
class.
So
the
requirement
or
the
issue
was
to
split
it
down
into
a
couple
of
small
small
functions
and
try
to
try
to
implement
in
such
a
way
that
later
on,
probably
if
someone
wants
to
implement
another
validation
or
probably
some
more
extended
with
the
with
the
validation,
then
they
can
so
based
on
the
requirement.
B
B
B
Those
all
the
different
different
validation
request,
validations
and
local
file
validations
and
the
remote
file
validations
all
being
implemented
into
the
same
class
so
and
there's
another.
Another
issue
which
has
been
impeded
is
the
validate
base64
before
upload,
so
I
have
based
on
these
two
issues.
I
have
created
one
full
request,
which
includes
all
these
scenarios
and
I
have
split
it
that
validator
class
into
a
different,
different
validated
classes.
So
there
is
one
common
validator
they
have
implement
is
request
validator
to
end
act.
B
So
I
have
created
interface,
which
inherits
which
will
be
implemented
with
all
different
validators,
and
one
is
so.
It
has
only
one
validate
validate
function
and
all
the
validators
are
being
implemented
that
validator
interface
and
will
be
defined
that
the
validate
class
validate
functions.
This
is
the
this
is
the
request
validator
which
which
I
have
injected
in
all
processors.
B
So
these
are
the
processors
and,
let's
say
you
know
like
each
processor
will,
will
how
to
validate
those
requests.
So
I
am
injected
value.
That
request
validators,
request,
validators
and
it
will
did
for
all
this
external
file.
Processor,
base64,
encoded
file,
personal
and
local
file,
processor
and-
and
there
is
a
there-
is
a-
is
64
validator
as
well
a
64
validated
as
well,
which
just
validate
the
import
data
with
the
regex,
and
if
it
is
VIN
valid,
then
it
throws
an
error.
B
The
same
way,
I
I
have
created
an
external
file
validator,
which
checks
the
file
we
check.
This
file
is
exists
or
not,
which
checks
the
mime
type
as
well,
and
there
is
a
local
file
validator,
which
also
checks
the
file
is
exist
or
not.
If
it
is
exist
and
check
the
mime
so
and
all
these
inject,
all
these
valadares
are
injected
to
the
relevant
processor.
So
let's
say
there
is
a
local
file
validator,
which
injects
into
the
local
processor
base.
B
64
validator
will
be
injected
into
the
base,
64
encode
data
processor
and
the
external
file
elated
injected
to
the
external
purpose.
Really,
if
someone
wants
to
create
a
new
validation,
then
they
just
need
to
implement
this
validator
interface
and
injecting
to
the
DI
XML,
which
will
walk
so
and
I
have
written
the
same
code
int
with
all
the
three
different
processors.
B
We
just
does
this
same
thing,
but
I
am
taking
an
example
of
local
file
processor
in
which
there
is
a
validator
to
be
injected
into
the
constructor,
and
it
just
there
is
a
for
each
loop
and
it
just
validates
that
expose.
That's
it
in
throws
an
error.
So
this
is
how
I
have
implemented
the
decision.
I
implemented
the
validators
and
yeah
one
more
thing
is
earlier.
There
was
an
issue
of
mind.
My
am
type
to
be
mime
type,
to
be
implemented
into
the
different
different
different
different
classes.
Now
I
have
used.
B
I
have
used
this
file
source
type
and
using
the
source,
pull
a
source
type
pool.
I
have
evaluated
those
mime
type
so,
with
the
sauce
pool
type
I
have
get
all
the
allowed
mime
types
and
I
have
validated
those
into
the
local
file
validator
and
external
file
evaluated.
So
this
is
one
thing
and
anything
wanted
to
ask
for
probably
give
me
suggestions
to
improve
something.
Anyone.
A
A
B
B
So
basically
the
idea,
those
two
as
I
said
you
know
like
whether
it
is
a
base64
processor,
whether
it
is
of
external
file
source,
whether
it
is
local
of
us
was
if
someone
wants
to
upload
their
files
in
a
different,
different,
different,
different
splits
and
later
on,
they'll
wanted
to
merge
it
and
they
can
do
it.
So
this
is
I
find
that
this
is
a
good
place
to
be
implemented,
the
partial
source
type.
B
So
what
I
did
is
in
the
API
I
have
created
a
1
1
partial
source,
upload
interface,
which
will
use
basically
initially
I,
have
created
separate
web
api
called
for
the
partial
source
which
caused
this
partial
sauce
interface
and
also
have
created
one
data,
partial
data
entry.
Please
explain
the
source
interface,
so
I
know
it's.
It's
not
advisable
to
use
the
used
to
extend
those
extend
any
of
the
classes
or
interfacing
to
another
classes,
but
later
on,
I
find
this
is
this.
B
Is
there
is
a
redundant
fields
which
is
already
been
implemented,
the
sort
in
source
interface
and
which
we
can
use
to
save
the
data
into
the
database
for
the
tables
and
all
this
stuff?
So
I
find
this
is
a
best
way
to
just
introduce
new
these
new,
these
files
and
new
these
fields
and
take
their
take
the
request
and
proceed
ahead.
So
after
someone
is
hitting
this
URL,
it
straight
away
cause
the
partial
post
upload,
which
leaves
the
redirects
me
to
the
partial
source,
uploads
execute
class
in
which
I
have
used
the
partial
source.
B
But
actually
you
know
like
a
partial
source
interface
is
also
an
instance
of
in
source
interface.
So
later
on,
when
I
am
forwarding
this
request
to
different
different
other
classes,
it
will
not
give
us
an
error,
and
it's
also
it
is
it
is.
It
leaves
me,
you
know
an
opportunity
to
use
this
same
source
and
the
source
processes
the
same
way
that
we
have
implemented
for
the
source
interface.
B
Have
checked
partial
source
type
is
a
valid
source
or
not.
It
requires
three
fields:
data
highest
data
pieces,
counting
piece
number.
When
it
will
it
is
valid
source,
then
it
saves
the
source
or
the
method
will
just
take
this
source
into
the
temporary
directory,
and
it
takes
that.
You
know
like
it's,
not
the
final
piece.
Then
it
just
returns
the
success.
That's
it
so
in
the
in
the
sale,
I
am
setting
up
the
setting
the
status
uploaded
and
that's
it
that
is
ripping
it's
not
retaining
the
source,
ID
or
any
other
source
resource.
B
It
says
it's
uploaded,
that's
it
in
either.
In
his
final
piece
check,
I
have
checked
that
you
know
like,
although
all
the
pieces
are
there
or
not.
If
each
and
every
pieces
are
been,
it's
been
uploaded,
then
I
have
you.
I
use
merge
function
to
merge
those
species,
I
get
the
source
type
against
get
the
content
of
all
the
merge
pieces
into
the
and
read
those
content.
Stat
it
into
the
source
and
rest
is
the
same
thing.
I
am
fine
getting
the
so
type.
Processor
I
am
saving
the
type
processor
and
returning
to
the
objects.
B
So
in
I
check
that
you
know
like
with
the
path
with
a
part:
I
have
just
get
the
total
number
of
counts
or
the
number
of
pieces
I
checked,
all
the
pieces
are
exist
or
not.
If
all
the
pieces
are
pieces
exist
or
if
there
is
any
piece
missing,
then
and
I
am
trying
to
upload
the
final
piece,
then
it
says
you
know
like
trying
to
save
the
final
piece
of
the
partial
request,
but
there
are
some
people.
This
is
missing,
so
please
try
but
try
again
with
the
missing
bit
to
multiple
sources.
B
So
the
idea
was,
you
know
like.
If
someone
tries
to
someone
has
already
uploaded
the
first
piece
and
they
misses
the
second
person
they
try
to
upload
the
third
piece.
Then
it's
a
gives.
An
error
that
you,
like
second
piece,
is
missing
and
later
on,
when
someone
will
try
to
upload
the
second
piece
and
they
chased
shag
that
you
know,
like
all
three
pieces,
are
existing.
It
straightaway
process
the
content.
B
So
they'll
not
wait
to
call
the
third
piece
to
be
uploaded
again,
but
when
the
first
someone
is
uploaded,
the
first
piece,
then
the
last
piece
and
someone
tries
to
upload
the
second
piece
later
on.
After
checking
of
the
error,
then
with
is
with
the
request
of
the
second
piece,
only
they'll
process
a
whole
all
importance.
A
So
just
select
others,
it
correctly
so
care
the
workflow
will
be
so
the
type
of
warrant
first
there.
So
the
ramp
load
them
asynchronously.
So
what
if
I
will
send
the
first
one,
then
the
salt,
water
and
then
the
second
one,
so
in
a
different
sort
order,
will
it
accept
all
three
and
will
merge
them
together
or
it
will?
It
throws
an
exception.
A
B
So
basically
see
it
will
throw
an
error
when
someone
someone
sent
the
last
liquid
first.
So
even
even
though
they'll
they'll
throw
an
error,
they
have
already
saved
that
particular
piece.
So
if
someone's
someone
standed
first
request
first
and
then
the
third
request
second,
so
it
will
save
the
it
will
process
it.
So
it
will
save
the
second
piece,
but
it
gives
an
error
that
you
know
like
it's.
It's
not
it's
not
the
last.
B
A
Okay,
now
I
get
what
you
mean
I
most
think
that
the
first
point
that
the
Saroyan
exception
is
not
a
best
way
because
we
still
save
the
object,
though
kind
of
we
I
mean
in
our
case,
operation
is
still
successful,
but
we
saw
an
exception.
That's
to
this
inconsistent
state,
so
second
one.
The
idea
also
was
that
you
can
and
you're
allowed
to
send
the
files
in
a
different
sort
order,
and
you
will
get
successful
response
in
all
those
cases.
So
you
have
five
files.
A
You
can
send
three
seconds
throughout
first
and
then
forth
in
a
different
sort
order,
and
the
system
will
accept
them
all
with
a
success
message
back
only
point,
which
is
you
know
that
we
need
to
define
the
point
when
we
can
merge
them
together
and
to
deliver
it
back
to
customer
I.
Don't
remember
if
we
discussed
it
before
forward.
You
remember
such
cases.
B
C
B
A
Dated
it's
our
first
part
that
we
have
dead
ahead,
but
just
in
point
here
that
we
have
two
off
destroy
in
the
first
part,
we
have
to
map
them
together.
Somehow
that
is
part
of
the
same
file
I
mean
if,
if
you
do
it
electronically,
how
you
were
doing
it.
Currently,
you
do
debates
based
on
a
date
in
it
yeah.
B
A
B
B
So
using
this,
so
if
you
are,
if
you
are,
if
you
are
say,
sending
the
data
sending
the
same
data
has
into
the
three
different
pieces
time
create,
then
I
am
creating
the
directory
directory
with
the
name
of
the
data
has
and
then
in
the
piece
number.
And
then
there
is
a
piece
number,
so
it's
a
will
be
where
we're
import
data
has
and
fees
number.
What.
B
I'm
not
sure
you
know
like
it's,
so
the
idea
was,
you
know,
like
I
have
uses
small
small
data
houses
as
of
now
so
I
was
not
sure
that
you
know,
like
these
pcs
I
mean
see,
there
might
be
a
very
large
pieces
as
well.
Oh,
that
was
not
quite
sure
that
you
know
like
it
is
best
idea
to
save
it
to
the
database
for
temporary
unity
database
and
then
remove
it
from
the
database.
A
B
So
the
in
the
partial
in
the
Parcells
processor-
let's
say
you
have
send
it
the
second
piece:
faucet
yeah
there
is
a.
There
is
a
final
piece
missing,
so
it
checks.
You
know
like
it
all
the
pieces
are
there
or
not.
If
it
is
not
there,
then
they'll
just
they'll
just
stand
the
status
which
is
uploaded.
That's
it.
They
are
not
returning
to
the
source.
They
are
not
hitting.
Will
source
ID
anything,
but
in
this,
in
this
final
piece
in
this
say,
partial
sale,
it.
A
A
A
B
A
A
We
will
have
a
look
with
load
image
together
or
the
pull
request,
and
just
we'll
review
it
and
yeah
that
maybe
we
will
get
a
bit
more
idea.
I,
currently
don't
have
a
good
solution
like
good
idea
in
mind,
so
yeah
I
would
propose
to
discuss
it
the
next
time,
but
then
we'll
have
a
lot
of
bit
more
time
to
to
check
the
pull
requests,
make
a
review
and
yeah
and
make
some
decision
here.
If
you,
okay
with
it.