►
From YouTube: Import/Export Knowledge Sharing (Manage 201)
Description
The Manage team at GitLab is giving talks to share knowledge on particular topics. The aim is to make it easier for others to contribute, both within the team and beyond
This presentation introduces the Import/Export feature https://docs.gitlab.com/ee/user/project/settings/import_export.html at GitLab. Emphasizing some known issues, where it can be improved, and some relevant parts of the code.
Slides are linked to from https://gitlab.com/gl-retrospectives/manage/issues/7, along with details of upcoming 201 sessions.
---------------
Read more about our product vision: http://bit.ly/2IyXDOX
Learn about FOSS & GitLab: http://bit.ly/2KegFjx
Get in touch with Sales: http://bit.ly/2IygR7z
A
A
A
A
A
Then
the
versioning
and
probably
the
most
awaitin
crusting,
which
is
when
do
we-
have
to
pump
the
diversion
and
then
we
look
at
the
other
code
and
we'll
have
some
time
for
a
lot
of
time,
hopefully
for
some
questions,
but
feel
free
to
interrupt
me
any
time
as
well.
So
I
have
this
project
here,
so
the
import/export
basically
means
that
we
can
export
an
archive
of
data
for
a
project
at
Goliath
in
in
this
case.
A
If
we
went
to
export
is
probably
would
have
to
go
to
the
general
settings
and
then
extrapolate
and
it's
a
few
things
that
get
exported,
such
as
the
polar
configuration
wiki,
the
repository
uploads
and
all
the
issues,
commencement
requests,
tips,
labels,
the
elephants,
objects
and
many
other
entities
as
well,
because
we
keep
we
keep
adding
entities
to
project
so
a
lot
of
them
and
in
every
release.
We
keep
adding
things
to
this.
A
A
So
if
we
click
on
export
project,
you
should
we
be
fast,
because
it
is
more
for
its
anyway
and
we
refresh
the
page,
there's
also
a
notification
that
gets
sent
to
the
email
of
the
user.
That
schedule
sees
in
it
will
maintain,
oh
by
the
way
in
order
to
export
any
project,
and
then
you
can
click
on
the
node
X
for
which
is
ready
now,
and
we
have
the
archive.
A
Now
we
can
move
this
window,
go
to
create
a
new
project
and
choose
the
expert
boot
that
we
just
downloaded
there's
a
few
options
here,
the
top
of
the
first
ones
we
get.
Let
export
there's
also
seeing
that
one
which
people,
sometimes
they
don't
know
about,
which
is
the
clip
calm
import,
a
nice
slight
similar
the
difference
is
that
the
export,
as
I
mentioned
before
it,
it
does
expert
quite
a
few
things
related
to
the
project,
while
the
Khilafat
comm
uses
authentication.
A
So
you
need
to
configure
the
developer,
calm
integration
through
OAuth
and
and
then
you'll
be
able
to
import
I
think
mainly
issues,
not
many
things
at
all.
So
people
don't
normally
use
this
option
that
much
and
just
go
for
the
year
for
the
extra
one
question
we
avoid
is
14
security
reasons,
yes,
mainly
for
security
reasons,
and
also
it's
another
reason
for
the
onions
because
is
not
easy
to
export
these
variables.
We
have
when
we
most
of
these
tokens
and
everything
they're
encrypted,
which
means
that
we
use
these
DB
encryption
key.
A
That
is
sort
of
unique
instance,
and
normally
these
issues,
although
I'm
doing
the
input
in
the
same
instance
right
now,
normally
this
issued
for
moving
these
problems
between
instances
like
exporting
from
Gillett
welcome
to
a
local
instance
or
that
way
around
a
lot
of
people
are
migrating
from
self
hosting
to
get
up
that
corn
and
they
will
export
from
the
local
instance
and
integral
about
Condor,
has
a
different
TB
encryption
key.
So
those
variables
those
columns
that
are
are
encrypted
a
bomb.
It
won't
work
properly
because
the
key
is
different.
A
So
there's
an
issue
about
this
somewhere
I
think
it
has
to
do.
This
comes
up
very
often
with
the
backup
and
restore
'm
about
the
restorer
and
say
it
is.
Then
people
are
more
insert
interesting
in
exporting
these
things.
What
in
the
info
export
is
not
something
that
okay
normally
asked
about,
and
so
what
program
for
that
I
could
be
asking
for
the
BBC
at
some
point
or
Sanchez,
but
yeah
that's
one
of
the
reasons
there
are
always
obviously
security
as
well.
So
let
me
just
drop
the
this
one.
A
A
A
A
A
A
So
thus,
the
demo
there's
also
a
bunch
of
things
that
I
won't
cover
today,
because
the
Ebro
expert
is
used
in
corfu
places
such
as
the
instant
level
templates.
Then
you
also
have
the
brilliant
level
and
soon
the
group
level
I
think
it's
gonna
benefit
in
melts
already,
but
all
of
these
templates
they
are
using
the
in
products
for
behind
the
scenes
as
well.
There's
also
the
API
for
the
import/export,
which
provides
them
for
you.
A
I
won't
cover
them
today
because
it's
got
a
lot
going
on
there,
but
so
that
you're
aware
that
all
of
these
they
use
the
peeper
export,
and
the
next
point
is
specific
to
debugging
and
I
what
to
do
when
we
found
a
problem,
so
the
I
mean
they.
Normally,
you
get
instant
feedback
in
here.
Instead
of
the
process
actually
imported,
there
should
be
an
an
arrow
there
that
states
a
something
happened,
which
also
reflects
this
input
arrow,
that
we
can
check
from
the
console,
and
so
something
happens.
A
We
probably
get
notified
there,
and
sometimes
it's
not
as
simple.
So
we
can.
We
can
check
what's
going
on
and
there's
a
few
things
that
we
can
do
in
order
to
debug
in
any
errors,
I
think
the
the
key
columns
to
check
these
three,
the
job
ID,
which
comes
from
sidekick.
This
is
a
this
gets
scheduled
as
a
background
job
and
saintly
person
returns.
This
is
geology
and
we're
keeping
the
database
because
it's
quite
useful
to
to
have
so
we
can
wait
for
it
later
ebrill
status.
A
This
is
significant
as
well,
and
we
talked
about
why
a
bit
later
and
the
in
Pereira,
which
should
be
reflecting
the
eyes
I
mentioned
earlier,
then
this
the
logs,
this
hasn't
been
transitioned
to
strata
logs,
yet
but
hopefully
soon
so
annoying
to
grep.
But
this
one
thing
that
we
always
do
so,
if
there's
any
error
of
yours
log
in
unstable
that
it
starts
with
incorrect
for
error.
So
we
can
wait
for
that
and
there
was
a
back
trace.
A
Then
the
job
ID
we
can
grab
consume
the
sidekick
logs
for
it
and
in
the
next
slide
we
see
a
bit
more
about
why
that
is
suitable
and
so
joe
says
those
kind
of
migrated
to
another
model
to
pro
yet
but
I
put
ID
yeah,
that's
right,
I
think
and
if
he's
called
for
your
status
or
something
that
I
think
these
works
is
still
and
I.
Think
all
of
these
columns
may
have
a
method
out
basically
calls
this
other
model
anyway.
A
A
A
A
A
A
A
The
best
thing
to
check
these
is
with
the
sidekick
locks,
and
this
is
why
these
job
ID
was
important,
because
then
we
will
see
something
like
this
works
is
still
in
progress.
Blah
blah-
and
this
for
the
first
kill
signal
that
we
sent,
and
in
here
you
will
see
a
struct
with
a
list
of
job
IDs,
and
one
of
them
would
be
the
one
that
we
that
related
to
the
invar
export.
A
This
is
a
bit
hard
to
debug,
because
the
we
couldn't
we
couldn't
grab
for
the
kill
signal,
because
if
we
keep
the
process-
and
he
made
a
random
job
ID-
and
not
necessarily
this
one.
Basically,
when
we
scheduled
a
safe
side
keyboard
for
doing
the
import
job-
and
this
will
be
picked
by
a
process
that
has
different
threads,
each
thread
will
have
a
job
ID.
If
it's
read
with
two
different
different
kind
of
jobs,
one
will
be
the
import.
A
So
in
the
logs
we'll
see
something
like
this
again,
the
job
ID.
We
can
great
for
that,
and
we
see
that
this
will
happen,
and
in
this
case
we
do
market
sales,
which
means
that
we've
been
see
the
error
in
the
in
the
input
error.
Basically,
John
will
be
possible
to
predict
the
size
of
a
4x4
before
we
do
most
of
times.
We
can,
because,
unfortunately,
this
is
the
next
one
anyways.
A
A
Let
me
let
me
check
yes
actually,
so
it
may
be
easier
if
I
open
one
to
see.
So
you
can
see
the
contents
of
one
of
these
archives.
So
basically
the
key
one
is
this
JSON
file,
because
we
load
all
of
this
in
memory,
so
the
size
of
these
will
represent
probably
or
we're
gonna
use
the
memory
that
we're
gonna
use.
So
there
are
issues
that
I
will
link
later
to
fix
this.
A
This
problem,
the
few
things
we
can
do
basically,
but
it's
not
fast
or
anything,
and
so
it
corresponds
to
the
size
of
these,
maybe
I'll
be
more
because
we
are
creating
on
so
some
in
the
case
of
the
input
regrading,
some
active
record
motor
of
some
things,
but
pretty
much
about
the
size
of
the
J'son,
is
what
we
can.
We
can
guess.
A
How
can
we,
our
customer,
find
the
amount
of
them
required
by
the
instance
for
the
import
of
a
project,
so
this
is
I
means
difficult
to
figure
out.
The
projection
is
a
good
approximation,
because
this
is
the
the
issue
in
performance
is
related
to
these
days
and
you
can
check
the
size
of
it
and
then
you
can
pretty
much
guess
that
we
are
at
least
going
to
load
the
whole
Jason
in
memory
she's
going
to
use
that
plus
a
bit
more.
A
A
Then
we
do
batch
here,
and
this
is
going
portent,
because
if
we
didn't
do
this,
then
it
will
use
way
more
memory
because
we-
and
we
do
load-
say
a
bunch
of
issues
comments
and
then
we
commit
to
the
database
and
we
free
the
memory
us
and
we
get
rid
of
those
obviates
and
then
we
let
the
bigger
but
flag
to
do
this
thing
which
work
well.
Well,
we
didn't
have
to
I
mean
we
don't
have
to
call
it
or
anything
it's
just.
It
gets
freed
after
a
while,
but
this
is
still
not
enough.
A
So
slow
Jason
is
one
of
the
key
problems,
loading
and
daunting,
especially
when
you
think
about
millions
of
builds
and
requests,
and
sometimes
we
have
to
touch
the
repository
as
well
think
about
music
words.
We
may
have
to
do
some
some
action,
Perez
requests
such
as
especially
your
own
folks,
because
we
don't
have
the
full
project.
A
Models-
and
this
is
what's
causing
the
high
memory,
so
one
of
things
here
we
can
do
is
split
the
worker
we
do
these
already
for
the
github
inputs,
which
improved
quite
a
lot
and
I
remember
for
the
community's
input
here.
We
we
scale
earlier
and
it
took
like
other
at
least
three
months
to
get
imported,
so
that
was
that
was
crazy,
but
I
was
much
faster
after
we
split
the
worker,
so
we
we
can
split
this
in
two
different
threads.
A
What
processes
really
and
then
also
start
made
a
few
good
points
about
Oscars
in
this
link
to
this
issue
here,
basically,
some
of
the
one
we
use,
which
is
active
record
to
Jason,
which
means
that,
because
it
doesn't
know
how
these
are
going
to
do
with
each
model,
then
it
doesn't.
It
doesn't
do
anything
clear,
but
we,
you
know,
we
may
call
I,
necessarily
necessarily
selects
to
the
database
that
we
could
have
parts
and
things
and
there's
a
few
more
themes
into
it.
A
Some
of
the
things
we
talked
about
on
those
issues
as
I
mentioned
splitting
these
parts,
the
expert,
which
is
everything
done
in
Guangzhou
at
the
moment
and
optimize.
The
sequencer
artifical
is
then
a
lot
of
things
and
is
not
stefano
calling
single
instance
model
good
bad
storms
as
well.
We
could
parts
the
reading
and
writing
to
disk.
We
don't
we
don't
you
don't
do
this
at
all.
A
Move
away
from
some
activerecord
cold
box
would
be
great
because
then
it
means
that
we
could
practically
just
insert
the
delay
database
like
if
it
was
a
CSV
import,
but
it'd
be
difficult.
Sometimes
we
do
scroll
box
Debbie
commit.
This
is
another
issue
so
for
each
part,
depending
on
when
we
commit
to
the
database,
this
is
related
to
active
record
transactions.
A
So,
basically,
when
you,
when
we
import
something,
we
we
sort
of
but
few
records
a
few
inserts
into
the
database
and
then
after
each
parts
we
commit
into
the
database,
because
if
we
don't
do,
then
all
of
that
would
be
in
memory
and
enable
basically
the
phlume
with
uses
they
get
slower.
So
the
more
often
we
commit
these
lower
he
gets.
A
So
that's
where
I
say
a
fitness
sport.
There
is
it's
quite
different.
We
tried,
we
I
did
try
a
few
other
tools
such
as
OJ,
but
it
doesn't
help.
I
mean
tested
this
with
a
few
other.
No
I
think
it
was
like
a
four
gig
input
and
it
held
for
just
a
few
seconds,
which
and
and
the
memory
was
exactly
the
same,
because
it
doesn't
use
active
record
behind
the
scenes.
This
is
promising
this
fast
Jason,
hey,
I,
think
Stan
mention
English
as
well.
That
may
have.
A
Then
well,
there's
a
few
few
links
there
so
feel
free
to
to
the
if
you're,
interesting
and
digging
a
bit
more
into
this
as
a
few
open
issues
and
suggestions
there
at
the
moment,
this
is
being
ongoing
with
having
the
schedule
of
these,
but
I
think
it
would
be
useful.
This
one
thing
that
we
do
now
for
customers,
which
is
in
the
infrastructure.
We
have
these
four
grant
imports.
So
if
we
want
to
import
a
big
project
for
customs,
we
do
this
in
the
foreground
using
a
template.
So
these
were.
A
Okay,
move
on
to
security,
so
the
thing
with
the
import/export
is
that
is
about
three
years
old.
Now
about
what
I've
been
in
the
company.
We
haven't
touched
that
much
it's
pretty
much
the
same
as
what
it
was.
So
that
is
why
the
four
months
now,
because
we
could
keep
adding
things
to
it
and
got
really
slow,
then
we
should
also
perform
a
code
edit
code
on
it,
because
we
we
keep
encountering
a
few
security
issues,
especially
lately,
and
we
haven't
really
dig
much
into
it.
A
There's
this
issue
there
to
to
do
this,
I
think
and
I
mentioned.
Is
there
we
should
realize
that
to
help
with
the
security
aspect,
there's
a
few
more
things
and
terms
of
security
that
we
do
I
mentioned
earlier.
The
attribute
cleaner.
This
removes
any
anything
that
ends
with
ID
and
there's.
There
are
some
references
that
we
don't
really
use,
but
we
change
the
items
and
we
we
need
in
formatting.
Basically.
A
A
A
A
A
Thing
so
the
version
is
basically:
if
we
increase
this
number,
then
it
would
mean
that
it
won't
be
compatible,
and
this
is
because
we
keep
changing
it
so
for
a
specific
release.
We
we
had
quite
a
few
changes
that
do
not
require
a
virtual
bomb
person.
They
won't
break
the
import/export
and
if
we
work
to
add-
and
you
increase
this
number
/
/
change-
that
is
no,
it
doesn't
break
anything
then
yeah,
it
would
be
probably
maybe
10
times
every
time
we
release
a
new
given
operation
because
we
keep
adding
new
things.
A
So
it's
a
bit
of
a
pain
for
the
customers,
because,
obviously,
when
every
time
we
bump
the
person
up,
the
customers
may
find
that
they
couldn't
import
in
archive
from
from
these
all
the
instances
into
the
new
one.
So
we
try
to
limit
this
as
much
as
possible,
which
leads
to
the
next
question.
A
A
Renaming
is
one
of
the
main
problems
we
have,
especially
when
we,
because
we
run
and
I
see
a
typical,
my
most
of
our
customers
or
users.
They
don't
it's
a
bit
of
a
problem
and
this
has
changed
recently
and
we
try
to
now
support
at
least
one
further
version.
So
we
don't
have
this
problem
so
basically
now
we
can
use
this
services
that,
for
one
for
one
version,
say
eleven
point
six
we
make
it
compatible
and
in
this
sample
we
renamed
pipelines
to
see
I
type
lines.
A
So
for
eleven
point,
six
experts
from
eleven
point
six
will
work
on
all
the
instances
because
it
will
get
exported
as
pipelines
as
well
and
while
in
eleven
point
seven,
we
will
expect
this
to
be
to
be
remove
and
and
this
this
is
great
for
customers,
because
then,
once
we
deploy
americium
get
become
most
of
the
experts
that
they
have.
It
won't
work
if
we
bump
the
version
up,
but
with
this
change
hopefully
will
it
will
help
so.
A
Okay,
look
he's
asking
what
was
the
reason
for
not
using
standard
server,
because
you
can
happen
a
number
of
times
each
release?
Yes,
because
there
are
compatible
changes
all
the
time
we
keep
adding,
not
only
motives
but
columns
and
changing
stuff
very
frequent.
Then
we
could
do
it,
but
I
think
the
waste
of
time
is
not
very
useful.
A
Okay,
let's
go
to
the
next
slide.
We
choose
basically
equip
dive
into
the
code.
The
prolly
most
important
thing
about
the
umbrella
is
that
and
this
is
used
by
customers
quite
a
lot.
The
is
this
configuration
file,
input
X
for
our
moon,
and
in
here
we
can
specify
what
we
actually
export
or
import
an
instance.
A
A
And
sometimes
customs
may
not
be
interesting,
exporting
certain
things.
Don't
issues
with
that
with
is
that
they
do
need
to
reserve
a
instance
after
change
here,
but
yeah
there's
another
thing
about
this
file
that
we
can
say
what
attributes
we
won't
include
and
which
ones
we
don't
want
to
include.
So
we
can
exclude
certain
attributes
this
quite
useful
for
security
purposes
as
well.
We
can
you
know
tokens
and
things
like
that.
A
There's
this
methyl
section
as
well
in
here-
and
this
is
we
don't
use
it
very
often,
but
sometimes
we
may
want
to
export
certain
things
that
may
not
have
like
a
relation
on.
We
may
want
to
know
the
relationship
and
do
something
to
that
model,
and
that's
the
case
for
a
few
things
like
the
this
utf-8
teeth
that
think
well.
This
has
been
a
few
years,
but
I
think
it
was
an
issue
when
he's
not
utf-8
and
the
JSON
wouldn't
work
properly.
So
we
change
these
to
use
this
and
export
the
ATF
version.
A
A
A
A
It
means
that
we
can
keep
them
in
object.
Storage
if
we
configure
them
like
that.
So
this,
basically
this
input
file
is
strikes
that
file
and
then
the
next
thing
we
do
is
check
the
version
to
see
if
it's
compatible
or
not,
then
we
call
a
few
restore
us
that
basically,
what
they
do
is
just
them
either
I
stuck
the
repo
and
extract
or
everything
to
do
with
it,
database
models
and
uploads,
L,
Affairs
and
yeah,
and
a
few
other
things.
A
A
A
A
And
then
the
link
to
the
presentation
will
be
here
or
there's,
not
a
presentation
per
se.
As
I
mentioned,
Lisa
marked
our
document
and
I'm
hoping
I
will
I
would
submit
a
meta
quest
to
add
these
to
the
to
the
root
of
the
import/export
namespace,
which
is
around
here.
So
I
think
this
would
be
useful
for
developers
to
check
so
we
have
like
a
readme
there.
We
fill
all
of
these
information
and.