►
Description
Elizabeth Cohen and Maggie Hays sat down with Sergio Gomez, Senior Data Engineer from Adevinta - a global classifieds platform - to hear about his experience working with the DataHub Community and rolling out DataHub across the organization.
Learn more about DataHub: https://datahubproject.io
Join us on Slack: http://slack.datahubproject.io
Follow us on Twitter: https://twitter.com/datahubproject
A
B
B
Adevinta
is
a
global
classified
specialist
with
the
with
market
leading
positions
around
the
world,
actually,
the
vintage
portfolio
spans
about
40
marketplaces
for
real
estate,
jobs
models,
consumer
goods
and
so
on,
operating
in
15
countries
covering
one
billion
people
and
with
around
three
billion
monthly
visits,
so
so
quite
challenging
and
exciting.
For
for
a
data
engineer
like
me,.
B
The
four
the
first
product
is
about
collecting
all
the
users
activity
on
the
different
marketplaces,
storing
the
data
in
a
data
lake
and
and
and
making
the
the
data
easily
accessible
for
for
analytics,
machine
learning
and
so
on
and
and
the
other
product.
Since,
since
yes,
we
are
moving
in
towards
a
decentralized
model
for
for
the
data
management
in
the
whole
organization.
C
A
B
B
The
solution
or
or
adopt
a
new
one,
so
we
did
an
assessment
on
existing
data,
catalog
solutions
and
we
quickly
found
data
hub
now
when
looking
for
candidates,
it's
obvious,
and
from
that
assessment
we
we
firstly
learned
that
redesigning
our
solution
would
be
a
lot
of
work
and
also
that
data
hub
was
covering
most
of
our
needs.
So
so
it
was
a
perfect
match.
At
that
moment,.
B
Yeah
actually
having
an
active
community
was
one
of
the
evaluation
criteria
in
the
assessment
and
well.
Data
have
got
one
of
the
best
results
that
okay.
So
we
quickly
know
that
the
community
was
very
open
and-
and
that
gives
us
also
the
opportunity
to
somehow
influence
on
the
product
by
by
sharing
use
cases,
requesting
features,
discussing
and
raising
issues
and,
and
also,
of
course,
by
by
contributing
to
to
the
source
governments.
As
we
have.
A
Yeah
we
love
contributors.
So
I'm
curious,
you
know
with
the
with
the
roll
out
of
data
hub.
What
is
what
is
data
hub
enabled
within
your
within
aravinda
so
far.
C
You
know
with
rolling
out
data
hub
and
and
using
data
hub
like
what
are
you
most
excited
to
see
in
2022
with
data
hub,
I
will
also
say
that
when
we
first
started
doing
humans
of
data
hub,
it
was
like
january.
So
it
was
a
new
year
and
now
it's
may
so
yeah.
What
are
you
excited
for
the
rest
of
2022
and
maybe
even
thinking
about
2023.
B
Yeah
yeah.
Well,
actually
it's
hard
to
keep
the
the
piece
of
the
features
that
you
release.
Okay,
we
lag
behind
with
the
upgrades.
It's
it's
it's
a
challenge
in
that
sense:
okay,
looking
at
your
roadmap
that
I'm
I'm
really
curious
about
the
the
the
improvement
on
to
access
policy
management.
Okay,
at
this
moment,
the
management
is
done
in
the
ui,
so
it
requires
manual
operations
and-
and
our
expectation
here
somehow
is-
is
to
be
able
to
define
roles
and
policies
as
code
okay.
So
I'm
curious
about
this
this,
how
these
improvements
look
like.
A
B
That's
awesome
and,
and
I'm
thinking
in
the
long
term,
so
not
sure
if
by
2022
or
or
later,
no
but
the
the
giants,
I
see
some
not
just
for
data,
but
when
adopting
this
technology
in
our
organization
is
so
on
the
data,
because
it
also
requires
changing
the
the
culture
of
the
people
as
soon
as
a
simple
example,
I
like
to
share
solving
all
the
missing
table
or
field
descriptions.
B
This
is
critical
for
effective
data
discovery,
but
it
requires
the
people
to
adopt
this
governance,
data,
governance
or
data
quality.
We
we
can
profile
data
sources
and
so
users
can
check
the
quality.
Somehow
the
quality
of
the
data
but
making
data
quality.
Part
of
your
dataset
contract
requires
a
new
mindset.
A
A
It's
a
big
challenge:
yeah,
the
the
data,
the
culture
component
right
is:
there's
no
one
tool
that
can
solve
that
right,
but
figuring
out.
How
do
we,
you
know?
I
think
I
think
that's
where
I'm
excited
to
see
the
data
hub
community
start
to
kind
of
cultivate
more
of
those
conversations
around.
How
do
you
roll
out?
How
do
you
start
to
shift
that
data
culture
within
an
organization?
Because
you
can
introduce
all
of
the
tools
in
the
world?
A
A
A
Within
so
it
sounds
like
you,
you
all
are
maybe
a
little
a
little
bit
behind
in
in
your
version,
which
is
totally
fine.
We
are
setting
releases
very
rapidly,
but
you
know
with
your
current
instance:
what's
your
what's
your
favorite
kind
of
use
case
within
data
hub
or
you
know,
what
do
you
find
to
be
most
kind
of
impactful
within
within
anavinta,
in
in
kind
of
your
current,
your
current
deployment
yeah.
B
C
C
B
The
same
time,
and
also
it
can
be
extended
if
needed
and-
and
we
also
love
the
the
the
stream
based
model
for
both
publishing
the
metadata
and
the
consumption.
A
C
C
B
Yeah,
well,
I
I
I
think
you
are
doing
great
job
in
terms
of
communication
by
having
different
channels
with
different
purposes.
That
really
helps
to
reduce
the
noise.
No
and
keep
the
focus,
but
well
as
engineer.
Definitely
the
the
troubleshoot
channel
is
is
the
one
I
love
the
most,
because
it's
the
the
one
solving
my
problems
when
facing
some
some
issues.
So
it's
it's
great,
but
well
it's
not
a
slack
channel,
but
I
also
like
a
lot
the
the.
C
B
Tom
hall
sessions,
okay,
because
well
they
are
great
to
to
to
keep
track
of
of
the
progress
and
also
learn
about
the
new
features.
Sometimes
a
five-minute
session
is
much
better
than
any
documentation
and
so
on.
Right,
yeah.
A
Let's
get
nice
a
nice
way
to
kind
of
contextualize,
all
all
of
the
stuff,
we're
cranking
out
all
right.
So
one
one
final
question
for
you:
you've
been
a
tremendous
contributor
back
to
the
code
base.
So
not
only
are
you
active
in
the
community
and
not
only
are
you
jumping
in
to
help
other
people,
but
you
also
contribute
back
to
the
code
as
well,
and
I'm
curious.
You
know
kind
of
thinking
back
to
your
contributions.
A
Is
there
one
that
that
you're,
particularly
proud
of
or
one
that
you
feel
like
solved
a
you
know,
a
an
interesting
problem
and
and
that
you
just
wanted
to
kind
of
give
back
to
the
community.
B
Yeah
well,
yes,
that
is,
there
is
one
contribution
about
adding
the
the
platform
instance
capability
to
to
the
connector
and
well,
I'm
very
proud
or
happy
with
that
contribution,
not
because
it
was
especially
complex
or
tricky,
but
because
it
it
showed
up
a
misunderstanding
using
the
environment
for
the
platform
instance
and
so
affecting
other
connectors
too.
So
I
was
really
happy
on
finding
such
a
guiding
issue,
let's
say
and
more
importantly,
on
helping
on
the
real
solution.