CAREERS AT EDC RETAIL

(Senior) Data Engineer

Our Data Engineering team plays a central role in our data-driven strategy, designing and developing our infrastructure, processes and analytics capabilities! The team is looking for a (Senior) Data Engineer to help us design and implement a modern data architecture based on Snowflake. Are you curious? Then please continue reading and apply immediately!

 

HBO

 

32 / 40 hour

 

 

What will you be doing

We as Data Team deliver high quality data, information and knowledge that enable relevant decision making across the organization and in close collaboration with our partners. Our Data Team plays a central role in our strategy, designing and developing our infrastructure, processes and analytics capabilities.

The team is also responsible for making the huge amount of data accessible, trustworthy and more efficient so that our teams can make the best data-driven decisions.

Elements of your role include:

  • You are responsible for the Design and implement ELT-based technology to achieve scale and performance;
  • You build and maintain our data infrastructure;
  • You help to grow our analytics capabilities with faster, more reliable data pipelines, and better tools;
  • You work closely with data analysts/data scientist and IT engineers to implement new ELT pipelines and find new ways to leverage this data;
  • You develop and transform raw data sources into powerful, reliable components of our data lake;
  • You optimize and improve existing features or data processes for performance and stability;
  • You are leading and participating in cross-functional projects that supporting data applications and reporting;
  • You will Investigate for next generation data and analytics technologies to expand the capacity and performance of the DATA stack.

Who are you

  • You love to learn, are a critical thinker and creative problem solver who strives for excellence;
  • You have a degree in computer science, informatics, or relevant subjects in the direction of data processing/analytics;
  • You have at least 3 years of professional experience in designing, building, monitoring, or scaling data processing pipelines;
  • Experience with Python or R, (or Java/Scala + Spark and willing to work in Python);
  • Advanced knowledge in building and maintaining complex and modern DWH architectures and Data Services/Pipelines (ETL);
  • Experience with at least one DWH-oriented DBMS (Snowflake, Vertica, Exasol, Exadata,  etc);
  • Understanding of big data (batch and streaming), data warehouses and BI tools like Apache Spark, Snowflake, Azure Data Lake Storage and Airflow (or similar);
  • Strong SQL knowledge in advanced querying;
  • Solid experience working with Snowflake and ETL/ELT tools (e.g. Airflow, Fivetran, dbt);
  • Comfortable with using the shell command line and Git.

What do we ask

  • English Proficiency to communicate by email and on conference calls;
  • Team player with excellent communication and collaboration skills;
  • Able to convey clearly and directly complex technical topics;
  • Proactive, self-directed and organized;
  • Being a data nerd and able to differentiate big data buzzwords from pokémon names ;-)

What do we offer you

  • A unique opportunity within a young company;
  • A job at one of the most exciting companies in the Netherlands;
  • A 'work hard, play harder' atmosphere;
  • The space to develop yourself;
  • Free use of our own in-house gym and/or a discount on your sports subscription via Bedrijfsfitness Nederland;
  • A job for 32-40 hours a week with a competitive salary.

Apply

Enthusiastic? Respond quickly via the link below.