Published: Wed, 04 Dec 2024 14:51:05 GMT
Job Title: Senior Data Engineer
Company Description:
Founded in 2014, Shippeo is a leading European SaaS company that specializes in supply chain visibility. With a rapid expansion fueled by $110 million in funding from top investors, Shippeo prides itself on its diverse team of 27 nationalities and 29 spoken languages. With offices across Europe, North America, and Asia, Shippeo provides real-time transportation visibility to customers in various industries, including retail, manufacturing, automotive, and CPG. Our ultimate goal is to become the leading data platform for the freight industry, utilizing our growing network, real-time data, and AI to enhance supply chain operations and customer service.
Position Overview:
The Data Intelligence Tribe at Shippeo is responsible for utilizing data from our large shipper and carrier base to build data products that provide predictive insights to our users. As a Data Engineer, your role is to ensure that data is readily available and actionable for our Data Scientists and Analysts on our various data platforms. You will be instrumental in constructing and maintaining our modern data stack, which includes technologies such as Kafka, Airflow, Snowflake, and Kubernetes.
Responsibilities:
– Build, maintain, test, and optimize data pipelines and architectures
– Extract and analyze data to provide actionable insights for operational performance
– Implement advanced cleansing and enhancement rules to ensure top data quality
– Work with technologies such as Kafka, KafkaConnect, RabbitMQ, Airflow, DBT, Snowflake, and BigQuery
– Utilize programming skills in Python and advanced knowledge of SQL
– Collaborate with cross-functional teams to support data-driven decision-making
– Continuously improve and evolve CI/CD pipelines for efficient data processing
– Monitor and optimize infrastructure performance
Qualifications:
Required:
– MSc or equivalent degree in Computer Science
– 3+ years of experience as a Data Engineer
– Experience with data pipelines and architectures
– Proficiency in Python and SQL
– Familiarity with message queuing and stream processing
– Advanced knowledge of Docker and Kubernetes
– Experience with a cloud platform, preferably GCP
– Experience with a cloud-based data warehousing solution, preferably Snowflake
– Experience with Infrastructure as code, specifically Terraform/Terragrunt
– Experience with CI/CD pipelines, preferably Github Actions
Desired:
– Experience with Kafka and KafkaConnect, specifically Debezium
– Experience with monitoring and alerting tools such as Grafana and Prometheus
– Experience with workflow management systems like Airflow
– Experience with Apache Nifi
Additional Information:
Application Requirements:
As Shippeo operates internationally, please submit your CV in English. We are looking for individuals who are ready to take on a challenge and grow their career in a supportive and innovative environment.
Recruitment Process:
– Interview with our Talent Acquisition Manager
– Interview with the Hiring Manager
– Business Case
– Final Interview
Our Values:
At Shippeo, we value the following:
– Ambition
– Care
– Deliver
– Collaboration
Learn more about our values in Our Culture Book.
Discover Your Dream Team:
Get to know our team members and their roles at Shippeo by watching their videos.
Diversity Statement:
We are dedicated to fostering diversity and inclusion within our workplace. We believe in the value of unique perspectives and experiences that individuals from all backgrounds bring to our team. We provide equal employment opportunities to all candidates, regardless of their background or abilities. Our commitment to inclusion is reflected in our policies, practices, and workplace culture.
We understand that candidates may have unique needs or questions related to disability inclusion. For any inquiries or requests for accommodations during the application process, please reach out to our dedicated Disability Advisor at inclusion@shippeo.com. Apply link