Data Engineer – Data Ingestion Platforms

Job Locations
RO-B-Bucharest | RO-IS-Iași
Job area
IT & Digital
Employment type
Permanent
Workplace
Hybrid
Experience level
Associate / Mid-Level

Overview

Expleo is a global engineering, technology, and consulting service provider that partners with leading organizations to guide them through their business transformation, helping them achieve operational excellence and future-proof their businesses.
Expleo benefits from more than 50 years of experience developing complex products in automotive and aerospace, optimizing manufacturing processes, and ensuring the quality of information systems. Leveraging its deep sector knowledge and wide-ranging expertise in fields including AI engineering, digitalization, automation, cybersecurity and data science, the group’s mission is to fast-track innovation through each step of the value chain.
With 19,000 employees and a worldwide presence in 30 countries, our global footprint includes excellence centers around the world. We are in Romania since 1994 and currently counting 1700 colleagues.

Responsibilities

We are seeking a Data Engineer to join a growing data ingestion team responsible for building and operating scalable, production-ready data pipelines. You will work under the guidance of a Lead Data Engineer and collaborate with distributed team members to deliver reliable, automated, and observable ingestion solutions across multiple data sources and environments.

  • Develop and maintain batch and streaming data ingestion pipelines using established patterns
  • Implement data extraction from APIs, databases, filesystems, and event-based sources
  • Write clean, testable, and well-documented Python code aligned with team standards
  • Contribute to automation, monitoring, and logging of data pipelines
  • Participate in code reviews, integration testing, and agile delivery ceremonies
  • Support operational activities including incident investigation and hotfix implementation
  • Apply cost-aware practices when using cloud resources

Qualifications

Required Skills & Experience:

  • Solid experience with Python for data engineering (unit testing, packaging, documentation)
  • Hands-on exposure to PySpark and Pandas for data processing
  • Basic experience with streaming or event-driven architectures
  • Familiarity with containerization and CI/CD workflows (e.g., Docker, GitHub Actions)
  • Experience working in cloud-based data platforms, preferably AWS and Databricks
  • Practical use of AI-assisted coding tools (e.g., GitHub Copilot) in daily development

Nice to have:

  • Experience with monitoring tools and operational dashboards 
  • Exposure to FinOps or cost monitoring concepts   
  • Prior work in distributed or remote delivery teams  

Benefits

  • Benefit Platform 
  • Holiday Voucher 
  • Private medical insurance  
  • Performance bonus 
  • Easter and Christmas bonus 
  • Employee referral bonus 
  • Bookster subscription  
  • Work from home options depending on project  #LI-AT1

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share to social media

Can't find the job of your choice?
Upload your C.V. / Resume here for our recruiters to view.