Back to offers

Data Engineers for a food delivery project

Ongoing

EUR 5 200 - 7 700 for Senior Data Engineer

EUR 3 300 - 5 000 for Regular Data Engineer

Click for the details

Join Netguru Talent Marketplace, a proven partner for tech-minded freelancers and experts. Thanks to us, you will have access to various project-based opportunities and can collaborate with different companies and industries. As a result, you will not only gain more experience but also develop a variety of skills you didn’t even know you had. Work the way you like, on your terms, with no strings attached.

We're developing 4 mobile food delivery applications for a multinational and innovative enterprise. For this client Netguru became a strategic development and design partner, taking ownership of 25+ projects so far, working on various products within the group. One of our teams has supported the company in optimizing the customer journey since the beginning of October 2020. Currently, we are looking for Data Engineers with excellent technical and soft skills to join the ongoing, long term projects for this client.

  • Required skills: Python, SQL, Spark/Glue, ETL (Airflow), Git Actions, CI/CD, Docker, English B2+ level.
  • Nice to have: AWS Redshift/GCP BigQuery, Snowflake Scala, Hadoop, TB+, cloud, HDFS/Parquet/Avro, Terraform for IaaS.
  • We offer: 100% remote work, flextime & flexplace, dev-friendly processes, long-term collaboration.

Apply if you:

  • Are advanced in Python programming language (understanding: iterators, generators, exceptions, OOP, popular libraries for data engineering).
  • Have advanced SQL knowledge.
  • Have experience in Docker (instantiating a container from a configuration).
  • Are passionate about data and have computer science fundamentals.
  • Are running, maintaining and deploying your own code to production.
  • Have at least 3+ years of experience building data pipelines in a professional working environment.
  • Have experience with processing of large amounts of structured and unstructured data.
  • Have a good understanding of distributed and streaming data processing.
  • Have experience with Apache Spark or similar solutions.
  • Have experience with ETL (Airflow) or other data processing automation approaches.

We'll be happy to see that you have:

  • Experience with Google BigQuery, PubSub, AWS SNS, AWS Lambda.
  • Experience with data sources like Salesforce and Microsoft Dynamics.
  • Experience with Docker, Travis, Airflow, Terraform, Kubernetes.
  • Have practical knowledge of DevOps t.j. CI, CD, terraform, observability.
  • Can debug complex data infrastructures.

Depending on your skills, joining Netguru as a Data Engineer could mean:

Project 1

  • Working with the client’s Data Engineering and Data Science team (around 15 people) to build custom data pipelines to support 4,000+ users.
  • Building ingestion pipelines from multiple source systems.
  • Taking part in a company-wide analytical reporting redesign.
  • Working both with batch and streaming data of approximate proportion 60% and 40% respectively*.
  • Data-driven mindset - our clients require PoCs, data exploration/normalization, and expertise.
  • Monitoring data flows and making continuous improvements to data pipelines with custom Airflow operators.
  • *Good understanding of streaming data processing, experience with Google BigQuery, PubSub, AWS SNS, AWS Lambda, and data sources like Salesforce and Microsoft Dynamics are required for this position.

Project 2
  • Working with the client’s Data Engineering and Data Science team (around 15 people) to build custom data pipelines to support 4,000+ users.
  • Building the data model of customers from scratch (empty BigQuery project), data model adjustment to deliver required analytical features*.
  • Taking part in a company-wide analytical reporting redesign.
  • Data-driven mindset - our clients require PoCs, data exploration/normalization, and selection of the right technological stack.
  • Monitoring data flows and making continuous improvements to data pipelines with custom Airflow operators.
  • *Experience with Google BigQuery, and data sources like Salesforce and Microsoft Dynamics are required for this position.

Project 3

  • Working with the client’s product teams to build custom data pipelines*.
  • Data-driven mindset - our clients require PoCs, data exploration/normalization, and expertise.
  • Monitoring data flows and making continuous improvements to data pipelines with custom Airflow operators.
  • *Experience with Google BigQuery, and working with the client’s product teams to build custom data pipelines are required for this position.

In return, we offer:

  • working with an experienced, distributed team;
  • a mentor who will assist you during your first days;
  • possibility of a long-term collaboration on other challenging products in the future;
  • continuous development of your hard and soft skills.

Looking for a full-time job? Check out our Career Page and find out more about our open recruitment processes.

If you need any disability-related adaptation at any step of the recruitment process – simply let the recruiter know! We'd be happy to help.