Back to offers

Senior Data Engineer for mobility fintech project

From Data & Machine Learning department

Opening soon

EUR 5 200 - 7 700

Before Tax (Business Contract)

Click for the details

Sorry, our forms might not work

Please contact us via hi@netguru.com in case of any issues.

Our mission at Netguru is to help entrepreneurs and innovators shape the world through beautiful software. We care about trust, taking ownership, and transparency. As a Certified B Corporation®, we offer a safe, inclusive and productive environment for all team members, and we’re always open to feedback. If you want to work from home and be a full time employee, great! We want to create the right opportunities for you.

  • Required skills: 3+ years of experience in Data Engineering or 4+ years in Software Engineering; strong skills in Python and GCP platform; experience in GCP BigQuery, Apache Airflow, and in DataOps.
  • We offer: 100% remote work, flextime & flexplace, dev-friendly processes, long-term collaboration.

We are looking for a Senior Data Engineer who would like to join one of our key projects within the area of mobility financing, and develop a durable relationship with our client - a startup that develops their platform revolutionizing vehicle financing across countries that have limited or no access to services as such.

Project description:

  • You will join a team of experts, building a large product for millions of users, already existing in 6 countries, with a possibility of expanding to various continents,
  • You will co-create cutting-edge solutions with clients, partners, fin-techs, and other technology leaders,
  • You will contribute to the development of several innovation projects.

Ready to apply? First check if you:

  • Have 3+ years of experience in Data Engineering or 4+ years in Software Engineering,
  • Are advanced in Python programming (good knowledge of OOP, iterators, generators, lambda functions, data structure, and Python ecosystem tools for code quality),
  • Have experience with DataOps - it is important for everyday tasks (observability, data quality & infrastructure monitoring),
  • Have strong skills in the GCP platform (with a focus on data services) and know AWS well,
  • Have experience in GCP BigQuery practical usage (writing/optimizing advanced SQL BQ queries, partitioning, and cost model understanding),
  • Have at least 1 year of experience in Apache Airflow.

Also, we’d expect from you:

  • Experience in Apache Beam framework (with Python SDK; Pipeline, PCollection, PTransform concepts),
  • Knowledge about GCP Dataflow (ready to switch from Airflow to Dataflow),
  • Good understanding of GCP pub/sub service (connecting from Apache Beam) and experience with the Apache Kafka ecosystem,
  • Experience with Change Data Capture (CDC) concept,
  • Knowledge about building/consuming data from APIs,
  • Great Expectations Python library usage will be a plus.

Joining Netguru as a Senior Data Engineer on this project means:

  • Developing a clear understanding of the data use cases on client's side and critical user journeys to design data systems
  • Defining database structure / how we store our data (e.g., creating Entity Relationship Diagrams (ERD))
  • Database technology selection for both relational and non-relational databases (in particular, time-series databases)
  • Defining APIs and services for accessing data
  • Defining data validation and constraints to meet business needs
  • Integrating our data systems as part of an event-based architecture (especially with Kafka)
  • working with an experienced team;
  • 100% remote work – we've developed a perfect remote work culture;
  • processes based on the Scrum and Agile methodologies;
  • dev-friendly processes such as Continuous Integration, Continuous Delivery, Code Review, and bug bashes;
  • continuous development of your hard and soft skills.

In return, we offer:

  • 100-percent remote work;
  • work with an experienced team and continuously development of your hard and soft skills;
  • dev-friendly processes such as Continuous Integration, Continuous Delivery, Code Review, and bug bashes.

What will happen next?

  • If your application meets our requirements you will be invited for a short meeting with the recruiter - we will be happy to get to know you and tell you more about us!
  • Next step is a technical interview with one of our Data Engineers (you may be asked to solve a short task before)
  • If everything goes well, we will be happy to welcome you on board!

If you need any disability-related adaptation at any step of the recruitment process – simply let the recruiter know! We'd be happy to help.