Brightgrove logo
Українська
Senior Data Engineer

Senior Data Engineer

for Smart Metering solutions
Location
Remote, Kyiv, Ukraine, Wroclaw, Poland, Bucharest, Romania
Area
Data
Tech Level
Senior
Tech Stack
Scala, Apache Spark, AWS (S3, Glue, Athena), Apache Hudi, Apache Kafka, Terraform, Docker
Refer a Friend

your info

REFERRAL'S INFO

0/4000

About the Client

A leading European smart metering manufacturer serving approximately 800 customers through its proprietary IoT platform. The company is undergoing a major digital transformation, migrating from on-premises monolithic infrastructure to a modern cloud-native, multi-tenant architecture on AWS.

Project details

The cloud transformation program focuses on migrating existing infrastructure to AWS, transitioning from a monolithic single-tenant setup to a scalable, cloud-native platform. The current phase involves building a data lakehouse on AWS - migrating and re-architecting data interpretation and processing pipelines, working with Scala-based services and Terraform-managed infrastructure.

Your Team

You will work alongside Brightgrove's Technical Lead and the client's Product Owner and Team Lead/

What's in it for you

  • Interview process that respects people and their time
  • Professional and open IT community
  • Internal meet-ups and resources for knowledge sharing
  • Time for recovery and relaxation
  • Bright online and offline events
  • Opportunity to become part of our internal volunteer community

Responsibilities

Design and implement data pipelines for migration of data interpretation services into the AWS environment
Contribute to building and evolving the data lakehouse architecture on AWS
Work with Scala-based data processing components, adapting them for cloud-native operation
Support infrastructure setup and maintenance using Terraform
Collaborate with the client team on data architecture decisions and pipeline design
Write supporting scripts and tooling in Python as needed
Participate in daily standups and task planning sessions with the client team
Take full ownership of deliverables end-to-end - from clarifying requirements through implementation to validation
Flag risks, gaps, and blockers early
Document technical decisions and migration patterns for knowledge transfer

Skills

  • Data engineering experience (seniority level to be confirmed)
  • Strong Scala proficiency - hands-on production experience required
  • Experience with Apache Spark, Apache Hudi, Apache Kafka
  • Familiarity with AWS data services: Athena, Redshift, Glue, S3, or similar
  • Python scripting skills for tooling and automation
  • Working knowledge of Terraform (infrastructure as code)
  • Linux, Docker, GitHub Actions
  • Experience building or contributing to data lakehouse architectures
  • Dedicated and self-organized - able to manage your own workload, set priorities, and deliver consistently
  • Ownership mindset - you follow through without being chased and hold yourself accountable for outcomes
  • Proactive - you push things forward and ask the right questions
  • Strong organizational and communication skills
  • Experience with LLM-assisted development tools (Cursor or similar) is a plus
  • Familiarity with data migration projects and ETL/ELT patterns
  • English proficiency sufficient for daily technical collaboration
Recruiter Viktoriia Dorosh
Your personal recruiter
Viktoriia Dorosh

Apply Now

0/4000

sharing is caring & referral bonus