Brightgrove logo
Українська
Senior Data Engineer

Senior Data Engineer

AI-driven solutions
Location
Bucharest, Romania
Area
Data
Tech Level
Senior
Tech Stack
PySpark, Databricks, Python, AWS
Refer a Friend

your info

REFERRAL'S INFO

0/4000

About the Client

Our client delivers digital and AI-driven solutions specifically for the life sciences and healthcare industries.
The company provides end-to-end engineering, informatics, and data science services. They utilize scientific and technical expertise to build scalable, secure, and compliant digital solutions.
Core services and activities include:
• Artificial Intelligence.
• Scientific Informatics and Laboratory Informatics.
• Developing Custom Laboratory Software Solutions.
• Data Science and Engineering.
• Solution Design, such as designing data management infrastructure to support scientific research.
• Cloud Engineering, utilizing expertise in GCP and AWS.
• Developing High Performance Computing (HPC) platforms for tasks like cell image analysis and molecular docking.

Project details

Senior Data Engineer will focus on data processing to develop intelligent solutions that enable our customers to accelerate scientific discoveries, generate clinical data, and solve R&D challenges.

Your Team

We are looking for a skilled and motivated Senior Data Engineer to join dynamic team.

What's in it for you

  • Interview process that respects people and their time
  • Professional and open IT community
  • Internal meet-ups and resources for knowledge sharing
  • Time for recovery and relaxation
  • Bright online and offline events
  • Opportunity to become part of our internal volunteer community

Responsibilities

Design and create pipelines for data ingestions based on the Databricks Platform
Preparing data transformations with Python and PySpark library
Conduct Data Discovery with Data lineage to build Data Domains

Skills

  • 5+ years of experience in Data Engineering
  • Strong familiarity with the Databricks platform (including Unity Catalog)
  • Ability to write robust code with Python
  • Proficiency with Spark (PySpark)
  • Knowledge of cloud platforms: AWS
  • Exposure to Data Mesh architecture
  • Ability to collaborate closely with US time zones (EST or CST)
Recruiter Alona Mylashenko
Your personal recruiter
Alona Mylashenko

Apply Now

0/4000

sharing is caring & referral bonus