Senior Data Engineer

for Smart TV Product

  • Location


  • Area

    Data Engineer

  • Tech Level


Tech Stack

Terraform, Airflow, SQL, PySpark, Python, AWS


your info


.pdf , .docx, .rtf, .pptx up to 20mb

    About the Client

    Founded in 2021 by serial entrepreneurs and investors (with multiple exits and prior companies with annual revenues in billions of dollars).

    Backed by some of Silicon Valley’s top VC firms.

    Project details

    A new revolutionary smart TV product that redefines what a TV can be.

    With future-proofed sensors and delightful OTA updates, we’re envisioned to become the hub of the reinvented living room.

    This and a host of other next-gen features make our product a truly revolutionary technology that goes beyond just smart.

    Your Team

    The team will consist of 35+ team members, working in Agile methodology.

    Come be a part of the future of entertainment!

    What's in for you

    • 20 days per year for your recreation and health
    • Long term and stable projects
    • Strong experts you can collaborate with and learn from
    • Smooth and respectful interviews


    Your responsibilities will include Data Pipeline development, Database management, AWS Services management, Data Quality and Testing, automation and orchestration, performance optimization and documentation, ensuring security and compliance, and providing troubleshooting and support. Outstanding team members also prioritize keeping up-to-date with the latest advancements in data engineering, cloud computing, and relevant technologies to propose improvements and optimizations.


    • Bachelor’s degree in Computer Science, Data Engineering, or a related field. Advanced degree preferred
    • Strong programming skills in Python
    • Proficiency in data engineering frameworks, particularly PySpark, is a plus
    • Solid experience with SQL and database systems
    • Hands-on experience with cloud platforms, specifically AWS (Glue, Athena, S3, Redshift)
    • Knowledge of infrastructure as code (IaC) principles and experience with Terraform
    • Familiarity with data orchestration tools like Apache Airflow
    • Excellent problem-solving skills and attention to detail
    • Strong communication and collaboration skills
    • Ability to work independently and as part of a team in a fast-paced environment
    • AWS certification(s) related to data and analytics (e.g., AWS Certified Data Analytics – Specialty) is a plus

    Tech stack:

    • Python
    • Pyspark (or any other data eng framework)
    • SQL
    • Terraform
    • AWS – glue, athena, s3, redshift
    • Airflow
    image description
    Your personal recruiter
    Juan Hernandez

    Apply Now

    .pdf , .docx, .rtf, .pptx up to 20mb