At NobleAI, we believe that energy, material science, and chemistry are key to building a sustainable world and that artificial intelligence is essential to unlock this potential. NobleAI leverages innovative Science-Based AI technology to revolutionize energy workflows, materials development, and chemical designs. We enable companies to accelerate innovation and reduce costs in developing sustainable technologies and products. 

We're a team of excellence-driven individuals who value thoughtfulness and respect while focusing on delivering products that empower engineers and researchers to create optimized solutions faster.

We are seeking a Lead DevOps Data Engineer to be an integral part of our growing team. This role will focus on building, maintaining, and scaling the infrastructure that supports our data pipelines and systems. 

Join us in building a more sustainable world through the power of AI and scientific innovation.

Requirements

Key Responsibilities

  • Design, build, and maintain scalable and resilient data pipelines and workflows following standard engineering quality and architecture best practices
  • Implement data storage and processing solutions using cloud technologies (e.g., AWS, Azure, GCP)
  • Develop CI/CD pipelines to automate the deployment and testing of data applications
  • Troubleshoot and resolve issues across infrastructure and data pipelines
  • Work closely with software engineers, data engineers, and research scientists to understand and support their data needs
  • Understand the pros and cons of implementation options and articulate the cost, performance, timing, and effort details
  • Understand the market and trends in technology and architecture and propose solutions aligned to the problem at hand
  • Collaborate with peers on ideas/solutions and provide mentorship to junior members/interns
  • Adapt communications to technical and non-technical stakeholders

What We’re Looking For

  • BSc Degree in Computer Science, Engineering or related field
  • 5+ years working in DevOps or Data Engineering for enterprise software applications
  • 3+ years of experience with cloud platforms (AWS, Azure, GCP)
  • 3+ years of experience in containerization and orchestration tools (Docker, Kubernetes)
  • 5+ experience with databases and platforms (Postgres, Hive, Snowflake, Databricks. Redshift, BigQuery, Athena)
  • 3+ years of experience in building data pipelines and features for analytics models

Benefits

Did we mention we offer great pay & benefits? 

  • Benefits coverage including medical, dental, vision, disability and life insurance
  • Retirement fund employer contribution 
  • Generous Paid Time Off & Holidays
  • Stock options
  • Performance-based bonus
  • Salary range depending on experience & geographic location 

Location

Mexico - Remote

Remote Job

Job Overview
Job Posted:
4 days ago
Job Expires:
Job Type
Full Time

Share This Job: