JOB DESCRIPTION

  • Develop, maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
  • Develop and maintain scalable, optimized data pipelines leveraging Python and AWS services to support increasing data volume and complexity, while ensuring seamless integration with AI platforms like Bedrock and Google. Further enhance data accessibility and drive data-driven decision making by collaborating with analytics and business teams to refine data models for business intelligence tools.

JOB REQUIREMENTS

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5-6 years of experience in data engineering or a similar role.
  • Strong programming skills in Python, AWS and related tech stack.
  • Excellent communication, collaboration, and problem-solving skills.
  • Strong Exposure to work with AI Services like Bedrock, Google (Gemini), and OpenAI Models etc
  • Develop, maintain, and optimize scalable data pipelines using Python and AWS services (e.g., S3, Lambda, ECS, RDS, SNS/SQS, Vector DB).
  • Develop unit and integration tests and contribute to engineering documentation.
  • Collaborate with analytics and business teams to improve data models for business intelligence.
  • Work closely with frontend and backend engineers, product managers, and analysts to ensure successful project delivery.
  • Participate in code reviews and contribute to the improvement of data engineering best practices.

#LI-JL2

BUSINESS SEGMENT

Corporate

PLATFORM

Operating Division

Location

SGP Keppel Bay Tower, Singapore

Job Overview
Job Posted:
1 week ago
Job Expires:
Job Type
Full Time

Share This Job: