Responsibilities 

  • Architect, design, and implement a scalable AI/ML platform that seamlessly integrates with existing CI/CD pipelines and data platforms, ensuring optimal performance and reliability
  • Lead the development of E2E machine learning pipelines for both structured data and Large Language Models (LLMs), tailored for internal platform use cases
  • Conceptualize, Design and implement innovative LLM applications to enhance developer productivity, system efficiency, cost-effectiveness across the platform ecosystem.
  • Develop, identify and maintain low-code automated tools and frameworks for end-to-end machine learning processes, including model training, model testing, model registration, model deployment, model monitoring, experiment tracking etc.
  • Optimise core data pipelines to handle high-volume, high-velocity data flows for training and deploying ML models keeping compliance checks etc.
  • Develop comprehensive, high-quality documentation, including detailed technical architecture/design documents, developer-friendly guides, and clear operational procedures to ensure knowledge transfer and system maintainability
  • Collaborate closely with data scientists, software engineers, and cross-functional teams to ensure operational excellence and architectural alignment across the organisation
  • Lead code reviews, promote best coding practices, mentor junior team members and ensure high code quality standards across the team
  • Stay abreast of the latest advancements in AI/ML technologies and MLOps practices, and drive innovation within the organisation

Qualifications

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • 7+ years of total industry experience with at least 3+ years in AI/ML  operations and/or platform development
  • Proficiency in at least one programming language such as Java/Python/GoLang/Scala etc. and familiarity with ML libraries like TensorFlow, PyTorch, or Scikit-learn etc.
  • Hands-on experience with infrastructure as code, i.e., Terraform stack.
  • Advanced knowledge of AWS cloud services, including but not limited to: Sagemaker, Lambda, S3, Glue, Redshift, EC2, IAM etc. 
  • Experience in working on at least 1 production level end-to-end AI/ML powered use case
  • Familiarity with large language models (LLMs) and generative AI, including experience with AWS Bedrock, Google Vertex or similar platforms
  • Familiarity with Kubernetes for container orchestration and Airflow DAGs for workflow orchestration
  • Huge plus - relevant certification(s) such as AWS Machine Learning Specialty, or Google Cloud Professional Machine Learning Engineer etc.
  • Excellent problem-solving and analytical skills
  • Strong communication and collaboration abilities
Fanatics is building a leading global digital sports platform. We ignite the passions of global sports fans and maximize the presence and reach for our hundreds of sports partners globally by offering products and services across Fanatics Commerce, Fanatics Collectibles, and Fanatics Betting & Gaming, allowing sports fans to Buy, Collect, and Bet. Through the Fanatics platform, sports fans can buy licensed fan gear, jerseys, lifestyle and streetwear products, headwear, and hardgoods; collect physical and digital trading cards, sports memorabilia, and other digital assets; and bet as the company builds its Sportsbook and iGaming platform. Fanatics has an established database of over 100 million global sports fans; a global partner network with approximately 900 sports properties, including major national and international professional sports leagues, players associations, teams, colleges, college conferences and retail partners, 2,500 athletes and celebrities, and 200 exclusive athletes; and over 2,000 retail locations, including its Lids retail stores. Our more than 22,000 employees are committed to relentlessly enhancing the fan experience and delighting sports fans globally.

Location

Hyderabad, Telangana, India

Job Overview
Job Posted:
4 months ago
Job Expires:
Job Type
Full Time

Share This Job: