At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. 

EY GDS – Data and Analytics (D&A) – AWS DBX - Manager
As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance.

The opportunity
We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and presales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team.

Your key responsibilities

  • Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.
  • Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 8 - 11 years]
  • Need to understand current & Future state enterprise architecture.
  • Need to contribute in various technical streams during implementation of the project.
  • Provide product and design level technical best practices
  • Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions
  • Define and develop client specific best practices around data management within a Hadoop environment or cloud environment
  • Recommend design alternatives for data ingestion, processing and provisioning layers
  • Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark
  • Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies 

Tech Stack: -

AWS

  • Experience building on AWS using S3, EC2, Redshift, Glue, EMR, DynamoDB, Lambda, Quick Sight, etc.
  • Experience in Pyspark/Spark / Scala 
  • Experience using software version control tools (Git, Jenkins, Apache Subversion)
  • AWS certifications or other related professional technical certifications
  • Experience with cloud or on-premises middleware and other enterprise integration technologies.
  • Experience in writing MapReduce and/or Spark jobs.
  • Demonstrated strength in architecting data warehouse solutions and integrating technical components.
  • Good analytical skills with excellent knowledge of SQL.
  • 8+ years of work experience with very large data warehousing environment
  • Excellent communication skills, both written and verbal
  • 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
  • 5+ years of experience data modelling concepts
  • 3+ years of Python and/or Java development experience
  • 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive)

Skills and attributes for success

  • Architect in designing highly scalable solutions AWS. 
  • Strong understanding & familiarity with all AWS/GCP /Bigdata Ecosystem components
  • Strong understanding of underlying AWS/GCP Architectural concepts and distributed computing paradigms
  • Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming
  • Hands on experience with major components like cloud ETLs, Spark, Databricks
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
  • Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.
  • Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Good knowledge in Apache Kafka & Apache Flume
  • Experience in Enterprise grade solution implementations.
  • Experience in performance bench marking enterprise applications
  • Experience in Data security [on the move, at rest]
  • Strong UNIX operating system concepts and shell scripting knowledge
     

To qualify for the role, you must have

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Strong verbal and written communication skills.
  • Must be a team player and enjoy working in a cooperative and collaborative team environment.
  • Adaptable to new technologies and standards.
  • Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
  • Responsible for the evaluation of technical risks and map out mitigation strategies
  • Experience in Data security [on the move, at rest]
  • Experience in performance bench marking enterprise applications
  • Working knowledge in any of the cloud platform, AWS or Azure or GCP
  • Excellent business communication, Consulting, Quality process skills
  • Excellent Consulting Skills
  • Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.
  • Minimum 10 years industry experience

Ideally, you’ll also have

  • Strong project management skills
  • Client management skills
  • Solutioning skills

What we look for

  • People with technical experience and enthusiasm to learn new things in this fast-moving environment

What working at EY offers
At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are.
You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way that’s right for you
     

EY | Building a better working world 


 
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.  


 
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.  


 
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.  

Location

Kochi, KL, IN, 682020

Job Overview
Job Posted:
1 month ago
Job Expires:
Job Type
Full Time

Share This Job: