Data Integration Engineer

Company Overview: At Codvo, software and people transformations go hand-in-hand. We are a global empathyled technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.

We are looking for a motivated and an experienced data engineer to join a growing data science team.. As a senior data engineer, you will be required to work closely with customers, data scientists and solution architects to build robust production ready data infrastructure and data pipelines to help scale the machine learning and analytics solutions. In this role, you will drive the development of data engineering solutions from initial experimentation to production level deployment. You will also work with data science leadership to develop internal tools for rapid data ingestion and integration of customer data into selected cloud platforms and other SaaS solutions.
Responsibilities:
• Collaborate and work in a global data science team to develop scalable and robust data integration infrastructure.
• Engage directly with customers and partners to design and develop data requirements based on functional requirements.
• Build custom data integration pipelines from existing source systems into cloud platforms such as AWS, Microsoft Azure etc.
• Enable data ingestion, pre-processing, custom data wrangling from filesystems, databases, queues and streams to enable rapid prototyping.
• Work with customers to develop custom data handlers and connectors as needed.
• Perform a variety of data loads & data transformations.
• Improve database and application performance with fine tuning.
• Work with other project teams for data integrations and data lake requirements.
• Automate processes for better stability and performance of application. Qualifications/Requirements
• BS or MS degree in Computer Science, Engineering, IT.
• Minimum 5+ years of professional experience in data engineering, databases and and/or business analytics.
• Experience reading and parsing industrial P&ID and PFD documents.
• Solid background and experience in SQL, Python and/or Java/Scala
• Minimum 3 years’ experience in ETL and data pipeline design for heterogenous data such as time series, streams, queues and text.
• Familiarity with containers and container services like Docker, Docker registry and Kubernetes.
• Knowledge of test-driven development, agile software development methodologies and tools .
• Excellent verbal and written communication skills to work in a global team and with customers on a regular basis.

Location : Bangalore (Remote till covid)
Work timings : 2.30 pm- 11.30 pm
Notice Period : Immediate-30 days

Location

Maharashtra, Pune, India

Remote Job

Job Overview
Job Posted:
1 day ago
Job Expires:
Job Type
Full Time

Share This Job: