Job Description The PySpark and Synapse Notebooks developer/application engineer will work as part of the project team. Will be primarily responsible for developing, deploying and operating the Synapse Notebooks and Spark SQL. • Strong skills with experience in PySpark/Spark SQL • Strong SQL skills with experience in Azure SQL & Synapse • Strong Skill in developing Synapse Notebooks , Azure Data Bricks • Experience in ETL (SAS/Informatica/ADF Data flows) • Experience implementing Synapse Pipelines/Dataflows • Use the interactive Synapse/Databricks notebook environment using SQL, Examine external data sets, Query existing data sets using SQL • ETL transformations and Loads using Synapse Notebooks/ Azure Databricks to apply built-in functions to manipulate data, • Perform an ETL job on a streaming data source, Parameterize a code base and manage task dependencies, Submit and monitor jobs using the REST API or Command Line Interface • Mange Delta lake using Data bricks to use the interactive Databricks notebook environment, Create, append and upsert data into a data lake.

Location

Mumbai, Maharashtra, India

Job Overview
Job Posted:
6 days ago
Job Expires:
Job Type
Full Time

Share This Job: