Technical Lead – Azure Data Factory with PySpark
City : Waterloo, Ontario
Category : Technology & Engineering
Industry : Information Technology
Employer : Cognizant
At Cognizant, our global community sets us apart—an energetic, collaborative and inclusive workplace where everyone can thrive. And with projects at the forefront of innovation, you can build a varied, rewarding career and draw inspiration from dedicated colleagues and leaders. Cognizant is right where you belong.
We are Cognizant Artificial Intelligence
Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. However, clients need new business models built from analyzing customers and business operations at every angle to really understand them. With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.
Job Summary: The ideal candidate will have extensive experience in PySpark, Azure Synapse, Azure Data Factory ETL, and Azure Data Factory. The candidate should be able to design, develop, and maintain data pipelines and data streams. The candidate should also be able to extract and transform data across various data processing layers using Databricks, Python. The candidate will play a crucial role in driving technical excellence and innovation within our projects.
Required Technical Skills: Azure Data Factory, PySpark, Azure Synapse, Azure Data Factory ETL.
What you will do:
- Lead the design and implementation of data solutions using PySpark, Azure Synapse, and Azure Data Factory.
- Oversee the development and maintenance of ETL processes to ensure data accuracy and reliability.
- Provide technical guidance and mentorship to team members to foster a collaborative and high-performing environment.
- Work together with collaborators, customers, partners, and team members to grasp business requirements and transform them into technical specifications.
- Ensure the scalability and performance of data solutions to meet the organization's growing needs.
- Perform code reviews to ensure compliance with coding standards.
- Develop and maintain documentation for data solutions, processes, and workflows.
- Resolve and solve technical issues related to data processing and integration
- Implement data security and privacy measures to protect sensitive information.
- Collaborate with teams from various functions to ensure seamless integration of data solutions.
- Monitor and optimize data pipelines for efficiency and performance.
What you bring to the role:
- Minimum 8+ years of hands-on experience in PySpark, Azure Synapse, and Azure Data Factory ETL.
- Show proficiency in designing and implementing scalable data solutions.
- Have experience in leading and mentoring technical teams.
- Be adept at translating business requirements into technical specifications.
- Ensure data security and privacy compliance.
- Demonstrate the ability to optimize data pipelines for performance.
- Demonstrate domain expertise in Life and Annuities Insurance.
Working Arrangements
We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in our client's office in Waterloo, ON location. Regardless of your working arrangement, we are here to support a healthy work-life balance through our various well-being programs.
Note: The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.