Job Title: Tech Lead- Data Engineering
Job Summary:-
We are looking for a highly skilled and motivated Tech Lead of Data Engineering to spearhead the development of scalable and efficient data engineering solutions. The ideal candidate will possess deep expertise in Python, PySpark, AWS services, and streaming data platforms, with a proven ability to integrate complex data sources and develop distributed data processing frameworks. This role requires a strong technical leader who can guide the team, solve complex challenges, and deliver optimal solutions that align with client requirements. Wealth management experience is an added benefit.
Key Responsibilities:
- Technical Leadership:
- Provide hands-on technical leadership to the team in designing and implementing data engineering solutions.
- Lead by example in adopting best practices for coding, testing, and deployment.
- ETL Development:
- Design and develop robust ETL pipelines using AWS Glue, Lambda, and other AWS services to process large volumes of data efficiently.
- Implement complex data transformations and integrate data from multiple sources such as APIs, databases, and streaming platforms.
- Streaming Data Processing:
- Design and implement streaming data pipelines using Kafka, AWS Kinesis, or similar technologies.
- Build scalable frameworks to handle real-time data ingestion and processing.
- Distributed Data Processing:
- Develop distributed data processing frameworks to ensure performance and scalability in handling large datasets.
- Optimize the performance of data processing jobs for both batch and real-time workloads.
- Solutioning & Architecture:
- Provide optimal data engineering solutions aligned with client requirements and business objectives.
- Collaborate with architects to design scalable and secure data solutions leveraging AWS cloud services.
- AWS Expertise:
- Utilize AWS services (e.g., S3, Glue, Lambda, Kinesis, DynamoDB) to build efficient and scalable cloud-based solutions.
- Stay updated with the latest AWS services and features to continuously improve system performance and cost efficiency.
- Stakeholder Collaboration:
- Work closely with clients, business analysts, and other stakeholders to understand requirements and translate them into technical solutions.
- Communicate progress, challenges, and solutions effectively to both technical and non-technical stakeholders.
Qualifications:
- Education:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- Experience:
- 10+ years of experience in data engineering, with at least 3+ years in an Architect role.
- Strong hands-on expertise in Python, PySpark, and AWS services for data processing and integration.
- Proven experience designing and developing streaming data solutions using Kafka, AWS Kinesis, or similar technologies.
- Solid experience working with APIs and integrating data from diverse sources.
- Extensive knowledge of distributed data processing frameworks and best practices.
- Skills:
- Strong problem-solving and solution-oriented mindset to deliver optimal results.
- Excellent knowledge of data integration techniques and cloud-based architecture.
- Proficient in implementing complex data transformations and scalable data workflows.
- Exceptional team leadership and mentoring abilities.
- Strong communication skills for effective stakeholder collaboration.
Preferred Qualifications:
- AWS Certified Solutions Architect certification or equivalent.
- Familiarity with Terraform or CloudFormation for AWS infrastructure as code.
- Wealth Management domain experience.
Why Join Us?
- Be a key player in building cutting-edge data engineering solutions for large-scale projects.
- Work with a talented and collaborative team in a dynamic environment.
- Competitive salary, benefits, and opportunities for career growth.