Role
Looking for a seasoned Tech Lead with strong hands-on expertise in Data Engineering, ETL tools (Informatica) and Cloud platforms. The ideal candidate will lead the design, development and delivery of large-scale data pipelines and conversion initiatives, working closely with cross-functional teams
Responsibilities
· Lead end-to-end data engineering drive and manage Informatica-based data conversion and migration projects
· Collaborate with business stakeholders to translate data requirements into scalable technical solutions
· Ensure data quality, performance optimization and best practices across pipelines
· Mentor and guide junior engineers within the team
· Evaluate and adopt emerging technologies including AI/ML tooling where applicable
· Work closely with SME, analysts and product teams on delivery.
MANDATORY SKILLS
· Informatica PowerCenter / IICS — Strong hands-on experience in ETL development, mapping design and data conversion projects
· Python — Proficient in data engineering, scripting, automation and pipeline development
· Teradata — Experience in data warehousing, SQL optimization and working with large-scale Teradata environments
· Google BigQuery — Hands-on experience in BigQuery data modeling, query optimization and large dataset handling
· GCP (Google Cloud Platform) — Experience with GCP services such as Dataflow, Cloud Storage and Cloud Composer/Airflow.
SOFT SKILLS
· Strong communication and stakeholder management
· Ability to lead teams and drive delivery independently
· Problem-solving mindset with attention to detail
GOOD TO HAVE
· SSIS / KNIME — Exposure to additional ETL/data integration tools
· Understanding of Prompt Engineering and working with LLMs (Large Language Models)
· Experience integrating AI/GenAI capabilities into data workflows or pipelines
· Awareness of responsible AI practices