Job Title: Senior Databricks Architect
Domain : Financial Services
Job Summary:
We are seeking an experienced Senior Databricks Architect to design and deliver scalable, high-performance data platforms for our Financial Services clients. The ideal candidate will have deep expertise in Databricks, cloud data architectures, and modern data engineering practices, with a strong focus on building governed, secure, and efficient data ecosystems.
Key Responsibilities:
- Architect and implement end-to-end data solutions using Databricks Lakehouse Platform
- Design and develop scalable data pipelines using PySpark, Spark, and SQL
- Lead cloud data platform implementations on AWS/Azure/GCP
- Drive modernization of legacy data systems to cloud-native architectures
- Implement Medallion Architecture (Bronze, Silver, Gold layers)
- Ensure data governance, security, and compliance (especially for Financial Services)
- Collaborate with business stakeholders, data scientists, and engineering teams
- Optimize performance, cost, and reliability of data pipelines
- Mentor and guide data engineering teams and provide architectural leadership.
Required Skills & Qualifications:
- 12+ years of experience in Data Engineering / Data Architecture
- Strong expertise in Databricks, PySpark, and Spark ecosystem
- Hands-on experience with cloud platforms (Azure, AWS, or GCP)
- Experience with data orchestration tools (Airflow, ADF, etc.)
- Solid understanding of data modeling, ETL/ELT, and distributed systems
- Experience with Delta Lake and Unity Catalog is a plus
- Strong knowledge of Financial Services domain (Banking, Insurance, Capital Markets)
- Excellent problem-solving and stakeholder management skills.
Preferred Qualifications:
- Databricks certifications
- Experience with real-time data processing (Kafka, streaming)
- Exposure to AI/ML pipelines and MLOps
- Knowledge of regulatory and compliance frameworks in Financial Services