➜
~
cat about.md
Data Architect & Principal Data Engineer
I am a seasoned Data Architect and Principal Data Engineer with deep expertise in Azure data platforms, Snowflake, and modern data engineering practices. I specialize in designing scalable cloud-based solutions, leading on-prem to cloud migrations, and implementing robust data pipelines with a strong focus on data quality, metadata management, and observability. My work bridges architecture and hands-on engineering, enabling businesses to unlock value from their data through secure, high-performance systems.
$
ls expertise/
Areas of Expertise
Cloud Data Architecture
Designing scalable data platforms using Azure services (Data Lake, Synapse, Databricks, etc.).
Snowflake Development
Building efficient data models, pipelines, and governance frameworks in Snowflake.
On-Prem to Cloud Migration
Leading end-to-end modernization of legacy data systems to the cloud.
Data Engineering
Building batch and streaming data pipelines using modern ETL/ELT tools and best practices.
Metadata Management
Implementing data catalogs, lineage tracking, and data documentation for governance.
Data Quality & Observability
Designing frameworks for monitoring, validation, and anomaly detection.
CI/CD & Automation
Automating deployments, tests, and monitoring via YAML pipelines and DevOps tools.
Big Data & Performance Tuning
Optimizing storage, compute, and query performance at scale.
$
cat projects.json | jq '.recent[]'
Recent Projects
Cloud Data Platform Modernization
Client: Global Retail Group
Scope: Migrated on-premise SQL Server DWH to Azure Data Lake + Synapse + Databricks.
- Designed end-to-end architecture for ingestion, transformation, and consumption layers.
- Implemented CI/CD pipelines using Azure DevOps (YAML-based).
- Reduced data processing time by 70%, enabled near real-time analytics.
Snowflake Enablement for Financial Reporting
Client: Large Financial Institution
Scope: Designed and built a scalable Snowflake-based reporting platform.
- Developed automated ELT pipelines using dbt and Snowpipe.
- Implemented role-based access control and data masking for compliance.
- Delivered metadata and lineage tracking via integration with Collibra.
Enterprise Data Quality & Observability Framework
Client: Multinational Logistics Company
Scope: Developed centralized data quality monitoring system.
- Introduced validation layers using Great Expectations and custom metrics.
- Integrated monitoring into Azure Data Factory pipelines.
- Increased data trust score across domains by over 40%.
$
cat philosophy.md
Project Management Philosophy
I believe in clear goals, lean execution, and continuous delivery. My approach emphasizes aligning stakeholders early, building with iteration in mind, and maintaining high visibility across all stages of the project. I prioritize outcomes over outputs, invest in risk reduction through proactive planning, and promote a culture of accountability, communication, and adaptability. Every project is an opportunity to create value, build trust, and improve team performance through clarity, empathy, and data-driven decisions.
$
_