Data Architect with Databricks
Role :Data Architect with Databricks
Location : Remote.
Required Skills & Experience
Bachelors or Masters degree in Computer Science, Data Engineering, Data Science, or related field.
13 years of experience in data engineering, cloud platforms, or data architecture roles.
3 years of hands-on experience with Databricks, including Delta Lake, MLflow, Workflows, and SQL/PySpark.
Strong knowledge of Azure/AWS/Google Cloud Platform cloud services and data ecosystem tools.
Proven experience designing scalable data lake/data warehouse solutions.
Expertise in Python, SQL, and PySpark for large-scale distributed data processing.
Strong understanding of MLOps principles and ML lifecycle management.
Excellent communication and client-facing consulting skills.
Ability to lead solution design discussions and document architectures clearly.
Preferred Qualifications
Databricks certifications (e.g., Databricks Certified Data Engineer/Architect, ML Professional).
Experience with Unity Catalog, Feature Store, Model Serving, and Lakehouse AI features.
Familiarity with Terraform, Kubernetes, CI/CD pipelines (GitHub Actions, Azure DevOps, etc.).
Experience working with enterprise clients in consulting, professional services, or SI environments.