The Databricks Architect is a subject matter expert in the Databricks Lakehouse Platform. This
role involves designing and implementing large-scale data solutions, migration strategies, and
best practices for data engineering, data science, and BI teams leveraging Databricks across a
cloud environment.
Key Responsibilities
● Lakehouse Design: Design and build optimized data architectures (e.g., Bronze, Silver,
Gold layers) within the Databricks Lakehouse using Delta Lake.
● Platform Governance: Establish governance, security, and best practices for the
Databricks environment, including cluster management, workspace security, and Unity
Catalog implementation.
● Performance Tuning: Optimize Spark jobs and notebooks for maximum performance
and cost-efficiency on Databricks clusters.
● Data Migration: Lead the technical planning and execution of migrating existing data
warehouses and data lakes onto the Databricks platform.
● Integration: Architect integrations between Databricks and other enterprise tools, cloud
services, and BI platforms (e.g., Tableau, Power BI).
Required Qualifications
● Bachelor’s degree in Computer Science or a related technical field.
● 6+ years of experience in data warehousing, data engineering, or cloud architecture.
● 3+ years of specific, deep experience designing and deploying solutions on the
Databricks platform.
● Expert proficiency in Spark, Python, and SQL.
● Certification as a Databricks Certified Data Engineer or Cloud Architect is highly
desirable.