Responsibilities
• Lead technical decisions for enterprise data architecture using Databricks
•Develop strategies for cost-effective, high-performance data architectures, covering everything from data streams to analytical storage
•Design and implement data solutions in Microsoft Azure, based on project needs
•Build and maintain ETL processes and data models for analytical pipelines, especially in Finance
•Establish and uphold best practices for software development, like version control, testing, and CI
•Support pre-sales activities, including RFPs, proposals, and client communications
•Guide discussions on cloud data warehousing and machine learning platforms
•Provide technical support for vendor and third-party engagements
•Offer ongoing support for business and technology teams on platform strategy and quality assurance
Requirements
• Bachelor’s degree in Computer Science, IT, or a related field
•Minimum 2 years of Databricks and 4 years of Azure experience (certifications preferred)
•Experience as a Data Architect/Engineer with hands-on expertise in Data Lakehouse, Data Lake, and Data Warehouse design
•Strong knowledge of Databricks, Data Lakes and Spark
•Strong experience with Azure Platform Services (Identities, Network, Storage)
•Proficiency with Azure Data Services, including Azure Data Factory, Azure SQL
•Experience with streaming data technologies like Databricks Streaming, Kafka, Azure Streaming Services
•Experience with Azure DevOps especially Azure Pipelines
•Proficiency in Python, PySpark, SQL
•Familiarity with DataOps
Apply for this position
Fill in your details and attach your CV.