Responsibilities
• Contribute to the core engine powering the platform, handling large-scale and complex datasets
• Design, build, and maintain data pipeline architectures for spatial (geospatial) data
• Develop and optimize ETL / ELT workflows
• Integrate and work with geospatial databases
• Ensure data quality, performance, and data lineage across systems
Requirements:
• Strong proficiency in Python
• Experience building and managing Airflow pipelines
• Hands-on experience with geospatial tools and technologies (e.g. PostGIS, QGIS)
• Familiarity with cloud-based data services, such as:
o Blob Storage
o Key Vault
Apply for this position
Fill in your details and attach your CV.