What you’ll already have:
Expert skills in building cloud-based data pipelines using data orchestration and workflow platforms. We use Airflow/Cloud Composer with Python but you may have experience of different orchestration/workflow tools or programming languages.
Advanced SQL skills, with a deep understanding of how to write performant SQL and debug problems
Experience with a cloud-based relational Data Warehousing product
Data transformation tools, preferably DBT
Cloud-Based data platforms, preferably Snowflake or Big Query
Designing secure data pipelines and fixing security/performance issues
Identification of data quality issues
Deploying to at least one major Cloud Platform such as AWS, GCP or Azure
Continuous Integration/Continuous Delivery
TDD, pair programming.
Agile development methods such as Scrum or Kanban
What else you could bring:
Containerisation, Kubernetes
Event-driven design
Gitlab CI, Jenkins, Circle CI, Jira
Data orchestration and workflow technologies such as Airflow
Visualisation & reporting tools such as Tableau or Looker
ETL tools such as Apache Beam or AbInitio
Knowledge or experience of DataVault (DV2)
Data Visualisation
Understanding of Data Management Concepts
Job ID: 123810
Meta is embarking on the most transformative change to its business and technolo...
Deloitte’s Enterprise Performance professionals are leaders in optimizing...
Job Duties/Responsibilities:Determine the acceptability of specimens for testing...
• JOB TYPE: Direct Hire Position (no agencies/C2C - see notes below)â€Â...