
Contract Architect - Databricks
- Auckland
- Contract
- Full-time
- Design data platforms, distributed systems, data lakes & data stores
- Creating software solutions for data ingest & integration
- Developing and operationalising reliable data pipelines & ETL patterns
- Building analytics tools to provide actionable insights and solve business problems
- Infrastructure development
- Wrangling and integrating data from multiple sources
- Identifying ways to improve data reliability, efficiency and quality
- Extensive knowledge of Databricks architecture and features.
- Experience in the design, development and implementation of data solutions using Databricks.
- Experience in data ingestion tools such as Apache Kafka, Informatica, AWS Glue, Azure event hubs and Fivetran.
- Strong programming skills, especially Python, Java, Scala, C++, C# etc
- Cloud platforms such as Azure, AWS & GCP
- Relational database management systems such as SQL Server, Redshift etc
- Distributed processing technologies such as Apache Spark
- Working knowledge of message queuing, stream processing, and highly scalable data stores.
- Experience building CI/CD Pipelines such as GITHUB, Azure DevOps, etc
- Degree in Computer Science, Data Science, Statistics or related field
- Ability to connect to customer’s specific business problems and Databricks solutions.
- Minimum 3+ years’ experience in Databricks development.
- Experience developing and deploying data pipelines into live environments
- A passion for lean, clean and maintainable code
- Strong analytic skills related to working with datasets, both structured and unstructured
- Curious and enthusiastic mindset
- The desire to grow and to share insights with others
- The desire to grow and to share insights with others
- Machine Learning frameworks and theory
- Analytic platforms & tools such as Databricks, Alteryx, SAS, KNIME or Datarobot
- Data vault / Kimball modelling methodology
- DevOps / DataOps