Vacancy title:
Senior Analytics Engineer
Jobs at:
M-Kopa SolarDeadline of this Job:
08 May 2022
Summary
Date Posted: Monday, April 25, 2022 , Base Salary: Not Disclosed
JOB DETAILS:
Responsibilities
• Increase the velocity of our transition to a modern data stack (dbt, Airflow, Python, Spark, Synapse, Kubernetes, Docker, + ….).
• Design and collaborate on efficient data ingestion to our data warehouse from data sources including published events, data lakes, data bases and API integrations.
• Develop automation frameworks for data pipelines and analytics processes.
• Set high standards of work including security, infra as code, documentation etc.
• Integrate and maintain the infrastructure used in data analytics workstream (data lakes, data warehouses, automation frameworks).
• Contribute to the design and implementation of a clear, concise data model.
• Contribute to efficient storage of our data in the data warehouse, identifying performance improvements from table and query redesign.
• Write quality ELT code with an eye towards performance and maintainability and empower other analysts to use and contribute to ELT frameworks and tools.
• Improve the overall Data team’s workflow through knowledge sharing, proper documentation, and code review.
• Abstract logic into libraries and patterns of work that enable teams to build value from our data independently.
Qualifications
• You have a bachelors’ degree in a relevant field.
• A minimum of 5 years’ professional experience in a similar field.
• You enjoy abstracting, generalizing, and creating efficient, scalable solutions where they are needed.
• You enjoy creating patterns and processes as well as solving presented problems.
• Strong foundation of software development best practices (collaborative development via git, testing, continuous integration, deployment pipelines and infrastructure as code).
• Strong SQL Skills.
• Have experience with python.
• Have experience deploying code to production via automated deployments on the cloud.
• Have experience working on associated platforms (Azure, AWS etc).
• Experience building ingestion and/or reporting from streaming datasets and event architectures.
• Experience with distributed compute tools such as Spark and DataBricks.
• Experience with dbt or a good basis to learn it from.
• Experience with orchestration tools such as Airflow.
• Familiarity with using analytics or working with analytics teams.
• Experience with Kubernetes not expected but a plus.
• Experience with data visualization tools such as PowerBI, Looker or Tableau.
Work Hours: 8
Experience in Months: 60
Level of Education: Bachelor Degree
Job application procedure
Please click here to apply.
All Jobs
Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.