Vacancy title:
Data Engineer
Jobs at:
SamaDeadline of this Job:
28 October 2022
Summary
Date Posted: Friday, October 14, 2022 , Base Salary: Not Disclosed
JOB DETAILS:
About the Job:
• As a Data Engineer at Sama, you will be responsible for building and optimizing data architectures and data pipelines. You will be responsible for building and maintaining ETL, ELT data flows for different cross functional teams. As a data engineer, you will provide support to our data analysts, data scientists and other stakeholders on data initiatives. Your primary goal is to ensure optimal and consistent data availability, data quality and data delivery architecture.
Key Responsibilities:
• Create and maintain optimal data pipeline architectures that serve key business stakeholders
• Assemble large, complex data sets that meet business requirements for different stakeholders and teams.
• Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
• Develop and maintain a data catalog of data sets, scripts, tools and pipelines as part of documentation.
• Work with stakeholders to identify their data needs and provide consistent data availability and quality to meet those needs.
• Work with business analytics to build ETL pipelines that serve various areas of business.
• Identify any bottlenecks or challenges in the current data pipelining approaches and suggest areas of improvement.
• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Build analytics tools that utilize the data pipeline to provide actionable insights on key business metrics
• Maintain the daily relationship with stakeholders to understand their data needs and communicate results intuitively.
Minimum Qualifications:
• Advanced working knowledge of SQL.
• Experience with Google Cloud Platform and its services.
• Experience working with relational and non-relational databases - Bigquery, AWS, Postgre..
• Working knowledge of data pipelining tools such as Hevo
• Experience with transformation tools such as DataForm, Database Tools (DBT).
• Experience with object-oriented/object function scripting languages: Python, JavaScript, Java, C++.
• Experience working on CI/CD processes and source control tools such as GitHub.
• A successful history of manipulating, processing and extracting value from large disconnected datasets.
Preferred Qualifications:
• Outstanding communication skills, and the ability to stay self-motivated and work with little or no supervision.
• Great communication and collaboration skills.
• Excellent time management and organizational abilities
Education Requirement: No Requirements
Job Experience: No Requirements
Work Hours: 8
Job application procedure
• Interested and qualified? Click here to apply
All Jobs
Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.