Job Summary
Bachelor’s degree in computer science, Information Systems, or related field.
- Minimum Qualification:Bachelor
- Experience Level:Mid level
- Experience Length:5 years
Job Description/Requirements
Duties and Responsibilities:
- Design and deploy an end-to-end data pipeline system that centralizes and processes large volumes of structured and unstructured data from various sources.
- Develop user-friendly interfaces that enable users to easily pull up relevant information for a product and customer.
- Collaborate with data scientists, data analysts, and other stakeholders to understand data requirements and design data solutions that meet their needs.
- Design and implement efficient data extraction, transformation, and loading (ETL) processes to populate the pipeline with data from various sources.
- Build and maintain robust data pipelines that ensure data is accurate, up-to-date, and easily accessible.
- Develop and maintain data models, data schemas, and data dictionaries.
- Use APIs, batch exports, and SQL queries to extract data from various sources and integrate it into a SQL database.
- Perform data cleaning, data transformation, and data integration tasks to ensure data quality and consistency.
- Collaborate with data analysts, data scientists, and other stakeholders to ensure data is processed and analyzed effectively.
- Monitor and optimize data pipelines to ensure they are performing efficiently.
Success for the role will be measured by delivering within the first few months of the following,
- Successful deployment of the end-to-end data pipeline system, including system implementation, ETL processes, and data handling capabilities within the first few months.
- Data accessibility and usability, are measured by ease of use of user-friendly interfaces and speed of accessing relevant information.
- Data quality and consistency, are monitored through data accuracy, completeness, consistency, and integrity.
- Collaboration and stakeholder satisfaction, are measured by feedback from stakeholders on the effectiveness of data solutions and maintaining positive working relationships.
Skills and Experience:
- Bachelor’s degree in computer science, Information Systems, or related field.
- At least 5 years of experience in designing and deploying end-to-end data pipelines.
- Strong knowledge of SQL, ETL tools, and data warehousing concepts.
- Experience working with APIs, batch exports, and SQL queries to extract data from various sources.
- Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform.
- Strong data analysis and problem-solving skills.
- Experience working with Microsoft Dynamics, open-source data systems like KOBO, and Call Center platforms would be an added advantage.
- Excellent communication skills and ability to work in a team environment.
Important Safety Tips
- Do not make any payment without confirming with the BrighterMonday Customer Support Team.
- If you think this advert is not genuine, please report it via the Report Job link below.
Get Insured through mTek Services
You can explore medical and personal accident insurance covers conveniently. Compare pricing from various insurance companies, save, and budget
Get Insured