Job Description
Purpose of the job
Develop, build and maintain new data models and data pipelines with various source systems. Manage the development of data resources and support new product launches. Ensure data quality across the product vertical and related business areas. Develop all integration activities within the Data Warehouse, including all DWH stakes, DWH model, CLDM layer and PL layer. Assures the data management, integration & quality standards, and processes of the data engineering workflow are all met.
Duties and responsibilities
– Analyze new business integration needs, fully understand the requirements and perform the required technical analysis to accommodate the new requirements.
– Perform the required development for DWH model to achieve the targeted business needs which include model structure updates, ETL/ELT job load, PL layer jobs modification.
– Implement the right controls to maintain the DWH quality as per the standard Specs and continuously update these quality control checks.
– Work closely with DWH BI analyst to ensure correct and consistent mapping criteria between sources and DWH databases.
– Coordinate with data operations and administrators (system and database) to ensure the best performance monitoring and tuning.
– Optimize data integration platform to provide optimal performance under increasing data volumes.
– Build data pipelines,Analyze and organize data,Build data ingestion scripts/workflows, and Perform quality checks on data
– Drive integration projects from idea formulation, through design and implementation collaborating with partner teams.
– Participate in vendors technical requirements, gap analysis and RFPs
– Support technical decision making in the area of Big Data eco-system and data platform architecture and relevant migration use cases
– Collaborate with data scientists, BI analysts, modelers and IT team members on project goals.
– Monitor and optimize the performance of data platform systems and computing infrastructure.