BRCA20012

Data Integrator – Senior

Location

Brussels area

Category:

IT Database

Project description:

The customer is currently looking for a Data Integrator to help the current team adapt to the raising demands from the business.
Based on the defined Data Architecture, the Data Integrator develops and documents ETL solutions, which translate complex business data into usable physical data models. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests.
The Data Integrator uses ETL (“Extract, Transform, Load”) which is the process of loading business data into a data warehousing environment, testing it for performance, and troubleshooting it before it goes live. The Data Integrator is responsible for the development of the physical data models designed to improve efficiency and outputs, of the Data Vault and Data Marts, of the Operational Data Store (ODS), and of the Data Lakes on the target platforms (SQL/NoSQL).


Responsibilities:

Primary Tasks and responsibilities
Design, implement, maintain and extend physical data models and data pipelines:
• The Data Integrator develops and documents ETL solutions, which translate complex business data into usable physical data models. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. The Data Integrator uses ETL (“Extract, Transform, Load”) which is the process of loading business data into a data warehousing environment, testing it for performance, and troubleshooting it before it goes live. The Data Integrator is responsible for the development of the physical data models designed to improve efficiency and outputs, of the Data Vault and Data Marts, of the Operational Data Store (ODS), and of the Data Lakes on the target platforms (SQL/NoSQL).
• Analyse the Data Integration Needs: Data Integrators analyse the technical requirements concerning the data integration needs.
• Design, Create and Maintain ETL Solutions: Data Integrators work closely with the Data Modelers to create optimal physical data models. The Data Integrator implements data models that translate complex business data into usable systems (ODS, Data Vault, Data Marts). Once the database structures are deployed, the Data Integrator extracts the necessary data, transforms and loads it to the new layer.
• Propose and Follow Data Integration Standards and Best Practices: Data Integrators propose and follow data integration standards, tools and best practices.
• Ensure that the solution is scalable, performant, maintainable, correct, and of high quality.
• Ensure that the solution is fully documented.

Ensure Security, Testing and Support:
• The Data Integrator ensures the Security and the Testing of the new solutions, and also provides the necessary support.
Tasks:
• Ensure Security: Data Integrators ensure the security management to protect the data in the Data Vault and Data Marts, and the IT infrastructure.
• Ensure Testing: At the end of the ETL development the Data Integrators test their data integration to ensure it responds to the business requirements (quality, coherence, performance). They fix problems that may pop up. They guarantee the availability of the data (assure the proper maintenance and functioning).
• Ensure Support: Data Integrators provide support in case of client requests / incidents / problems.


Technical skills:

Technical profile requirements
• +5 years of experience in a similar role
• Expert knowledge of ETL tools and development (DATASTAGE)
• Expert knowledge of Database Platforms (DB2 & Netezza)
• Expert knowledge of SQL Databases (DDL)
• Expert knowledge of SQL (Advanced querying, optimization of queries, creation of stored procedures)
• Good knowledge of Data Modeling Principles / Methods including Conceptual, Logical & Physical Data Models, Data Vault, Dimensional Modelling
• Good knowledge of Test Principles (Test Scenarios / Test Use Cases and of Testing)
• Good knowledge of ITIL
• Good knowledge of OLAP
• Good knowledge of NoSQL databases
• Good knowledge of Hadoop Components – HDFS, Spark, Hbase, Hive, Sqoop
• Good knowledge of Big Data
• Good knowledge of Data Science / Machine Learning / Artificial Intelligence
• Good knowledge of Data Reporting Tools (TABLEAU)
• Very good knowledge of ETL-Tool (IBM INFOSPHERE DATASTAGE)
• Very good knowledge of SQL and/or PL/SQL
• Very good knowledge of Relational databases (IBM DB2 LUW)
• Very good knowledge of Data warehouse appliances (NETEZZA)
• Very good knowledge of Atlassian Suite: JIRA / CONFLUENCE / BITBUCKET
• Very good knowledge of TWS (IBM Workload Management)
• Very good knowledge of UNIX Scripting (KSH / SH / PERL / PYTHON)


Methodology/Certification requirements
- Bachelor Degree or equivalent through experience.
- Expert knowledge of Agile methodology


Soft skills:

Non-Technical profile requirements
- Customer satisfaction oriented
- Good analytical and problem-solving skills
- Must be able to work on multiple simultaneous tasks with limited supervision
- Good interpersonal, communication and team collaboration skills
- Able to follow change management procedures and internal guidelines
- Good people management, coaching/training skills

Language proficiencies
• FR and ENG is a must
• NL is an advantage

Contact person:

Contact name: Yves Cambron