PROFESSIONAL EXPERIENCE
BNPParibas Asset Management 02/2023 up to now
Data Engineer (Python, PySpark & IBM Cloud)
Assignment
Goals of the mission and realizations:
Migration of the whole system to the Cloud and rewriting R code and Python codes into pyspark
Python, Spark, Cloud:
• Rewriting ancient codes into Pyspark and creation of secured data pipeline from scratch to receive data from data providers in the Cloud Landing zone
• Implementation of pyspark jobs for KPI’s calculations
• Supporting data teams in the integration and consumption of data.
• Enrichment of Data Platform mechanisms to standardize uses
• Improvement of automata and Framework ingestion process
• Accelerate migration to the cloud from legacy DWH
• Gain quantification and cost optimization
• Organize a data catalog: business domains, terminology, tags, data quality/lifecycle,
• Security and fine access to data
• Design of architectures for inter-application exchanges
• POC of different innovative solutions
Team composition and technical environment :
• Pyspark, Python, Pandas,
• IBM Cloud (Code pipeline...)
• Agile Method, Sprint Jira
• GitHub
Cora & LDIT 12/2021 to 02/2023
Data Engineer (Python, PySpark, Snowflake, AWS, Stambia ELT & Tableau)
Assignment
Goals of the mission and realizations:
• Development of a python, pyspark flow from the Louis Delaize holding company with activities in 4 countries.
• Support in structuring the Cora loyalty activity.
• Participation in the construction of the GDPR
Python, Spark, AWS:
• Creation of secured data pipeline from scratch to receive data from brands in the Cloud Landing zone while respecting GDPR rules
• Encryption and decryption of files by AWS lambdas developed with python (validated by legal team and the Cora IT Cloud Architects)
• Development of lambdas on AWS to ensure proper organization of data in the datalake
• Implementation of pyspark jobs for KPI’s calculations
Snowflake, Stambia & Tableau software :
• Creation of sql scripts for data validations
• Monitoting database SQL operations on snowflake
• Creation of data pipelines from S3 buckets to snowflake tables
• Creation of Stambia mappings and processes for different purposes
• Creation of dashboards on Tableau Software by respecting charts rules
Team composition and technical environment :
• Pyspark, Python, Pandas,
• Aws services (Lambda, EMR, EC2, ECR, Code Deploy, Code build, Code pipeline...)
• Stambia ELT
• Snowflake, DB2, Oracle,
• Tableau Software
• Agile Method, Sprint Jira
Bonduelle 04/2020 to 11/2021
Stambia ELT, database and AWS developer
Assignment
Goals of the mission:
Development of new stambia flows to support migration from Oracle database to RedShift (AWS)
Actions:
• Analysis of needs and issues with IT teams
• Creation of Stambia mappings and processes with migration to RedShift (AWS)
• Unit tests
• Monitoring of TMA (intervention on tickets)
• Verification of daily production RUN, making technical and/or functional modifications
• Optimization of flows dealing with large volumes (ETL partitioning, restructuring of mappings, creation of indexes)
• Organization of meetings, technical guild with the different members of the team
Results:
• Testing sets realization
• Development of integration steps and Integration tests
• Improving of the mapping process for a better system management
Team composition and technical environment :
Stambia ELT, Oracle 11g, RedShift, S3, Qliksense
Cardif – BNPparibas 08/2018 to 03/2020
Informatica PowerCenter developer
Assignment
Goals of the mission:
the global target of the project is the harmionzation of the data coming from cardif PMS in order to be used for accounting, finance and actuarial science departments. These data must fit with the new standards such as Solvability 2, IFRS17, etc
Achievements :
• Data Quality management and improvement
• Indicators calculation for actuarial science department
• Balance calculation
• Documents redaction, project specification improvement and functional analysis
• development testing and Scenario analysis
• Setup of ETL processes, Mappings and workflows development
• Mapping development for KPIs calculation and management rules implementation
• Mappings and workflows development for feeding the facts tables
Results:
• Production of several Informatica Mapping with SQL optimization under Informatica PWC
• Significant functional changes to management rules having an impact on processes
• Implementation of new test scenarios
Team composition :
• 8 ETL Informatica developers (including experts and techlead),
• 6 Qliksence et PL/SQL developers (including experts and techlead),
• 3 to 4 experts PL/SQL,
• 4 JAVA developers,
• 2 DBA,
• 4 testing developers
• 3 to 4 techleads (database, KSH, PL/SQL, DBWatcher, contrôl M)
Technical / functional environment/ Methodologies :
• Informatica PowerCenter v10, Oracle 12g
• Languages : SQL, PL/SQL, java
• tools : Informatica, sql developer, Qliksence, Putty, WinSCP,
ALM, Power AMC
• Integration : Jenkins
• Database : Oracle
• System : UNIX
• Methodology : Agile & V method
• Architecture : Cardif
CHANEL 2014 – 07/2017
Manufacturing and control technician
ASSIGNMENT :
To cope with the increase in production and the launch of new products, Chanel has reinforced its teams
• Production monitoring
• Management of the production of samples and verification of their conformity
• Participation in the optimization and improvement of the production process (production proto-col and cleaning)
• Management of risk analysis and communication of anomalies
• Modification and improvement of production processes Improvement of control and monitoring processes Participation in work meetings
Technical environment: Industrial production tools, Dedicated software, SAP
CAMFIL 2011 – 08/2014
Production and quality control technician
Camfil is a company producing industrial filters for the nuclear industry, hospitals, pharmaceutical industry, etc.
ASSIGNMENT :
Support on monitoring the production and quality of filters for nuclear activity.
• Monitoring the quality of production
• Analysis of samples and verification of conformity Participation in the development of new products (pilots) Participation in the implementation of analysis protocols
• Optimization of production processes
• Improvement of production processes and quality control protocols
Technical environment: Industrial production and quality control tool, Software dedicated to production and quality monitoring