Function: Devops & data engineer
Client : Électricité De France - Paris
10/2018 -
Project: DevOps Big Data
Mission: Set up a DevOps tool for Big Data applications
The goal of the mission is to set up a Devops deployment system for Big Data treatments.
Deploy the solution within all EDF units with support and project training
The mission within the project consists of:
 Create, industrialize and automate bigdata infrastructure projects in Infra As code
 Set up the CI/CD pipeline on projects with tests, validations, security
 Propose and implement architectural developments in conjunction with the architects
 Guarantee the security of project data by implementing the right protection mechanisms
 Participate in the study, costing, framing in connection with architects and project managers
 Take charge of all BUILD and RUN aspects (You build it, you run it)
Environment: Ansible, Jenkins, Git, Groovy, Docker, Kubernates, Ambari, Ranger, Kerberos, Cloudera, Knox,
Spark, Hive, Linux, Yaml,, Python, Scala, Java, Shell
Function: Data & Devops Engineer
Client : Crédit Agricole - CACEIS– Paris
5/2018 - 10/2018
Project: Set up a Data-Lake
Mission: Architect, Hadoop Administrator, Data-Lake, DevOps
The objective of the mission is the definition of the architecture of the new datalake, then the implementation and
management of the Big Data HortonWorks infrastructure and also the implementation of a security policy and
data governance.
The mission within the project consists of:
 Definition of the technical architecture
 Setting up the technical infrastructure
 Installation, configuration of software bricks: Hortonworks, Talend BigData, Vertica, Attunity
 Connect all components and make them work together
 Hortonworks Hadoop Administration
 Configuration and customization of Big Data security
 Set up the CI / CD installation pipeline with tests, validations, security
 Automation of the installation of infrastructure components with Ansible
Environment: Hortonworks, Ambari, Ranger, Ansible, Jenkins, Kerberos, Talend Big Data, Attunity Replicate,
Kafka, Vertica, Spark, Hive, Shell Linux
Function: Big Data Engineer
Client : Société Générale - Paris
5/2017 - 5/2019
Project: Compliance
Function: Big Data Engineer
Mission: Data Analyzer / Big Data Developer
The client has a global responsibility for all IT tools for market abuse control, compliance, and operational risk. In
addition to this direct responsibility, it is also possible that the steering projects executed in other related entities.
The Project Objective is the implementation of solutions based on Hadoop stack: Hortonworks distribution,
parquet storage, un-normalization in HIVE, SOLR indexing, the use of SPARK / Java for distributed process, the
use of oozie and spring also, data-visualization with Tableau Software.
The mission within the project consists of:
Analysis and design of the big data solution to put place
Loading data in the DataLake
SPARK / JAVA development of new detection patterns
Restitution via a WEB interface or the TABLE software tool
Development and calculation of KPI / KRI
Indexing data with SOLR
_Hadoop Hortonworks, Java, Python, Spark, Hive, Hue, Keberos, Parquet, ElasticSearch, SolR,Shell Linux, WSRest
Function: Data Engineer & Data Scientist
Certificate Data Science - University Paris Dauphine - Paris
3/2016 - 6/2016
Project: Amazon open data project
Mission: Exploration of the data and analyzes feelings (Opinion of the Amazon customers)
 Development of several algorithms on Apache Spark by using the functional language Scala
 and Python
 Implémention of the algorithms by using the language R with RStudio
 Loading of the data by using Pig/Hive /Impala on a distribution Hadoop Cloudera
 Development of a MapReduce program in Java to make statistics on the basis of
data
 Construction of the whole of the characteristics to lead classification, by using them
methods of text mining
 Proposal and use for a set of learning techniques supervised (trees of
decision, random forests, svm, bayesien naive) to build a model of analysis of feeling.
 Proposal for a set of measurements to compare the performances of these methods (accuracy, precision,
recall, F-measurement, ROC, AUC).
 Calculations were carried out on a cluster composed of 10 waiters, of which storage capacity
live reached 40 GO of RAM by waiter and the total secondary storage capacity of the cluster reaches 10 TO.
_Hadoop Cloudera, Spark, Yarn, MapReduce, Python, Java, Hive, Pig, Impla, Sqoop, Language R,RStudio, Learning Machine, Scala, Elastic Search, Neo4j
Function: Data engineer
Electricité de France - Paris
3/2015 - 5/2017
Project: Projects under HADOOP
Mission: Architecture and development Big data
For projects of the department “EDF data warehouses”, that is brought to conduct many studies of architecture
then to implement these studies of architecture within the framework of operational projects.
For example: Installation of Kafka/Spark within the framework of a brick of acquisition at the current, put in work
of ELK for the acquisition of log and bricks of reporting.
The mission within the project consists with:
 To carry out the studies of architectures big dated in support with the architect referent from the department.
 To carry out in a specific way of the developments to implement these architectures.
Projects carried out:
 Accosting of new a CRM (MY) to the Big platform Dated, Vue customer 360°: real time with Kafka and
Spark Streaming, storage and indexing in HBase and ElasticSearch, alignment customer with Comscore and
WebAnalytics
 Creation of a reference frame for the eligible customers with the electronic invoice (AIFE Chorus): Spark
batch, Apache Nifi, HBase, Java, XML
 Development of several batchs, Hive, Spark, for the integration of data of several applications in the platform
Hadoop (Linky, GazPar, Comscore…)
_Hadoop Hortonworks, HDFS, Spark, Yarn, MapReduce, Hive, Pig, Hoot, Sqoop, RStudio,ElasticSearch, HBase, Linux, Keberos, Ranger, Python, Java, Scala
Function: Technical Engineer
ACCENTURE- ERDF (Electricity Network Distribution of France) - Paris
1/2012 - 3/2016
Project: SOA portal
Mission: Technical Lead and SOA Architect
 Participation in the installation of the technical architecture of the project in collaboration with
Basis teams, Architecture and Interfaces
 Participation in the drafting and the update of the document of technical architecture (DAT)
 Configuration of WebDispatcher (Load-balancing), to improve the performances
 The installation of the environments (Dev., test, preprod, prod), Unix, Oracle, RAC, SAP…
 Administration of the waiters of application WAS
 Administration of the environment of development
 Drafting of the SFD and Standard and TUFF, YOU
 Formative expert/(2 developers) on technologies /J2EE Java
 Piloting of several developers onshore and offshore oil rig in a SOA context 4 layers
 Definition of the best practices project
 Design object and development of the microphone-services in Java
Context:
The infrastructure of the project is based on the structuring principles of the High Availability and the replication,
based on the fault tolerance of a principal site and a recovery plan of activity on a site of help. With this intention,
the infrastructure put in work rests on two rooms for the principal site and of a room in a site of distant help.
_Java, SOA, Microphone-services, Unix, Oracle, VMware, WebDispatcher, SAP, Visual Tocompose, LoginModule,
Function: Development Engineer
ACCENTURE - ERDF (Electricity Network Distribution of France) - Paris
4/2011 - 12/2011
Project: Processing data of counting and publication flows (Data-processing Direction and telecommunications)
Mission: Technical Lead
 Framing of the development team
 Participation in the optimization of the technical architecture of the application
 Analyzes, design object, and definition of the technical specifications in UML
 Development java
Context:
The project rests on an architecture on 3 levels, of standard application Web/light customer based on the use
of:
 Apache (Web server)
 Weblogic Server (waiter of application/middleware)
 DBMS Oracle (persistence of the data)
 CFT, ftp and MQSeries-JMS for the exchanges with IF partners
The Apache waiter and the Weblogic field are lodged on the same physical machine, the Weblogic field
being composed of two authorities:
 WLS-01 for the presentation layer
 A cluster WLS-02 and WLS-03 for the layer services trades
The Oracle database is lodged on a separate machine. Base files them being stored on an external bay of discs.
Functional axes of IF:
 The management of the producing contract
 Data processing of counting (control, correction, validation and calculation of energies)
 Publication of flows
 Invoicing of the components of the access to the public network of distribution and covering
Environment: Java, SAP, Weblogic 10, PowerDesigner, Undertaken Achitect, Spring, Spring Batch,
Ant, Linux
Function: Engineer of studies and development
ACCENTURE - FRANCE Telecom-Orange Service-Paris Business of January
2011 - 3/2011
Project:Data-processing direction - development of a gate Web of monitoring quasi real time of a network IP VPN
(IP VPN Dashboard Monitoring) in Nimble mode
Mission: Refer technical
 Analyzes, design in UML
 The selection of spots/modules has to deliver, at
the beginning of each sprint
 Delivery of the selected modules has the end of
each Sprint
 Immediate notification...