CV/Mission d'Expert hadoop freelance

Je dépose une mission gratuitement
Je dépose mon CV
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Exemple de missions d'Ali,
Expert hadoop habitant les Hauts-de-Seine (92)

Expérience professionnelle

[November 2018-2023] Cloud and data architect: Data platform at Carrefour (phenix core team)
As a Member of the Core and Ops team of the phenix project, I endorsed the following roles:
Data and software Architect :
▪ Design and industrialization of the use of Cloud composer (airflow) to deliver batch workflows:
o Migrate batch workflows from Azkaban to composer
o Implementing the CI pipeline with Jenkins/k8s/pants and the CD pipeline with a custom kube
operator to deploy dags.
▪ Design and industrialization of the use of Spring boot to develop rest apis:
o Implementing the api security layer (JWT and LdapWs).
o Implementing a spring-data-bigtable module from scratch.
o Implementing a spring-data-bigquery component to write avro data in Bigquery.
o Implementing an avro SerDe converter in Spring and a maven plugin to download avro schemas
from a custom schema manager.
o Industrializing the use of several spring-data backends.
o Industrializing the use of TestContainers for IT tests.
o Mentoring all data teams (60 Dev) to migrate from Scalatra to Spring Boot framework and the use
of DDD and hexagonal code design architecture.
▪ Kafka expert: administration, monitoring, security,…
▪ Implementation of a Kafka bigtable sink connector and custom connect transformers in scala.
▪ Implementing a streaming-pipleine-adaptor in Golang using sarama librabry: to migrate real time data
pipelines from using a custom avro serialization to confluent serialization to use schema registry and to
allow the standardization of avro messages (headers/Key/Value) within data pipelines.
▪ Leading a project to migrate from Kafka clusters running on IAAS to Strimzi handled kafka clusters.
▪ Industrialization of akhq to monitor kafka topics, consumer groups, …
▪ Design and industrialization of the use of spark on GKE:
o Industrializing the use of spark operator in Carrefour
o Contributing and committing to google spark-on-k8s-operator github repository to run spark jobs
on kubernetes and industrializing the operator (with Golang):
o PR#952: Filter Custom resources on specific labels to allow running multiple operator instances
on GKE.
o PR#935: exposing container Ports to Prometheus scraping.
o PR#914: Support ingress configuration in crd to expose the sparkUI in private networks.
o Extending the operator webhook to mutate pods with specific security features before
instantiation.
o Migrate the core phenix pipeline libraries developed in scala from spark 2.2.1/kafka 0.8 to
spark2.4.5/kafka 2.4 with all breaking changes of using kafka to manage consumers offsets instead
of Zookeeper.
▪ Leading the migration of more than 70 spark streaming pipeline (data normalizers and persisters) from
Mesos/Marathon to GKE.
Technologies: spark, scala, sbt/mvn, golang, Spring boot (2.X), java11/17, kafka, akhq, JWT, Jenkins, nexus,
Artifact registry, cloud composer, python, pants(mono repo for airflow dags), dataproc , Avro.
Kubernetes Architect :
▪ Design and industrialization of using, securing and exposing GKE (managed kubernetes by GCP ) in the
data platform:
o Use of kustomize to deploy and maintain the cluster state
o Setting Rbacks and enabling workloadIdentity
o Exposing with nginx Ingress controllers/Use of proxies and loadbalancers to wire networks
o Defining a functional namespace splitting strategy
▪ Design and industrialization of the CICD pipeline with jenkins (with k8s plugin enabled) to deploy on
different environments.
▪ Migration of workloads and data pipelines from Mesos/Marathon to GKE
▪ Deploy and maintain K8s webhooks and operators: Strimzi, spark, Prometheus, ingress, OPA operators.
▪ Implementing Nginx ingress controllers to expose services and securing the communication between
internal and external services to deployments in GKE.
▪ Implementing a monitoring stack with Prometheus + Grafana + Alertmanager.
Technologies: GKE, kustomize, Security in Kubernetes, Jenkins, GCP network, monitoring.

P r o f e s s i o n a l e x p e r i e n c e
Migration from IBM datacenter to GCP cloud :
▪ Contribute to define the migration strategy of the data platform to GCP.
▪ Contribute to define and secure the network connections between legacy IAAS datacenters and private
VPC that hosts all data backends, apis,..
▪ Setting up a one way Kerberos trust between legacy datacenters and GCP to backup data with distcp.
▪ Enabling and industrializing the use of gcp services.
▪ Define methods and architecture to write and read data from BigQuery.
▪ Defining a security and best practices framework to gcp services and to run applications
▪ Full automation with ansible, terraform and google deployment manager.
Technologies: Cloudera, security, GCP KMS, ansible, deployment manager, terraform, GCP, BigQuery, MIT
Kerberos, DNS, haProxy, LB4, SSL,…
Security referee :
▪ Reshaping authentication and authorization methods at the carrefour data platform by implementing an
openLdap cluster with saslauthd enabled to proxy authenticated users to the Ldap Group. Groups are
defined locally on the openldap.
▪ Installing and Securing Cloudera clusters by leveraging the ldap as a main entry point for authentication
and authorization
▪ Proposing and implementing new methods to allow to clients outside the cluster to access to HDFS/Hive
without the need to have a Kerberos token. This is by implementing and enabling Knox parcel on the
cluster instead of HttpFs which requires Kerberos and configuring extra Hive servers with Ldap
authentication. All of this while preserving the user impersonation.
▪ Extending a python client library to communicate with Cloudera Manager and to implement the required
rest calls to install and configure Knox parcel.
▪ Providing support and expertise to all data teams and its clients.
▪ Full automation with ansible of all kinds of deployments through rundeck.

Technologies: ldap, saslauthd, knox, python, MIT Kerberos.

[October-November 2018] Hadoop Expert: Douane + CNAM

▪ Audit of spark jobs implemented by the data scientists and boosting the jobs’ execution by a factor of 10.
▪ Giving guidelines composed of more than 15 points on how to fine tune the cluster.
▪ Auditing the CNAM Hortonworks clusters and fixing many security blocking points mainly related to
Kerberos, Ranger and Knox.
▪ Proving the feasibility of Implementing a multihomed hadoop cluster (hosts with many network
interfaces) with Kerberos enabled: Exposing the Kerberos traffic through exposed network interface to
communicate with the Active Directory

October 2016-August 2017 September 2017 -Today Hortonworks Professional Services (PS):

Technical leader/Solution architect Architect at Société
Générale Technical development leader of a regulatory project -Mesh contra ct-to address the IFRS 9.2 requirements in term of regulations using Big Data technologies at Société Générale:
▪Hortonworks consultant
▪Defining the software stack for the project.
▪Contributing and leading the developments : Mesh Contrat relies on many technical components: Oozie, spark (scala), spark streaming, kafka, Teradata, sqoop, elasticsearch and kibana.
▪Implementing the continuous delivery/integration process for the project ith Nexus, Jenkins, ansible
▪Successful production deployment of the project Hortonworks Solution Architect at Société Générale:
Hadoop (Hortonworks):
▪Hadoop Security Expert: Designing and implementing of secured solutions for security requirements.
▪Installation and configuration of a new secured development/integration cluster for projects with ranger and Kerberos enabled.
▪Synchronization ranger, with LDAPs, and Configuring sssd for ldap authentication
▪Full automation of installation and configuration of components/products for the cluster with ansible
▪Configuring backup cluster, and providing solutions for disaster recovery strategies.
▪Configuring and running mirror-maker to backup streaming data in secured environments (Kafka Acls; SSL and Kerberos).
▪Defining and implementing the migration strategy from using Kafka ACLs
to Ranger policies and migration from self -signed certificates to CA signed certificates for Kafka SSL listener.
▪Enabling wire encryption and managing SSL certificates on major Hadoop components.
▪Installing and configuring Hue on a HA and kerberized cluster and synchronization with ldap.
▪Installing and configuring Knox to connect reporting tools on Hive such as Tableau.
▪Setup of Prometheus for monitoring and alerting of the most critical components: ldap, FS size, ...
Talend:
▪Define and implementation in all Societe Genreale environments.
▪Connecting the different TAC instances to the Active Directory group and Securing the communication with SSL.
▪Implementing ansib le playbooks to install TAC and jobservers.
▪Define and implementation the logging strategy for Talend projects that use Kafka (SASL)
▪Defining best practices and security strategies to isolate jobservers
with cgroups for projects and to authenticate each jobserver with Kerberos.
▪Configuration and installation of Talend Data Quality on a kerberized environment: Integration with Kafka for data dictionary service and HDFS to import/export data.
P r o f e s s i o n a l e x p e r i e n c e

September 2015-Today Hadoop administrator trainer-HDP Administrator certified

▪HDPA trainer at Ysance: Administration + Security + ...

Voir le profil complet de ce freelance

Profils similaires parfois recherchés

CV Expert hadoop, Missions Expert hadoop, Expert hadoop freelance, Expert hadoop hive, Expert hadoop hbase, Expert big data hadoop, Architecte big data hadoop, Ingénieur de développement hadoop, Expert hadoop hdfs, Expert hadoop apache, Expert hadoop big data, Expert hadoop zookeeper

Les nouvelles missions d'Expert hadoop

Expert Big Data Middleware

APACHE HADOOP Kafka
ASAP
79 - Niort
3 mois
Voir la mission

Expert Hadoop

APACHE HADOOP
ASAP
92 - NANTERRE
10 jours ouvrés
Voir la mission

Expert Hadoop (Cloudera CDP) maîtrisant Ansible

APACHE HADOOP HASHICORP TERRAFORM ANSIBLE Cloudera
ASAP
puteaux
12 mois
Voir la mission

CONSULTANT SECOPS CLOUD CONFIRME

AZURE DOCKER Cloud AWS Google Cloud Platform Cybersécurité
ASAP
92 BAGNEUX
3 mois
Voir la mission

CONSULTANT DATAOPS CONFIRME

HASHICORP TERRAFORM AZURE ANSIBLE Cloud AWS Google Cloud Platform
ASAP
92 BAGNEUX
3 mois
Voir la mission

Expert Hadoop (Cloudera CDP)

LINUX SHELL UNIX PYTHON AGILE ANSIBLE
ASAP
92
12 mois
Voir la mission

Architecte Train Data

ORACLE APACHE HADOOP SAFE
ASAP
92
18 mois
Voir la mission

Développeur DATA

PYTHON SQL Microsoft Power BI APACHE HADOOP APACHE KAFKA
ASAP
75 - Paris
3 mois
Voir la mission

Expert Hadoop

Cloudera TERADATA LINUX PYTHON
ASAP
75 - Paris
6 mois
Voir la mission

Expert Big Data

APACHE HADOOP BIG DATA APACHE SPARK SCALA
ASAP
Télétravail
8 mois
Voir la mission
Je trouve ma mission

Les profils d'Expert hadoop ont postulé à :

Tech Lead Talend

TALEND
ASAP
92 - BAGNEUX
6 mois
Voir la mission

Business Analyst DATA

SQL DATA PYTHON
ASAP
92 - BAGNEUX
6 mois
Voir la mission

CP Data

DATA
ASAP
92 - LEVALLOIS-PERRET
12 mois
Voir la mission

Mission en indépendant

SQL EXCEL SAS DATA Microsoft Power BI
ASAP
75 - PARIS
6 mois
Voir la mission

Data Scientist - DEALING SUPPORT EXÉCUTION en Asset Management

SQL PYTHON PYSPARK
ASAP
75 - PARIS
180 jours ouvrés
Voir la mission

Data analyst

SQL PYTHON R
ASAP
75 - PARIS
12 mois
Voir la mission

Ingénieur Data

JAVA SQL DATA SPRING AZURE
ASAP
92 - BOULOGNE-BILLANCOURT
11 mois
Voir la mission

Business Analyst Data / chef de projet BI Retail

SQL BI BIG DATA Google Cloud Platform
ASAP
94
10 mois
Voir la mission

Data Analyst

DATA
ASAP
75010-Paris
90 jours ouvrés
Voir la mission

CHIEF DATA OFFICER

DATA BIG DATA
ASAP
Télétravail
6 mois
Voir la mission
Je trouve ma mission

Les derniers CV d'Expert hadoop disponibles

CV Architecte QLIKVIEW
Jules

Architecte QLIKVIEW

  • Montrouge
QLIKVIEW BUSINESS OBJECTS BIG DATA
CV Ingénieur développeur Python/DevOps/Systèmes
Renaud

Ingénieur développeur Python/DevOps/Systèmes

  • CASTELNAU-LE-LEZ
PYTHON SQL LINUX API RESTful TRANSACT SQL DJANGO OPEN SOURCE CI/CD NoSQL BIG DATA
CV Expert CyberSécurité
Djasra

Expert CyberSécurité

  • PARIS
PKI JAVA J2EE OPENTRUST BIG DATA SCRIPT SHELL PYTHON DEVOPS Kubernetes
CV Manager, expert Business Intelligence, blockchain, AI, banking payments
Younes

Manager, expert Business Intelligence, blockchain, AI, banking payments

  • Asnieres-Sur-Seine
BI BIG DATA DATA BLOCKCHAIN
CV Architecte ORACLE
Jean-Luc

Architecte ORACLE

  • ROISSY-EN-BRIE
ORACLE APACHE SPARK APACHE KAFKA UNIX SYSTEMES EMBARQUES APACHE HADOOP APACHE HIVE
CV Expert Bigdata/DevOps
Michel

Expert Bigdata/DevOps

  • Vigneux de Bretagne
BIG DATA Kubernetes APACHE KAFKA LINUX APACHE HADOOP ELK Cloudera PYTHON MYSQL POSTGRESQL
CV Consultant API
Mouhamadou

Consultant API

  • Asnières-Sur-Seine
HTML LINUX ORACLE J2EE LDAP TOMCAT ECLIPSE VMWARE SCRUM SOA
CV Data Analyst/Data Engineer
Mugisha Christian

Data Analyst/Data Engineer

  • VILLIERS-SUR-MARNE
SQL AGILE Microsoft Power BI TABLEAU SOFTWARE MICROSOFT POWER QUERY SCRUM BIGQUERY Google Cloud Platform AZURE
CV Chef de projet BIG DATA
Jean Pierre

Chef de projet BIG DATA

  • Lieusaint
BIG DATA DATAWAREHOUSE ENTERPRISE ARCHITECT JAVA PYTHON
CV Cybersécurité et Projets (Management/RSSI/DPO/DEO/Architecte)
Yael

Cybersécurité et Projets (Management/RSSI/DPO/DEO/Architecte)

  • SAINT-JACQUES-DE-LA-LANDE
Cybersécurité ENTERPRISE ARCHITECT IOT PMO AZURE Google Cloud Platform EBIOS BIG DATA RGPD ISO 2700x
Je trouve mon freelance