Ali - Architecte BIG DATA

Ref : 171207G004
Photo d'Ali, Architecte BIG DATA
Compétences
Expériences professionnelles
  • Expérience professionnelle

    [November 2018-2023] Cloud and data architect: Data platform at Carrefour (phenix core team)
    As a Member of the Core and Ops team of the phenix project, I endorsed the following roles:
    Data and software Architect :
    ▪ Design and industrialization of the use of Cloud composer (airflow) to deliver batch workflows:
    o Migrate batch workflows from Azkaban to composer
    o Implementing the CI pipeline with Jenkins/k8s/pants and the CD pipeline with a custom kube
    operator to deploy dags.
    ▪ Design and industrialization of the use of Spring boot to develop rest apis:
    o Implementing the api security layer (JWT and LdapWs).
    o Implementing a spring-data-bigtable module from scratch.
    o Implementing a spring-data-bigquery component to write avro data in Bigquery.
    o Implementing an avro SerDe converter in Spring and a maven plugin to download avro schemas
    from a custom schema manager.
    o Industrializing the use of several spring-data backends.
    o Industrializing the use of TestContainers for IT tests.
    o Mentoring all data teams (60 Dev) to migrate from Scalatra to Spring Boot framework and the use
    of DDD and hexagonal code design architecture.
    ▪ Kafka expert: administration, monitoring, security,…
    ▪ Implementation of a Kafka bigtable sink connector and custom connect transformers in scala.
    ▪ Implementing a streaming-pipleine-adaptor in Golang using sarama librabry: to migrate real time data
    pipelines from using a custom avro serialization to confluent serialization to use schema registry and to
    allow the standardization of avro messages (headers/Key/Value) within data pipelines.
    ▪ Leading a project to migrate from Kafka clusters running on IAAS to Strimzi handled kafka clusters.
    ▪ Industrialization of akhq to monitor kafka topics, consumer groups, …
    ▪ Design and industrialization of the use of spark on GKE:
    o Industrializing the use of spark operator in Carrefour
    o Contributing and committing to google spark-on-k8s-operator github repository to run spark jobs
    on kubernetes and industrializing the operator (with Golang):
    o PR#952: Filter Custom resources on specific labels to allow running multiple operator instances
    on GKE.
    o PR#935: exposing container Ports to Prometheus scraping.
    o PR#914: Support ingress configuration in crd to expose the sparkUI in private networks.
    o Extending the operator webhook to mutate pods with specific security features before
    instantiation.
    o Migrate the core phenix pipeline libraries developed in scala from spark 2.2.1/kafka 0.8 to
    spark2.4.5/kafka 2.4 with all breaking changes of using kafka to manage consumers offsets instead
    of Zookeeper.
    ▪ Leading the migration of more than 70 spark streaming pipeline (data normalizers and persisters) from
    Mesos/Marathon to GKE.
    Technologies: spark, scala, sbt/mvn, golang, Spring boot (2.X), java11/17, kafka, akhq, JWT, Jenkins, nexus,
    Artifact registry, cloud composer, python, pants(mono repo for airflow dags), dataproc , Avro.
    Kubernetes Architect :
    ▪ Design and industrialization of using, securing and exposing GKE (managed kubernetes by GCP ) in the
    data platform:
    o Use of kustomize to deploy and maintain the cluster state
    o Setting Rbacks and enabling workloadIdentity
    o Exposing with nginx Ingress controllers/Use of proxies and loadbalancers to wire networks
    o Defining a functional namespace splitting strategy
    ▪ Design and industrialization of the CICD pipeline with jenkins (with k8s plugin enabled) to deploy on
    different environments.
    ▪ Migration of workloads and data pipelines from Mesos/Marathon to GKE
    ▪ Deploy and maintain K8s webhooks and operators: Strimzi, spark, Prometheus, ingress, OPA operators.
    ▪ Implementing Nginx ingress controllers to expose services and securing the communication between
    internal and external services to deployments in GKE.
    ▪ Implementing a monitoring stack with Prometheus + Grafana + Alertmanager.
    Technologies: GKE, kustomize, Security in Kubernetes, Jenkins, GCP network, monitoring.

    P r o f e s s i o n a l e x p e r i e n c e
    Migration from IBM datacenter to GCP cloud :
    ▪ Contribute to define the migration strategy of the data platform to GCP.
    ▪ Contribute to define and secure the network connections between legacy IAAS datacenters and private
    VPC that hosts all data backends, apis,..
    ▪ Setting up a one way Kerberos trust between legacy datacenters and GCP to backup data with distcp.
    ▪ Enabling and industrializing the use of gcp services.
    ▪ Define methods and architecture to write and read data from BigQuery.
    ▪ Defining a security and best practices framework to gcp services and to run applications
    ▪ Full automation with ansible, terraform and google deployment manager.
    Technologies: Cloudera, security, GCP KMS, ansible, deployment manager, terraform, GCP, BigQuery, MIT
    Kerberos, DNS, haProxy, LB4, SSL,…
    Security referee :
    ▪ Reshaping authentication and authorization methods at the carrefour data platform by implementing an
    openLdap cluster with saslauthd enabled to proxy authenticated users to the Ldap Group. Groups are
    defined locally on the openldap.
    ▪ Installing and Securing Cloudera clusters by leveraging the ldap as a main entry point for authentication
    and authorization
    ▪ Proposing and implementing new methods to allow to clients outside the cluster to access to HDFS/Hive
    without the need to have a Kerberos token. This is by implementing and enabling Knox parcel on the
    cluster instead of HttpFs which requires Kerberos and configuring extra Hive servers with Ldap
    authentication. All of this while preserving the user impersonation.
    ▪ Extending a python client library to communicate with Cloudera Manager and to implement the required
    rest calls to install and configure Knox parcel.
    ▪ Providing support and expertise to all data teams and its clients.
    ▪ Full automation with ansible of all kinds of deployments through rundeck.

    Technologies: ldap, saslauthd, knox, python, MIT Kerberos.

    [October-November 2018] Hadoop Expert: Douane + CNAM

    ▪ Audit of spark jobs implemented by the data scientists and boosting the jobs’ execution by a factor of 10.
    ▪ Giving guidelines composed of more than 15 points on how to fine tune the cluster.
    ▪ Auditing the CNAM Hortonworks clusters and fixing many security blocking points mainly related to
    Kerberos, Ranger and Knox.
    ▪ Proving the feasibility of Implementing a multihomed hadoop cluster (hosts with many network
    interfaces) with Kerberos enabled: Exposing the Kerberos traffic through exposed network interface to
    communicate with the Active Directory

    October 2016-August 2017 September 2017 -Today Hortonworks Professional Services (PS):

    Technical leader/Solution architect Architect at Société
    Générale Technical development leader of a regulatory project -Mesh contra ct-to address the IFRS 9.2 requirements in term of regulations using Big Data technologies at Société Générale:
    ▪Hortonworks consultant
    ▪Defining the software stack for the project.
    ▪Contributing and leading the developments : Mesh Contrat relies on many technical components: Oozie, spark (scala), spark streaming, kafka, Teradata, sqoop, elasticsearch and kibana.
    ▪Implementing the continuous delivery/integration process for the project ith Nexus, Jenkins, ansible
    ▪Successful production deployment of the project Hortonworks Solution Architect at Société Générale:
    Hadoop (Hortonworks):
    ▪Hadoop Security Expert: Designing and implementing of secured solutions for security requirements.
    ▪Installation and configuration of a new secured development/integration cluster for projects with ranger and Kerberos enabled.
    ▪Synchronization ranger, with LDAPs, and Configuring sssd for ldap authentication
    ▪Full automation of installation and configuration of components/products for the cluster with ansible
    ▪Configuring backup cluster, and providing solutions for disaster recovery strategies.
    ▪Configuring and running mirror-maker to backup streaming data in secured environments (Kafka Acls; SSL and Kerberos).
    ▪Defining and implementing the migration strategy from using Kafka ACLs
    to Ranger policies and migration from self -signed certificates to CA signed certificates for Kafka SSL listener.
    ▪Enabling wire encryption and managing SSL certificates on major Hadoop components.
    ▪Installing and configuring Hue on a HA and kerberized cluster and synchronization with ldap.
    ▪Installing and configuring Knox to connect reporting tools on Hive such as Tableau.
    ▪Setup of Prometheus for monitoring and alerting of the most critical components: ldap, FS size, ...
    Talend:
    ▪Define and implementation in all Societe Genreale environments.
    ▪Connecting the different TAC instances to the Active Directory group and Securing the communication with SSL.
    ▪Implementing ansib le playbooks to install TAC and jobservers.
    ▪Define and implementation the logging strategy for Talend projects that use Kafka (SASL)
    ▪Defining best practices and security strategies to isolate jobservers
    with cgroups for projects and to authenticate each jobserver with Kerberos.
    ▪Configuration and installation of Talend Data Quality on a kerberized environment: Integration with Kafka for data dictionary service and HDFS to import/export data.
    P r o f e s s i o n a l e x p e r i e n c e

    September 2015-Today Hadoop administrator trainer-HDP Administrator certified

    ▪HDPA trainer at Ysance: Administration + Security + Preparing for HDP administrator certification.
    ▪Trainer at Canal+ on how to setup Cloudbreak on AWS.
    ▪Trainer at canal+ on the full HDP stack
    ▪Hortonworks Administrator certified (HDPCA)
    Data Architect /Developer at SFR July 2016 -October 2016 January 2016 July 2016
    Poc design and implementation of a monitoring solution based on big data technologies for SFR:
    ▪Poc design and implementation of a monitoring solution for the VoD plateform of SFR.
    ▪Specification of the KPIs to be monitored with the customer.
    ▪Building from scratch and securing the monitoring plateform on AWS. The plateform relies on the following components: logstash, S3, elasticsearch,
    EMR, spark, kibana, nginx, SSL.
    ▪Implementing the logstash configuration to parse and normalize the logs, and the spark jobs to generate the business views and index them in Elasticsearch.
    ▪The poc was enough mature to run for around 1 year on AWS without any
    issue and has triggered a new project at SFR to internalize the poc.
    Data Architect/Developer at BNP Solution architect:
    ▪Definition and implementation of a new monitoring and alerting solution
    based on HDP and influxDataPlateform (Telegraf, kapacitor, influxDb)stacks to monitor more than 500 VM+ databases.
    ▪Automation of the deployment of the monitoring stack with puppet and saltstack.
    ▪Configure all Telegraf agents to send telemetry data to kafka. Then Implementation of a spark streaming job to consume data from kafka, commit the offsets to Zookeeper for roll back and load data into InfluxDb.
    ▪Automation of deployment of HDP clusters
    using blueprints, saltstak on VMs.
    ▪Poc: configuration and installation of HDFS transparency connector to plug
    GPFS-IBM on yarn and execution of spark jobs on GPFS.
    ▪Definition of the indexing strategies on Elasticsearch on a 84 node cluster.

    2012-2015 Orange Labs R&D engineer

    ▪PHD on recommendation systems and video caching algorithms.
    ▪Orange Data traffic analysis to derive user behavior patterns to fine tune the recommendation system.
    ▪4 publications on top computer of science conferences
    ▪Participation to 2 European projects: ecousin/ocean

Études et formations
  • Competances

    ▪System/Data architect
    ▪Hadoop security
    ▪Tech lead developer
    ▪Cloud and automation
    ▪Hortonworks data plateform expert
    ▪Recommendation systems
    ▪Big Data Trainer

    SKILL
    Programming
    Language
    System/stack
    Databases
    Java Scala
    python Talend
    HDP/ HDP Cloud
    ELK/TICK
    MySQL

    LANGAGE SKILLS
    French Englich

D'autres freelances
Formateur JAVA

Ces profils pourraient vous intéresser !
CV Développeur PYTHON
Claudia

Développeur PYTHON

  • LA VARENNE-SAINT-HILAIRE
PYTHON Vue.js SQL DJANGO JAVASCRIPT GITLAB MONGODB SCALA JAVA DOCKER
Disponible
CV Consultant informatique
Virginie

Consultant informatique

  • AVIGNON
PYTHON HTML JOOMLA DJANGO PHP JAVASCRIPT CSS JAVA JASPER REPORTS SQL
Disponible
CV Formateur JAVA
Achraf

Formateur JAVA

  • RENNES
JAVA ANDROID C#
CV Développeur Full Stack (Orienté Jam Stack )
Alexandre

Développeur Full Stack (Orienté Jam Stack )

  • LYON
REACT.JS React Native PHP SYMFONY PYTHON PHP LARAVEL DJANGO JAVA SWIFT C++
Disponible
CV Consultant IT
Hugues

Consultant IT

  • MAREIL-MARLY
PYTHON C WINDOWS AGILE LINUX C# JAVA SQL AZURE JAVASCRIPT
CV Formateur JAVA
Boris

Formateur JAVA

  • MONS-EN-BAROEUL
JAVA JAVASCRIPT SPRING MYSQL Angular SQL REACT.JS
CV Développeur WINDEV
Nicolas

Développeur WINDEV

  • MARCQ-EN-BAROEUL
WINDEV SQL ANDROID JAVA SQL SERVER
CV formatrice mathématique et informatique
Nesrine

formatrice mathématique et informatique

  • CRÉTEIL
C SQL JAVA C++ SQL SERVER PL SQL MYSQL PYTHON
CV Formateur
Adeline

Formateur

  • RAMONVILLE-SAINT-AGNE
JAVA PHP SQL FIGMA SYMFONY SPRING JIRA GIT GITLAB JAVASCRIPT
Disponible
CV Expert Jenkins | Développeur Sénior Spring / Angular
Rossi

Expert Jenkins | Développeur Sénior Spring / Angular

  • NANTES
JAVA SPRING JENKINS Angular