CV/Mission PySpark freelance

Je dépose une mission gratuitement
Je dépose mon CV
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Résumé des emplois de Jugurtha,
freelance PYSPARK résidant dans l'Oise (60)

  • Data Engineer (Python, PySpark & IBM Cloud)

    BNPParibas Asset Management
    Jan 2023 - aujourd'hui

    Assignment
    Goals of the mission and realizations:
    Migration of the whole system to the Cloud and rewriting R code and Python codes into pyspark

    Python, Spark, Cloud:
    • Rewriting ancient codes into Pyspark and creation of secured data pipeline from scratch to receive data from data providers in the Cloud Landing zone
    • Implementation of pyspark jobs for KPI’s calculations
    • Supporting data teams in the integration and consumption of data.
    • Enrichment of Data Platform mechanisms to standardize uses
    • Improvement of automata and Framework ingestion process
    • Accelerate migration to the cloud from legacy DWH
    • Gain quantification and cost optimization
    • Organize a data catalog: business domains, terminology, tags, data quality/lifecycle,
    • Security and fine access to data
    • Design of architectures for inter-application exchanges
    • POC of different innovative solutions

    Team composition and technical environment :
    • Pyspark, Python, Pandas,
    • IBM Cloud (Code pipeline...)
    • Agile Method, Sprint Jira
    • GitHub

  • Data Engineer (Python, PySpark, Snowflake, AWS, Stambia ELT & Tableau)

    Cora & LDIT
    Jan 2021 - Jan 2023

    Assignment
    Goals of the mission and realizations:
    • Development of a python, pyspark flow from the Louis Delaize holding company with activities in 4 countries.
    • Support in structuring the Cora loyalty activity.
    • Participation in the construction of the GDPR

    Python, Spark, AWS:
    • Creation of secured data pipeline from scratch to receive data from brands in the Cloud Landing zone while respecting GDPR rules
    • Encryption and decryption of files by AWS lambdas developed with python (validated by legal team and the Cora IT Cloud Architects)
    • Development of lambdas on AWS to ensure proper organization of data in the datalake
    • Implementation of pyspark jobs for KPI’s calculations

    Snowflake, Stambia & Tableau software :
    • Creation of sql scripts for data validations
    • Monitoting database SQL operations on snowflake
    • Creation of data pipelines from S3 buckets to snowflake tables
    • Creation of Stambia mappings and processes for different purposes
    • Creation of dashboards on Tableau Software by respecting charts rules

    Team composition and technical environment :
    • Pyspark, Python, Pandas,
    • Aws services (Lambda, EMR, EC2, ECR, Code Deploy, Code build, Code pipeline...)
    • Stambia ELT
    • Snowflake, DB2, Oracle,
    • Tableau Software
    • Agile Method, Sprint Jira

  • Bonduelle
    Jan 2020 - Jan 2021

    Assignment
    Goals of the mission:
    Development of new stambia flows to support migration from Oracle database to RedShift (AWS)

    Actions:

    • Analysis of needs and issues with IT teams
    • Creation of Stambia mappings and processes with migration to RedShift (AWS)
    • Unit tests
    • Monitoring of TMA (intervention on tickets)
    • Verification of daily production RUN, making technical and/or functional modifications
    • Optimization of flows dealing with large volumes (ETL partitioning, restructuring of mappings, creation of indexes)
    • Organization of meetings, technical guild with the different members of the team

    Results:
    • Testing sets realization
    • Development of integration steps and Integration tests
    • Improving of the mapping process for a better system management

    Team composition and technical environment :
    Stambia ELT, Oracle 11g, RedShift, S3, Qliksense

  • Informatica PowerCenter developer

    Cardif – BNPparibas
    Jan 2018 - Jan 2020

    Assignment
    Goals of the mission:
    the global target of the project is the harmionzation of the data coming from cardif PMS in order to be used for accounting, finance and actuarial science departments. These data must fit with the new standards such as Solvability 2, IFRS17, etc

    Achievements :

    • Data Quality management and improvement

    • Indicators calculation for actuarial science department

    • Balance calculation

    • Documents redaction, project specification improvement and functional analysis

    • development testing and Scenario analysis

    • Setup of ETL processes, Mappings and workflows development
    • Mapping development for KPIs calculation and management rules implementation
    • Mappings and workflows development for feeding the facts tables

    Results:
    • Production of several Informatica Mapping with SQL optimization under Informatica PWC
    • Significant functional changes to management rules having an impact on processes
    • Implementation of new test scenarios

    Team composition :
    • 8 ETL Informatica developers (including experts and techlead),
    • 6 Qliksence et PL/SQL developers (including experts and techlead),
    • 3 to 4 experts PL/SQL,
    • 4 JAVA developers,
    • 2 DBA,
    • 4 testing developers
    • 3 to 4 techleads (database, KSH, PL/SQL, DBWatcher, contrôl M)

    Technical / functional environment/ Methodologies :

    • Informatica PowerCenter v10, Oracle 12g
    • Languages : SQL, PL/SQL, java
    • tools : Informatica, sql developer, Qliksence, Putty, WinSCP,
    ALM, Power AMC
    • Integration : Jenkins
    • Database : Oracle
    • System : UNIX
    • Methodology : Agile & V method
    • Architecture : Cardif

  • Manufacturing and control technician

    CHANEL
    2014 - Jan 2018

    ASSIGNMENT :
    To cope with the increase in production and the launch of new products, Chanel has reinforced its teams
    • Production monitoring
    • Management of the production of samples and verification of their conformity
    • Participation in the optimization and improvement of the production process (production proto-col and cleaning)
    • Management of risk analysis and communication of anomalies
    • Modification and improvement of production processes Improvement of control and monitoring processes Participation in work meetings

    Technical environment: Industrial production tools, Dedicated software, SAP
  • Production and quality control technician

    CAMFIL
    2011 - Jan 2014

    Camfil is a company producing industrial filters for the nuclear industry, hospitals, pharmaceutical industry, etc.

    ASSIGNMENT :
    Support on monitoring the production and quality of filters for nuclear activity.
    • Monitoring the quality of production
    • Analysis of samples and verification of conformity Participation in the development of new products (pilots) Participation in the implementation of analysis protocols
    • Optimization of production processes
    • Improvement of production processes and quality control protocols

    Technical environment: Industrial production and quality control tool, Software dedicated to production and quality monitoring
Voir le profil complet de ce freelance

Compétences proches de PYSPARK parfois recherchées

CV PySpark, Missions PySpark

Les derniers freelances PySpark

CV Data Analyst SQL
Remi

Data Analyst SQL

  • VINCENNES
SQL Microsoft Power BI Apache Spark Agile Python
Disponible
CV Data Analyst SQL
Nassim

Data Analyst SQL

  • SAINT-SAULVE
SQL Data Microsoft Power BI Excel Big Data Google Cloud Platform Tableau
Disponible
CV Data Engineer Power BI
Kang

Data Engineer Power BI

  • VINCENNES
Power BI SQL Python Data Apache Spark BigQuery Google Cloud Platform PySpark Pandas
Disponible
CV Chef de projet / Po Data
Achraf

Chef de projet / Po Data

  • LONGJUMEAU
Agile Jira Cloudera SQL HDFS Apache Hadoop Apache Hive Microsoft Power BI
Disponible
CV Data Engineer Hadoop
Zhicheng

Data Engineer Hadoop

  • SANNOIS
J2EE Java JavaScript Angular SQL PySpark Apache Hadoop
Disponible
CV Senior Data Scientist
Adel

Senior Data Scientist

  • GRENOBLE
SQL Python Microsoft Power BI Data R Github Data Science Excel
Disponible
CV Data Engineer Python
Juan Diego

Data Engineer Python

  • VILLEURBANNE
Python SQL Google Cloud Platform PySpark DevOps Data Science BigQuery Airflow Kubernetes Cloud AWS Transformer Models LLM
Disponible
CV Data Engineer
Adnane

Data Engineer

  • ROMAINVILLE
Databricks Python SQL Apache Spark Azure Data
Disponible
CV Data Engineer Semarchy
Ayoub

Data Engineer Semarchy

  • BOULOGNE-BILLANCOURT
Semarchy SQL Python Microsoft Power BI BigQuery Google Cloud Platform Snowflake
CV Développeur Machine Learning Data
Alex

Développeur Machine Learning Data

  • MARSEILLE
Data Python SQL Azure Cloud AWS
Disponible
Je trouve mon freelance PySpark

Les nouvelles missions PySpark

Expert en IA & Data Science

Python Cloud AWS RGPD PySpark TensorFlow
01/06/2026
75 - PARIS
3 mois
Voir la mission

Data Analyst

SQL Teradata Python PySpark Dataiku
ASAP
92 - BOULOGNE-BILLANCOURT
12 mois
Voir la mission

Data Analyst

SQL Python Dataviz PySpark Power BI
ASAP
33 - BORDEAUX
12 mois
Voir la mission

Data Scientist Machine Learning

IA Générative LLM Python
ASAP
59 - Croix
9 mois
Voir la mission

Data Engineer

Python SQL Azure
ASAP
Télétravail
3 mois
Voir la mission

Développeur PYTHON

Python
ASAP
44 - Nantes
3 mois
Voir la mission

Ingénieur(e) Data Expérimenté(e) - Data & AI - Clermont-Ferrand

Python Dataiku Azure Data Factory Databricks Power BI
ASAP
63 - CLERMONT-FERRAND
11 mois
Voir la mission

Dev / Data Engineer SPARK/SCALA

Apache Spark PySpark Scala
ASAP
94 - Charenton le Pont
6 mois
Voir la mission

Data Engineer

Data Python Apache Hadoop Apache Spark PySpark
ASAP
75 - PARIS
12 mois
Voir la mission

Expert Databricks

Python Databricks
ASAP
69 - LYON
12 mois
Voir la mission
Je trouve ma mission PySpark

Les freelances maîtrisant PySpark ont postulé à :

Data Engineer Innovation AWS/Python

Data Python Amazon AWS
ASAP
92 - NEUILLY-SUR-SEINE
35 mois
Voir la mission

Consultant Junior/confirmé DataScience DataEng.

SQL Python Apache Spark Microsoft Power BI Google Cloud Platform
ASAP
75 - PARIS
6 mois
Voir la mission

Cloud Data Engineer AWS

Data GitLab Cloud AWS Airflow Microsoft Power BI
29/06/2026
75 - PARIS
3 mois
Voir la mission

Data analyst Tableau Software - Secteur automobile

Microsoft Power BI Power BI Gestion multi-projets
ASAP
75 - PARIS
12 mois
Voir la mission

Sr. Data Platform Engineer

SQL Python Google Cloud Platform BigQuery Airflow
ASAP
75 - PARIS
12 mois
Voir la mission

BI Analyst – Power BI - Bruxelles

SQL Power BI
ASAP
Bruxelles (Belgique)
12 mois
Voir la mission

Mécénat de compétences

Microsoft Power BI Power BI
ASAP
Télétravail
1 mois
Voir la mission

DATA ANALYST SENIOR H/F - RENNES

SQL SQL Server Data Microsoft Power BI
ASAP
35 - RENNES
3 mois
Voir la mission

PARIS - Ingénieur Data / Ingénieur Cloud

SQL Server Data Big Data Azure Microsoft Power BI
ASAP
75 - PARIS
6 mois
Voir la mission

Data Platform Enablement Engineer

SQL Data Python BigQuery
ASAP
75 - PARIS
365 jours ouvrés
Voir la mission
Je trouve ma mission PySpark