CV/Mission PySpark freelance

Je dépose une mission gratuitement
Je dépose mon CV
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Exemple d'expériences de Jugurtha,
freelance PYSPARK résidant dans l'Oise (60)

PROFESSIONAL EXPERIENCE

BNPParibas Asset Management 02/2023 up to now
Data Engineer (Python, PySpark & IBM Cloud)

Assignment
Goals of the mission and realizations:
Migration of the whole system to the Cloud and rewriting R code and Python codes into pyspark

Python, Spark, Cloud:
• Rewriting ancient codes into Pyspark and creation of secured data pipeline from scratch to receive data from data providers in the Cloud Landing zone
• Implementation of pyspark jobs for KPI’s calculations
• Supporting data teams in the integration and consumption of data.
• Enrichment of Data Platform mechanisms to standardize uses
• Improvement of automata and Framework ingestion process
• Accelerate migration to the cloud from legacy DWH
• Gain quantification and cost optimization
• Organize a data catalog: business domains, terminology, tags, data quality/lifecycle,
• Security and fine access to data
• Design of architectures for inter-application exchanges
• POC of different innovative solutions

Team composition and technical environment :
• Pyspark, Python, Pandas,
• IBM Cloud (Code pipeline...)
• Agile Method, Sprint Jira
• GitHub

Cora & LDIT 12/2021 to 02/2023
Data Engineer (Python, PySpark, Snowflake, AWS, Stambia ELT & Tableau)

Assignment
Goals of the mission and realizations:
• Development of a python, pyspark flow from the Louis Delaize holding company with activities in 4 countries.
• Support in structuring the Cora loyalty activity.
• Participation in the construction of the GDPR

Python, Spark, AWS:
• Creation of secured data pipeline from scratch to receive data from brands in the Cloud Landing zone while respecting GDPR rules
• Encryption and decryption of files by AWS lambdas developed with python (validated by legal team and the Cora IT Cloud Architects)
• Development of lambdas on AWS to ensure proper organization of data in the datalake
• Implementation of pyspark jobs for KPI’s calculations

Snowflake, Stambia & Tableau software :
• Creation of sql scripts for data validations
• Monitoting database SQL operations on snowflake
• Creation of data pipelines from S3 buckets to snowflake tables
• Creation of Stambia mappings and processes for different purposes
• Creation of dashboards on Tableau Software by respecting charts rules

Team composition and technical environment :
• Pyspark, Python, Pandas,
• Aws services (Lambda, EMR, EC2, ECR, Code Deploy, Code build, Code pipeline...)
• Stambia ELT
• Snowflake, DB2, Oracle,
• Tableau Software
• Agile Method, Sprint Jira

Bonduelle 04/2020 to 11/2021
Stambia ELT, database and AWS developer
Assignment
Goals of the mission:
Development of new stambia flows to support migration from Oracle database to RedShift (AWS)

Actions:

• Analysis of needs and issues with IT teams
• Creation of Stambia mappings and processes with migration to RedShift (AWS)
• Unit tests
• Monitoring of TMA (intervention on tickets)
• Verification of daily production RUN, making technical and/or functional modifications
• Optimization of flows dealing with large volumes (ETL partitioning, restructuring of mappings, creation of indexes)
• Organization of meetings, technical guild with the different members of the team

Results:
• Testing sets realization
• Development of integration steps and Integration tests
• Improving of the mapping process for a better system management

Team composition and technical environment :
Stambia ELT, Oracle 11g, RedShift, S3, Qliksense

Cardif – BNPparibas 08/2018 to 03/2020
Informatica PowerCenter developer

Assignment
Goals of the mission:
the global target of the project is the harmionzation of the data coming from cardif PMS in order to be used for accounting, finance and actuarial science departments. These data must fit with the new standards such as Solvability 2, IFRS17, etc

Achievements :

• Data Quality management and improvement

• Indicators calculation for actuarial science department

• Balance calculation

• Documents redaction, project specification improvement and functional analysis

• development testing and Scenario analysis

• Setup of ETL processes, Mappings and workflows development
• Mapping development for KPIs calculation and management rules implementation
• Mappings and workflows development for feeding the facts tables

Results:
• Production of several Informatica Mapping with SQL optimization under Informatica PWC
• Significant functional changes to management rules having an impact on processes
• Implementation of new test scenarios

Team composition :
• 8 ETL Informatica developers (including experts and techlead),
• 6 Qliksence et PL/SQL developers (including experts and techlead),
• 3 to 4 experts PL/SQL,
• 4 JAVA developers,
• 2 DBA,
• 4 testing developers
• 3 to 4 techleads (database, KSH, PL/SQL, DBWatcher, contrôl M)

Technical / functional environment/ Methodologies :

• Informatica PowerCenter v10, Oracle 12g
• Languages : SQL, PL/SQL, java
• tools : Informatica, sql developer, Qliksence, Putty, WinSCP,
ALM, Power AMC
• Integration : Jenkins
• Database : Oracle
• System : UNIX
• Methodology : Agile & V method
• Architecture : Cardif

CHANEL 2014 – 07/2017
Manufacturing and control technician
ASSIGNMENT :
To cope with the increase in production and the launch of new products, Chanel has reinforced its teams
• Production monitoring
• Management of the production of samples and verification of their conformity
• Participation in the optimization and improvement of the production process (production proto-col and cleaning)
• Management of risk analysis and communication of anomalies
• Modification and improvement of production processes Improvement of control and monitoring processes Participation in work meetings

Technical environment: Industrial production tools, Dedicated software, SAP

CAMFIL 2011 – 08/2014
Production and quality control technician
Camfil is a company producing industrial filters for the nuclear industry, hospitals, pharmaceutical industry, etc.

ASSIGNMENT :
Support on monitoring the production and quality of filters for nuclear activity.
• Monitoring the quality of production
• Analysis of samples and verification of conformity Participation in the development of new products (pilots) Participation in the implementation of analysis protocols
• Optimization of production processes
• Improvement of production processes and quality control protocols

Technical environment: Industrial production and quality control tool, Software dedicated to production and quality monitoring

Voir le profil complet de ce freelance

Expertises semblables à PYSPARK occasionnellement sollicitées

CV PySpark, Missions PySpark

Les derniers freelances PySpark

CV Data Engineer Google Cloud Platform
Canh Hieu

Data Engineer Google Cloud Platform

  • HOUILLES
Google Cloud Platform Python PySpark Apache Hadoop Airflow SQL
Disponible
CV Data Engineer PySpark
Koffi Wilfried

Data Engineer PySpark

  • SAVIGNY-SUR-ORGE
Apache Spark PySpark SQL Python Data Big Data Power BI Microsoft Power BI Snowflake SAS
CV Data Scientist Python
Yann Junior

Data Scientist Python

  • CRÉTEIL
Python Data SQL Apache Spark Cloud AWS PySpark Snowflake DevOps Google Cloud Platform
Disponible
CV Ingénieur de développement JavaScript
Fairouz

Ingénieur de développement JavaScript

  • PUTEAUX
JavaScript Node.js Amazon AWS
Bientôt disponible
CV Data Scientist
Kamal

Data Scientist

  • LYON
Microsoft Power BI Python SQL Data Science Google Cloud Platform PySpark
Bientôt disponible
CV Data Scientist/Data Engineer
Soukaina

Data Scientist/Data Engineer

  • LILLE
SQL Server Data Science Python Microsoft Power BI Google Cloud Platform LLM SQL Talend Docker Apache Spark
Disponible
CV Développeur fullstack IA
Manh Cuong

Développeur fullstack IA

  • LE PERREUX-SUR-MARNE
Cloud AWS Python GitLab PostgreSQL Docker LLM Kubernetes
Bientôt disponible
CV Administrateur Bases de Données Unix
Ouissam

Administrateur Bases de Données Unix

  • PARIS
Unix
Disponible
CV Data Analyst SQL
Tarik

Data Analyst SQL

  • PARIS
SQL Microsoft Power BI Python Jira Excel Google Cloud Platform Azure Data Agile Java
Disponible
CV Data Engineer (GCP, Python, DevOps)
Hichem

Data Engineer (GCP, Python, DevOps)

  • MONTESSON
Python Google Cloud Platform Data DevOps SQL Airflow
Bientôt disponible
Je trouve mon freelance PySpark

Les nouvelles missions PySpark

Expert Databricks

Python Databricks
19/01/2026
69 - LYON
12 mois
Voir la mission

Data Engineer

SQL Data Python Git PySpark
15/12/2025
92 - COURBEVOIE
12 mois
Voir la mission

Data Engineer Dataiku & Azure - Clermont-Ferrand

SQL Python Azure Dataiku Databricks
05/01/2026
63 - CLERMONT-FERRAND
12 mois
Voir la mission

Data engineer

SQL Python Power BI Dataiku Azure Data Factory
ASAP
63 - Clermont-Ferrand
3 mois
Voir la mission

Dev / Data Engineer orienté DEVOPS

Apache Spark Python PySpark XL Deploy XL Release
ASAP
94 - Charenton le Pont
3 mois
Voir la mission

Tech Lead Data Analyst

Business Objects Data Python Google Cloud Platform Databricks
ASAP
75 - PARIS
24 mois
Voir la mission

Data analyst

SQL Data Python Apache Hadoop Google Cloud Platform
ASAP
75 - PARIS
12 mois
Voir la mission

Data Engineer SPARK / SCALA

PySpark Apache Spark Apache Hadoop Scala
ASAP
94 - Charenton le Pont
12 mois
Voir la mission

Ingénieur DevOps AWS

Amazon AWS HashiCorp Terraform Python Github Databricks
ASAP
75 - Paris
6 mois
Voir la mission

Data Engineer AWS

Amazon AWS Python PySpark SAP
ASAP
76 - Le Havre
6 mois
Voir la mission
Je trouve ma mission PySpark

Les freelances maîtrisant PySpark ont postulé à :

PARIS - Ingénieur Data / Ingénieur Cloud

SQL Server Data Big Data Azure Microsoft Power BI
ASAP
75 - PARIS
6 mois
Voir la mission

Profil Data et BI

Data
ASAP
Genève
12 mois
Voir la mission

Data Analyst PowerBI

Google Cloud Platform Power BI
ASAP
59 - LILLE
3 mois
Voir la mission

Développeur Data / BI (H/F)

SQL Data Azure Azure Data Factory Power BI
ASAP
31 - TOULOUSE
6 mois
Voir la mission

Dev Back-End Sénior Python

Python Git
ASAP
91 - MASSY
3 mois
Voir la mission

Data Engineer

Python Microsoft Power BI Azure Data Factory Azure Synapse
ASAP
San Sebastian
24 mois
Voir la mission

Data engineer

Data
ASAP
75 - PARIS
5 mois
Voir la mission

Cloud engineer

Gestion multi-projets
ASAP
75 - PARIS
5 mois
Voir la mission

Consultant AI/ LLM Engineering

Python Elasticsearch TensorFlow PyTorch Hub DataRobot MLOps
ASAP
75 - PARIS
6 mois
Voir la mission

Tech Lead Data Dataiiku

Data ODI Big Data Microsoft Power BI Dataiku
ASAP
79 - NIORT
36 mois
Voir la mission
Je trouve ma mission PySpark