CV/Mission PySpark freelance

Je dépose une mission gratuitement
Je dépose mon CV
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Exemple des missions de Jugurtha,
freelance PYSPARK résidant dans l'Oise (60)

PROFESSIONAL EXPERIENCE

BNPParibas Asset Management 02/2023 up to now
Data Engineer (Python, PySpark & IBM Cloud)

Assignment
Goals of the mission and realizations:
Migration of the whole system to the Cloud and rewriting R code and Python codes into pyspark

Python, Spark, Cloud:
• Rewriting ancient codes into Pyspark and creation of secured data pipeline from scratch to receive data from data providers in the Cloud Landing zone
• Implementation of pyspark jobs for KPI’s calculations
• Supporting data teams in the integration and consumption of data.
• Enrichment of Data Platform mechanisms to standardize uses
• Improvement of automata and Framework ingestion process
• Accelerate migration to the cloud from legacy DWH
• Gain quantification and cost optimization
• Organize a data catalog: business domains, terminology, tags, data quality/lifecycle,
• Security and fine access to data
• Design of architectures for inter-application exchanges
• POC of different innovative solutions

Team composition and technical environment :
• Pyspark, Python, Pandas,
• IBM Cloud (Code pipeline...)
• Agile Method, Sprint Jira
• GitHub

Cora & LDIT 12/2021 to 02/2023
Data Engineer (Python, PySpark, Snowflake, AWS, Stambia ELT & Tableau)

Assignment
Goals of the mission and realizations:
• Development of a python, pyspark flow from the Louis Delaize holding company with activities in 4 countries.
• Support in structuring the Cora loyalty activity.
• Participation in the construction of the GDPR

Python, Spark, AWS:
• Creation of secured data pipeline from scratch to receive data from brands in the Cloud Landing zone while respecting GDPR rules
• Encryption and decryption of files by AWS lambdas developed with python (validated by legal team and the Cora IT Cloud Architects)
• Development of lambdas on AWS to ensure proper organization of data in the datalake
• Implementation of pyspark jobs for KPI’s calculations

Snowflake, Stambia & Tableau software :
• Creation of sql scripts for data validations
• Monitoting database SQL operations on snowflake
• Creation of data pipelines from S3 buckets to snowflake tables
• Creation of Stambia mappings and processes for different purposes
• Creation of dashboards on Tableau Software by respecting charts rules

Team composition and technical environment :
• Pyspark, Python, Pandas,
• Aws services (Lambda, EMR, EC2, ECR, Code Deploy, Code build, Code pipeline...)
• Stambia ELT
• Snowflake, DB2, Oracle,
• Tableau Software
• Agile Method, Sprint Jira

Bonduelle 04/2020 to 11/2021
Stambia ELT, database and AWS developer
Assignment
Goals of the mission:
Development of new stambia flows to support migration from Oracle database to RedShift (AWS)

Actions:

• Analysis of needs and issues with IT teams
• Creation of Stambia mappings and processes with migration to RedShift (AWS)
• Unit tests
• Monitoring of TMA (intervention on tickets)
• Verification of daily production RUN, making technical and/or functional modifications
• Optimization of flows dealing with large volumes (ETL partitioning, restructuring of mappings, creation of indexes)
• Organization of meetings, technical guild with the different members of the team

Results:
• Testing sets realization
• Development of integration steps and Integration tests
• Improving of the mapping process for a better system management

Team composition and technical environment :
Stambia ELT, Oracle 11g, RedShift, S3, Qliksense

Cardif – BNPparibas 08/2018 to 03/2020
Informatica PowerCenter developer

Assignment
Goals of the mission:
the global target of the project is the harmionzation of the data coming from cardif PMS in order to be used for accounting, finance and actuarial science departments. These data must fit with the new standards such as Solvability 2, IFRS17, etc

Achievements :

• Data Quality management and improvement

• Indicators calculation for actuarial science department

• Balance calculation

• Documents redaction, project specification improvement and functional analysis

• development testing and Scenario analysis

• Setup of ETL processes, Mappings and workflows development
• Mapping development for KPIs calculation and management rules implementation
• Mappings and workflows development for feeding the facts tables

Results:
• Production of several Informatica Mapping with SQL optimization under Informatica PWC
• Significant functional changes to management rules having an impact on processes
• Implementation of new test scenarios

Team composition :
• 8 ETL Informatica developers (including experts and techlead),
• 6 Qliksence et PL/SQL developers (including experts and techlead),
• 3 to 4 experts PL/SQL,
• 4 JAVA developers,
• 2 DBA,
• 4 testing developers
• 3 to 4 techleads (database, KSH, PL/SQL, DBWatcher, contrôl M)

Technical / functional environment/ Methodologies :

• Informatica PowerCenter v10, Oracle 12g
• Languages : SQL, PL/SQL, java
• tools : Informatica, sql developer, Qliksence, Putty, WinSCP,
ALM, Power AMC
• Integration : Jenkins
• Database : Oracle
• System : UNIX
• Methodology : Agile & V method
• Architecture : Cardif

CHANEL 2014 – 07/2017
Manufacturing and control technician
ASSIGNMENT :
To cope with the increase in production and the launch of new products, Chanel has reinforced its teams
• Production monitoring
• Management of the production of samples and verification of their conformity
• Participation in the optimization and improvement of the production process (production proto-col and cleaning)
• Management of risk analysis and communication of anomalies
• Modification and improvement of production processes Improvement of control and monitoring processes Participation in work meetings

Technical environment: Industrial production tools, Dedicated software, SAP

CAMFIL 2011 – 08/2014
Production and quality control technician
Camfil is a company producing industrial filters for the nuclear industry, hospitals, pharmaceutical industry, etc.

ASSIGNMENT :
Support on monitoring the production and quality of filters for nuclear activity.
• Monitoring the quality of production
• Analysis of samples and verification of conformity Participation in the development of new products (pilots) Participation in the implementation of analysis protocols
• Optimization of production processes
• Improvement of production processes and quality control protocols

Technical environment: Industrial production and quality control tool, Software dedicated to production and quality monitoring

Voir le profil complet de ce freelance

Compétences comparables à PYSPARK quelquefois demandées

CV PySpark, Missions PySpark

Les derniers freelances PySpark

CV Power BI Consultant / Analytics Engineer
Wissem

Power BI Consultant / Analytics Engineer

  • VILLEJUIF
Microsoft Power BI DAX Figma SQL NoSQL Python Azure Data Factory Google Analytics Github
Disponible
CV Data Analyst
Amazigh

Data Analyst

  • FONTENAY-SOUS-BOIS
SAS Excel Excel VBA SQL Tableau Software MicroStrategy Microsoft Power Query Python Microsoft Power BI
Disponible
CV Technical Leader data
Yanis

Technical Leader data

  • TOURNAN-EN-BRIE
Data Python SQL Apache Spark Cloud AWS
Bientôt disponible
CV Data Engineer IA
Constantin

Data Engineer IA

  • PARIS
Python SQL Power BI Data Science Snowflake PySpark
Disponible
CV Data Scientist  / Ingénieur calcul
Jonathan

Data Scientist / Ingénieur calcul

  • BOISSISE-LE-ROI
Data Python SQL PySpark Deep Learning PyTorch TensorFlow Python (Pandas, NumPy) Github Cloud AWS Microsoft Power BI Docker
Disponible
CV Data Scientist & AI Engineer
Adel

Data Scientist & AI Engineer

  • RIS-ORANGIS
Python SQL Excel
Disponible
CV Analytics Engineer / Data Engineer
Amine

Analytics Engineer / Data Engineer

  • PARIS
SQL Microsoft Power BI Data Python Excel
Disponible
CV Directeur de projet IT, Expert Agile
Jean-Georges

Directeur de projet IT, Expert Agile

  • DOMONT
Agile Jira Scrum Risques Kanban Dataiku
Disponible
CV Data Engineer Apache Spark
Salim

Data Engineer Apache Spark

  • TORCY
Apache Spark Google Cloud Platform Python Cloud AWS Apache Hadoop Apache Kafka Databricks MongoDB Elasticsearch Kubernetes Docker GitLab Oracle Apache Hive HDFS Azure Data Factory PostgreSQL
Disponible
CV Data Engineer Python
Djamel

Data Engineer Python

  • CACHAN
Python SQL Microsoft Power BI Python (Pandas, NumPy) Google Cloud Platform PySpark Airflow Snowflake Databricks Excel
Disponible
Je trouve mon freelance PySpark

Les nouvelles missions PySpark

Data Analyst

SQL Python Dataviz PySpark Power BI
06/04/2026
33 - BORDEAUX
12 mois
Voir la mission

Data Scientist Machine Learning

IA Générative LLM Python
ASAP
59 - Croix
9 mois
Voir la mission

Data Engineer

Python SQL Azure
ASAP
Télétravail
3 mois
Voir la mission

Développeur PYTHON

Python
ASAP
44 - Nantes
3 mois
Voir la mission

Ingénieur(e) Data Expérimenté(e) - Data & AI - Clermont-Ferrand

Python Dataiku Azure Data Factory Databricks Power BI
ASAP
63 - CLERMONT-FERRAND
11 mois
Voir la mission

Dev / Data Engineer SPARK/SCALA

Apache Spark PySpark Scala
ASAP
94 - Charenton le Pont
6 mois
Voir la mission

Data Engineer

Data Python Apache Hadoop Apache Spark PySpark
ASAP
75 - PARIS
12 mois
Voir la mission

Expert Databricks

Python Databricks
ASAP
69 - LYON
12 mois
Voir la mission

Data Engineer

SQL Data Python Git PySpark
ASAP
92 - COURBEVOIE
12 mois
Voir la mission

Data Engineer Dataiku & Azure - Clermont-Ferrand

SQL Python Azure Dataiku Databricks
ASAP
63 - CLERMONT-FERRAND
12 mois
Voir la mission
Je trouve ma mission PySpark

Les freelances maîtrisant PySpark ont postulé à :

Sr. Data Platform Engineer

SQL Python Google Cloud Platform BigQuery Airflow
ASAP
75 - PARIS
12 mois
Voir la mission

DATA ANALYST SENIOR H/F - RENNES

SQL SQL Server Data Microsoft Power BI
ASAP
35 - RENNES
3 mois
Voir la mission

PARIS - Ingénieur Data / Ingénieur Cloud

SQL Server Data Big Data Azure Microsoft Power BI
ASAP
75 - PARIS
6 mois
Voir la mission

Profil Data et BI

Data
ASAP
Genève (Suisse)
12 mois
Voir la mission

Consultant Data

Data
ASAP
75 - PARIS
6 mois
Voir la mission

Business Analyst – BI

SQL Data BI
ASAP
75 - PARIS
6 mois
Voir la mission

Développeur Data / BI (H/F)

SQL Data Azure Azure Data Factory Power BI
ASAP
31 - TOULOUSE
6 mois
Voir la mission

Dev Back-End Sénior Python

Python Git
ASAP
91 - MASSY
3 mois
Voir la mission

Data Engineer

Python Microsoft Power BI Azure Data Factory Azure Synapse
ASAP
San Sebastian (Espagne)
24 mois
Voir la mission

Data Engineer Senior

SAP Data BODS
ASAP
92 - PUTEAUX
24 mois
Voir la mission
Je trouve ma mission PySpark