CV BIGQUERY : Sélectionnez gratuitement les meilleurs profils

Je dépose une mission gratuitement
Je dépose mon CV

Les derniers profils BIGQUERY connectés

CV Senior Data Analyst - Data Engineer
Martin

Senior Data Analyst - Data Engineer

  • BORDEAUX
BIGQUERY SQL Google Cloud Platform BIG DATA PYTHON
Disponible
CV Data Science & machine learning engineer
Hamid

Data Science & machine learning engineer

  • CHESSY
C# TRANSACT SQL PL SQL POSTGRESQL APACHE SPARK PYSPARK SQL SERVER DATA ORACLE PYTHON
Disponible
CV Architecte JAVA
Alain Narcisse

Architecte JAVA

  • antony
JAVA UNIX UML J2EE JBOSS SYBASE INFORMIX HIBERNATE SPRING IBM WEBSPHERE SERVER
Disponible
CV Architecte APACHE SPARK
Mohamed Oussama

Architecte APACHE SPARK

  • MONTROUGE
APACHE SPARK PYTHON APACHE KAFKA APACHE HADOOP SCALA MONGODB APACHE HBASE CASSANDRA BIG DATA Data lake
Disponible
CV Architecte solution data
Naoufal

Architecte solution data

  • Meudon
Cloud AWS AZURE Google Cloud Platform MDM BI Cloudera TOGAF BIG DATA SAFE ARCHIMATE
Disponible
CV Data Scientist Google Cloud Platform
Omar

Data Scientist Google Cloud Platform

  • COURBEVOIE
PYTHON SQL DATA Google Cloud Platform Cloud AWS
CV Ingénieur Data Scientist & Systèmes Embarqués
Abdelkarim

Ingénieur Data Scientist & Systèmes Embarqués

  • BRÉTIGNY-SUR-ORGE
PYTHON LINUX UBUNTU SYSTEMES EMBARQUES AGILE AZURE MONGODB NoSQL C PHP Data science
Disponible
CV Data Scientist / Engineer & Développeur Flutter / Python
Jaber

Data Scientist / Engineer & Développeur Flutter / Python

  • PARIS
PYTHON Dataiku SQL DATA FLUTTER JAVASCRIPT NoSQL R
Disponible
CV Data Engineer / Analyst
Ayoub

Data Engineer / Analyst

  • Maisons Laffitte
DATA PYTHON SQL Microsoft Power BI Google Cloud Platform APACHE SPARK
Disponible
CV Data Analyst SQL
Fatma

Data Analyst SQL

  • AULNAY-SOUS-BOIS
SQL BIGQUERY Microsoft Power BI JIRA AGILE
Disponible
Je trouve un CV BIGQUERY
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Exemple des emplois de Berramou,
freelance BIGQUERY résidant dans l'Isère (38)

  • Full stack Data DPD FRANCE
    2022 - aujourd'hui

    • Onprem Postgres datawarehouse : Develop comprehensive specifications for the IT department, enabling the construction of a PostgreSQL data warehouse that effectively utilizes data from local servers across the agencies.
    • Datamarts for delivery activity reporting : Leverage advanced SQL queries and utilize the power of dbt to establish robust datamarts for monitoring and analyzing delivery activity within the organization.
    • Data migration to GCP: Create a secure and GDPR-compliant Google Cloud infrastructure (using terraform, CI/CD). Transfer data to cloud storage in Parquet format. Transform and ingest data into BigQuery using dbt for efficient data transformation and Airflow/Cloud Composer for seamless orchestration. Ensure GDPR data compliance by implementing purge and anonymization procedures.

  • Full stack Data Madiaperformance
    2021 - 2023

    • Categorial analysis (Carrefour data retail) : Automation of categorical analysis in order to provide insights on a brand or a
    product category from the transactional data of carrefour france
    (Big data)) using python and bigquery (Advanced sql queries)
    • Data migration (OnPrem to GCP): Migrate data from Postgresql
    to Google cloud storage then ingest the data to bigquery in a
    defined frequency of a batch data using python, Fastapi, deployed
    on cloud run and managed with cloud scheduler (All compute
    resources are terraformed)
    • Data retrieval and structure : Automate mixpanel data retrieval
    via api mixpanel to google cloud storage and then ingest the data
    to Bigquery using python, fastapi, deployed on cloud run and
    scheduled with cloud scheduler.
    • Data quality/analysis : Data analysis and data quality using
    python (Pandas, data prep, fuzzyWuzzy algorithm), bigquery and
    dataiku ng python, Fastapi, deployed on cloud run and managed
    with cloud scheduler (All compute resources are terraformed).
    • GCP : Implement, manage GCP and create an organisation
    whithin it using cloud identity and configure Azure active directory Single Sign-On (sso) with google cloud connector.
    • Cloud provider : Benchmarking multiple cloud providers using
    TPC-DS in order to find the most suitable solution for the company (Built all compute resources are terraformed)

  • Data Scientist / data analyst

    Prisma Media (PMS)
    2019 - 2021

    • Scoring: predict the gender and age of visitors of Prisma media’s websites based on their browsing behaviour and the CRM
    database. Developped with python, scikitlearn decision trees
    model.
    • Segment manager: an intercative segment catalogue. Developed
    with Rshiny, python and deployed with Google compute engine
    (GCE) and cloud build for CI/CD.
    • Revenue dashboard: reporting in detailed manner, revenues generated from advertising and segments data (cookie). Developed
    with Rshiny, python, Google Ad manager API and deployed with
    GCE and cloud build for CI/CD.

  • Data Scientist / data analyst

    Internship (5 months) JohnDeere
    2019 - aujourd'hui

Voir le profil complet de ce freelance