CV/Mission Apache HBase freelance

Je dépose une mission gratuitement
Je dépose mon CV
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Aperçu d'expériences de Rafiqul,
freelance APACHE HBASE résidant dans Paris (75)

PROFESSIONAL EXPERIENCE

Research Engineer January 2016
- June 2016
• Organization:Telecom St-Etienne
St-Etienne, France.
• Duties:
– Collect RDF (graph) data from DBPedia, NELL Datastorre, and Yago Datastore.
– Clean and enrich data.
– Developing solution for reasoning dataset to discover hidden relation between RDF triples
and interlink the newly discovered relations automatically.
– Produce technical reports.

Research Engineer 2015 - 2016
• Organization: Laboratoire d’InfoRmatique en Image et Syst`emes
Lyon, France.
• Duties:
– Build a Hadoop cluster for storing large-scale data streaming from different sources.
– Integrate MongoDB, Apache Hbase, Apache Cassandra in Hadoop ecosystem.
– Collect data streaming from Lyon Smart city, Twitter using Apache kafka.
– Integrate Apache Flume for collecting data from logistics business process.
– Integrate Apache Storm in Hadoop ecosystem for realtime processing of data.
– Integrate Apache Spark in Hadoop ecosystem for processing data in batch style.
– Design and lead the development of a realtime sentiment analytics and route optimization
analytics.
– Design and lead the development of a solution for clustering text in realtime.
– Designand lead the development of realtime recommendation system.
– Conduct experiment with graph pattern mining algorithms including gSpan, FFSM, etc.
– Produce technical reports.

Software Experimental Track Manager 2013-2015
• Organization: Laboratoire d’InfoRmatique en Image et Syst`emes
Lyon, France.
• Duties:
– Design and implement a scalable Hadoop infrastructures on Grid5000.
– Conduct experiments with distributed file systems including Hadoop Distributed FIle
System and Lustre.
– Conduct experiment with MapReduce based complex query processing engines including,
SHARD, RDF3X, H2RDF, Virtuoso, RDFPig, and Jena TDB.
– Design and lead the development of an n-tier scalable framework for collecting, cleaning,
distributing, and querying Big Linked data.
– Test and debug the framework.
– Produce technical reports.

Business Process Architect 2012-2013
• Organization: Governance, Risk, and Technology Research Center
Cork, Ireland.
Responsibilities:
– Design a framework for smart financial service-based application.
– Develop and test the prototype the framework.
– Design, verify and validate financial service reference process models.
– Encode financial regulations using the Semantic Web technologies such as OWL (Web
Ontology Language).
– Produce technical reports.

Research Associate 2009-2010
• Organization: European Research Institute of Service Science (ERISS)
Tilburg, the Netherlands.
• Responsibilities:
– Design a declarative language for modeling business transaction.
– Develop service-based solution for designing and running flexible business transactions of
complex purchase order business processes.
– Produce technical reports.

Developer 2007-2009
• Organization: SAP AG, Germany.
• Responsibilities:
– Develop Web Services for the logistics system.
– Design end-to-end logistics management processes for a Small and Medium Scale Enterprise.
– Conduct feasibility study on industry standard technologies such as ebXML.
– Develop website for Seukukhune Living Lab, South Africa.
– Produce technical reports.

Head of the Computer Lab 2004-2006
• Responsibilities:
– Building the local area network (LAN) of computer lab.
– Managing, administering the network of computer lab.

PROTOTYPES DEVELOPED
• CedTMart: A framework for efficient processing of complex, distributed, and parallel queries
on massive-scale RDF graph data.
• CedCOM : A Cache Only Memory Architecture for Big Data Applications.
• GAIA: A generic large-scale RDF data generator.
• Xadoop: A multimaster Hadoop for Building Highly Fault Tolerant Cluster.
FraTAct: A framework for execution large-scale financial processes by checking compliance
with financial regulations.
• RODL: An agile methodology for encoding financial regulations into machine readable codes.
• Reference models for designing financial service business processes.
• PAEAN4CLOUD: A framework for monitoring and managing the SLA Violation of Cloud
Service-based Applications.
• PAEAN : A risk-mitigation framework for business transaction at runtime.

Voir le profil complet de ce freelance

Compétences analogues à APACHE HBASE parfois recherchées

CV Apache HBase, Missions Apache HBase

Les derniers freelances Apache HBase

CV Data Engineer APACHE SPARK
Moad

Data Engineer APACHE SPARK

  • SAINT-MAUR-DES-FOSSÉS
Apache Spark DevOps SQL Apache HBase Java Scala Spring Boot Red Hat OpenShift Dynatrace
CV Data Engineer | Data Analyst | Formateur
Ahmed

Data Engineer | Data Analyst | Formateur

  • NANTES
SQL Data Apache Hadoop Talend Microsoft Power BI Python Google Cloud Platform Big Data Azure Apache Spark
Bientôt disponible
CV Data Engineer PYTHON
Yassine

Data Engineer PYTHON

  • THIAIS
Python Apache Spark PySpark SQL Cloudera Apache Kafka Shell Unix HDFS Apache HBase Apache Hive
CV Développeur Sénior /  Tech Lead / Référent Java, Bases de Données
Laabidi

Développeur Sénior / Tech Lead / Référent Java, Bases de Données

  • LES SORINIÈRES
Java Spring Hibernate API RESTful Apache Kafka Elasticsearch Python Rust Oracle Azure
CV Développeur
Monia

Développeur

  • LILLE
SQL Python React.js Docker WebUI JavaScript UX Design TDD Apache HBase Apache Hadoop
CV Consultant APACHE HADOOP
Jaleed

Consultant APACHE HADOOP

  • NOGENT-SUR-OISE
Bash Apache Hadoop Cloudera Ansible Red Hat HDFS Apache HBase Git HashiCorp Terraform Java
CV Data Engineer Spark & Talend
Amokrane

Data Engineer Spark & Talend

  • BOULOGNE-BILLANCOURT
Talend SQL Cloud AWS SQL Server Databricks PySpark Python Oracle Microsoft Power BI SAP
CV Consultant Data (Talend | Certified MuleSoft)
Mehdi

Consultant Data (Talend | Certified MuleSoft)

  • MARSEILLE
Talend Java SQL Windows Git Oracle MuleSoft Anypoint Platform Agile SQL Server Jenkins
CV Senior Data Engineer - Cloud
Oussama

Senior Data Engineer - Cloud

  • PARIS
Cloud AWS Google Cloud Platform Azure Azure Data Factory Apache NiFi Apache Spark Kafka Java Scala Python
CV Architecte Technique / Data
Ben Afia

Architecte Technique / Data

  • MASSY
Apache Hadoop Apache HBase Apache Kafka Kerberos Active Directory Elasticsearch Oracle Kubernetes
Disponible
Je trouve mon freelance Apache HBase

Les nouvelles missions Apache HBase

Formateur en Big Data Avancée / AirFlow

Kafka NoSQL Apache HBase Apache Hive
ASAP
63
3 mois
Voir la mission

DBA PostgreSQL

PostgreSQL SQL Java Apache HBase Teradata
ASAP
92 - Paris
3 mois
Voir la mission

Data ingénieur

Apache Hadoop Apache HBase HDFS Apache Spark Apache Kafka
ASAP
grand duché du luxembourg
12 mois
Voir la mission

Développeur/Ingénieur BIG DATA

Scala Apache Spark Apache Kafka Elasticsearch Apache HBase
ASAP
75 - Paris
3 mois
Voir la mission

Architecte BIG DATA/HORTONWORKS

Big Data HORTONWORKS Apache Hadoop Apache HBase
ASAP
27 - Evreux
110 jours ouvrés
Voir la mission

Auditeur HBASE / HADOOP

Apache HBase Apache Hadoop
ASAP
59 - Lille
A définir
Voir la mission

Administrateur Hbase

Apache HBase Apache Hadoop
ASAP
38 - Grenoble
4 mois
Voir la mission

Consultant HADOOP

HTML PHP Oracle Apache Hive Apache HBase
ASAP
Paris
3 mois
Voir la mission

Expert HADOOP

Apache Hadoop Apache HBase HDFS Apache Solr
ASAP
33 - Bordeaux
15 jours ouvrés
Voir la mission
Je trouve ma mission Apache HBase