Philippe Rossignol
About
Philippe Rossignol is from Niort, Nouvelle-Aquitaine, France. Philippe works in the following industries: "Insurance". Philippe is currently Responsable et référent technique offre Data at MATIS, located in Niort. Philippe also works as Technical Director of Data-Intelligence offer at AKKA Digital at Akka Technologies, a job Philippe has held since Jan 2017. Another title Philippe currently holds is Directeur technique de l'offre Data-Intelligence at AKKA Digital. In Philippe's previous role as a Big Data Manager at Akka Technologies, Philippe worked in Niort Area, France until Apr 2017. Prior to joining Akka Technologies, Philippe was a Architecture and Data-Engineering expertises at SPHEREA and held the position of Architecture and Data-Engineering expertises at Toulouse Area, France. Prior to that, Philippe was a Senior Big Data Expert and Architect at MAAF Assurances - Groupe Covéa, based in Niort Area, France from Jan 2016 to Jul 2016. Philippe started working as Big Data Consultant at BNP Paribas Cardif in Nanterre in Dec 2014. From Sep 2014 to Nov 2014, Philippe was Big Data - Architecture Consultant & Development at Cdiscount, based in Bordeaux Area, France. Prior to that, Philippe was a Big Data - Architecture Consultant & Development at Cdiscount, based in Bordeaux Area, France from May 2014 to Aug 2014. Philippe started working as Big Data - Architecture Consultant & Development at Bull in Bordeaux Area, France in Mar 2014.
You can find Philippe Rossignol's email on finalScout.com. FinalScout is a popular professional database with more than 500 million business professionals and over 200 million company profiles.
Philippe Rossignol's current jobs
Philippe Rossignol's past jobs
Goals : Enable building of different turnkey Big-Data architectures targeted for clients. Datalake, Collecting data, Transformation and Visualisation. Tech: Building a Datalake based on Hortonworks. On-Premise and AWS installations mixed with docker containers. Collecting and transforming data with Jupyter/Spark and Zeppelin (MongoDB, CSV, JSON and XML).
Leading of Data driven projects, Data centric and Data preparation approaches with Spark, Dashboards on SolrCloud indexes, Machine Learning with Spark, Prototyping with Jupyter and its kernels, Support and Training, Supervising, Tech-Lead and Architect, Installations and On-demand deployments, Developments with Python Java Scala and Shell
Big Data expert into the DataLab team of CARDIF: • Role: In order to implement business use cases (such as Anti-Fraud detection, churn, appetence, text analysis for automatic classification, etc.): Consulting, responsible of installation, configuration and security of the cluster based on Hadoop and Spark. In charge of the business data injection into the DataLake. • Architecture, Administration: Hadoop Cloudera cluster (HDFS, YARN). Installations, Configurations and Tuning of the clusters, Spark/YARN, Anaconda and Jupyter. Shell Linux developments. • Development and data injection: Development of Java MapReduce jobs, Pig and Hive scripts, Spark (RDDs and DataFrames) projects for ETL functions, Shell scripts to simplify data injections. • DataScience: Evangelizations in order to help the DataScientists to use Spark Dataframes and Spark Machine Learning as they use Pandas and SciKit-Learn. • Machine-Learning Challenge (with Retail data): CDiscount Multi-categorization (www.datascience.net). 500 Multinomial-NaiveBayes models, with Stemming, Stratified sampling and Mutual Information. • Trainings: Spark for Developers (RDDs Spark Core). Spark for DataScientists (DataFrames and Spark Machine Learning, Pipelines)
Solr search engine : Defining of a Test Plan, and development of a specific tool to compare performances and strengths between the two famous search engines that are Solr and Exalead. Proposition of a model to increase the relevance of queries’s results from the web Front, consisting to a first step of NLP with Clustering for automatically find out the tags of articles, followed by a second step of Classification with the Naive Bayes, to finally execute the Solr search function based on the lemmas.
NoSQL Cassandra and Distributed messaging Kafka : As part of the implementation of a strategic large project, defining a Test Plan performance based on Apache Kafka broker with the NoSQL Cassandra. Development of a configurable tool covering all the test cases Kafka-Cassandra in the new Cdiscount environments. Surveys around an hybrid scenario SQL-NoSQL concerning a future implementation of a multilingual platform based on Cassandra, with a direct implication in a call of tender. Creation of the open-source project named KafkaGust on GitHub, allowing with Apache Kafka the production in real-time both of statistics and big volumes of datas, by using customized message formats as Text, Json, Xml and others.
Hadoop platform, Java, Hive, Pig, Sqoop : Development of Proofs Of Concepts based on the Hadoop Cloudera distribution. Development of Recommenders, using the Mutual Information for scoring, Naïve Bayes for Classification and predictions, Collaborative Filtering with Python (Tonimoto and Euclidean distance algorithms) and Apache Mahout. Proofs Of Concepts around Apache Kafka & Zookeeper, Apache Storm, Cassandra. Surveys around the Machine Learning and tests of algorithms (Clustering, Classification and Collaborative filtering). Help to launch a new European R&D project combining together the M2M and Big Data technologies. Presentations and demos around the multi-layers neural networks, how the supervised trainings works for Classification (for predictions), and the non-supervised trainings for Clustering.
Web Technologies : Involvements during a reversibility phase concerning a famous French web application written in Java-EE and JavaScript, and dedicated to produce in real time some rich contents about the traffic jams, the accidents, the roadwork’s, etc. Designing new architectures in order to develop the new application's version.
Web Technologies : Technical audits to design the new n-tiers architectures of existing applications dedicated for the Yield Management business domain (optimizing the prices for trains tickets). Technologies : AngularJS framework, JQuery et JqPlot for the GUIs, web-services, JBoss AS server configured with High Availability, Engine Rules such as Drools and BRMS to compute automatically the recurrent and simple business rules, etc.).
Investigation around an evolution of a supervision platform for banking terminals : Audits, studies and costings around new functional and technical scenarios. Recommendations about technologies to use for an overhaul of the platform.
Support and compliance of operational processes and action plans : Compliance of the project plans and operational processes to successfully deliver in time the statistical business applications (technologies : Java-EE, Customer's frameworks, ClearCase, Maven, SonarQube, WebLogic, Oracle, SAS, Unix). Design and development of a SSO launcher for SAS Enterprise Guide V4.3 : Launcher deployed in all the French agencies, providing an automatic authentication for the end-users like the statisticians. Design and development of a reliable integration chain for components SAS, Cobol and Unix : Solution similar to a continuous integration system, automatically preparing of each component SAS, Cobol and Unix for a target environment such as the production.
Quotation for a call for tender successfully won, consisting of the creation of a Web Java EE Application for managing the European historical heritages (method used : Use Case Points).
Interface between the Business owner and the Project owner. Definition of a specification to start as soon as possible a Web Java EE application project to facilitate the search of jobs. Comparison of the costings with the Use Case Points method.
Designing new architectures, specifications around the Java ESB bus to transfer the business data in a secure and reliable way (solution based on Java OSGI, and Java ESB light Apache Camel).
Designing architectures, opportunity surveys, audits, feasibility surveys, costings.
French leadership and coordinator of a R&D European project, called Usenet (ITEA2 consortium) for the creating of a new European standard in the domain of the M2M (Machine to Machine). Surveys around Android.
Design and development (for the Completel-Bouygues provider) of an ETL Java application based on TALEND, to transfer automatically and monthly the invoices that comes from the clients. Design and development of an ETL Java application (for the CNAMTS client) based on TALEND, to import the business datas that come from a fleet management system, to the SIEBEL and the GLPI systems.
Costings and technical surveys around solutions from the domains of the GPS localization and the Fleet Management. Survey for evolving a client-server architecture to a N-tier JEE architecture. Advices around systems concerning the access controls, and the time management.
Technical support activities to help the project teams. Implementation of an ETL application (based on Oracle Java-Sunopsis ETL) for a real-time synchronization and a bidirectional communication between many heterogeneous databases. Specifications and developments for a Java-EE web PDM application (Product Data Management). Technologies: Rational Rapid Developer technologies, WebSphere, Tomcat, Oracle, tests of load by using IBM Workload Simulator. Creation of specific software which is able to automatically install all the Oracle 10G Databases in a "silent" mode (technologies used: InstallShield, Java, Ant). Developments using of Java EJBs, JBoss, WebLogic and WebSphere servers.
Java developments in server side in the domain of the call centers. Development of a CTI server (Computer Telephony Interface) pure Java for Alcatel-Lucent PABX and Genesys middle-ware, providing a high availability for hundreds connected operators using a CRM application (Customer Relationship Management). Technologies used : Java, Siebel CRM, CTI Genesys, Oracle, MySQL, SQL-Server, MQSeries, SWIFT, LDAP. Main clients : CNAMTS, MGEN GROUP CIC, Groupama, URSSAF, CNCA