Loic Ambamany
About
Loic Ambamany is from New York, New York, United States. Loic works in the following industries: "Financial Services", "Staffing & Recruiting", "Banking", "Hospital & Health Care", "Design", and "Information Technology & Services". Loic is currently Software Engineer - Quantitative Model Platform at Wells Fargo, located in New York, New York, United States. In Loic's previous role as a Data Pipeline Engineer - CyberSecurity & Data Management Group at JP Morgan, Loic worked in Greater New York City Area until Oct 2019. Prior to joining JP Morgan, Loic was a Hadoop Solutions Architect / Data Engineer at New York-Presbyterian Data Science and held the position of Hadoop Solutions Architect / Data Engineer at Greater New York City Area. Prior to that, Loic was a Big Data / Java Software Engineer at Citi, based in Greater New York City Area from Jan 2015 to Jan 2016. Loic started working as Software Engineer at IBM Application Services in Jan 2010. From Jan 2010 to Jan 2010, Loic was software engineer at InSource. Prior to that, Loic was a Software Engineer at Spherion from Jan 2010 to Jan 2010. Loic started working as software engineer at Fusion in Jan 2010.
You can find Loic Ambamany's email on finalScout.com. FinalScout is a popular professional database with more than 500 million business professionals and over 200 million company profiles.
Loic Ambamany's current jobs
Model Validation Compute Platform Data Pipelines on CPU & GPU Computing and Machine Learning models Big Data/Machine Learning Applications using Spark , Hive , Kafka , H20 clusters deployment , TensorFlow , Dask, Hadoop Stack Java , Python, Scala, PySpark , Jupyter Notebooks on Conda and Kubernetes deployments XGBoost , Decision Trees , Classification and Regression based ML models development and support
Loic Ambamany's past jobs
Design , built , implement and maintain real time and batch pipelines for JP Morgan Data Lakes using Java, Scala,Kafka,Spark Streaming , Hive,Hbase as supporting tools for use case pipelines. AWS Deployment with Lambdas, EMR , Redshift , Glue and S3
Hadoop Systems Architecture : Deployments and Implementations of Big Data tools on cluster of 25+ nodes and Terabytes of data. HDFS, HBase , Spark , Kafka , MapReduce, Oozie , Yarn Use cases are deployed on Healthcare Data systems and Real Time systems oriented policies'. Daily Duties include decision making on setting up pipelines of raw Data and implementation guidelines on data transformation using Scala ,Java ... Clusters setup , configuration changes and management for Production clusters including Cloudera and HortonWorks platforms.