Karthik Subramanian
About
Karthik Subramanian is from Greater Minneapolis-St. Paul Area. Karthik works in the following industries: "Computer Software", "Higher Education", "Mechanical Or Industrial Engineering", and "Information Technology & Services". Karthik is currently Research Specialist at 3M. In Karthik's previous role as a Senior Data Scientist at 3M, Karthik worked in Greater Minneapolis-St. Paul Area until Apr 2021. Prior to joining 3M, Karthik was a Data Scientist at 3M and held the position of Data Scientist at Greater Minneapolis-St. Paul Area. Prior to that, Karthik was a Graduate Research Assistant at Harris School of Public Policy at the University of Chicago, based in Chicago, Illinois, United States from Jan 2019 to Mar 2020. Karthik started working as Graduate Teaching Assistant at The University of Chicago Graham School in Chicago, Illinois in Oct 2018. From Aug 2018 to Dec 2018, Karthik was Graduate Research Assistant at The University of Chicago Graham School, based in Chicago. Prior to that, Karthik was a Graduate Research Assistant at Harris School of Public Policy at the University of Chicago, based in Greater Chicago Area from Apr 2018 to Jul 2018. Karthik started working as Associate - Data Science at Cognizant in Bengaluru Area, India in Jan 2017.
You can find Karthik Subramanian's email at finalscout.com. FinalScout is a professional database with business professional profiles and company profiles.
Karthik Subramanian's current jobs
Karthik Subramanian's past jobs
I worked in 3M's Corporate Research Systems Lab as a data scientist. As part of this, I was responsible for: 1. Reducing the time to deploy of AI/ML Algorithms to manufacturing shop floors from months to days. 2. Researching and building Process Anomaly detection algorithms for improving plant operations and reducing downtimes using AI/ML and Six Sigma principles 3. Connecting with stakeholders regularly to understand if the algorithms met the core business requirement and incorporated their feedback. Technology Stack: - AWS Sagemaker, Databricks, Horovod, Spark, Python, R
I worked with Prof. Will Howell and Mr. Marc Farinella for the Center of Effective Government to study anti-democratic themes in political campaigns. We study public campaign announcements/communications made by election candidates during the 2018 mid-term elections. PART A - Developed a comprehensive Political campaigns listening platform to gather data from multiple sources 1. Created a highly scalable and generalizable web crawler to crawl ~1000 candidate campaign websites (using Scrapy and Selenium libraries) 2. Set up data streaming to collect data using Twitter API (tweepy library), Facebook Graph API (facebook-sdk library), videos streams from youtube with captions (youtube-dl) PART B - Natural Language Processing 1. Study the collected text and develop models using NLP techniques such as n-grams, dependency parsing, semantic role labeling and visualize top keywords/phrases associated with a particular theme 2. Evaluating the use of deep learning algorithms such as LSTMs to build a supervised classifier for themes https://harris.uchicago.edu/research-impact/centers/project-political-reform
Teaching Assistant for the Data Analytics for Business Professionals (DABP). I taught the following courses: • Advanced Analytics and Machine Learning • Data Visualization and Storytelling • Data Understanding and Preparation • Building and Managing Analytics-Driven Organizations As a Teaching Assistant, I conducted weekly review sessions, graded assignments, and had 1:1 interaction with students to help them with assignments and project work.
I worked in the Professional Development Strategy department at UChicago Graham School. As a Research Assistant I, 1. Conceptualized and Developed sentiment analytics models on student experience surveys to analyze which programs and courses worked well students and what wholistic factors contributed to a student’s overall impression of a course and faculty. 2. Performed a thorough analysis to understand the scope for new courses to be introduced into Graham’s curriculum
I worked under Jane Thu Giang Le in a program to identify potential donors for the school based on shared ethics and values. As a Research Scientist I was responsible for 1. Implementing a depth-first search python routine to scrape data from the web recursively for potential donors. 2. Built a topic detection and entity tagging model using Python and Genism to classify the different types of donors. 3. Created an automated natural language pipeline to perform text scraping, text ingestion, text cleaning, text processing, and text reporting activities.
• Developed Health Monitor for Electric Insulators using AI image recognition techniques in Python for a US leading energy utilities company to help reduce manual monitoring efforts by 50%. • Developed an anomaly detection model using AI image recognition techniques to identify broken electric poles and vegetations for US leading energy utilities company. This was helpful for the customer in gaging the damage caused by a natural disaster. • Created a number plate recognition system using image and text recognition techniques for US energy utilities company. This resulted in a 2% increase in accuracy from a baseline model. • Implemented an automated facial respiratory mask detection model for US life-sciences major using OpenCV and DLIB. This reduced efforts by 40%. • Part of AI Research and Development to help identify new technologies / techniques for org-wide adoption. • Part of initial kickoff and consulting activities for an AI project for a joint venture between US leading internet company and US leading life science company. • Implemented an EMAIL classification model using RNN, LSTM techniques with feature engineering using Stanford GloVe vectors.
• Created and devised BRAVO, Cognizant’s proprietary platform for Big Data Reference, Validation and Optimization. This generated a revenue of $9.5M for the year 2015-2016 for the practice. • Built and deployed a German Sentiment Analytics model for a US Telecom Major using R and JAVA. This model improved the accuracy by 5% from a baseline model. • Developed metrics and implemented a predictive model monitoring solution using R and JAVA • Responsibilities included creating metrics and modules for model validation and efficacy, creating natural language routines for the BRAVO platform and customer engagements