Lalit Pathak
About
Lalit Pathak is from Ireland. Lalit works in the following industries: "Technology, Information and Internet". Lalit is currently Business Intelligence Manager at Jobbio, located in Ireland. In Lalit's previous role as a Data Manager at Jobbio, Lalit worked in Dublin, Leinster, Ireland until Aug 2022. Prior to joining Jobbio, Lalit was a Salesforce Admin/Consultant at Jobbio and held the position of Salesforce Admin/Consultant at Ireland. Prior to that, Lalit was a Business Intelligence Data Warehouse Engineer at Accenture, based in Pune Area, India from Sep 2016 to Sep 2018. Lalit started working as Associate Software Engineer at Accenture in Mumbai Area, India in Jan 2016.
You can find Lalit Pathak's email on finalScout.com. FinalScout is a popular professional database with more than 500 million business professionals and over 200 million company profiles.
Lalit Pathak's current jobs
Lalit Pathak's past jobs
Responsible for maintaining project plans, coordinating resources, executing deliverables, and managing Stakeholder relationships. Responsible for working with the Business Development team to scope implementation & support projects. Coordinated and managed external system integrations and data migrations to Salesforce.com organization. Responsible for the design of the Salesforce.com object structure, the configuration of standard & custom objects, security models, fields, page layouts, workflow rules, reports, and dashboards. Responsible for gathering requirements, documenting business process flows, research of third party applications, and creating Stakeholder specific training documentation. Worked closely as business analyst with Sales Team and performed a detailed analysis of business, technical requirements and designed the solution by customizing various standard objects of SalesForce.com (SFDC). Experienced in doing the Fit-Gap analysis between Requirements and SFDC Application. Created many Email Templates for sales, customer success, marketing and Partnership team. Responsible for architecture in Salesforce modules internally and also without of the box functionality. Primarily involved in developing prototype of the partnership Dashboards as proof of concept (POC) on google data studio and google sheet. Delivered technical and end-user training to internal administrators, power users, executive teams, sales teams, customer service representatives, and marketing teams. Saved the company over $50,000 per year in unnecessary expenses of licenses and increased sales by 60% through extracting actionable insights and generated live alerts from Salesforce data.
Used Data Loader for insert, update, and bulk import or export of data from Salesforce.com Objects and it to read, extract, and load data from comma-separated values files. Responsible for the creation of users, roles, profiles, and defined object and field-level security. Migrated the old legacy system data into salesforce and building the new system in salesforce. Created custom formula fields, email services, approval processes, and field updates according to application and business requirements. Implemented picklists, dependent pick lists, lookups, master detail relationships, validation and formula fields to the custom objects. Worked on standard objects like Accounts, Contacts, Leads, Campaigns, Opportunities Created custom objects, tabs, components, and reports. Created Standalone, Bundle, Option products. Created Pricing rules, constraint rules for types of products. Created reports and dashboards for managers' home page and provided access to only authorized people. Defined including product Pricing, Opportunity Management flow and Quoting with the help of product team. Integrated and maintained data from third party lead generation applications like Snovio, LinkedIn Sales navigator, LeadIQ, LeadIRO for sales team members.
Client: Nationwide Building Society Involved in the implementation of Data lake and Data warehouse as Application Developer to uplift the Information management capabilities of the client. Created Complex SQL queries to fetch data from various relational databases and flat files and Data lake. Developed Informatica Mappings and workflows using transformations like Expression, Filter, Joiner and Lookup for extracting data from legacy system as well as different sources and created cleanse, standardized, valid data according to business rules for better data Processing and migrating clean and consistent data in Data warehouse for the reporting team. Improved Performance and productivity 50% by Implementation and automation of slowly changing Dimension (SCD- 2) with dynamic mapping (Run time reusable mappings) of Informatica version 10.1.1. Migrated, Stored and Processed Terabytes of structured data from legacy system and flat files through Hadoop eco-system component such as such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce and Hive, Shell Scripting. Created tables and views in hive using a variety of data types, delimiters and file formats as per the business requirements and regularly tune the performance of Hive queries to improve data processing and retrieving. Supported unit testing of ETL code to ensure that code is delivered and running in the system test environment(SIT), Production Environment also performed all activities related to the migration of ETL components from DEV environment to Production and SIT with the help of Team Foundation Version Control for source code management. Participated and contributed to quality assurance walkthrough of the ETL component for QA team and testing team as well as analyzed and resolved of all data related SIT and production defects on Application life cycle Management tool (HP- ALM). Maintained Reference data with the help of Informatica MDM and Prepared Technical Design Documents
Client : Sanofi Collaborated and organized meetings with functional consultants for understanding client’s requirements to migrate data from legacy system into Veeva CRM, used for engagement with healthcare professionals. Integrated customer data with the CRM application with the help of data integration tools Informatica Power center, TIBCO and Hadoop ecosystem component. Worked on functional testing as well as ETL testing and maximized test efficiency by designing standardized SQL/Hive queries, test case test plan which enable execution of multiple test cases at a time on HP-ALM. Reduced turnaround time by 2 days for defect fix by conducting root cause analysis and recommended necessary fix for the identified defects Organized knowledge transfer session for the development team member for understanding the functional requirements of the client