Post Job Free

Resume

Sign in

It Analyst Data Analytics

Location:
Iselin, NJ, 08830
Posted:
March 04, 2024

Contact this candidate

Resume:

Dhana Latha

IT Analyst and ETL- Developer

Mobile: +1-732-***-****

Email:ad33z8@r.postjobfree.com

SUMMARY:

Motivated and Enthusiast as IT analyst and Data analytics with 8 years of experience in various aspects involving Data integration and Data warehousing techniques using ETL tools like Informatica Power Center 10.2/9.6/9.1/8.6 and Informatica cloud.

Involved in end-to-end BI business improvement implementation (Analysis of source system data, ETL Informatica jobs creation, scheduling, and reporting).

•Domain expertise in Media, marketing, Banking and Healthcare. Skilled in working with cross-functional team for problem solving with prototypes, project development, process improvement, dynamic environment, meeting deadlines, providing business insights, designing, developing, and implementing data-driven solutions with key performance indicators to solve complex business problems and executing data-driven action-oriented solutions with communication skills.

• Applied expertise in finance, accounting, and data storage to design and implement efficient data storage structures and schema using industry-standard design patterns.

•Developed robust data models that facilitated accurate and reliable reporting of financial information, ensuring compliance with regulatory requirements and improving data integrity.

•A seasoned Informatica Developer with extensive expertise in Informatica powercenter and Informatica Intelligent Cloud Service (IICS) and a proven track record in designing, developing, and optimizing workflows to facilitate seamless data exchange among various platforms, including Oracle databases, Salesforce, Data Lakes, and operational or warehouse data stores.

•Demonstrated exceptional documentation skills, operational support, production support maintain database, creating comprehensive technical documentation and schematics for relational databases, ensuring clarity and accuracy in database design and implementation. This documentation served as a valuable resource for team members and stakeholders, facilitating efficient collaboration, troubleshooting and knowledge sharing.

Proficient in understanding business requirements and translating them into technical requirements.

Expert in all phases of Software development life cycle (SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.

Extensively worked in ETL process consisting of data transformation, data sourcing, mapping, conversion, and loading.

Extensive experience in data profiling, data migration from various legacy sources and relational systems to OLAP and decision support target systems.

Strong expertise in Relational data base systems like Oracle, SQL Server. Strong knowledge of writing simple and Complex Queries in various databases like Oracle, My SQL, DB2 and SQL Server.

Worked with different non-relational Databases such as SFTP Flat files, XML files, Mainframe Files.

Proficiency in developing complex mappings using Informatica (powercenter, big data and cloud) and Change data capture (CDC), Data Cleansing, slowly changing dimensions (SCD) types 1&2.

Practical knowledge of Data warehouse concepts, Data modelling principles - Star Schema, Snowflake, Normalization/De-normalization.

Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions.

Experience in working on large data sets and version control systems like GIT and used Source code management client tools like Git Bash Scripting, UNIX /LINUX shell scripting, support services, GitHub, Git GUI, and other command line processing applications using scripting languages.

Worked and monitored on agile development and designing of statistical models and data models for the client. Coordinated with end users to articulate on designing. Also, created Jira stories and extensively worked on JIRA dashboard.

Highly proficient in processing tasks, scheduling sessions, import/export repositories, manage users, groups, associated privileges, and folders.

Carried out data loading activities to various target cloud-based systems like SFDC

Excellent verbal and written communication skills, independent, team building, dedicated, innovative, collaborative, accountable, leadership, consistency, influence, Technical architecture, Technical guidance, self-directed, Self-motivated, quality assurance, collaborating, interpersonal skills, creative, technical experience, strong technical skills, coordination, analytical skills with research, service-oriented, and strong ability to perform in a team critical thinking ability as well as individual team members.

EDUCATION DETAILS:

Mtech: Advanced systems in JNTU (Jawaharlal Nehru Technical University(Autonomous)) – Year: 2013-2015: Location: Hyderabad, India.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2/9.6/9.1/8.6, Informatica Cloud and Informatica Big Data.

Operating system: Windows Server 2008/2003, UNIX, OS.

Data Modelling: Star-Schema Modelling, Snowflake-Schema Modelling.

Database: Oracle 11g, SQL, DB2, NoSQL, Postgres SQL, Snowflake, PLSQL.

Reporting and Visualization: Power BI, BO, OBIEE etc.

Scheduling: ESP etc.

PROFESSIONAL EXPERIENCE:

Client: Horizon Blue Cross Blue Shield: Location: New Jersey (July2023-Till date).

Technologies: Informatica Power center, Oracle, PL/SQL, ESP scheduling tool and GIT HUB.

Description: Horizon Blue Cross Blue Shield of New Jersey (Horizon BCBSNJ) is a major health insurance provider in New Jersey. ETL jobs are likely a crucial part of their data management and analytics infrastructure as they would play a vital role in managing and leveraging the organization's data assets to support various business functions, including claims processing, member services, provider network management, and strategic decision-making.

Roles and Responsibilities.

•Pulling data from various sources such as databases, flat files, or APIs. For Horizon BCBSNJ (this could include claims data, member information, provider data, financial records, and more).

• Transforming and data loading into target systems such as a data warehouse, data lake, or analytical database. Which Horizon BCBSNJ use this consolidated data for reporting, analytics, business intelligence, or other purposes.

•Extensively worked on complex mappings which involved slowly changing dimensions.

•Developed several complex mappings in Informatica a variety of transformations, Mapping

•Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.

•Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.

•Worked on developing Change Data Capture (CDC) mechanism using Informatica

•Power Exchange for some of the interfaces based on the requirements and limitations of the Project.

•Implemented performance and query tuning on all the objects of Informatica using SQL Developer.

•Worked in the ETL Code Migration Process from DEV to QA and to PRODUCTION.

•Created the design and technical specifications for the ETL process of the project.

•Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing

•Created different parameter files and started sessions using these parameter files using pmcmd command to change session parameters, mapping parameters, and variables at runtime.

•Job Scheduling, Monitoring and Performance Optimization of jobs as Horizon BCBSNJ deals with large volumes of sensitive healthcare data, optimizing ETL processes for performance, scalability, and reliability would be critical. This might involve tuning database queries, optimizing data transformation logic.

Tata Consultancy Services, Hyderabad, India-(January 2016-May2023)

Client: Thomson Reuters

Informatica ETL Developer

Technologies: Informatica Power center, SFDC workbench, Oracle, PL/SQL, ESP scheduling tool and GIT HUB.

Title: High Radius collections-HRC (January 2022-May2023)

Description: High Radius collections is a project that has designed to achieve to and fro file mechanisms like loading and data capture sourcing the data from salesforce and staging in EDW and loading to HRC and capturing the changes is HRC and updating the same in SFDC using ETL.

Roles and Responsibilities:

●Gathering the requirements from the target systems and working with all the ERP systems (4 systems) to understand the architecture and fitting the same to Data Lake which will be consumed by single target High radius.

●Created ETL Design Documents and created power center mappings, sessions, workflows.

●Worked on Performance improving Techniques in ETL jobs and SQL query optimization by parallel partitioning and pushdown optimization.

●Designed and developed end-to-end data integration workflows using Informatica Cloud Data Integration, enabling seamless data flow across cloud and on-premises applications.

●Implemented data mappings and transformations to ensure data accuracy, consistency, and adherence to business rules and requirements.

●Collaborated with business stakeholders to understand data integration needs and translated them into technical solutions, resulting in improved data-driven decision-making processes.

●To give data science solutions, hands on expertise in every stage of data collection, data cleaning, model development, model validation, and visualization.

●Led the integration of multiple cloud applications with on-premises databases, facilitating real-time data synchronization and enabling comprehensive analytics.

●Conducted data quality and profiling assessments to identify data issues, recommending data cleansing and enrichment strategies, resulting in improved data integrity.

●Assisted in the deployment and maintenance of APIs to streamline data access and integration processes for various business units.

●Worked with different non-relational Databases such as SFTP file transfer operations, XMLs.

●Worked on Jira as ticketing tool and used Jenkins for code deployment from DEV to QA.

●Created PL/SQL packages, functions, cursors, indexes, views, materialized views.

●Worked on project documentation which involves all the jobs involved in the projects and their dependencies.

●Got exposure to GitHub for version controlling.

●Experienced in both working with QA teams for integration testing as well as being responsible for testing own work.

●Created and updated design and systems documentation for developed or modified services or programs.

●Worked on ETL Testing which includes Unit testing, Regression Testing, writing Test cases, validating the test cases and performing sanity checks before prod migrations.

●Created technical design for the back-end PL/SQL based on business requirement documents and the functional system design.

●Creation of database objects like tables, views, synonyms, materialized views, stored procedures, packages using Oracle tools like PL/SQL developer.

●Coordinated with the front-end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.

●Built complex queries using oracle and SQL Server Integration Services (SSIS) wrote stored procedures using PL/SQL.

●Used ref cursors and collections for accessing complex data resulted from joining of a large number of tables.

●Involved in moving the data from flat files to staging area tables using SQL loader.

●Extensively used for all and bulk collect to fetch large volumes of data from the table.

●Performed SQL and PL/SQL tuning and application tuning.

●Created ETL and Data warehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies .

●Created mapping documents with detailed source to target transformation logic, Source data column information and target data column

●Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes.

Tata Consultancy Services, Hyderabad, India

Client: Thomson Reuters

IT Analyst and ETL Developer

Technologies: Informatica Power center, Oracle, PL/SQL, ESP scheduling tool and GIT HUB.

Title: TRTA_LOTUS (DSVE): Data stage and Validation Environment: (May 2020-January 2022)

Description: The objective of DSVE in Lotus Program is to provide flexible environment to stage, cleanse, validate & transform the data for stage legacy systems migration to fit into target systems. This also provides ability to perform Data profiling, Analytics to identify Data Quality issues lying in source systems.

Roles and Responsibilities:

●Spearheaded the design and development of intricate Informatica workflows in Informatica Big Data, enabling efficient interaction with the organization's Data Lake infrastructure.

●Implemented robust data exchange mechanisms between Oracle databases, Salesforce, and other operational data stores, ensuring data integrity and seamless integration.

●Leveraged expertise in Informatica Power Center and IDQ to optimize data transformations, ensuring high-quality and reliable data processing.

●Collaborated with cross-functional teams to analyze business requirements and translate them into technical designs for scalable and flexible data workflows.

●Capture the requirements arising out of new change requests by client and existing defects logged in JIRA.

●Experience in designing/implementing APIs and API integrations.

●Provide effort estimates to design and implement ETL code to fulfil the overall business requirements.

●Involved in data modeling and design of data warehouse in star schema methodology with confirmed granular dimensions and Fact tables.

●Implemented Conceptual, Normalized/Denormalized Logical and Physical data models to design OTLP and Data warehouse environments.

●Develop and unit test the mappings created to meet the business requirements.

●Performed extraction, transformation and loading of data from RDBMS tables and Flat File sources into Oracle 11g RDBMS in accordance with requirements and specifications.

●Designed and developed mapplets and re-usable transformations and used them in different mappings.

●Experience developing and supporting complex Data warehouse transformations. Creating Reusable Transformations and Mapplets in a mapping.

●Tuned the performance of mappings by following Informatica best practices and applied several methods to get best performance by decreasing the run time of workflows.

●Developed Conceptual, normalized Logical and Physical data models to design OTLP and Data warehouse environments.

●Used ETL process to load data from multiple sources to staging area (Oracle 10g) using Informatica Power Center 10.0. Created Mapplets and used them in different Mappings.

●Performance tuning using Informatica partitioning. Involved in Data Base Tuning. Written UNIX Shell Scripts for various purposes in the project.

●Extensively used Informatica's Big Data solutions which aim to address the complexities and requirements associated with handling and harnessing the potential value of Big Data in this organisation, empowering users to derive insights and make data-driven decisions.

Tata Consultancy Services, Hyderabad, India

Client: Thomson Reuters

Informatica ETL Developer

Technologies: Informatica Power center, Oracle, PL/SQL, ESP scheduling tool and GIT HUB.

Title: CEBOM: Canada Enterprise Back-office migration: (Febraury2019-May2020)

Description: The scope of this project is to migrate the entire existing data and new data of Canada back office in to S4 HANA through EDW (Enterprise Data warehouse). Informatica acts as an integration layer between CEBO (DB2 I series) and SAP. This migration involves data of customers and products and subscriptions.

Roles and Responsibilities:

●Attending lead Conversion Mapping Sessions and document them.

●Coordinate with dependent teams to identify the source fields, mapping rules as per solution .

●Create/modify transformation programs and testing.

●Execute data loads.

●Support validation process: provide validation files, automate validation to the extent possible.

●Performance tuning and scheduling of the Workflows.

●Documenting the changes as part of CRs in Design documents, Unit testing & integration testing.

●Coordinating with the production support team while the implementation of the task.

●Providing the maintenance for the implemented task, Preparation of Design, Run & support documents.

●Involved in entire product life cycle development in the end-to-end phases of data warehouse like designing, Loading, and testing including early lifecycle support and production support.

Tata Consultancy Services, Hyderabad, India

Client: Thomson Reuters

Informatica ETL Developer

Technologies: Informatica Power center, Oracle, PL/SQL, ESP scheduling tool and GIT HUB.

Title: Legal Commissions stabilization: (June2018-Febrauary2019)

Description: Entire commission’s data when a sales rep is selling a product or service is handled from SAP to EDW. In which data is extracted from Db2 and transformed according to Business rules and loaded into EDW layer upon which Reporting has been done.

Roles and Responsibilities:

●Requirement gathering, data analysis and Development of Mappings, Sessions and Workflows for Data extraction, transformation and loading data from multiple sources into target.

●Interact with Business analysts to understand the user requirements.

●Prepare Technical Design document for the identified components.

●Perform and capture Unit Testing data.

●Code migration and environment setup for SIT.

●Support SIT and UAT testing.

●Interact with business team, users, support team to get approvals for current Production release.

●Raise change request for Production deployment with all necessary approvals.

●Attend Change Approval Board (CAB) meetings to seek release approvals from architecture and management teams.

●Deploy code in Production environment and perform dry run within stipulated change window.

●Communicate the success of each implementation step to all the stake holders.

●Schedule Informatica jobs in ESP for automated run.

Tata Consultancy Services, Hyderabad, India

Client: Thomson Reuters

Informatica ETL Developer

Technologies: Informatica Power center, Oracle, PL/SQL, ESP scheduling tool and GIT HUB.

Title: TR-EDW Enhancements: (January 2016-June2018)

Description: Worked on various complex enhancement projects, many Jira’s which included root-cause analysis, fixes, design changes to the existing systems and Data discrepancies in the environments, understanding the functionalities and job dependencies according to the business requirement and the changes they required in the existing production environment and delivered the work within the timelines.

Roles and Responsibilities:

●Requirement gathering, data analysis and Development of Mappings, Sessions and Workflows for Data extraction, transformation and loading data from multiple sources into target.

●Performance tuning and scheduling of the Workflows.

●Validation of data provided according to requirements.

●Involved in entire product life cycle development in the end-to-end phases of data warehouse like designing, Loading, and testing including early lifecycle support and production support.

●Documenting the changes as part of CRs in Design documents, Unit testing & integration testing.

●Coordinating with the production support team while the implementation of the task.

●Providing the maintenance for the implemented task, Preparation of Design, Run & support documents.

●Proficiency in developing complex mappings using Informatica and Change data capture (CDC), Data Cleansing, slowly changing dimensions (SCD) types 1&2.

●Performed design of Data warehouse and Data modelling – loading of Data to Star Schema, Snowflake, Normalization/De-normalization.

●Worked Performance Tuning of sources, targets, mappings, transformations, and sessions.

●Carried out support and development activities in a relational database environment, designed tables and Views in relational databases and used SQL proficiently in database programming using MS SQL Server, Oracle 10g.

●Expertise in configuration, performance tuning, installation of Informatica, & in integration of various API using Informatica cloud data integration processes.



Contact this candidate