Post Job Free

Resume

Sign in

Informatica Sql Developer

Location:
Greenville, SC
Posted:
May 19, 2021

Contact this candidate

Resume:

Saivishal Shanmugam Phone: 619-***-**** Email: admjqu@r.postjobfree.com

Background Summary:

Over 12 years of IT Experience in analysis, design, development, implementation and troubleshooting of Data Mart / Data Warehouse applications using ETL tools like Informatica power center 10.x/9.x/8.x/7.x/6.x/5.x.

Skilled certified professional having around 2+ years of RPA process implementation using UiPath..

Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model (Kimball and Inmon), Star and Snowflake schema design addressing slowly Changing Dimensions (SCDs).

Experience in developing and maintaining logical and physical Data models for Data Warehouse applications using tools like Erwin and ER/Studio.

Extensive integration experience with ERP Systems (SAP, i2) and EAI (JMS – real time systems) using Informatica Power Exchange for JMS and Power Exchange for SAP and Power Connect for SAP R/3 and SAP BW tools including system architecture and interfaces covering versions 6.0 to 8.6.0

Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, and XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.

Expertise in implementing complex Business rules by creating robust mappings, Mapplets, shortcuts,

reusable transformations using transformations like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.

Extensively worked on Informatica tools Admin console, Repository manager, Designer, Workflow manager, Workflow monitor.

Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x, Sales force.

Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Expertise in Session partitioning and tuning session and lookup/aggregator/joiner caches for performance enhancements.

Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers and involved in Query Optimization.

Profound knowledge about the architecture of the Teradata database. Developed Teradata Loading and Unloading utilities like Fast Export, Fast Load, Multiload (Mload).

Extensive knowledge with Teradata SQL Assistant. Developed BTEQ scripts to Load data from Teradata Staging area to Data warehouse, Data warehouse to data marts for specific reporting requirements. Tuned the existing BTEQ script to enhance performance.

Documented design procedures, mapping specifications from Source to Target, operating instructions, test plans/procedures and troubleshooting procedure for ease of application maintenance.

Good experience in performing and supporting Unit testing, System Integration testing, UAT and production support for issues raised by application users.

Strong in UNIX Shell and PERL scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like CRON tab, Maestro, Control-M.

Excellent technical and professional client interaction skills. Interacted with both Technical, functional and business audiences across different phases of the project life cycle.

Versatile team player with excellent analytical, presentation and interpersonal skills with an aptitude to learn new technologies.

Technical Skills:

Databases

Oracle 11i/10g/9i/8i/8.0/7.0, MS SQL Server 2000, DB2, MS Access 97/2000, Sybase, Teradata V2R3.

ETL Tools

Informatica (Power Center 10.2, 9.6, 9.1, 8.6, 8.5, 8.1.2, 8.1.1, 7.1.1, 6.2 / 5.1, SSIS

and Power Mart 6.2 / 6.0 / 5.1

Operating systems

DOS, Windows 9x/NT/2000/XP/XP2/XP Professional, LINUX, AIX 5.3/6.0, Windows Vista

RPA Tools:

UiPath 2018 4.5

Programming Skills

C++, Shell Scripting (K-Shell, C-Shell), PL/SQL, C#, .Net 3.0/3.5, PERL, JAVA (Eclipse IDE and Net Beans IDE, JAVA Script, CSS.

Methodologies

Data Modeling – Logical Physical

Dimensional Modeling - Star / Snowflake

Tools & Utilities

SQL plus, AQT, TOAD, SQL developer, Net Beans 5x/6x, Microsoft Visual Studio 2005/2008, Cygwin, Autosys, BTEQ, Facets, SQL developer 5.0, SQL Loader, SSIS/SSRS/SSAS

BI Tools

Tableau, Cognos 8, Query Studio, Report Studio, Cognos Impromptu 7.1 MR3, Cognos Power- Play/ Transformer 7.1/7.0

Web

HTML, XML, PHP, Visual Web JSF.

EDUCATION:

National University, San Diego, CA Aug’ 07 - Dec’ 08

(Master of Science in Wireless Communications) GPA – 3.64/4.0

Jawaharlal Nehru Technological University, INDIA Aug’ 03 – May’ 07

(Bachelors in Electronics & Communication Engineering) GPA - 3.3/4.0

HONORS:

UIPATH Developer Certification.

Learning SQL Programming Certification.

Agile Marketing foundations Certification.

Received student scholarship for my master’s at National University, San Diego, CA.

PROFESSIONAL EXPERIENCE:

Company: BMW, Woodcliff Lake, NJ May’19- Till date

Role: Sr. System Analyst / Software Developer.

Description:

BMW Group is now one of the ten largest car manufacturers in the world and, with its BMW, MINI and Rolls-Royce brands, possesses three of the strongest premium brands in the car industry. The group also has a strong market position in the motorcycle sector and operates a successful financial services business.

Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.

Collaborated with end users, project stakeholders, and support partners to identify needs, goals, metric measurements, and business models, which will result in easy to understand interfaces and enable end users to quickly identify the key themes within the data.

Responsible for implementing the full life cycle of RPA solutions: Identifying automation opportunities, gathered requirements, implement, test and deploy targeted automation solutions using industry leading RPA

Gathered our customer’s business requirements and clearly documented them in order to our RPA Consultant teams to deploy automation projects in an optimal manner

Provided business analyst support expertise to Sales team to draft the RFI/RFP and POC stages and help sales team to draft the RFI/RFP responses

Supporting our customers in implementing the required changes to make effective use of the automation

Developed the objects and workflows as per client requirement, performed tool evaluation and feasibility to help in selecting Automation tool for RPA implementation

Worked with Business Teams to create PDD (Process Definition Document) and developed separate SDD (Solution Design Documents)

Developed ROBOTS using UiPath to automate and accelerate 80% of standard business process with minimum human intervention for remaining non-standard processes

Developed robots using UiPath studio, identifying and debugging the errors using error handlers and Participated in online meetings to demonstrate the capabilities of RPA for on-going projects and potential clients

Configured UiPath processes and objects using core workflow principles that are efficient, well structured, maintained and easy to understand for further requirement

Automated Several Processes and Business activities for various business systems and Provided knowledge transfer/support/Assistance to the Application development and Support teams

Worked with OCR, ICR and Machine learning for designing in UiPath studio and with the UI team to design the appropriate icon and display for the screens

Efficiently handled monitoring and troubleshooting the studio environment through Orchestrator, Involved in Logging into websites, searching websites, submitting web forms, or updating records on a website

Automated the data transfers, including importing/exporting data between applications or files using UiPath

Involved in code review and fine tuning the code for performance related issues and worked on debugging application for fixing bugs and Production support

Maintained several RPA Bots as batch processes on daily basis and ensure quality and high performance to the internal and vendor applications of the organization

Worked with different teams in making changes in developed processes as per the Technical Design Document (TDD) to meet the defined requirements

Delivered demos, technical trainings and support for new/existing clients

Worked with test teams during the Product test and UAT phases to fix assigned bugs

Interacted with the Business users to identify the process metrics and various key dimensions and measures.

Data Integration of Tableau/Informatica with SQL Server, SAP BW, SAP HANA, Hadoop for data transformation and for creating various complex worksheets( reports).

Research new cloud technologies and prototype solutions that can be leveraged to decrease costs and increase performance.

Leveraging AWS SDKs to interact with AWS services from your application.

Worked on SQL optimization and Identified bottlenecks and performance tuned the Informatica mappings/sessions.

Extracted data from various data sources such as SAP, Oracle, Flat files, DB2, transformed, and loaded into targets using Informatica.

Performed Upgrading Informatica Power center Version 9.x to Informatica Power center 10.x.

Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.

Developed disaster recovery and failover strategies for the data integration environment

Extensively worked on UNIX shell scripts on automation (Auto restart of servers, Disk space utilization, Informatica Server monitoring and UNIX file system maintenance and cleanup, and scripts using Informatica Command line utilities.

Creation and maintenance of Informatica users and privileges.

Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.

Created Groups, roles, privileges and assigned them to each user group.

Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.

Functioned as the primary liaison between the business line, operations, and the technical areas throughout the project cycle.

Environment: Informatica Power Center 10.x/9.x,AWS,Bitbucket, Jira, Confluence, Jenkins, SAP HANA, HADOOP,SAPR/3, SAPBW, RPA UIPATH, DB2, Erwin, TOAD, Tableau, PL/SQL, Flat files, XML, Sql Server, Oracle 12/11i/10g/9i, Unix, SUSE Linux, Mainframe JCL’S.

Company: System Soft Technologies.

Client: BMW, Woodcliff Lake, NJ May’16 - May‘19

Role: Sr. ETL Architect/ Software Developer /System Analyst.

Description:

BMW Group is now one of the ten largest car manufacturers in the world and, with its BMW, MINI and Rolls-Royce brands, possesses three of the strongest premium brands in the car industry. The group also has a strong market position in the motorcycle sector and operates a successful financial services business.

Interacted with the Business users to identify the process metrics and various key dimensions and measures.

Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.

Collaborated with end users, project stakeholders, and support partners to identify needs, goals, metric measurements, and business models, which will result in easy to understand interfaces and enable end users to quickly identify the key themes within the data.

Data Integration of Tableau/Informatica with SQL Server, SAP BW, SAP HANA, Hadoop for data transformation and for creating various complex worksheets(reports).

Research new cloud technologies and prototype solutions that can be leveraged to decrease costs and increase performance.

Created users, groups and gave read/write permissions on the respective Folders of repository.

Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Application Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.

Created deployment groups in one environment for the Workflows, Worklets, Sessions, Mappings, Source definitions, Target definitions and imported them to other environments.

Prepare test plans and test scripts to test the data based on the business requirements.

Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models as per business requirements using Erwin.

Extracted data from various data sources such as SAP, Oracle, Flat files, DB2, transformed, and loaded into targets using Informatica.

Performed Upgrading Informatica Power center Version 9.x to Informatica Power center 10.x.

Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.

Developed disaster recovery and failover strategies for the data integration environment

Participated in migration of ETL code from development repository to testing repository and then to production repository.

Extensively worked on UNIX shell scripts on automation (Auto restart of servers, Disk space utilization, Informatica Server monitoring and UNIX file system maintenance and cleanup, and scripts using Informatica Command line utilities.

Creation and maintenance of Informatica users and privileges.

Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.

Created Groups, roles, privileges and assigned them to each user group.

Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.

Performance tuning, including collecting statistics, analyzing explains & determining which tables needed statistics. Increased performance by 20% for one of the FT Informatica Mapping.

Backup the team wherever required and make sure the team is aware of the complexities of the stories and challenge the team on requirements for the estimation process.

Created and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer.

Development of PL/SQL blocks and multi-thread Pro*C programs for ETL purposes.

Collaborate with DBA’s and operations team in collection of statistics on original query and analyze them to set up a test environment.

Used set explain utility for collecting detailed SQL query plans and execution statistics using Informix.

Developed UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.

Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.

Evaluated all the roadblocks via performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.

Prepared Run book for the daily batch loads by providing, the job dependencies and how to restart a job when it fails for ease of handling job failures during loads.

Support sales team in presenting Tableau Practice capabilities to prospects and build credibility.

Participated in Unit testing, User Acceptance testing to check whether the data is loading into target, which were extracted from different source systems according to the user requirements.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

Responsible for regression testing ETL jobs before test to production migration.

Created various documents including high-level design documents, mapping documents and knowledge transfer documents.

Automate the release pipeline to achieve zero touch deployments using Jenkins, Nexus etc.

Applied Code-level application security (IAM roles, credentials, encryption, etc.)

Creating a Redshift cluster and doing data warehouse.

Creating different S3 buckets and write Lambda to move files from S3 to instance and FTP server to S3.

Functioned as the primary liaison between the business line, operations, and the technical areas throughout the project cycle.

Environment: Informatica Power Center 10.x/9.x,AWS,Bitbucket,Jira,Confluence,Jenkins, SAP HANA, HADOOP,SAPR/3, SAPBW, DB2, Erwin, TOAD, Tableau, PL/SQL, Flat files, XML, Sql Server, Oracle 12/11i/10g/9i, Unix, SUSE Linux, Mainframe JCL’S.

Company: ATOS IT SOLUTIONS AND SERVICES INC.

Client: BMW, Woodcliff Lake, NJ Jan’15- April ‘16

Role: Sr. Software Developer

Description:

BMW Group is now one of the ten largest car manufacturers in the world and, with its BMW, MINI and Rolls-Royce brands, possesses three of the strongest premium brands in the car industry. The group also has a strong market position in the motorcycle sector and operates a successful financial services business.

Responsibilities:

Interacted with both Technical, functional and business audiences across different phases of the project life cycle.

Designed and developed conceptual, logical and physical data model of the star schema using VISIO.

Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.

Implemented Change Data Capture (CDC) using Power exchange.

Developed Informatica mappings, sessions and workflows as per the business rules and loading requirements.

Created Workflows using various tasks like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment and worked on scheduling of the workflows.

Used mapping parameters and variables.

Prepared mapping specification document, which gives the data flow and transformation logic for populating each column in the data warehouse table.

Used debugger to analyze the data flow between source and target to fix the data issues.

Developed PL/SQL procedures, functions to facilitate specific requirement.

Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.

Implemented audit and reconcile process to ensure Data warehouse is matching with the source systems in all reporting perspectives.

Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.

Created the release requests for QA Builds to include all the release requirements and involved in the implementation of QA, UAT and Production releases.

Developed a UNIX Shell scripts which will send the reports to client over the network by using file transfer protocol (FTP) and generating the log file, which will keep the history for the FTP reports.

Provided data loading, monitoring, and system support and worked on data issues raised by end user during its production support phase.

Performance tuned Informatica mappings using Pushdown Optimization, Variables and Dynamic Cache.

Hands on experience with Scheduling tool Mainframe to schedule workflows by running shell scripts through it.

Used Connect: direct to get files from mainframe to Unix server.

End-to-end ETL development of the Marketing Data Warehouse.

Environment: Informatica Power Center 7.0/8.6/9.0, DB2, Erwin, TOAD, PL/SQL, Flat files, XML, Sql Server, Oracle 11i/10g/9i, Unix, SUSE Linux, Mainframe JCL’S.

Company: Verans Business Solutions Inc.

Client: ESTEE LAUDER, NY Dec’11- Dec’14

Role: Programmer Analyst

Description:

The Estée Lauder Companies Inc. is one of the world's leading manufacturers and marketers of quality skin care, makeup, fragrance and hair care products. Founded in 1946 by Mrs. Estée Lauder, our Company's products are sold in over 150 countries and territories under the following brand names: Estée Lauder, Aramis, Clinique, Prescriptives, Lab Series, Origins, M•A•C, Bobbi Brown, Tommy Hilfiger, Kiton, La Mer, Donna Karan, Aveda, Jo Malone, Bumble and bumble, Darphin, Michael Kors, American Beauty, Flirt!, GoodSkin™ Labs, Grassroots™ Research Labs, Sean John, Missoni, Tom Ford, Coach, Ojon, Smashbox and Ermenegildo Zegna.

Responsibilities:

Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate the data from the source to the target DW.

Analyzed the business requirement document and created functional requirement document mapping all the business requirements.

Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems including SAP.

Designed and developed complex ETL mappings making use of transformations like Source Qualifier, SAP BAPI, Application Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.

Created users groups and gave read/write permissions on the respective Folders of repository.

Created deployment groups in one environment for the Workflows, Worklets, Sessions, Mappings, Source definitions, Target definitions and imported them to other environments.

Developed Informatica mappings, sessions and workflows for all mappings.

Used mapping parameters and variables.

Automated the load process using UNIX shell scripts.

Worked on SQL optimization. Identified bottlenecks and performance tuned the Informatica mappings/sessions.

Developed process for error handling and auto reloading.

Created reusable objects in Informatica for easy maintainability and reusability.

Performed the data validations and control checks to ensure the data integrity and consistency.

Extensively used debugger to trace errors in the mapping.

Installed and integrated the QTP test environment with Adobe Flex 3.0 add-in for testing Flash objects.

Involved in developing test plans and test scripts to test the data based on the business requirements.

Performed Upgrading Informatica Power center Version 8.x to Informatica Power center 9.x.

Developing UNIX scripts and PERL scripts for pre data manipulation.

Used Sol manager to update the status and document the technical Specs.

Mentored ETL team.

Responsible for regression testing ETL jobs before test to production migration.

Supported migration of ETL code from development to QA and then from QA to production.

Environment: Informatica Power Center 8.6/9.0, DB2,SAPR/3,SAP BW, Sol Manager Erwin, TOAD, PL/SQL, Flat files, Oracle 10g/9i, Unix, Control-M, Mercury Quality Center, HP Quality Center, .Net, Agile methodology, UML, XML, HP Load Runner.

Company: Verans Business Solutions Inc.

Client: BMW, Woodcliff Lake, NJ May’10- Dec’11

Role: Programmer Analyst

Description:

BMW Group is now one of the ten largest car manufacturers in the world and, with its BMW, MINI and Rolls-Royce brands, possesses three of the strongest premium brands in the car industry. The group also has a strong market position in the motorcycle sector and operates a successful financial services business.

Responsibilities:

Worked closely with the end users in writing the functional specifications based on the business needs.

Analyzed the source data coming from Oracle. Coordinated with Data Warehouse team in developing Relational Model.

Worked closely with Business Analyst and user decision makers to develop the transformation logic to be used in Informatica.

Used Informatica Repository Manager to create Repositories and Users and to give permissions to users.

Created ftp connections, database connections (ODBC) for the sources and targets.

Created Logical and Physical data models using Erwin.

Integrated various data sources using sales force

Designed and developed mapping using various transformations like Source Qualifier, Expression, Lookup, Aggregator, Router, Rank, Filter and Sequence Generator transformations. Star Schema and created Fact Tables and Dimension Tables for the Warehouse and Data Marts.

Created Update Strategy and Stored Procedure transformations to populate targets based on business requirements.

Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.

Scheduled the sessions to extract, transform and load data into warehouse database as per Business requirements.

Loaded the oracle data in DB2, created all the oracle objects in DB2 UDB.

Used shell scripts for handling the journal files from DB2 source to make sure we have received all files with right format to kick off INFORMATICA load

Used Adaptive server query optimizer to analyze the costs of query plans.

Developed Procedures and Functions in PL/SQL.

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Created Unit Test plans and involved in primary unit testing of mappings.

Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.

Involved in various meetings, seminars, presentations and group discussions.

Environment: Informatica Power Center 8.6, Mainframes, Erwin, SQL server, UNIX, TOAD 8.6.1.0, PL/SQL, Flat files, Oracle 9i, DB2 UDB, Java Platform, BEA Web logic, RUP Methodology, UML, Win Runner, Quick Test Professional, Rational Requisite Pro, IBM Lotus, Agile Methodology, XML, MS Visio, MS Project, MS Office.

Company: Zuven Technologies Inc.

Client: Super Media, Dallas, TX Jul 09 - Mar’ 10.

Role: Programmer Analyst

Description:

SuperMedia (formerly known as Idearc Media) are based in Dallas, TX. As a Informatica Developer for the large marketing institution involved in the design and development of which involved Super Media’s advertising products and services include: the SuperGuaranteeSM and SuperTradeExchange programs, Verizon SuperYellowPages,,FairPoint,SuperYellowPages,,Superpages.com.EveryCarListed.comSM,Switchboard.comSM, Local Search.com SM, Super pages MobileSM and SuperpagesDirect direct mail products was part of the Data warehouse.

Responsibilities:

Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate the data from the source to the target DW.

Analyzed the business requirement document and created functional requirement document mapping all the business requirements.

Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.

Experience in using Oracle Application Express via APEX rapid development tool for data centric (web applications).

Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Application Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.

Developed Informatica mappings, sessions and workflows for all mappings.

Used mapping parameters and variables.

Automated the load process using UNIX shell scripts.

Extracted various data using Informatica 8.6 with Sales Force Adapter.

Worked on updating Sales Force using external ID.

Implemented Change Data Capture (CDC) using salesforce.com.

Created various objects in sales force.

Used apex explorer to query the sales force.

Worked on SQL optimization. Identified bottlenecks and performance tuned the Informatica mappings/sessions.

Created reusable objects in Informatica for easy maintainability and reusability.

Performed the data validations and control checks to ensure the data integrity and consistency.

Extensively used debugger to trace errors in the mapping.

Involved in developing test plans and test scripts to test the data based on the business requirements.

Ensure test data is created via QTP Automation Scripts from appropriate Pharmacy applications like Saber Sonic Web1 / Web 2, Interact and Self Service web check-in which are based on Java technology.

Worked on Data Conversion & Data Migration to support Back end testing using Sql Server 2008 version.

Developing UNIX scripts for pre data manipulation.

Responsible for regression testing ETL jobs before test to production migration.

Supported migration of ETL code from development to QA and then from QA to production.

Environment: Informatica Power Center 8.6, Erwin, TOAD, PL/SQL, Flat files, Oracle 11i/10g/9i, Unix,Apex,Salesforce.com, Autosys, Mercury Quality Center, MS Outlook, RUP, MS Visio, MS Project, MS Office, VB Scripting Oracle, PL/SQL, Java, VBScript, Web Logic.

Company: Zuven Technologies Inc.

Client: ALSAC/St.Jude, Memphis, TN. May’09- Jul’09

Role: Programmer Analyst

Description:

ALSAC/St. Jude, the fund-raising organization of St. Jude Children's Research Hospital, is one of the nation's largest health care charities and raises money solely for St. Jude. ALSAC/St. Jude provides intelligent solutions to public and private hospitals, by providing them insight in to their day to day activities. This results in better health care and it’s low-cost.

Responsibilities:

Interacted with business users and business analyst to understand reporting requirements and prepared system requirement specification document.

Worked with Reporting



Contact this candidate