Post Job Free

Resume

Sign in

SQL Developer

Location:
Madison, WI
Salary:
76,800
Posted:
May 28, 2018

Contact this candidate

Resume:

Kiran

234-***-****

ac5ncc@r.postjobfree.com

Experience Summary:

3+ years of extensive experience in Oracle SQL and Advanced PL/SQL and Data warehouse development.

Expertise in Software Development Life Cycle(SDLC) of client implementations including Requirements Gathering, Business Analysis, Design, Development, Testing, Technical Documentation and Support.

Experience in writing Packages, Stored Procedures, Functions, Triggers, and Materialized Views in Oracle 11g/10g /9i database.

Strong experience in developing SQL/PLSQL Scripts/Programs for Data Analysis, Extraction, Transformation and Loading (ETL) mechanism.

Worked with Collections, Records, Dynamic SQL and Exception Handling.

Experience with tools like SQL Server management studio and SQL Server 2005/2008 integration (SSIS) and reporting services(SSRS).

Used SQL and PL/SQL performance tuning techniques which included creating indexes, partitioning tables, providing Hints, Optimizer query execution plans, SQL trace and TKPROF.

Worked with tools like PL/SQL Developer, SQL Developer, SQL*Loader, TOAD, SQL plus.

Usage of vi editor for writing the Unix korn, bash scripts. Experience in Unix utilities like SED, AWK.

Hands-on experience in proprietary issue tracking products such as JIRA and HP Quality Center.

Possess excellent skills in Automation Testing using HP / Mercury's Quick Test Pro (QTP), Quality Center.

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica Power Center.

Experience in reviewing Business Requirement Documents, Software Requirement Documents and preparing Test Cases, Test scripts and Execution.

Involved in Data Quality Analysis to determine the cleansing requirements.

Experience in data modeling using Erwin, Star Schema Modeling, and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.

Good understanding and experience in Data modeling, ER Diagrams, Normalization and De-normalization of Tables.

Experience in Waterfall and SCRUM agile methodologies.

Expertise in Dynamic SQL, Collections and Exception handling.

Possess excellent communication, interpersonal and presentation skills

Excellent team player with the ability to work independently and interact with people at all level.

Area of Expertise

Languages

PL/SQL, SQL, UNIX Shell Scripting, Java, HTML, XML, CSS, Java script, T-SQL, Python, Postgre SQL

Operating Systems

UNIX Sun Solaris, Linux, Windows 7/8/10/XP/2000/98.

Databases

Oracle 9i / 10g / 11g, Sybase, SQL server

Development Tools

TOAD, SQL Developer, SQL*Plus, PL/SQL Developer 7.1.5, Putty

ETL Tools

Informatica 9.1.6/8/6/1/8.5.1 (Power Centre), SSIS, Sunopsis, ODI

Other tools and Software

JIRA, SVN, Clear case, Citrix, Erwin 7.1, Autosys, Tableau 7.0, Power BI, Microsoft Office, Eclipse, Quality Center, QTP, SSRS, GIT

Professional Experience:

Dean Health, Madison, WI Jul 17 – Till date

ETL Developer

Responsibilities:

Created and Modified PL/SQL packages, Triggers, Procedures, Functions and Cursors.

Created /updated tables, Indexes, materialized views, synonyms and sequences per requirement.

Analyzed Database of Source 9i and target 11g database before and after migration.

Responsible for analysis of the issue, creation of Functional Specification Document and Design Documents. After proper unit testing of the code moving the code to Quality and after user acceptance in QA move the code to Production environment. Thus, involved in all the stages of SDLC.

Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures.

Extracted, Transformed and Loaded data into Oracle database using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).

Used DBMS_SQLTUNE.REPORT_SQL_MONITOR package to generate SQL monitoring report and tune the queries.

Performance Tuning of complex SQL queries using Explain Plan to improve the performance of the application.

Automating testing activities on Hadoop by shell scripting and through TOSCA tool.

Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).

Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.

Created Records, Tables, Objects, Collections (Nested Tables and Varrays), and Error Handling.

Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable.

Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.

Extensively worked on BULK COLLECTS, BULK INSERTS, and BULK UPDATES & BULK DELETES for loading, updating and deleting huge data.

Used Oracle Analytical functions such as RANK, DENSE RANK, LEAD, LAG, LISTAGG & ROW_NUMBER Functions for sorting data.

Used SQL*Loader to load data from Excel file into temporary table and developed PL/SQL program to load data from temporary table into base Tables.

Involved in managing and reviewing Hadoop log files

Used PUTTY & secure shell to connect to UNIX machines.

Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.

Developed complex reports using multiple data providers, user defined objects, charts, synchronized queries, and created star schema in SSAS to develop ad-hoc reports for the clients as per their requirements using SSRS in MS SQL 2005.

Experience in deploying forms and reports on the web.

Experience in developing forms based on views, tables and procedures in tabular and form layouts.

Worked on Several Unix/Linux Wrapper shell scripts like .ksh, .Csh, .sh.

Written shell scripts for processing of the files calling the PL/SQL packages and SQL scripts.

Used SQL*loader to load the input files from various external systems into the database staging tables and storing it for further processing.

Active participation in release, deployment, Data migration and Production Cut-Over activities work.

Responsible for creating PLSQL Programs and UNIX Scripts for Data Validation and Data Conversion.

Prepared UNIX shell script for the transfer of files from FTP to SFTP.

Environment: Oracle 10g, TOAD, Windows XP, Pl/SQL, SQL, Cognos8.4, UNIX Shell Scripting, Informatica, Putty, HP Quality Centre, SSRS, SSAS,

Nationwide Insurance, Columbus OH May 16 – Nov 16

MS SQL Developer

Responsibilities:

Gathered Business Requirements, interacted with the users, project manager and QA team to get a better understanding of the Business Processes

Documented System requirements, Technical specifications, Use cases, activity diagrams, workflow diagrams and release notes

Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable

Created reusable ETL & Reporting templates using SSIS.

Converting all existing SSIS package of MS SQL 2008 to 2012

Used SQL*Developer/Toad for creating all types of PL/SQL objects like tables, Indexes, packages, triggers, sequences and stored procedures.

Used SSRS to create, execute, and deliver tabular, matrix, and charts reports. Debugged and deployed reports in SSRS.

Used DTS/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.

Designed and developed efficient SSIS packages for processing fact and dimension tables with complex transforms.

Developed ad-hoc reports, drill through reports, drill down reports, parameterized reports, and other report models using MS SQL Server 2005 SSRS.

Involved in Performance tuning of SSIS Package to reduce overall ETL execution time.

Used Bulk collect and Forall in stored procedures to improve the performance and make the application to run faster.

Working knowledge on Tableau Server / Administration.

Experience working in On-Shore/Off-shore Development Models.

Implemented CTAS (Create Table As Select), Exchange Partition approaches to optimize archive process.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.

Worked on User-defined and the System defined exceptions to handle different types of errors like NO_DATA_FOUND, ROWTYPE_MISMATCH, and PRAGMAEXCEPTION_INIT.

Generated Power BI reports based on user requirements.

Used PL/SQL Tables, Ref Cursors to process huge volumes of data and used bulk collect and bulk bind for mass update as performance improvement process

Developed ETL process using stored procedures, functions and packages to migrate the data from SQL Server to Oracle Database.

Generated Power BI Dashboards with filters, quick filters, context filters, global filters, parameters and calculated fields on reports.

Generated ad-hoc reports in Excel Power Pivot and sheared them using Power BI to the decision makers for strategic planning.

Worked extensively with Composite data types, Cursors and Dynamic SQL.

Created PL/SQL program units to include DML, DDL statements using Dynamic SQL.

Used techniques like direct path load, for loading the data faster and created filters in Control files to pre-process data, as part of performance tuning.

Created B Tree indexes, Function based indexes, Bit Mapped Indexes on Tables, columns to minimize query time and achieve better performance.

Implemented various UNIX scripts to generate the data in CSV and txt format

Environment: MS BI (SSIS), SQL Server, T-SQL, Power BI, SSRS, SSAS, Tableau 7.0, UNIX Shell Scripting.

Crown Solutions: Hyderabad, India Nov 13 – Dec 15

SQL Developer

Responsibilities:

Requirement Gathering Analysis for new business requirements & major developments. Impact analysis of proposed changes. Participated on System level requirement and activity.

Created test case scenarios, executed test cases and maintained defects in Remedy

Tested reports and ETL mappings from Source to Target

Implemented various automated UNIX shell scripts to invoke PL/SQL anonymous blocks, Stored Procedures/Functions/Packages.

Reported bugs and tracked defects using HP/ALM Quality Center.

Developed Shell and PL/SQL scripting for regular maintenance and production support to load the data in regular intervals.

Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies in Tableau.

Developed complex SQL scripts database for creating BI layer on DW for tableau reporting.

Developed mappings/sessions using Informatica Power Center 8.6 / 7.1 for data loading. Developed mapping to load the data in slowly changing dimension (SCD).

Wrote complex SQL scripts using joins, sub queries and correlated sub queries.

Worked Extensively In setting up Batch Process Using Autosys Scheduler.

Conducted Black Box – Functional, Regression and Data Driven. White box – Unit and Integration Testing.

Created and maintained the shell scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.

Created/modified JIL Scripts and Used Autosys Client (4.5.1,11.1,11.3) to Monitor Batch jobs efficiently

Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.

Automated functional, GUI and Regression testing by creating scripts in QTP/UFT.

Extraction of test data from tables and loading of data into SQL tables.

Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)

Troubleshot production problems and worked effectively with other team members to identify and address problems.

Preparing and supporting the QA and UAT test environments

Environment: Oracle 9i, Autosys, Sql*Loader, SVN Sub-Version (Version Control), UNIX Shell Scripting, ALM/HP Quality Center 11, QTP/UFT, Tableau 7.0

Education:

Master’s in Computer and Information System from Kent State University.

Bachelor’s in Electronics and Communication Engineering from Jawaharlal Nehru Technological University.



Contact this candidate