NIMMA REDDY
Phone: 502-***-****
Email: *******@*****.***
Career Summary
More than 16 years of experience in analysis, planning, architecture, design, development, maintenance and management. Data integration, data modeling, Enterprise data warehouse (EDW), data mart, data mining, creation of Power BI reports and analysis cubes for strategic business solutions.
Excellent professional skills in data integration SSIS(ETL), creation of reports, dash boards using visualization tools Power BI and SSRS, Tableau, data modeling, data warehouse, data mart and SSAS cubes,
Very Strong foundation in SQL programming, creation of tables, views, indexes, stored procedures, user defined functions, complex queries for data validations, cleansing, trouble shooting, performance tuning and many more.
SQL Server, SSIS, SSRS, SSAS, Power BI and Oracle, MySQL, Snowflake, DB2, T-SQL, PL/SQL, C#, Tableau, GIT Azure Dev Ops, MS Teams,
Familiar with Azure cloud environment Azure SQL Server, Azure Data Lake, Azure Data Factory, Azure Data Bricks, Azure Synapse.
Excellent understanding of Software Development Life Cycle (SDLC). Very comfortable in coordinating with other teams and deploying code thru various stages Dev-QA-UAT-Prod environments. Implementation of best practices and adapt company standards.
Good interpersonal and communication skills. Self-motivated, agile methodology, keen ability to grasp new systems and technologies quickly.
Professional Experience
Republic Bank Louisville, KY Sept 2015 – Present
Environment: SQL Server, SSIS, SSRS, SSAS, Power BI, Snowflake, GIT Azure Dev Ops, Outlook, Share point, MS Teams, Azure Cloud environment, Tableau.
Working as: Business Intelligence Architect
Accomplishments
Planning, Design, Architecture the enterprise reporting foundation. Managed to Load data received from various business partners of bank into SQL server database using ETL tool SSIS and other bulk load methods. Architecture and design, coordination and Modeling to build enterprise data warehouse (DW) involving multiple Fact and Dimension tables in Star Schema model for different lines of businesses of Bank including conventional banking and non-conventional banking lines of business such as Prepaid Card Services, Credit solutions, Tax Refund Solutions. Processes are built to Pull the data from ODS (Operational Data Store) servers’ and other business partners and load the data into the warehouse on daily basis. I extracted data with complex algorithms. Created tables, views, stored procedures, indexes, triggers as required. Wrote numerous complex SQL queries to validate, cleanse the data and pull it for various purposes including creating data sets for Power BI reports, SSRS reports and warehouse loading. Worked on performance tuning and Optimization techniques.
Completely owned, designed and built data modeling, SSAS multi-dimensional and Tabular cubes and wrote MDX queries and DAX queries for extraction of aggregate values for various purposes. Created Data Mart for intermediate processing to feed data to the CRM (Customer Relationship Management) app thru API.
Created Power BI, Tableau reports, dash boards, KPI, Score cards with advanced multiple features by pulling data from warehouse and SSAS cubes. Sometimes I ran queries to extract data directly from ODS LIVE servers to create data sets to make the reports show LIVE data snapshot without latency. Reports are deployed to Azure portal, some reports are placed on premise and few are deployed to Cloud Azure environment. Achieved 100% automation end-to-end including loading data to database and refreshing reports on scheduled basis. Reports are helping many key business decision makers within the business and drive results.
Constantly coordinating, collaborating and maintaining very good relationship with other teams of business such as business analysts, change management, QA team, DBA team, business product owner and operations team along with my own IT team. I did documentation of the processes and trained the team for operations and smooth running of jobs on regular basis.
Humana, Louisville, KY Jul 2014 – Sep 2015
Environment: SQL Server, T-SQL, SSIS, SSRS, Oracle, PL/SQL, Informatica, Windows, UNIX, Microsoft Office, Outlook, Lync, Share Point, $Universe, Tableau.
Worked as: Data Services Tech Lead.
Accomplishments
Humana is third largest health Insurance company in USA. Worked in Enterprise Data and Analytics (EDA) team. Planning, Design, Architecture for loading input feeds from various business units and Vendors to Enterprise Data Warehouse(EDW). Data extraction from EDW for the purpose of reporting. ETL involved cleansing, mapping, creation of data into various forms such as flat files, Excel files. I did Data Modeling, analysis, building measures and dimensions to enable generating the reports for business decisions. I lead the following projects during this time period. Informatica is used for ETL and reports are created in Tableau.
Intelligent RX (IRX) Project:
This project is to identify the clinical data from EDW data base, extract the data and create modules for Member ship, Members Coverage, Drug Claims, Medical Claims and Lab Claims and process to send key data to Anvita (Vendor to Humana) to feed data to their Intelligent Rx engine.
Anvita engine provides services to LIVE pharmacy (Rx) filling thru Walgreen Pharmacy, CVS Pharmacy etc. The main purpose of this is to avoid duplication of filling of same prescription medicine during the same time period for the same patient due to various reasons like patient may see multiple providers (Doctors) for similar health problems. It happens to several patients many times causing serious adverse effects. This is a brand-new work and quite a good size project. I took ownership and lead the project. I have actively participated to gather the requirements, discussed on specifics with business team (Rx team), our own IT team, and Anvita (Humana vendors) team. Finished the design and the project planning and obtained approval from the Management. I did co-ordination, leading and helping offshore & onshore development teams. Monitored, reviewed the work at various stages to ensure project is completed successfully with quality product idelivered as per standards and schedule. As the data we are pulling is huge (several hundred million rows (approx. a billion) for each of the modules) for reports, we are splitting the data into multiple flat files dynamically and exporting to Anvita thru secured Electronic DataTransfer method. The process of extraction, exporting and loading on to Anvita system was a weekly job. As the data was huge, we initially experienced some issues like disk space issue, network issue, overwriting/overloading issues, file size problems but eventually we could come around methods to overcome it and managed to load with no or very little (fraction of a second) interruption to the services.
SRAE (Seriously Reportable Adverse Events) Reports and AHRQ (Agency for Healthcare Research & Quality) Reports
The project is for enhancements to SRAE and AHRQ reports. The person(s) previously worked on these reports were no more working in the team. I stepped into the new area and studied the existing processes and helped the Developers(contractors) to develop the logics to enhance the reports. The process is reasonably a big size which involved many modules and several SQL scripts, procedures involving numerous tables. We have taken extra care for not breaking any existing logic while adding new code for enhancements. SIT, regression test and UAT went well.
Online SQR (Stars Quality Report)
This project is for loading additional feed from PPI (Physician performance interactive) team for supplemental data. The PPI feeds are now for Medical and Vision. I lead the project and completed thru offshore contractors to load the feeds to the data base with validations and mappings. We also generated reports and automated the process to send the report to the business team.
Republic Bank Louisville, KY Apr 2014 – July 2014
Environment: SQL Server, SSIS,SSRS,SSAS, IBM AS400/ iSeries, DB2, Visual Studio, C#, VB.Net, Windows, Microsoft Office, Outlook, TFS (Source Control).
Worked as: Data Architect
Accomplishments
Planning, Design, Architecture the enterprise reporting. Accessing DB2 data and perform data cleansing. Retrieve data from DB2 data base thru ETL tool SSIS and load into SQL server. I did SQL programming, created stored procedures. Wrote various queries to validate data. Built SSIS new packages and maintained the existing processes to download the data from different business units and move the data to other systems. Research production issue and provide solutions to resolve data issue by cleansing and reloading.Involved with merger of Republic Bank (Acquired Florida Bank) with Republic Bank(Corporate), gave ideas and helped on how to merge data by introducing X-Ref tables concept such a way the accounts and its history on both banks don’t get affected while merging and moving forward with common ID per entity concept in the Corporate bank ledgers. Job monitoring, maintenance and troubleshooting, and ensure smooth running of jobs on daily basis.
AccentF(x) Louisville, KY Feb 2005 – Apr 2014
Environment: SQL Server, SSIS, Tableau, Visual Studio, C#, VB.Net, Active-Batch
Worked as: Data Integration Specialist.
Accomplishments
I did SQL programming, created stored procedures. Built SSIS packages and maintained the processes to import the data from clients and move the data to the destinations. Pulled the data from ODS servers and other data sources and refreshed the data warehouse on regular basis. Created Tableau reports, deployed and published the reports to the server and made available to the stakeholders as per requirements. Worked in creation of KPI, dashboard reporting, customer specialists score cards for sales and finance teams. Presentation of reports to the business leaders, job monitoring, troubleshooting production issues. I developed processes and implemented in Active-Batch job scheduling tool to refresh the Tableau reports automatically on a scheduled basis.
ACCENT Marketing, Jeffersonville, IN Jan 2005 – Jan2014
Worked as: Client Server Analyst/Developer
.
Environment: SQL Server, SSIS, SSRS, Visual Studio, C#, VB.Net, IBM AS400/iSeries, DB2, COBOL, CL, Windows, Microsoft Office, Outlook, Cognos, Active-Batch
Accent Marketing was running many marketing campaigns for clients like Whirlpool, GE, Maytag, Amana and Samsung, Humana, Sprint PCS. One of them is to sell Extended-warranty Service Plans (ESP) to home appliances like Refrigerator, Washer/Drier, Cooking range, Dishwashers, wine-cooler and TVs etc. It also involved in tracking sales and money movement activity.
Accomplishments
Worked on planning, data base design, and application architecture. SQL programming, created tables, views, user functions, and indexes, several stored procedures, tuning for high performance and scalability. Wrote numerous multiple complex queries to validate and manipulate the data in various tables. Wrote SQL queries ranging from simple to very complex including multiple joins to obtain the group values.
Heavily involved in ETL (Extract Transform Load) process. Built numerous SSIS packages with data merging, mapping and several validations to move the data to and from remote systems and internal data transfers as well. Created many complex SSIS packages. Downloaded files/ using the method of FTP/sFTP from remote business partners systems and our own business units.
Dedicated my time in a big size project of migration from IBM AS400/DB2 platform to Microsoft SQL server. During this period, I owned the projects to migrate the processes and applications. I was involved in gathering requirements, communicating to the team, managed and coordinated with other teams related to the projects and successfully deployed and maintained.
I was involved in backend programming and data management in DB2 IBM iSeries. I did coding in SQL, COBOL, RPG, CL, and Query manager. Developed common stored procedures in DB2 to use in Windows apps and websites which aided in achieving a long-term goal of implementing common data sync logic.
Processes Automation
Many manual processes were automated and made to run smoothly even on weekends and holidays hands free, thus improving productivity and reducing many manual hours and process time. Few of these are listed here.
Problem: Here some processes were involved with multiple tasks and data spread all over in multi platforms/environments like SQL server, DB2, Excel files, text files and clients ftp sites. We were using third party tools as well for accessing data at the vender area for the services like National change of address (NCOA), Phone append, Standardization of Addresses (CASS). Various people owned these tasks individually and were maintaining and running their own pieces manually daily.
Solution: Stepped forward and automated the processes by integrating all the pieces together. Added validation checks after each task and ensured that they ran without any manual involvement.
Problem: There was another complex process “outbound call campaign” that had many tasks like pulling data from DB2 tables, export to a web site for scrubbing Phone numbers against national Do-Not-Call (DNC) list, Do-Not-Email list. The return results file had to be imported next day and upload it to the data base to filter the DNC flagged records. The data had to be loaded to a Phone dialer (a separate environment) and finally to a SQL server table to record sales information. All these steps were performed manually at different times by different people and it became a repetitive work every week and consuming good amount of time. It’s also prone for errors as it involves manual process.
Solution: Volunteered to automate the whole process to make it run smoothly on a scheduler. It saved time and more over provided peace of mind to load the leads to the dialer on time which is critical in a call center business.
Problem & Solution: While doing automation, added flags and checks, and monitored all tasks to ensure successful completion. If for any reason any process did not complete successfully then it sent out a message to the production team with information about where it was held up. After fixing data error issues it could resume running from point where it stopped.
Problem: Bonus/incentives to call center reps data load was a manual process to load about 30 to 40 excels files from various business centers on weekly basis. Some parts of data were in Excel files while other parts of the data information were provided thru email. The data formats were inconsistent with each other, so it was taking 10 to 15 hours every week to put the data together and load onto data base tables.
Solution: Come up with a new idea of creating a common template for submitting the bonus data. Developed a process of loading all files one after another automatically pulling directly from the network folder. This helped reduce the task time to less than 10 minutes for which the Finance team greatly appreciated me.
Education and Awards
BS in Technology, JN Technological University, Hyderabad, India 1984-1989
MS in Engineering and Technology, Osmania University, Hyderabad, India 1990-1992
Genuine Leader (GL), Most Valuable Person (MVP) Awards by CEO, Accent, IN
Tiger-Star Award by Vice-President, Accent, IN
Most Valuable Person (MVP) Appreciation Award by Director of IT, Accent, IN