VENKAT MUNJETI
Contact#408-***-****
********@*****.***
OBJECTIVE
I’m looking for a challenging position in RDBMS (OLTP), OLAP, Data Mart/Data Warehouse, Internet/Intranet and client/server technologies. I have been working at small size to corporate companies like eBay, Intel, Symantec and FUJIFILM Dimatix, Inc.
Over 15 years of experience in the IT field and over 8 years of DBA experience in various domains like Finance, Health Care, Mortgage, Network, Digital Print Media, Antivirus Software, Wafer Manufacturing Industry and E-commerce industries.
SUMMARY
Over 8 years’ experience in architecting, designing, implementing and supporting OLTP and business intelligence solutions.
Over 8 years of SQLSERVER DBA experience in various projects.
Strong experience in Replication Services, Database modeling, DB Cluster Node Management and Performance Tuning.
Over 2 years experience with SQL SERVER 2012/2014 with AlwaysON technologies.
Having a good experience with SQL Server Service Broker
Over 12 years of experience in T-SQL coding and 10 years of PL/SQL coding
Over 4 years of MANUFACTURING EXECUTION SYSTEM Eyelit software and DB integration experience with various Projects.
Over 4 years of experience with AWS cloud computing platform, and its many dimensions of scalability - including but not limited to: VPC (Virtual Private Cloud), EC2, load-balancing with ELB, messaging with SQS (and scalable non-AWS alternatives), autoscaling architectures, using EBS under high I/O requirements, custom monitoring metrics/analysis/alarms via CloudWatch.
Over 10 years using VM ware client and servers.
MS TFS2012/2015 migration experience
Strong experience in Data migration from SqlServer2000/DB2 à SQSVR2005/SAP, ORACLE10i ( SQLSVR2005 and SQLSVR2005 à SQLSVR2008 R2 Enterprise
Strong working knowledge in Sqlserver2008/2005/2000 with cluster server environment
Over a year QlikView Administration Experience
Strong working knowledge of DBA Roles like Backup, Restore, Export, Import, Data Archiving and setting up Proxy accounts for SSIS etc.
Expertise working with cursors, triggers, User defined functions, DTS Packages, Stored procedures, Reporting services (SSRS) and Report builder, Integration services, Analysis services
Experience in User support, requirement specifications, documentation, third party software integration, implementation, troubleshooting and mentoring
Excellent logical and analytical skills with the ability to work under pressure. Strong communication skills and an ability to work individually or part of a team
COMPUTER SKILLS
Operating Systems: Windows NT 4.0/2000/2003/2000/XP/2008/WINDOWS7, UNIX
Languages: SQL, T-SQL, PL-SQL, MDX, C#, ADO.NET, ASP.NET, VB.NET
Databases: MS-SQLServer 2008/2005/2000,TeraData,Oracle9i/10G/11G, MySQL,MS Access,MongoDB
IDE/Tools: Visual Studio 2008/2005/2003, Business Intelligence Development Studio (BIDS)
Web Servers : IIS 6.0/5.0, Glass Fish
Database Design Tools: Erwin, MS-VISIO
Version Control Tools: TFS 2008/2010/2015, Visual Source Safe 2005/6.0, Source Gear Vault, Clear Case, PerForce (P4V)
Debuggers: Visual Studio Debugger, Bugzilla, HP Quality Centre, Jira
Reporting Tools: Crystal Reports 10/9/8.5, MS Reporting Services (SSRS), Qlikview 10.0, Tableau 9.2
Communication Tools: MS-Office Live meeting 2007, WEBEX, SHAREPOINT SERVER
3rdParty DB Monitoring Tools/ Controls:
1)TOAD for Oracle, HYPERBACK (DB backup tool), HYPER-V,
2)VM WARE, FTP ELITE, RMAN,
3)REDGATE Software,
4)IDERA SOFTWARE AND QUEST SOFTWARE
5)SQL PROFILER
Bug Tracking: BugZilla 2.0, PVCS Tracker, TRACKRECORD, HP Quality Center, JIRA
Position: Sr. SQL Server DBA
Client: FUJIFILM Dimatix Inc., Santa Clara, CA.
Aug’11 – Till Date
FUJIFILM Dimatix, Inc., a wholly owned subsidiary of FUJIFILM Corporation, is the world's premier provider of piezoelectric inkjet print heads. Our products are designed to accurately dispense a wide range of fluid types in micro-amounts for demanding imaging and fabricating applications. With more than 27 years advancing the performance of drop-on-demand dispensing devices and applications, FUJIFILM Dimatix is the only developer and manufacturer in this industry with the technology, know-how, and creativity to lead the industry into the future.
Projects:
Here is the list of my job duties and accomplishments at FDMX, Santa Clara, CA.
1)Identified old production DB and hardware issues and came up with a new solution and it was successfully implemented. The DATA has been migrated from standalone Sqlserver2005 physical server to Sqlserver2008R2 Enterprise Clustering (Active/Passive) with RAID 10, EMC SAN configuration.
2)WebFab web application hosted and maintained on AWS
3) Installed, Configured SQL SERVER REPORTING SERVICES, SSAS. Migrated all Crystal Reports to SQL SERVER REPORTING SERVICES to better perform and reduce the cost. Build Multi dimensional cube using SSAS and used advanced DAX calculations for Measures and KPI.
4) Supporting DEV, QA and TEST environments
5) Daily database backup, Replication, SQL Agent Jobs monitoring
6) Make sure that high availability of data and SERVERS.
7) Migrated our Test/DEV Environment into SQLSERVER 2012 Always ON. Availability Groups maximizes the availability of a set of user test/dev databases for an enterprise. This availability group supports a failover environment for a discrete set of test/dev user databases, known as availability databases, that fail over together. Periodically check and do Performance tuning
8) Introduced Reliance MOC (Management Of Change Control) and successfully integrated with Development, Test and production environment.
9) Various versions of Eyelit MANUFACTURING EXECUTION SYSTEM software has been successfully upgraded for FDMX Front End Fab operations (production).
Eyelit Version 4.16 to 5.1.01
Eyelit Version 5.1.01 to 5.2.04
Support MANUFACTURING EXECUTION SYSTEM on all production and test issues
Heap Alerting Manufacturing Execution System system Developed and deployed in MANUFACTURING EXECUTION SYSTEM Eyelit version 5.1.01
Maintenance of All MANUFACTURING EXECUTION SYSTEM clients like MANUFACTURING EXECUTION SYSTEM MODELER, OPERATOR and Control Center
10)Developed SSIS packages, SQL SERVER REPORTING SERVICES Reports based on requirements.
Developed and maintain custom MANUFACTURING EXECUTION SYSTEM reports by using SQL SERVER REPORT BUILDER. All below listed reports have been developed and deployed on SQL SERVER REPORTING SERVICES and successfully serving for all business needs in Production, QA and Maintenance.
Tool Time In State
Photo Rework Report at layer and rework code level
Chemical Bottle Usage
Scrap Report.
Tool Usage Report
Preventive Maintenance Report
Wafer Data Collection report
Step Yield Report for Back End Assembly
11) Continuous support on production, Development and QA Databases.
12) Actively working on VSS to TFS2012 migration, SP2010 to SP2013 migration and SAP requirements gathering.
13) Continuous deployment and administrative support on WebFab web application which is being used for fixtures/modules interactive application.
14) Continuous support on all PRODUCTION DB's maintenance jobs, upgrades, DB Performance issues.
15) Successfully migrated resource intensive Crystal reports into SQL Server Reporting Services.
16) Actively attending on new SQL Server Technology seminars and keeping myself current.
Physical implementation of databases, storage structures, high-availability solutions, replication, and disaster recovery solutions.
OLTP Design (conceptual/logical/physical), development, implementation, and maintenance phases
OLTP Code and design review Ensure database security, integrity, and performance.
Providing inputs in defining database standards and best practices for advanced monitoring and troubleshooting, performance tuning and automation
Provide guidance and direction to Level-2 Production database operations team.
Having storng experience in various technologies within SQL Server: Data Mirroring/Replication, SSIS. Configuration of SQL SERVER REPORTING SERVICES and SSAS.
Having storng experience in SQL Server performance monitoring, troubleshooting,SQL scripting skills, security standard management
Recommend, implement, and maintain best practices and corporate standards.
Monitor and maintain; All SQL Server instances (Production, Staging, QA, & Development) AllSQL Server subsystems (SQL Agent, SSIS, etc)
Backup and recovery processes for all SQL Server databases
Work as a team member and individually
Troubleshooting, problem analysis, and resolution
Fulfill ad hoc query and report requests from other departments
Experience with multi 1 TB databases
Experience with 10-20+ production database servers
Experience with remote disaster recovery and business continuance sites
Windows Clustering as it relates to SQL
SAN knowledge, MySQL, LINUX
12+ years of experience working with OLTP environments that are 24/7
SQL Server performance tuning
Strong skills in T-SQL programmable object development, table design, and index design
Strong skills in database performance analysis and tuning
Team player with strong communication skills
Position: Sr. SQL Server DBA
Client: Symantec, Mountain View, CA.
Projects: 1) Successfully done Data migration from Sqlserver2005 to Sqlserver2008R2
2) Auto renewal project migration from MS-Access to Sqlserver2008
3) SPA (store Performance Analysis) Integration and automation
4) Daily database backup, Replication, SQL Agent Jobs monitoring
5) Make sure that high availability of data and SERVERS.
6) Developed SSIS packages for daily data load from various places like
ORACLE ENTERPRISE Data ware House, FTP files etc.
Aug’10 -- July’11
Project Description:
I had been worked with eBIO DST (World wide e-commerce business intelligence Data Strategy Team) team as a SQL DBA on various projects as I mentioned here.
Roles and Responsibilities:
Monitoring SQL Agent jobs, Performance bottlenecks, configuring databases, backup/restore Databases.
Successfully migrated SSIS packages, stored procedures, SQL reports from SQLSERVER2005 to SQLSERVER2008 R2.
Installed, Configured SQL SERVER REPORTING SERVICES, SSAS. Migrated all Crystal Reports to SQL SERVER REPORTING SERVICES to better perform and reduce the cost. Build Multi dimensional cube using SSAS and used advanced DAX calculations for Measures and KPIs
Configuring DB file groups on SQL SERVER 2008 R2 to deliver better performance.
Running Alerts on all production Databases to make sure that enough space is there.
Configured DB mail and set up email alerts by using it.
Windows server administration
Used dynamic SQL to reuse the code part of the stored procedure. Implemented Transactions like BEGIN/ROLLBACK/COMMIT based on the needs. Key feature in Dynamic SQL like Paramdefinition to define it as output and retrieve it by using SP_EXECUTE SQL.
Used Information_Schema views, SYSTEM. Triggers, SYS.dependencies for various database application needs to verify the common columns/Key columns list between the tables, finding the triggers list on a table and finding the dependencies of an object.
Implemented different types of Replication Models like Snapshot, Merge and Transactional using SQL SERVER2008.
Used performance tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Monitoring and Tuning MS SQL Server Performance. .
Used Common Table Expressions (CTEs) to avoid sub queries.
Added error handling techniques like TRY-CATCH in the procedures.
Jira tool used to track the bugs.
Involved in the Database design and creation of tables, stored procedures, User Defined Functions, SSIS packages using SQL Server 2008.
Position: Sr. SQL Server Developer/DBA
Client: Intel, Santa Clara, CA.
Projects: Archive Project, Delta Refresh Project and Sample Management Solutions (SMS)
Jan’10- Aug’10
Project Description:
I have been supporting in RDMS (Revenue and Demand Management Solutions) group various projects like ARCHIVE PROJECT, DELTA REFRESH PROJECT SMS PROJECT (Sample Management Solution) and PERFORMANCE TUNING.
Archive Project:
Intel has heterogeneous data (various sources of data) that got loaded into PRODUCTION on a daily basis by running AUTOSYS jobs. Currently huge volume of data has been sitting in PRODUCTION and it is slowing down the various applications runtime and job performance.
The primary objective of this project is to move the historical data from Production to Archive Server and cut down the all applications and jobs runtime. And at the same time keep the historical data on the ARCHIVE Server to access it on demand basis.
We have done this job in two steps.
1)Onetime cut over: Identified the historical data in various databases with a onetime cutover date. Part of the process is to take the backup from PRODUCTION SERVER and restore the same on ARCHIVE SERVER.
After that Ran all the delete procedures/scripts on PRODUCTION SERVER based on one time cut over date. Coordinated with SAs and App owners before and after this process and make sure that everything is OK.
Entire process we have been tested in all environments like DEV, ITT, UTT and CONS before we pushed it into PRODUCTION.
2)Monthly basis: Take the backup of PRODUCTION on a monthly basis and refresh it on ARCHIVE SERVER
Delta Refresh Project: The objective of this is projected to eliminate full load batch run every time. Instead load only the data which has got modified or added from last run based on CREDATE/LASTMODIFIED DATE. And update the ti Manufacturing Execution System tamp on EDW (enterprise Data Ware House)
It helps to cut down the data load jobs run time significantly.
Sample Management Solution (SMS): Intel develops various samples to their customers to integrate with the customer new/existing products to see how the sample products are doing. This is a huge inventory and they want to manage SMS as a separate product. That’s how this Project was started. I involved in developing SHIP TO/SOLD TO MGMNT part in the back end.
SHIP TO AND SOLD TO have three common steps.
1)Activate new customer in SMS system
2)Modify the existing customer from ACTIVE TO INACTIVE or Vice Versa based on the business rules.
3)Delete the customer data from SMS if it is sitting in a DQ (Data Quality Issue) more than 7 days.
Roles and Responsibilities:
Monitoring SQL jobs, Performance bottlenecks, configuring databases, backup/restore DBs.
Used dynamic SQL to reuse the code part of the stored procedure. Implemented Transactions like BEGIN/ROLLBACK/COMMIT based on the needs. Key feature in Dynamic SQL like Paramdefinition to define it as output and retrieve it by using SP_EXECUTE SQL.
Used open query commands in Stored Procedures to update data in TeraData.
Using TFS (Team Foundation Server) to maintain the all SQL code, to build for each release in various environments. Participated part of the build/release process in various environments.
Installed, Configured SQL SERVER REPORTING SERVICES, SSAS. Migrated all Crystal Reports to SQL SERVER REPORTING SERVICES to better perform and reduce the cost. Build Multi dimensional cube using SSAS and used advanced DAX calculations for Measures and KPIs
Used Information_Schema views, SYSTEM. Triggers, SYS.dependencies for various database application needs to verify the common columns/Key columns list between the tables, finding the triggers list on a table and finding the dependencies of an object.
Implemented different types of Replication Models like Snapshot, Merge and Transactional using SQL SERVER2008.
Used performance tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Monitoring and Tuning MS SQL Server Performance. .
Used Common Table Expressions (CTEs) to avoid sub queries.
Added error handling techniques like TRY-CATCH in the procedures.
The HP Quality Center tool used to track the bugs and to get the consolidated bug reports.
Involved in the Database design and creation of tables, stored procedures, User Defined Functions, SSIS packages using SQL Server 2008.
Scheduled the reports to run on Daily/weekly basis to deliver sales and business teams via email. All reports generated using Microsoft SQL Server Reporting Services.
Position: Sr. SQL Server DBA
Client: FNC Inc
Project: CDNA (Collateral Data and Analytics) for Mortgage Industry
Oct’09- Jan’10
Project Description:
FNC collects daily mortgage data from various clients like Fannie Mae, Freddie Mac, BOA etc. and it process through the Data Factory and push the data into NCD (National Collateral Data). FNC Data and Analytics group generates the products using this NCD data like Data Express, Value sight, Appraisal Score ( GAAR (Generally Applied Appraisal Rules) Score), etc. One of the markets leading analytics product, GAAR (Generally Applied Appraisal Rules) engine runs uploaded appraisals and generates reports at client request. So based on these reports Lender can make a decision whether to perform the loan or not for that given property. Also client can get the closest match (with real data comparable) for that given property.
Roles and Responsibilities:
Monitoring SQL jobs, Performance bottlenecks, configuring databases, backup/restore DBs.
Wrote ETL scripts using SSIS to process the lender data which co Manufacturing Execution System in XL, xml, pdf format and that Load it into the Data Factory.
Configuring file groups on SQL SERVER 2008 R2 to deliver better performance.
Running Alerts on all production Databases to make sure that enough space is there.
Configured DB mail and set up email alerts using it.
Windows server administration
Did Data mining to help Mortgage Statistical data analytics.
Using SQL SERVER REPORTING SERVICES to generate Reports like how the single family/condo property sales doing at Metro Area/County/Zip level for Business needs.
Implemented different types of Replication Models like Snapshot, Merge and Transactional using SQL SERVER2008.
Using performance tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Monitoring and Tuning MS SQL Server Performance.
Developed various reports using SQL SERVER REPORTING SERVICES and SSAS and rolled out to all FNC mortgage clients.
Used Geo spatial functional calculations to get best comparable based on client given property details.
Used Tally table to avoid looping and to improve performance.
Used CTE (common Table expression to avoid sub queries).
Added error handling techniques like TRY-CATCH in the process.
Used VS2005 Team Foundation for DB professional’s tool for version controlling.
Raidster internal tool used to track the bugs and to get the consolidated bug reports.
Involved in the Design of Database and creation of tables, stored procedures, User Defined Functions, DTS packages using SQL Server 2005, 2000.
Scheduled the reports to run on a weekly basis and set up them to deliver to sales and marketing teams via email. All reports generated using Microsoft SQL Server Reporting Services.
Position: Sr. SQL Server Developer/Application DBA
Client: eBay, Inc., San Jose, CA
Project: Share of Voice (SOV)
June’07- April’09
SOV initially started to monitor how the eBay keywords doing at Natural search at the search engine level within USA. Later it extended to all eBay international sites.
SOV involves in 5 steps at high level.
1) Keywords Feeding uses the Win forms application
2) Scraping
3) Summarization
4) Reporting
5) Push data from the transactional database to Data warehouse using ETL
Scraper: Using keywords bucket at entity/country level .It sends an http request to the search engine using a pool of proxies. SE populates all natural search results. Scraper parses all the results and loads in the database.
Use of the proxies in an innovative way (thus, not tying one application server to one proxy but to all). Use the same technique used in SOV that distributes the load to all proxies. The technique not only randomizes but also vastly distributes the HTTP requests and thus avoiding Google detection.
Once data got populated in the Database, using SQL Server Reporting services generates various reports to the abbey at site/country level. These reports use various stored Procedures, Functions, Views which are written in T-SQL .Above all reports delivered in XL sheet to the business users via email
After that all transaction keyword data pushed to Data ware House using SSIS (ETL) technology to do further analysis by business users.
Achievements: Successfully rolled out SOV project to 30 eBay international sites.
Roles and Responsibilities:
Monitoring SQL jobs, Performance bottlenecks, configuring databases, backup/restore DBs.
Wrote complex ETL scripts using SSIS to push the SOV data from the transactional database to DW on a regular basis. Involved in database design and implementation.
Did Data mining to help Data warehouse/Data mart team for further analysis of keyword data
Involved in Star Schema design and Implementation.
Used SSAS to generate Reports for Business needs
Implemented different types of Replication Models like Snapshot, Merge and Transactional.
Using performance tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Monitoring and Tuning MS SQL Server Performance.
Developed about 120 reports using SQL SERVER REPORTING SERVICES and SSAS and rolled out to 30 ebay International sites.
T-SQL. Added error handling techniques like TRY-CATCH in the process. Used Clear Case tool for version controlling.
Bugzilla tool used to track the bugs and to get the consolidated bug reports.
Involved in the Design of Database and creation of tables, stored procedures, User Defined Functions, DTS packages using SQL Server 2005, 2000.
Scheduled the reports to run on a weekly basis and set up them to deliver to sales and marketing teams via email. All reports generated using Microsoft SQL Server Reporting Services.
Environment: VisualStudio. NET 2005, Framework 2.0, C#, ASP.NET, ADO.NET, Perl, SQL Server 2005, TeraData,MYSQL IIS 6.0, JavaScript, XML, Win 2003, Clear Case.
Position: Sr. SQL Server Developer/DBA
Client: Flash Seats LLC, Campbell, CA, USA
May’06 – May’07
Flash Seats (www.flashseats.com) is a new and exciting opportunity being introduced to Cavaliers season ticket holders for the 2006-07 seasons. This concept, similar to the e-ticket in the airline industry, allows you to manage your season tickets via the Internet. Flash Seats give Cavaliers fans a secure, anonymous, online marketplace where they can buy, sell and transfer Cavaliers game seats safely and conveniently. Since buyers and sellers remain anonymous, privacy is always protected. Flash Seats handles the payment transactions. Flash Seats’ paperless ticketing does just that – it eliminates paper tickets. Fans swipe any form of electronic ID (credit card, driver’s license, etc.) at the gate and then enter the arena. No lost or stolen tickets.
Projects delivered successfully:
1) Account Management 2) Account Adjustment 3) Sales Activity and Reports 4) Dash Board summary 5) Venue Web Search 6) Ticket Holder Info Display 7) Invoice 8) Verify and Add ID 9) Admin Search 10) Bug Fixing
All these projects are well designed and developed using n-tier architecture. We used MVC and Facade pattern. This pattern we used in the admin tools like Account Adjustment and Account Management. We took the advantage of multiple view and controllers can interface with the same model using the MVC pattern.
Using the MVC pattern we separated all objects into three categories
1) Model for maintaining data 2) Views for displaying all or a portion of the data
3) Controllers for handling events that affect the model or view(s) both.
Roles & Responsibilities:
Monitoring SQL jobs, Performance bottlenecks, configuring databases, backup/restore DBs.
Involved in database design and implementation
Involved in the Design of Database and creation of tables, stored procedures,
User Defined Functions, DTS packages using SQL Server 2005, 2000.
Designed and Implemented important Sales Activity reports which show the Week over Week performance of Sales Activity. Also generated reports for a Game Day which gives number of tickets redeemed on each entrance for every 10 minutes at the Venue entrance. Scheduled the reports to run on a weekly basis and set up them to deliver to sales and marketing teams via email. All reports generated using Microsoft SQL Server Reporting Services.
Used SSAS to generate Reports for Business needs
Implemented different types of Replication Models like Snapshot, Merge and Transactional based on needs.
Using performance tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Monitoring and Tuning MS SQL Server Performance.
Optimized data retrieval and data modification queries that deal with Tables containing huge amount of data and created indexes to improve the query performance.
Involved in Design and Development using .NET Framework 2.0, ASP.NET, C#, VB. Net, SQL Server 2005, Win Forms, Web Forms and Web services Used typed/on typed datasets, Data Grids, Grid View, user controls, web controls and ReadThroughCache. Wrote a connection manager class to manage database connections most efficiently.
Wrote Business Objects, Utility classes.
Environment: VisualStudio. NET 2005, Framework 2.0, C#, ASP.NET, VB.NET, ADO.NET, Web Services, SQL Server 2005, TeraData, IIS 6.0, JavaScript, XML, Win 2003, Vault Source Gear,.
Position: Sr. SQL Database Developer/DBA
Client: MegaPath Networks Inc. Pleasanton, CA, USA.
Project: Regression Application Tool Nov’05 – April’06
This tool is designed as an intranet based application. It has its own database which holds all stored procedures, functions and tables. This application always uses this database to perform ADD/COPY/COMPARE test cases actions.
It has an authentication page and uses Windows security. Authorized users can access ADD a test case page. This page takes a server and database name for source and target in addition to test case number, description and single or multiple service location numbers.
It pulls all the information about specific Service location number/numbers from source and applies algorithms before insert into the target database. If user gave valid inputs then it adds a test case to the target database. This application uses lookup tables and event log tables to capture the history of the specific event like “WHO” did “WHEN” and “WHAT”. Each test case is identified by a unique test case number.
After successfully added a test case then the user can copy the test case to cross server or local database. And compare results between source and target and tells you whether it is passed or failed. And generates a comprehensive report for ADD/COPY /COMPARE test cases using Crystal reports10. 0
Roles & Responsibilities:
Wrote complex ETL scripts using SSIS to push the data from the transactional database to DW. Involved in database design and implementation.
Did Data mining to help Data warehouse/Data mart team for further analysis of data
Involved in Star Schema design and Implementation.
Used SSAS to generate Reports for Business needs me
Implemented different types of Replication Models like Snapshot, Merge and Transactional.
Using performance tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Monitoring and Tuning MS SQL Server Performance.
Involved in this project from design level to release in all phases (requirements, designing, gathering, coding, testing and integration) with director of Information Systems in Megapath Networks Inc.
I wrote all necessary SQL stored procedures, Functions, Triggers and test scripts to meet the requirements. Using C# implemented dot net technology in user interface. I also used Microsoft Application Blocks as reference in coding. Java script is used for client side validation. I did the application deployment and integration testing successfully.
Environment: ASP. Net, ADO. Net, C#, VB.NET, SQL Server 2000/2005, IIS 5.0, HTML, FrontPage 98, JavaScript, Visual Studio. Net, Crystal reports10. 0, Bugzilla, MS-Access’97/2000, MS-Office
Position: Sr. Database Developer/DBA
Client: Resolution Health Inc (RHI), San Jose, CA, USA Aug’01 – Oct’05
Project: Mail Order Rule, PMH Rules, Event Rules
The mail order rule addresses medication cost and compliance issues by evaluating each member’s active medication for mail order opportunities. Mail order usually reduces member Co pay by enabling them to get a three month supply of MEDs for the cost of 2 Co pays. These programs deliver 3 months worth of medication and regular refills thereby increasing the likelihood that the member will be compliant with the regimen.
PMH Rule identifies members who probably have certain Disorder and stores it as part of the past medical history. The results are available for other rules to use. Event Rule Identifies patients with new or recurrent event and saves the date (date of service for the first diagnosis).
All the above rules are designed and developed to help Patients to take precautionary measures and save money for their health care.
Client Toolkit:
This is a web based application. This application designed for Health care companies and RHI internal use. Run rules on patient data through treatment monitoring system and generate results. Based on these results the Manufacturing Execution System sages were sent to either patient or provider.
Using this application health care company can see -
Patient treatment History
Current drugs he is in
What Manufacturing Execution System sages sent for specific patients and provider on specific rule
Enter feedback for various