Post Job Free
Sign in

Engineer Sql Server

Location:
Maylands, WA, Australia
Posted:
November 06, 2015

Contact this candidate

Resume:

RESUME

NAME Alok Kumar Singh

QUALIFICATION Bachelor of Technology, Uttar Pradesh Technical University, 2007

OPERATING SYSTEMS Windows XP, Windows 7, Windows Server 2003/20008/2012

LANGUAGES C#, PHP, MatLab/Simulink, Javascipt, Jquery, Xml, WCF, MVC, SharePoint

RDBMS/DBMS SQL Server, Mysql

SUMMARY

Microsoft Certified Engineer with 8.5 years of experience in Project Management, Software Design and Development specializing on Microsoft Technologies

Excellent technical knowledge and experience in developing Object oriented enterprise application, distributed application programming (ASP.NET, Web Services, COM/DCOM, SOAP, XML, C#.NET, ODBC/OLEDB and ADO.NET)

Extensive Experience in User Interface development (ASP.NET, ASP, MVC) using ASP.NET design patterns.

Good working experience in databases like SQL Server 2012, SQL Server 2008, SQL Server 2005, SQL Server 2000, MySQL.

Good Working experience with SQL Server Integration Services (SSIS), DTS Packages.

Work Experience includes various application servers like IIS 5.0, IIS 6.0, IIS 7.5

Worked extensively on application integration frameworks and Web Services using XML, SOAP, UDDI, WSDL and RPC

Good experience working with offshore and Onsite teams.

Excellent working knowledge in SOA Architecture.

Worked extensively on Agile SDLC Methodologies.

Good working experience in reporting tools like Crystal Reports 2008, Crystal Reports XI, Crystal Reports 10, Active Reports, Microsoft Access Reports, Excel Charts.

Good Working experience in reporting services using Crystal SDKs, SQL Server Reporting Services (SSRS)

Good working experience in the development tools (Visual Studio 2012, 2010, 2005, 2003, .NET Framework 4.5, 3.5, 2.0, 1.1, Visual Studio 6.0)

Good Working experience in Visual Source Safe 6.0, Microsoft Team Foundation Server, Visio 2003

Committed to excellence, self-motivator, quick-learner, diligent, team-player and a prudent developer with strong problem-solving skills and good communication skills.

EXPERIENCE SUMMARY

HCL Australia Services Pty Limited – Project Lead – April 13 – Till date

oLocation worked (address) : WorleyParsons, Level 3, 250 St George Terrace, Perth WA 6000

HCL Technologies Limited – Lead Engineer - Feb. 12 – April 13

oLocation worked (address) : HCL Technologies Ltd, Special Economic Zone, Software Tower -II, Plot 3A, Sector-126, Noida-201304 (U.P.)

Buzzworks Loyalty Solutions Pvt Ltd: Project Lead - Jan. 11 – Feb. 12

oLocation worked (address) : Austin Town, Bangalore

Indicus Netlabs Pvt Ltd: Software Engineer - Sep. 08 – Jan. 11

oLocation worked (address) : Nehru House, ITO, New Delhi

VisGain Solutions Pvt Ltd: Software Engineer - Jun. 07 – Sep. 08

oLocation worked (address) : Sadh Nagar, Palam, Delhi

PROJECT DETAILS

1. ASM WorleyParsons Apr 12 – Till Now

WorleyParsons delivers projects, provides expertise in engineering, procurement and construction and offers a wide range of consulting and advisory services. As a part of ASM WorleyParsons engagement HCL is providing application development, maintenance and support to all WorleyParsons in house and COTS application across the globe. HCL is providing its services to WorleyParsons from 3 locations Calgary (Canada), Perth (Australia) and Noida (India).

As a Service Delivery Lead I am responsible for

End to end delivery from requirement analysis to delivery

Leading team which support more than 30 applications

Team handling(based in offshore)

Provide L1-L4 support

Preparing schedules & test plans

Coding

Testing

Nullifying the defects, troubleshooting, providing technical and business support to other downstream systems

2. Hardware Abstraction Layer Apr 12 – Jan 13

Our Client provides robotic solutions for oil and gas wells. Oil Companies identify a location for oil exploration and engage the services of Client to do oil exploration. Client tools and sensors are embedded devices used in oil exploration. A field engineer (or a group of them) is deputed to site to perform the exploration. The Tool string is inserted into the Oil well and the Tools and Sensors are powered by Clients equipment. The Same electronics shall control the Tools and send Sensor data to a computer called the FE laptop. Software applications collect the data sent from Tools into xml files. Client analyses this data and sends a report to their customers. This data is also stored in a database at Client office.

HCL manages the database and has provided an application for easy data mining.

As a Module Lead I was responsible for

Analysis

Design

Preparing schedules & test plans

Coding

Testing

Nullifying the defects, troubleshooting, providing technical and business support to other downstream systems

A 5-member team is being involved in the development of the project using C#, MATLAB, WCF and SQL Server 2008.

3. High on Life Feb 11 – Feb 12

HOL (High on Life) is for internal sales employees to provide them freedom to use their Loyalty Program. This program binds different sales contest as well as regular sales and incentive program under a single umbrella. This contains a pre build reward catalogues to redeem their products based on loyalty points. An employee can track his/her performance and supervisor can see the details reports of its subordinated. Google charting API is used to generate different performance graph.

Along with the online redemption of the products this application will provide following facilities:

Application is divided in different access level like agents, supervisors and administrators.

Process MIS data and calculate the loyalty points for different point based on MIS.

Various levels of data searching and refining to track the performance.

Real time graph and charts are generated using Google Graph API.

Multiple sales contest share the same platform.

User tracking to maintain the demography of the user.

Backend Call Centre Management to handle OMS through Call Centre.

As a Project Lead I was responsible for

Planning, analysis and writing of Solution Design for the requirements

Handling the team which involves in development and deployment

Implement Google API for real time graph

Write backend logics to import and process MIS data.

Write password retrieval module.

An 8-member team was involved in the development of the project using ASP.net, C#, JQuery, Google APIs, Sql Server 2008.

4. Worked on two projects simultaneously during this duration Sep 08 – Jan 11

a. www.indicus.net

Website used to publish research summaries, products feature and data sample in website. PHP based open source CMS Wordpress and Joomla is used to display media coverage and blogs. Different economic graph are generated on daily basics using Google Graph API.

A regular data crawling & parsing is used to extract data from various institutions Government and non-government’s website. Open source dll (Lucean.Net) is used for Data crawling and HTML Aligity dll is used for data parsing.

This project contains these functionalities

Data crawling and parsing from different institutions government and Non-government websites.

Use Google Graph API to generate graphs.

Three sub domains of website developed in ASP.NET

Two sub domains are developed in PHP open source Joomla & Wordpress

Two monthly newsletters send to almost 100,000 users using a PHPList.

Monthly online survey to capture demographic and financial related data.

India Today-Indicus yearly survey on Best Company to work for.

As a Software Engineer I was responsible for

Responsible for development and maintenance of Product sub domain.

Responsible for all Open Source and Google APIs implementation.

Design and developed the Best Company to work for survey.

Responsible for data crawling and parsing.

A 5-member team was involved in the development of the project under Asp.Net, C#, JavaScript, Sql Server, XML, Jquery, Google APIs, Lucean.Net, HTML Aligity DLL, PHP.

b. www.raftaar.in

Raftaar in India’s first Hindi search engine. It’s totally crawling based search engine which categorized the data under different sub categories. Raftaar use Lucean.Net based crawler to search and parse the data and categorized in different services.

An extensive data crawling, parsing and indexing is backbone of Raftaar services.

This project contains these functionalities

Data crawling and parsing from various websites.

Categorized the data by its area of domains

Regular update the parsing logic to extract the information.

As a Software Engineer I was responsible for

Responsible for development and maintenance of 3 sub domain education, exam results and local news.

Part of R&D team to explore new techniques and technologies to make search and crawling better.

Lead the Search engine optimization team

Responsible for data crawling and parsing.

A 20-member team was involved in the development of the project using Asp.Net, C#, JavaScript, Sql Server, XML, Jquery, Google APIs, Lucean.Net, HTML Aligity DLL, PHP.

5. Hotel Management System Jun 07 – Sep 08

This project contains these functionalities

Room booking, inventory management and finance management.

MIS management.

Multi-level reporting

As a Software Engineer I was responsible for

Responsible for development and maintenance.

GUI development.

Manual testing.

A 4-member team was involved in the development of the project using C#, Sql Server, Crystal Reports.



Contact this candidate