Sumaiya Akhtar
*******.*******@*****.***
Phone: 404-***-****
QUALIFICATION SUMMARY
Over 10 years of experience in software Quality Assurance, Data Analysis and Application Support with a strong background in SQL Oracle, ETL, UNIX & Shell & vast working experience in ATT.
Proficient in Conceptualizing, investigating and providing technical solutions and strong back-end testing and E2E testing.
Certified AWS Cloud Practitioner
TECHNICAL SUMMARY
Databases: Toad for Oracle 10.6, Teradata, SQL Developer, Postgre SQL,
Test Management Tools: Rally, HP Quality Center, IBM Rational Testing tool, Jira, Qtest, Zephyr
Technologies: Business Objects Reporting, MicroStrategy, Oracle Hyperion, Hadoop systems (HDFS), Hive, UNIX Shell command line, Elastic search, TWS Scheduler, SQL, Selenium Java, Cucumber Framework, TestNG, Jmeter, Postman, Newman, Rest-Assured, Apache POI, Reports, AWS, Putty, StreamDiff, GitLab
EDUCATION
B.E (Comp. Sc.) HKBK College of Engineering, Bangalore, India
PROFESSIONAL EXPERIENCE
Nov 2018 – Current
Precisely (former - Cedar Document Technologies)
Senior Software Engineer -Quality Mgmt
Dunwoody GA
Precisely provides private Cloud based management services. It provides communication and digital service platforms for B2B/B2C business and software solutions. Precisely provides financial statements, Digital Composition, Billing and Invoicing, Mortgages and loan payment services and Customer Communication with e-mail, SMS Alerts which includes Rapid Communications.
Precisely Ingests the raw data from the Clients and transforms the data by adding resources, Inserts, Attachments etc
and compose the communication to be delivered to in Client preferred mode of delivery.
The Current System is in process of getting decoupled and Implemented with Microservices/API services and hosted on AWS Cloud.
Roles and Responsibilities:
Part of Agile Scrum Team and supported the project work sprint wise.
Automated the UI of various Client Portals using Java Selenium with TestNG framework and generated testing reports using Allure.
Used Rest-Assured with Cucumber framework for automating API’s.
API Testing using Postman and Newman with multiple runs.
Performed manual Testing of the UI/API and raised several defects to report issues.
Have done deployments using Jenkins for the UI and back end Rapid deployments on Linux Server for every release. Maintained the healthy server by performing purge and take care of old logs and archived them for later reference.
Feb 2010 – Nov 2018
Amdocs
Amdocs - ATT Account/Alpharetta GA
Big Data /Data lake / Hadoop:
Data Lake (DL): The Data Lake is a storage platform to capture and process vast amounts of
multi-structured data that typically has been cost prohibitive to store and analyze.
It is a cross platform collaboration environment that is highly scalable with low-latency performance.
The Data Lake Solution utilizes the ingestion framework to load data into the stage and then to the gold areas of the HDFS cluster.
Roles and Responsibilities:
* Use SVN Utility to get the new code to ST server and TWS scheduler for running and scheduling jobs.
* Validate the Table Structure, File structure as per the AID/HLD after the Ingestion is complete.
* Subscribe to source feed in the subscriber config table with landing zone path. Subscriber Invenio set up for Data router should be done.
* Data Ingestion Via the source tables or files, used Scoop to Ingest in Hadoop Cluster for tables.
* Logstash process reads the files and enhances the data and coverts them into JASON Files.
* Follow deployment Guide for steps to replicate the production when the project goes live.
* Validate count and data from file to HDFS Stage, Hive stage and Hive gold.
* Count and Data Validation is done with file in landing zone Vs the HDFS stage Vs HIVE STAGE AND HIVE Gold.
* Publishing data to downstream systems through Kafka broker and received response files back from them to process them during Ingestion.
Business Objects - SAP Business Objects Enterprise XI – Reporting
Reporting tools are widely used to support decision making and measure performance. Companies use them for financial consolidation, for evaluation of strategies and policies and often just for plain reporting. Today most of these tools are integrated with Business Intelligence tools. Reporting tools allow companies to create attractive reports easily. With the reports containing the right information people can manage and improve the business processes more easily. SAP BO tool hovers around terminologies like Universe, Class, Object, Filters, Report.
Responsibilities:
Validating BO Universe - Objects/metrics validation, Joins, Reports and other logics applied to the BO Universes. Generating and validating BO adhoc reports.
Writing SQL queries to create test scripts to validate pre-defined canned reports
Use the AID to create test cases for objects, joins and Reports. Add Validations for verifying the mapping and check the data by running the BO SQL and SQL written by us in the backend (Teradata or Oracle). For Joins verify the cardinality.
Report test cases include both on demand and canned reports.
Canned Reports: It’s a scheduled report and we schedule it and runs automatically without user intervention for a date/month.
On demand Reports: Waits for user to enter the values/prompts and runs according to user request.
Amdocs - T-Mobile Account /Dunwoody, Georgia
End to End Testing Analyst
T-Mobile: T-Mobile USA is currently the third Largest wireless carrier in the United States and approximately 34 million customers. I am involved in the complete end to end (E2E) testing from the T-Mobile Ihaps systems and tested all the way through the backbends (Samson, HSO, and PAG). I have tested all the T-Mobile CARE, Retail and Web apps.
Responsibilities:
* Experience doing live switch testing with Production HLR and have tested scenarios like activation, creating data usage, throttling, getting the SMS from engineering systems.
* Very proficient and Strong in SQL queries to suit the testing criteria or scenario. Have used joins and other relational operators to enhance the speed of the query.
* Experience writing and executing test plans, scripts. And have used the BPT Business components for designing the test case in QC and used the requirements tab to cover all the project requirements and link to test cases in QC.
* Have experience creating the Project Exit reports with all the relevant issues and risk after the successful execution of the project.
* Have proficiently worked on UNIX Platform for creating the unbilled Usage, running scripts and running billing for subscribers.
* Used the Defects Tab in QC for logging and tracking the defects and have opened Lifelines with the Development /Middleware team to resolve the issue.
* Worked on T-Mobile web apps like TMO.com which is an Online Web Portal, MyT-Mobile.com is the online customer self service Portal, E-Bill used for invoice presentation. Here after doing activation from Tmo.com, as an E2E tester I have also been involved in validating the DB for the Activation charges, MSISDN and testing the welcome SMS from the Engineering systems.
* Was involved in testing the Retail apps like Watson which allows T-mobile retail stores the ability to access account creation, CAM/ICAM enables customer management, and POS is the retail register system which accepts the Payments. As an E2E tester I was performing scenarios where I create an account, add/delete soc, make a payment and do these validations in Samson DB for the entries.
* Tested the CARE systems like HSO which is used for Handset Upgrades and Exchanges, CSM which provides Customer CARE interface to Samson for Activations and account maintenance. As an E2E tester I was doing the handset Upgrades/Exchanges and validated in the HSO DB and the charges in Samson.
* Worked on T-Mobile Prepaid Projects and have thoroughly tested the entire prepaid flow from Activation, receiving the welcome SMS, SMS for 80% and 100 % tethering. Have also tested the Unlimited International price plans on Production HLR and validated the correct decrement rate for all the validation. T-Mobile uses a separate DB for prepaid which is PAG.
* Have been involved in the Night of deployment testing (NOD) testing with the Business for all the projects I have done E2E testing for.
* Experience testing the Middleware files/Flows which are fed to the Backend or DB. The aim here was to reduce the number of application integration defects getting into production and increase the confidence of Business prior to the deployment of new release.