Post Job Free

Resume

Sign in

Data Sql Server

Location:
Arlington, TX
Posted:
February 22, 2018

Contact this candidate

Resume:

Yamira Roque

ac4ksr@r.postjobfree.com

Summary:

• More than 7 years of IT experience in data analysis, data architecture, application design, dimensional data modeling, implementation and testing of Enterprise Data Warehousing, NoSQL data, Enterprise Database and Data Mining.

• More than 7 years of hands on experience with Modeling (Object models, Logical models, Physical models for relational databases) for the PostgreSQL manager.

• Excellent experience in the design of UML diagrams using modeling tools such as Erwin and Visual Parading.

• Strong experience in physical modeling for multiple platforms such as PostgreSQL /MariaDB, Oracle / MySQL / SQLite / SQL Server / DB2/ Sybase IQ.

• Expertise in the Analysis, Design and Development of Data warehousing solutions and in developing strategies for Extraction, Transformation and Loading process using Ab Initio, Pentaho, Talend and SSIS.

• Experienced in data analysis and data profiling using complex SQL in various source systems, including PostgreSQL, MariaDB, Oracle, SQL Server, Sybase IQ, Teradata, DB2 platforms.

• Experienced in server monitoring, enabling and using records for multiple platforms such as PostgreSQL, MariaDB, Oracle, MySQL, SQLite, SQL Server, DB2, Sybase IQ.

• Solid experience in the Transform, Partition, Departition, Dataset and Database components using Ab Initio, Pentaho and Talend.

• Strong experience in the configuration of replication processes and data clustering for many tool such as: Pgpool, Symmetry, SharePlex, MariaDB Galera, Oracle Golden Gate, Reko

• Strong knowledge in the Manipulation and index development, partitioning tables, writing better queries and adjusting variables in PostgreSQL platforms.

• Excellent experience in the design of transactional, dimensional and multidimensional data structures using data modeling tools such as Erwin and Visual Parading.

• Strong experience in SQL and ETL development including data mapping, data cleansing and data profiling.

• Detailed knowledge of, SSAS, SSRS, SSIS, T-SQL, Reports and Analysis.

• Experience in the standardization method of the database defined by E.F CODD

• Experience in data warehouse design, data modeling and database administration activities.

• Extensively worked on the Cassandra data modeling and designing of the database

• Experience in incorporating commercial requirements into conceptual and logical quality data models using Visual Parading and physical data models created using advanced engineering techniques to generate DDL scripts.

• Experience in writing complex SQL and optimizing queries in various managers: PostgreSQL, MariaDB, Oracle, SQL Server, Teradata, DB2, Sybase IQ and MySQL.

• Extensive experience with the OLTP / OLAP system and E-R modeling, the development of database schemas such as the STAR scheme and the Snowflake scheme used in relational, dimensional and multidimensional modeling.

• Extensive work experience in standardization and de-normalization techniques for OLTP and OLAP systems in the creation of database objects such as tables, constraints (primary key, foreign key, single, default), indexes.

• Experience in administering large and multi-terabyte into the PostgreSQL database. Strong experience in the creation of database objects, such as tables, views, functions, stored procedures, indexes, triggers and cursors in PostgreSQL.

• Experience in testing and writing of SQL and PL / SQL statements - Stored procedures, functions, triggers and packages using various managers: PostgreSQL, MariaDB, Oracle, DB2, Sybase IQ, SQL Server and MySQL.

• Experienced in data loading using PL / SQL, SQL Server Integration Services (SSIS) packages, Pentaho Data Integration (PDI), Talend, Oracle Data Integrator (ODI) and panel reports developed using SQL Server reporting services (SSRS).

• Basic knowledge in the functioning and development of the following languages: Java, Visual Basic, R, .Net, C, C++, C# and Prolog.

• Experienced in full Software Development Life Cycle (SDLC) starting from collecting Business specifications, Analysis, Design and Development, Testing, documenting the entire life cycle and working with vendors for data acquisition.

• Excellent experience in the writing and execution of unit scripts, system, integration in data warehouse projects. Experience using scheduling tools such as Control M.

• Experience in working with different operating systems WINDOWS, UNIX, LINUX.

• Experienced in Shell scripting and command line utilities.

• Worked in the team and faced challenges during the project development and production process.

• Strong analytical, logical, communication and problem solving skills and ability to quickly adapt to new technologies through self-learning.

Education Details:

• Bachelor's in Computer Engineer in 2012 from University of Computer Science, Havana, Cuba Technical Skills:

Databases: Oracle 11g/ 12c, PostgreSQL 8.x /9.x, MySQL, SQL Server, SQLite, DB2, MariaDB, MS Access, MariaDB, Sybase, Cassandra, MongoDB.

ETL/Data warehouse Tools: Ab Initio, ODI, Talend, Pentaho, SSAS, SSIS Data Modeling: Erwin, Visual Parading, Crystal Reports. Tools: SSRS, MS-Office suite (Word, Excel, MS Project and Outlook) Operating System: Windows, Unix, Linux(Ubuntu, Debian) Tools & Software: GIT, Subversion, AWS, Pgpool, Symmetry, SharePlex, Oracle Golden Gate, Reko Project Execution Methodologies: Ralph Kimball and Bill Inmon, XP, SCRUM, Rational Unified Process

(RUP)

Programming Languages: SQL, PL/SQL, T-SQL, DQL, C, Shell Scripting, Python, PHP Protocols: HTTPS, HTTP, SSL, SMS, MMS, OpenSSL, SMTP, FTP. Professional Experience:

Company Maestranza – Santiago, Chile August 2016 to December 2017 IT Specialist

Responsibilities:

• Design, integrate, and maintain implementation logical data models for analytic and operational data management platforms.

• Serve as a liaison between architectural oversight teams and business owners, providing technical guidance to project teams to facilitate requirements definition and system design.

• Ensure compliance with enterprise data structures and master data management strategies through ongoing communication and coordination with DBAs, application developers, and data stewardship teams.

• Provided technical solutions for SQL Server database issues associated with local automotive client's implemented software package; Worked with client's vendor to resolve software integration challenges; Provided consulting and technical guidance on networking options; Implemented client's electronic invoicing solution.

• Designed and deploying client's website, as well as designing concepts for corporate branding, marketing materials, and publications.

• Provide ongoing technical expertise for independent insurance agency client, including software acquisition, modification, and integration solutions, operational impact assessments for daily data storage and backup utilization, and best practices guidance for software implementation and maintenance.

• Design and configuration of different replication environments using Oracle Golden Gate.

• Maintenance services and correction of errors in websites.

• Construction of diagrams and graphs for the statistical analysis of the information. Development of complex queries and maintenance and configuration of database environments. Environment: Oracle, Oracle Golden Gate, Erwin, SQL, PL / SQL, C, DQL, MS Excel 2007, PHP, Symfony, JavaScript, JSON, BJSON and XML.

UCI - Havana, Cuba January 2015 to June 2016

Software Engineer

Description: UCI is a technology company headquartered in Havana, Cuba. UCI provides information technology and consulting services to businesses and governments. Project: Development of Replication Tool.

Responsibilities:

• The concepts related to the process of replication, migration and centralization of the information for multiple data centers were studied in depth. Different replication tools were installed, configured and assimilated (Pgpool, Symmetry, SharePlex, MariaDB Galera, Oracle Golden Gate, Reko), in order to determine the functional and non-functional requirements for the replication tool to be developed.

• Proactively foster successful working relationships with all Agile project team members by encouraging open discussions and facilitating cooperative and timely consensus among stakeholders.

• Translated business requirements into conceptual, logical and physical data models using Erwin.

• Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.

• Implementation of triggers in C language that will manage the process of capturing the information, in an agile and efficient way using libpq - C Library.

• Development aggregations, key performance indicators and browser perspectives extensively developed.

• Worked on Shell scripting to support Unix platform.

• I created, maintain and optimize database objects for multiple managers.

• I manage the Lead design and code reviews.

• Developed and maintained the data dictionary to create metadata reports for technical and commercial purposes.

• I work in the construction of queries and complex SQL functions associated with the business, interacting with the attributes and data collected in the System Catalogs and Information Schema.

• Development and maintained metadata and metadata management strategies for the data model.

• Segmentation of information through criteria guaranteeing greater performance in queries and availability of information.

• I performed analysis, developed specifications, designed and developed reports and dashboards using Crystal Reports, MS Excel and other standardized reporting tools.

• Involved in extensive data validation when writing several complex SQL and DQL queries and involved in back-end testing and worked with data quality problems.

• Implementation of cryptographic functions and validation of uniqueness of the information to be replicated through the module pgcrypto and uuid-ossp.

• Implementation of internal SQL and PL / SQL functions to guarantee the synchronization of the information in an asynchronous and synchronic way depending on the needs of the client.

• Expert in data structure design, Data Modeling in Cassandra.

• Designed various based on the requirement to minimize the join and modified queries to do the client side joins.

• Strong understanding of complex inner workings of Cassandra, such as the gossip protocol, hinted handoffs, read repairs.

• Good understanding of the tomb-stone and compaction of ss-tables.

• Experience in setting up Cassandra cluster.

• Strong working knowledge of CQL (Cassandra Query Language) and Materialized views on Cassandra file system.

• Designed the queries with different consistency levels based on the requirement.

• Worked on performance tuning of cluster using Cassandra Configuration file and JVM Parameters.

• Performed importing data from various sources to the Cassandra cluster using Java and APIs

• Conducted complex ad-hoc programming and analysis including statistical programming and analysis.

• Develop and update a guide that will simplify the design of centralization and data integration, as well as the launch of applications to the real environment, validating it through 10 different projects with multiple data servers.

Environment: Oracle 11g, PostgreSQL 9.0, SQL Server 2008, Unix, Pgpool, Symmetry, MariaDB Galera, SharePlex, Oracle Golden Gate, Reko, Erwin, Shell Scripting, SQL, PL / SQL, C, DQL, libpq-C, Crystal Reports, MS Excel 2007, Modules pgcrypto, uuid-ossp, Sybase IQ, JSON, BJSON, XML, System Catalogs and Information Schema.

UCI - Havana, Cuba December 2013 to December 2014

Software Engineer

Project: Development of ERP software.

Responsibilities:

• Worked in data analysis, data profiles and data governance, identifying datasets, source data, source metadata, data definitions and data formats.

• Designed the Logical Data Model using ERWIN for each thematic area.

• Worked on SQL query scripts, complex stored procedures and routines, user defined functions, database triggers and events, view, User-Defined Functions.

• Developed a roadmap and long-term data warehouse architectures, design and build the data warehouse framework.

• Involved in the analysis, design, testing and implementation of Business Intelligence solutions using Data Warehouse, ETL, OLAP, Client / Server applications.

• Managed the import of data from various data sources and transformations using Ab Initio.

• I designed and developed architecture for the data services ecosystem that encompasses Relational and NoSQL technologies.

• Specify the general data architecture for all areas and domains of the company, including data acquisition, Data Warehouse, Data Migration, ETL and BI using Ab Initio.

• Install and configure servers and clients using PostgreSQL. Creating and updating SQL databases.

• Advise and implement data governance to improve the quality / integrity of the data and the monitoring of the collection and management of operational data in PostgreSQL.

• Develop stored and activating procedures. Information was provided to solve problems and data limitations.

• Responsible for Implement, Schedule jobs, Alert and Maintain packages.

• Create parameterized, cascaded, detailed, cross-tabulated, and detail-obtaining reports.

• Implement an internal mechanism to monitor and stop the corruption of data, by intruders, guaranteeing the security and integrity of the information.

• Development of the data warehousing, batch processing, click-stream analysis, data movement, parallel data cleansing and validation, data transformation and filtering, high performance analytics and real time, parallel data capture using Ab Initio.

• Manage the User and Server Security by establishing Secure Connections Overview using the SSL / TLS System Variables and Encryption Plugin API in the PostgreSQL manager.

• Demonstrate competence in SQL through a series of managers (MySQL, MariaDB, DB2, SQL Server and Oracle); writing and modifying SQL*LOADER, PL/SQL codes including Packages, Materialize Views, Nested tables, Triggers, and Functions

• Development coding and debugging stored procedures and various other objects into the Oracle, PostgreSQL, SQL Server, DB2 and another managment. Environment: MariaDB, Oracle, MySQL, PostgreSQL, SQL Server, Unix, Shell Scripting, Ab Initio, NoSQL, ETL, OLAP, OLTP, SQL, PL/SQL language, Erwin, JSON and XML. Hotel Chains - Havana, Cuba February 2011 to October 2013 Software Engineer

Responsibilities:

• I developed the logic and the physical model from the conceptual model with the use of Visual Parading, based on the analysis of business requirements.

• Getting, installing, and upgrading the databases using the MariaDB and PostgreSQL managers.

• Responsible for designing and implementing efficient stored procedures and packages in PDI.

• I designed a high level ETL architecture for the general transfer of data from OLTP to OLAP.

• Use of tools and methods to perform backing up and restoring databases in the MariaDB and PostgreSQL managers.

• ETL packages were created and executed to complete data from various data sources for different data loading operations for many applications.

• Determine the best approach to design data structures that support easy data consumption, trend identification and the ability to highlight exceptions through reporting tools and dashboards.

• Used reverse engineering for a wide variety of RDBMS, including MS Access, Oracle and DB2 to connect to an existing database and create a graphical representation using Visual Parading.

• Data migrated from PostgreSQL and MariaDB to the data warehouse using the integration services to generate the reports.

• Created and Designed Data Source and Data Source Views and also configured OLAP Cubes

(Scheme of Stars and Snow Scales) using Ab Initio.

• Implemented ETL techniques for data conversion, data extraction and data mapping for different processes and applications using Ab Initio.

• Worked extensively with the database to implement data cleanup and performance adjustment techniques in the MariaDB and PostgreSQL managers.

• Manipulation and index development, partitioning tables, writing better queries and adjusting variables to optimize and adjust the filtering process and query in the MariaDB and PostgreSQL administrator.

• Monitoring of the PostgreSQL server and authorization and use of records.

• Use of PostgreSQL Audit Plugin to record server activity in each client session by sending this information to a rotating log file or local syslogd.

• Use of Pgpool for replication and data clustering.

• The coherence of the data was maintained through the evaluation and updating of physical and logical data models to admit new and existing projects.

• Developed the understanding and calendar of data from the originating system, and worked within these limitations to provide information in a timely manner.

• Solved the inconsistencies in the type of data between the source systems and the target system using the mapping documents and analyzing the database through SQL queries.

• Generated the DDL using advanced engineering worked in the fusion and complete comparison of the physical models with the PostgreSQL manager.

Environment: SQL, Visual Parading, XML, JSON, UML, MariaDB,MS Access, Oracle and MySQL, PostgreSQL, MariaDB Galera, MS-Excel, PowerPoint, MS-Word, Microsoft SQL, Pentaho Reporting and Ab initio.

Cupet - Havana, Cuba March 2010 to January 2011

Data scientist

Description: Cupet Corporation is responsible for the transportation and distribution of cargo throughout the country. Its main task lies in the distribution of oil, as well as in the construction and maintenance of these facilities, taking a constant decision on the behavior of certain factors to provide optimal service at all levels.

Responsibilities:

• As an architect, he designs conceptual, logical and physical models using Visual Parading and builds DataMarts using hybrid methodologies of Inmon and Kimball DW.

• Worked closely with companies, data management and suppliers to define data requirements.

• Worked with research, discovery and data mapping tools to scan each data record from several sources.

• Designed the Data Mart prototype and documented the result for the end user.

• Involved in the modeling of business processes using UML.

• Developed and maintained the data dictionary to create metadata reports for technical and commercial purposes.

• Created SQL tables with referential integrity and queries developed using SQL and PL / SQL.

• Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and DB2.

• Implementation of the replication, encryption, aggregation, indexing and security of the information.

• Analysis, Design and Development of Data warehousing solutions using Ab Initio.

• Design, coding, unit tests in ETL packages using Ab initio Suite.

• Created a Logical Design and Physical Design in Erwin and used Erwin for developing Data Model using Star Schema methodologies.

• Developed ETL jobs for automation and responsible for job performance optimization through the using Ab Initio and Oracle.

• Controlled data cube development based on requirement prioritization and consolidation.

• Conducted Learning sessions for business users, analysts, QA team to provide insight into datamodel and providing documentations

• Developed several data models extracting and using data from various sources: DB2, Excel, files XML / JSON, SQLite and Oracle.

• Managed import data from various data sources, performed transformations using Ab Initio.

• Interaction with Business Analyst and other data architects to understand the needs and functionality of companies for various project solutions.

• Designed 3NF data models, OLTP systems and dimensional data models. Environment: Ab Initio, PostgreSQL, OLTP, Oracle 10g, OLAP, DB2, metadata, MS Excel, Visual Parade, Rational Rose, PL / SQL, SQL, SQLite, XML, JSON, Tortoise SVN,etc.



Contact this candidate