Post Job Free
Sign in

Manager Project

Location:
Denver, CO, 80247
Posted:
November 02, 2010

Contact this candidate

Resume:

[pic]

Professional Summary:

. ** plus years of work experience in the IT industry.

. Designing, development, DBA activities involving Teradata and

designing, developing and supporting various features in Teradata

namely Teradata Stored Procedures, Teradata V2R5 Triggers and Teradata

UDFs and solid experience in developing SQLs in Teradata.

. Have good experience in Teradata DBA activities like Teradata database

installations, version up gradations, database maintenance activities,

Archive/Recovery operations, Teradata user and database creations.

Creating and managing user accounts. Prepared scripts for monitoring

the system. Automated the Arcmain for the database. Installed Teradata

drivers for the teradata utilities. Refreshed the data by using fast

export and fastload utilities. Used Teradata Administrator and

Teradata Manager Tools for monitoring and control the system.

. Fully aware of the IUMB (Install, Upgrade, Migration, and Backdown)

procedures of V2R51 and V2R6 Teradata Software on all flavors of UNIX

and WINDOWS platforms.

. Extensively worked in Teradata DBA activities like: Archive/Recovery,

DBS installation / upgrading / maintenance, Creating Teradata Users,

Teradata Databases

. Lead the teams in the usage of mostly all the Teradata Utilities

like Backup and Restore Utilities (ARCMAIN/Net Vault/Net Backup),

Fastload, Multiload, Tpump, BTEQ, FastExport, Pmon.

. Worked extensively in C, C++, Bourne Shell, Unix Internals, TCP/IP,

Socket Programming, Perl languages, Teradata and well versed with

RDBMS: MS-SQL server, Sybase and Teradata RDBMS. Strong in developing

tools and system applications using Unix Internals, TCP/IP, Socket

Programming and FTP. Worked on Windows NT, 95, CE, SCO-UNIX, HP-Unix

and Linux, Sun OS 5.5 platforms. Conversant with Wipro & Satyam's ISO

9001 quality procedures.

. Worked extensively in Teradata Manager, ARC, BTEQ, TDADMIN, SQLA,

PMON, Priority Scheduler, Teradata RDBMS, SQL, TPT,Teradata Data

Modeling, Teradata Client Utilities and Backup and Recovery Utilities

on operating systems like UNIX MP-RAS, Solaris-SPARC, HP-UX, SUSI

Linux, .NET and Windows 2000 and on hardware platforms such as

NCR4400, NCR4300, NCR4100, Pentium and Itanium.

Experiences:

Organization Designation Duration

Cognizant Technology (CTS) Manager Oct 2010 - Till Date

Everest Technology, USA Sr Consultant Jan 2010- Sep 2010

ARTECH Information Systems Sr Consultant Jan/2009 - Jan 2010

LLC, USA

Wipro Technologies, USA Sr.Manager April/2008 - Jan/2009

Wipro Technologies, USA Project Manager Feb/2007 - April/2008

Satyam Computer Services Senior System Analyst April/2005 - Feb/2007

Ltd., Hyderabad.

Satyam Computer Services System Analyst June/2003 -

Ltd., Hyderabad. March/2005

Nelco Automation Ltd (A Associative System (01/2001) - (02/2003

business Associative of Analyst

Tata Consultancy Services)

BAeHAL Software Ltd (A Software Engineer 06/1997) - (12/2000)

business Associative of

Pharma Systems Pvt. Ltd.,)

Education

Degree University Year of Passing

B.Tech (Computer Science) Andhra University 07 / 1996

Certifications

. Teradata Basics V2R5 (NR0-011) - SR2061037

. Teradata Physical Implementation V2R5 (NR0-012) - SR2061037

. Teradata Administration V2R5 (NR0-014) - SR2061037

. Teradata SQL V2R5 (NR0-013) - SR2061037

. Teradata Design Architecture (NR0-015) - SR2061037

. Teradata Application Development (NR0-016) - SR2061037

. Teradata Certified Master V2R5

Experience:

Penn National Gaming Inc (With Cognizant) 02/2010 - Till

Date

Teradata Analyst & DBA - Data Warehouse

Penn National Gaming is a big Casino Gaming company providing services for

gaming line customers. I worked as a Sr Teradata DBA/Application DBA /

Architect to provide EDW Data mart solution for marketing department to

analyze Campaign Analytics with Customer Analytics. Informatica 8.x,

7.1.3/6.2 is used as ETL tool & SQL Server, Teradata 12 & 6.2 is the

database & Cognos is used for Reports.

Contribution

. Worked on NetBackup/NetVault Technology to take the hot and cold

backups.

. As Teradata DBA Lead, responsible for maintaining all DBA functions

(development, test, production) on three platforms in operation 24x7.

Platforms included 5500, 4 node for production, development and

testing.

. Setup core process required to source new data streams into Teradata

data mart

. Participate in the design and development of a technical architecture

that will be aligned with the client's business objectives.

. Responsible for providing the best Teradata Informatica Loading

strategies for ETL processing of getting data into Data mart.

. Involved in workload balancing through TASM.

. Performed query optimization and assisted ETL developers and users by

analyzing DBQL data, explain plan and join strategy and Partitioning

large tables to improve performance of sqls.

. Using the workload management tool and methodology to manage the

system workload to help business team to meet SLA

. Performed database health checks and tuned the databases using

Teradata Manager.

. With the in-depth expertise in the Teradata cost based query

optimizer, identified potential bottlenecks with queries from the

aspects of query writing, skewed redistributions, join order,

optimizer statistics, physical design considerations (PI/USI/NUSI/JI

etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to

analyze and improve query performance.

. Analyzing sessions with for high spool usage, high CPU usage, high

disk usage, and high skew etc when alerts are received.

. Use of all database administration tools to analyze system

status/performance.

. Analyze explains/statistics/diagnostics to isolate SQL problems.

. Teradata SQL, OLAP functions, and Load/Unload utilities

. Perform report development, MSTR architecture work, and interaction

with business, SMEs and client MSTR team, perform requirements-

gathering and translate to technical requirements, document

requirements/design.

. Did performance tuning, including collecting statistics, analyzing

explains & determining which tables needed statistics. Increased

performance by 35-40% in some situations.

. Multiload, BTEQ, created & modified databases, performed capacity

planning, allocated space, granted rights for all objects within

databases, etc

. Moved databases between machines.

. Installed patch sets and upgraded Teradata.

. Automated data warehousing and datamart refreshes using Transportable

table space option.

. Worked on creating and managing the partitions.

. Deliver new and complex high quality solutions to clients in response

to varying business requirements and Creating and managing user

accounts.

. Installed Teradata drivers for the teradata utilities.

. Refreshed the data by using fastexport and fastload utilities.

. Used Teradata Administrator and Teradata Manager Tools for monitoring

and control the system.

. Creating and modifying MULTI LOADS for Informatica using UNIX and

Loading data into EADW.

. Loading data from various data sources and legacy systems into

Teradata production and development warehouse using BTEQ, FASTEXPORT,

MULTI LOAD, FASTLOAD and Informatica.

. Working as an On-Site Teradata Lead DBA, doing DBA activities that are

Implementation of data and table-level security, setting up access

rights and space rights for Teradata environment. Performance tuning,

Monitoring and reporting Review system utilization by user.

AT&T (With Wipro) 06/2009 - 02 /2010

Sr. Teradata DBA/Application DBA /Teradata Architect

AT&T is a big Telecommunications company providing services for wireless

and wire line customers. I worked as a Sr Teradata DBA/Application DBA /

Architect to provide BI Data mart solution for marketing department to

analyze Campaign Analytics with Customer Analytics. Informatica 8.x,

7.1.3/6.2 is used as ETL tool & Oracle 10g/9i, Teradata 12 is the

database & Micro strategy Ver. 8 is used for Reports.

Contribution

. Worked on NetBackup Technology to take the hot and cold backups.

. As Teradata DBA Lead, responsible for maintaining all DBA functions

(development, test, production) on three platforms in operation 24x7.

Platforms included 5500, 4 node for production, development and

testing.

. Setup core process required to source new data streams into Teradata

data mart

. Participate in the design and development of a technical architecture

that will be aligned with the client's business objectives.

. Responsible for providing the best Teradata Informatica Loading

strategies for ETL processing of getting data into Data mart.

. Involved in workload balancing through TASM.

. Performed query optimization and assisted ETL developers and users by

analyzing DBQL data, explain plan and join strategy and Partitioning

large tables to improve performance of sqls.

. Using the workload management tool and methodology to manage the

system workload to help business team to meet SLA

. Performed database health checks and tuned the databases using

Teradata Manager.

. With the in-depth expertise in the Teradata cost based query

optimizer, identified potential bottlenecks with queries from the

aspects of query writing, skewed redistributions, join order,

optimizer statistics, physical design considerations (PI/USI/NUSI/JI

etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to

analyze and improve query performance.

. Analyzing sessions with for high spool usage, high CPU usage, high

disk usage, and high skew etc when alerts are received.

. Use of all database administration tools to analyze system

status/performance.

. Analyze explains/statistics/diagnostics to isolate SQL problems.

. Teradata SQL, OLAP functions, and Load/Unload utilities

. Perform report development, MSTR architecture work, and interaction

with business, SMEs and client MSTR team, perform requirements-

gathering and translate to technical requirements, document

requirements/design.

. Did performance tuning, including collecting statistics, analyzing

explains & determining which tables needed statistics. Increased

performance by 35-40% in some situations.

. Multiload, BTEQ, created & modified databases, performed capacity

planning, allocated space, granted rights for all objects within

databases, etc

. Moved databases between machines.

. Installed patch sets and upgraded Teradata.

. Automated data warehousing and datamart refreshes using Transportable

table space option.

. Worked on creating and managing the partitions.

. Deliver new and complex high quality solutions to clients in response

to varying business requirements and Creating and managing user

accounts.

. Installed Teradata drivers for the teradata utilities.

. Refreshed the data by using fastexport and fastload utilities.

. Used Teradata Administrator and Teradata Manager Tools for monitoring

and control the system.

. Creating and modifying MULTI LOADS for Informatica using UNIX and

Loading data into EADW.

. Loading data from various data sources and legacy systems into

Teradata production and development warehouse using BTEQ, FASTEXPORT,

MULTI LOAD, FASTLOAD and Informatica.

. Working as an On-Site Teradata Lead DBA, doing DBA activities that are

Implementation of data and table-level security, setting up access

rights and space rights for Teradata environment. Performance tuning,

Monitoring and reporting Review system utilization by user.

Xcel Energy & Utilities Company (With IBM) 01/2009 - 06/2009

Application DBA and Sr Teradata DBA

. Worked on Netback up/TERA Technology to take the hot and cold backups.

. As Teradata DBA Lead, responsible for maintaining all DBA functions

(development, test, production) on three platforms in operation 24x7.

Platforms included 5500, 4 node for production, development and

testing.

. Did performance tuning, including collecting statistics, analyzing

explains & determining which tables needed statistics. Increased

performance by 35-40% in some situations.

. Multiload, BTEQ, created & modified databases, performed capacity

planning, allocated space, granted rights for all objects within

databases, etc

. Moved databases between machines.

. Installed patch sets and upgraded Teradata.

. Automated data warehousing and datamart refreshes using Transportable

table space option.

. Worked on creating and managing the partitions.

. Performed database health checks and tuned the databases using

Teradata Manager.

. Deliver new and complex high quality solutions to clients in response

to varying business requirements and Creating and managing user

accounts.

. Installed Teradata drivers for the teradata utilities.

. Refreshed the data by using fastexport and fastload utilities.

. Used Teradata Administrator and Teradata Manager Tools for monitoring

and control the system.

. Creating and modifying MULTI LOADS for Informatica using UNIX and

Loading data into EADW.

. Loading data from various data sources and legacy systems into

Teradata production and development warehouse using BTEQ, FASTEXPORT,

MULTI LOAD, FASTLOAD and Informatica.

. With the in-depth expertise in the Teradata cost based query

optimizer, identified potential bottlenecks with queries from the

aspects of query writing, skewed redistributions, join order,

optimizer statistics, physical design considerations (PI/USI/NUSI/JI

etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to

analyze and improve query performance.

. Working as an On-Site Teradata Lead DBA, doing DBA activities that are

Implementation of data and table-level security, setting up access

rights and space rights for Teradata environment. Performance tuning,

Monitoring and reporting Review system utilization by user.

. Acted as a liaison between the offshore and onsite teams.

. Also performed functions as Project Lead (teams up to 15 people)

Charles Schwab Co Inc. 02/2007 - Till

Date

Application DBA and Teradata DBA

Contribution

. Worked on Netvault Technology to take the hot and cold backups.

. As Teradata DBA Lead, responsible for maintaining all DBA functions

(development, test, production) on three platforms in operation 24x7.

Platforms included 5450, 17 node for production, development and

testing.

. Did performance tuning, including collecting statistics, analyzing

explains & determining which tables needed statistics. Increased

performance by 35-40% in some situations.

. Multiload, BTEQ, created & modified databases, performed capacity

planning, allocated space, granted rights for all objects within

databases, etc

. Moved databases between machines.

. Installed patch sets and upgraded Teradata.

. Automated data warehousing and datamart refreshes using Transportable

table space option.

. Worked on creating and managing the partitions.

. Performed database health checks and tuned the databases using

Teradata Manager.

. Deliver new and complex high quality solutions to clients in response

to varying business requirements and Creating and managing user

accounts.

. Installed Teradata drivers for the teradata utilities.

. Refreshed the data by using fastexport and fastload utilities.

. Used Teradata Administrator and Teradata Manager Tools for monitoring

and control the system.

. Creating and modifying MULTI LOADS for Informatica using UNIX and

Loading data into IDW.

. Loading data from various data sources and legacy systems into

Teradata production and development warehouse using BTEQ, FASTEXPORT,

MULTI LOAD, FASTLOAD and Informatica.

. With the in-depth expertise in the Teradata cost based query

optimizer, identified potential bottlenecks with queries from the

aspects of query writing, skewed redistributions, join order,

optimizer statistics, physical design considerations (PI/USI/NUSI/JI

etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to

analyze and improve query performance.

. Working as an On-Site Teradata Lead DBA, doing DBA activities that are

Implementation of data and table-level security, setting up access

rights and space rights for Teradata environment. Performance tuning,

Monitoring and reporting Review system utilization by user.

. Acted as a liaison between the offshore and onsite teams.

. Also performed functions as Project Lead (teams up to 15 people)

Environment : Teradata, NETVAULT,TMGR, TDADMIN, PMON, SQLA, TSAW, TSWIZ,

Priority Schedule, BTEQ, FastLoad, MultiLoad, FastExport, Tpump and

DataStage. MPRAS V2R6, HPUX, MS Win 9x, Win NT4.0, W2K Professional,

Windows XP 32-bit, Erwin, Power Designer, MP-RAS, W2K, DB2, Oracle

STANDARD BANK OF SOUTH AFRICA LTD 01/2006 - 02/2007

Teradata Lead DBA and Project Manager

Contribution

. Worked on Netvault Technology to take the hot and cold backups.

. As Teradata DBA Lead, responsible for maintaining all DBA functions

(development, test, production) on three platforms in operation 24x7.

Platforms included 5400, 12 node for production, development and

testing.

. Did performance tuning, including collecting statistics, analyzing

explains & determining which tables needed statistics & secondary

indexes. Increased performance by 35-40% in some situations.

. Multiload, BTEQ, created & modified databases, performed capacity

planning, allocated space, granted rights for all objects within

databases, etc

. Moved databases between machines.

. Installed patch sets and upgraded Teradata.

. Automated data warehousing and datamart refreshes using Transportable

table space option.

. Worked on creating and managing the partitions.

. Performed database health checks and tuned the databases using

Teradata Manager.

. Deliver new and complex high quality solutions to clients in response

to varying business requirements and Creating and managing user

accounts.

. Installed Teradata drivers for the teradata utilities.

. Refreshed the data by using fast export and fastload utilities.

. Used Teradata Administrator and Teradata Manager Tools for monitoring

and control the system.

. Creating and modifying MULTI LOADS for DataStage using UNIX and

Loading data to DataStage.

. Loading data from various data sources and legacy systems into

Teradata production and development warehouse using BTEQ, FAST EXPORT,

MULTI LOAD and DataStage .

. With the in-depth expertise in the Teradata cost based query

optimizer, identified potential bottlenecks with queries from the

aspects of query writing, skewed redistributions, join order,

optimizer statistics, physical design considerations (PI/USI/NUSI/JI

etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to

analyze and improve query performance.

. Working as an On-Site Teradata Lead DBA, doing DBA activities that are

Implementation of data and table-level security, Setting up access

rights and space rights for Teradata environment. Performance tuning,

Monitoring and reporting Review system utilization by user.

. Acted as a liaison between the onshore teams.

. Also performed functions as Project Lead (teams up to 15 people)

. Was responsible for certifying Teradata Manager against Teradata

running on .NET

Environment : Teradata, NETVAULT,TMGR, TDADMIN, PMON, SQLA, TSAW, TSWIZ,

Priority Schedule, BTEQ, FastLoad, MultiLoad, FastExport, Tpump and

DataStage. MPRAS V2R6, HPUX, MS Win 9x, Win NT4.0, W2K Professional,

Windows XP 32-bit, Erwin, Power Designer, MP-RAS, W2K, DB2, Oracle

NCR USA 12/2003 - 12/ 2005

Teradata Support Analyst and Teradata Lead DBA

Contribution

. Worked on ARC Technology to take the hot and cold backups.

. Tuned system. Init parameters.

. Moved databases between machines.

. Installed patch sets and upgraded Teradata.

. Automated data warehousing and datamart refreshes using Transportable

table space option.

. Worked on creating and managing the partitions.

. Performed database health checks and tuned the databases using

Teradata Manager.

. Deliver new and complex high quality solutions to clients in response

to varying business requirements and Creating and managing user

accounts.

. Installed Teradata drivers for the teradata utilities.

. Refreshed the data by using fast export and fastload utilities.

. Used Teradata Administrator and Teradata Manager Tools for monitoring

and control the system.

. Loading data from various data sources and legacy systems into

Teradata production and development warehouse using BTEQ, TPUMP, FAST

LOAD, FAST EXPORT and MULTI LOAD.

. With the in-depth expertise in the Teradata cost based query

optimizer, identified potential bottlenecks with queries from the

aspects of query writing, skewed redistributions, join order,

optimizer statistics, physical design considerations (PI/USI/NUSI/JI

etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to

analyze and improve query performance.

. Working as an offshore Teradata Lead DBA, doing DBA activities that

are Implementation of data and table-level security, Setting up access

rights and space rights for Teradata environment. Performance tuning,

Monitoring and reporting Review system utilization by user.

. Developed the Test cases for the new features.

. Performed Product Integration Testing for these applications on W2K

advanced server, W2K Pro, XP against V2R5.1 MPRAS/W2K, V2R5.0

MPRAS/W2K, V2R4.2 HPUX/. NET and V2R4.1.x MPRAS/W2K DBS Platforms

. Involved in fixing the Discrepancy Reports (DR's) raised by customers

as well as In-house

. Prepared the Test Reports with the results was responsible for

developing the Data Collector feature for Teradata Manager 6.0.

. Played active role in testing the Teradata Manager 6.0 applications on

W2K Professional, W2K Server and Windows XP 32-bit client platforms

against Teradata V2R5.0 and Teradata V2R4.1.3.

. Acted as a liaison between the offshore and onshore teams.

. Was responsible for certifying Teradata Manager 6.0 against Teradata

running on .NET

. Execution, development of Test Cases, Test Plans and Test Reports for

various applications of Teradata Manager

Environment : Teradata, ARC,TMGR, TDADMIN, PMON, SQLA, TSAW, TSWIZ,

Priority Schedule, BTEQ, FastLoad, MultiLoad, FastExport and Tpump. n node

NCR - Teradata V2R4X/5X/5.1.X on any node - UNIX MPRAS/Win 2K, MPRAS V2R5.1

and 2r6, HPUX, SUSI Linux,MS Win 9x, Win NT4.0, W2K Professional, Windows

XP 32-bit, Microsoft .NET 32-bit

Rational Clearcase, Rational Purify, Visual Studio Debugger. MP-RAS, W2K,

.NET

Goldman & Sachs, Newyork, USA Jun', 03 - Dec',

03

Developer

Project Description:

Global Asset Servicing System used for processing corporate action

notifications, elections and payments. It feeds off EDS system.

Contribution:

. Creating New Programs/ Jobs

. Corrective, Adaptive, Enhancement and Preventive Maintenance

. Documentation of programs/Jobs

. Production Support

. Modifying Data in the database

. Unit Testing and User Acceptance Testing

Environment : C, C++, PERL, AUTOSYS, Workshop and DBX. CVS, SUN Workshop

(debugger), Sybase11, Sun OS 5.5, Sun Sparc

RADIXS, Singapore (02/2003) -

(07/2003)

Module Leader

Contribution

. Analysis of the specifications provided by the clients

. Design and Development

. Studied & analysed the work activities of system.

. Coding using C, Bourne Shell Script, IPC, Multi Threading, Socket

Programming, TCP/IP.

. Works by using the Event based mechanism with the combination of process

based and thread based mechanism. When the client makes a request for MXI

services the main server handles the request.

. For synchronizing the process and thread creation, IPC mechanisms are

used

. During the process of integration the control flow of information has to

be handled using Message Queues, Semaphores, Shared memory for Inter

Process communication

. Passing arguments between platform interface and server interface

. Testing - unit testing & integration testing

. Responsible for overseeing the Quality procedures related to the project.

. Client site for implementation.

Environment : C, IPC, Socket programming, TCP/IP, Oracle, Linux RH 7.1,

HP-UX 10.0, HP 9000

Steltor, USA Aug'2002 - Jan' 2003

Team Member

3 Contribution:

As a Team Member, responsible for

. Studied & analysed the work activities of system.

. Coding using C, Bourne Shell Script, IPC (Semaphore & Shared

Memory).

. Testing - unit testing & integration testing

. During the process of integration the control flow of

information has to be handled

Using Message Queues, Semaphores, Shared memory for

Inter Process

Communication

Environment : C, Bourne Shell, IPC (Semaphore & Shared Memory), Unix-

HPUX10.0, Solaris, HP 9000

Fleet Financial Services Sep' 2001 -July'

2002

Team Member

4 Contribution:

. Defining common data types and writing wrapper classes

. Development of client wrappers for Component Manager, Craft Manager

. Development of File Transfer Agent using C.

. Involved in Re-engineering of several classes and extending their

functionality

. Testing - unit testing & integration testing

.

Environment : C, C++, Bourne Shell, Oracle, Unix-HP10.0 - Solaris, HP 9000

AIG - Boston MA Jan 2001 - Aug 2001

Team Member

Contribution:

. Involved in code analysis for transition. Analysis of the

specifications provided by the clients

. Impact analysis for enhancements

. Coding and testing of the enhanced version

. Client Side Validations using Windows for C, Windows for Data.

. Responsible for overseeing the Quality procedures related to the

project

Environment : Windows for C, Windows for Data. MS-SQL Server 2000. Windows

95/98/NT and Solaris, SunSparc, Pentium PC

Defense Electronics and Research Laboratories, Hyderabad.

Aug'1999 -Dec' 2000

Team Member

Contribution:

. Analysis of the specifications provided by the clients.

. During the process of integration the control flow of information is

handled using Message Queues, Semaphores, Shared memory for Inter

Process Communication, Socket Programming using TCP/IP for data

transfer across network. Multi Threaded Programming is used for

handling asynchronous data reported from external systems.

. As a Team Member, the work involved in coding & testing.

. In coding Inter Process mechanism for data transfer within the

. System and Sockets are implemented whenever required to interact with

other (remote) Entities.

. Documentation is done for all the test cases, which involves

documenting wide range of input values and corresponding output

values.

Environment : C, Socket Programming using TCP/IP, IPC (Message Queues,

Saphores, Shared memory), SunSparc, Pentium PC, Bourne Shell. NIX/Solaris

Nuclear Power Corporation, Mumbai. Jun'1998 -Jul'

1999

Team Member

Contribution:

. Analysis of the specifications provided by the clients

. During the process of integration the control flow of information is

handled using Socket Programming using TCP/IP for data transfer across

network.

. As a Team Member, the work involved in coding & testing.

. Documentation is done for all the test cases, which involves

documenting wide range of input values and corresponding output

values.

Environment : C, Socket Programming using TCP/IP, Bourne Shell. UNIX /

Solaris, SunSparc, Pentium PC

[pic]



Contact this candidate