JOSEPHINE PALENCIA
Honeywell International
Atlanta, GA 30308
*********.********@*********.***
Marietta, GA 30062
********@*****.***
www.linkedin.com/in/josephinepalencia/
Director, IT Operations
Multi-award winning and change driver with comprehensive experience in leading organizational efforts in evaluating technology trends that impact large Global enterprise. Spearheads the development and implementation of intensive computing, big data analytics, fast storage, networking, leading-edge high performance computing (HPC) and datacenter facilities that provide scientists with the balanced interconnected capabilities to model, simulate, and predict complex phenomena. Hires key personnel in the division and maintains sound management structures and practices while mentoring section staff and laying out their paths for professional development. Expertise in providing technical leadership and guidance to a diverse staff of scientific and computational professionals, and the ability to direct projects and priorities among technical groups and issues, resulting in effective action. Identify the future direction of complex interdisciplinary research programs at both the agency and national levels.
HPC GPU technologies Partnerships & Alliances Global Infrastructure Management Team Mentoring/Supervision Strategic Planning & Execution High Performance Computing (HPC/GPU) AI/Deep Learning, Cloud CyberSecurity Export-Control, ITAR-restricted data Air-gap Compute Resources Big Data Analytics Modernizing Global, Corporate Enterprise IT Cross-Agency Science & Research Collaborations Technical Leadership Systems and Storage Architecture Project Management Budget, Finance, Procurement Roadmaps
PROFESSIONAL EXPERIENCE
Senior Manager, System Integration HPC 01/2017 – Present
Honeywell International, 715 Peachtree St, Atlanta, GA
Coordinate and supervise the functions of the Honeywell’s Global HPC Team across the US and India as regards modernizing the computing infrastructure of the HPC & GPU while steering the corporate vision and providing Technical Leadership.
Excel in maximizing the performance, scalability, and efficiency of key scientific, engineering, and analysis applications in aerospace and material design through the implementation and evaluation of systems and software for extreme-scale parallel computing.
Spearhead functions involving the cloud integration of on-premise HPC Infrastructure with Amazon AWS, Azure supporting Big Data Analytics, AI, and Deep Learning, Quantum Computing
Achieve a 600% increase in Engineering Software Performance by coordinating the execution of the Application Acceleration Project.
Maximize the annual O&M Budget resulting from identifying and developing new business opportunities such as $5M from 2019 to 2020.
Prepare and present a 5-year HPC roadmap to the executive leadership team along with detailed implementation plans.
Research Scientist II Faculty 05/2015 – 01/2017
Georgia Institute of Technology, Atlanta, GA 30332, Partnership for an Advanced Computing Environment
Piloted multiple functions encompassing Data Analytics, Hadoop (Hortonworks, Apache), Python, SQL, Bioanalytics in my role as a Scientific Computing Consultant.
Championed the design of a Hunk-Hadoop infrastructure that helped achieve a $2M cost saving as regards cybersecurity for the Georgia Tech in collaboration with PACE.
Acknowledged for expertise in gathering system logs from hundreds of remote systems on campus through exploring and identifying various techniques and means of transferring data.
Handled the presentation and gave a speech on "Cybersecurity Data Analytics on Hunk-Hadoop" as an invitee at the Georgia State University's Conference themed "Scientific Computing Day 2016".
Senior Scientific Data Specialist 06/2005 – 03/2015 Senior HPC System Administrator
Carnegie Mellon University, Pittsburgh Supercomputing Center, Pittsburgh, PA 15213
Masterminded the process of offering PSC support for data-intensive science.
Boosted the efficiency of leading edge cyber-infrastructure systems through conceptualizing and initiating novel ideas, methods, algorithms, and codes.
Orchestrated the strategic planning and execution of various outreach functions focused on publishing and promoting scientific and technological accomplishments.
Led and supervised training and development programs as well as PSC workshops.
Forged and maintained ethical relationships with scientists in executing collaborative projects geared toward steering professional advancement through leading-edge data management and processing.
Accomplished in evaluating proposals for resource grants as well as offered support in writing winning grants.
Acting Project Manager and Technical Lead, ExTENCI Lustre Project
(OSG) NSF Grant 1007115
Acted as the Technical Lead for 8 institutions, including Pittsburgh Supercomputing Center, Fermi National Lab, University of Florida, and 7 other Universities across the State of Florida.
Promoted and set project vision, determine its direction, wrote and managed specific tasks, goals, and schedules as well as generated the technical progress reports for reporting to XSEDE.
Guilded the publication of half a dozen first-authored peer-reviewed papers, posters, and presentations.
Presented various talk on the project at various local and international conferences and also obtained additional funding of $12K for PSC with the project extension.
Conceptualized and designed the secure, kerberized lustre filesystem over the Wide Area Network WAN for tier-1, 2, 3 CERN sites and port the CMS code over the WAN file system; virtualization, cloud computing (XEN, KVM, Openstack) and physical machines (UF, PSC, FIU, FSU, Fermilab) applications: CMS, LQCD, ATLAS.
Bioinformatics/Genomics
Oversaw the GenePattern (Broad Institute), including Torque, SGE, java, R as well as Galaxy (PSU) involving python, NGS assembly tools, PostgreSQL DB, and Assembly (Trinity).
Facilitated the design and development of PSC's Genomics/Bioinformatics Web User Interface VM's as well as Expression Analysis and Annotation (Trinotate).
Data Consultant, Galaxy Bioinformatics for XSEDE
Utilized XSEDE's powerful compute and massive storage facilities in extending the PSU Bioinformatics.
Provided strategic oversight for Clusters, Fast Network Interconnects (Myrinet, Quadrics, Infiniband,10/1GigE),
WAN/Filesystem (PNFS, GPFS-/wan, Lustre-/wan, Panasas, PVFS, AFS), Scheduler (PBS/Torque, Maui/Moab, Slurm), Grid/Globus (TeraGrid, OSG), Compilers (Intel, PGI), Mpich/2, and Storage.
Kerberized Lustre Teragrid & National Center for Biomed Research @ PSC
PSC Technical Lead, NSF Grant Prime Award OCI-0932251
Accelerated the development of the the kerberized lustre filesystem over Teragrid-WAN (PSC, SDSC) as the Lustre/Kerberos technical lead.
Established and managed the entire HPC systems used for Biomed research operations.
HPC Clusters managed at PSC
Streamlined the Helix- 254-processor cluster with lustre, IB network, slurm, as well as the Axon- 24-processor cluster with lustre, quadrics, PBS.
ADDITIONAL WORK EXPERIENCE
Adjunct Physics Faculty 09/2009 – 05/2011
Duquesne University, Physics Department, Pittsburgh, PA 15282
Beowulf Cluster Specialist
Raytheon, ITSS, Lanham, MD
NASA/Goddard Space Flight Center Principal Unix System Administrator
Unix System Administrator 09/1995 – 01/2000
Drexel University, Department of Physics & Atmospheric Sciences, Graduate Teaching/Research Assistant
Graduate Student/PC Help Support 09/1992 – 09/1994
East Carolina University, Department of Physics, East Fifth Street, Greenville, NC 27858
Staff Physicist/Solar Observatory 10/1989 – 10/1991
Manila Observatory, Philippines, Solar-Optical Division, Diliman, Queston City, Philippines 1101
Instructor in Physics 06/1988 – 10/1989
University of the Philippines, Diliman, National Institute of Physics Katipunan, Road, Diliman, Diliman, Quezon City, Philippines
honors & awards
Raytheon 2003 Excellence in Information Technology: Operational Excellence
Raytheon 2002 Excellence in Information Technology: Finalist
Raytheon Peer Award 2000
Manila Observatory Fellowship Award 1992 ($17,000)
ACTIVITIES & INTERESTS
Carnegie Mellon University Explorers Scuba Diving Chair and Founder 2008 – 2012
www.cmuexplorers . o r g / s c u b a d i v I n g / I n d e x . h t m l
Pittsburgh Zoo & Aquarium: Volunteer Scuba Diver
EDUCATION
M.Sc. Analytics 2016 – Present
Georgia Institute of Technology, Atlanta, GA
Robotics course 2006
Carnegie Mellon University, Pittsburgh, PA
(Ph.D.) Nonlinear Physics, All But Dissertation 1994 – 2000
M.S. Physics & Atmospheric Sciences
Drexel University, Philadelphia, PA
M.Sc. Physics 1992 -1994
East Carolina University, Greenville, NC
B.Sc. Physics 1981 -1988
University of the Philippines, Philippines
PAPERS & PRESENTATIONS
Palencia, J ., K o z a, K ., B r i g h t, N ., B e l g i n, M, M c N e i l l, A n d r e, “ CyberSecurity Data Analytics on Hunk Hadoop,” 2016 Scientific Computing Day Conference, Georgia State University, Atlanta, GA September, 2016
Palencia, J., Ropelewski, A., “Real-Time Next-Generation Sequencing (NGS) in the Classroom using Galaxy,” XSEDE Extended Collaborative Support (ECSS) Symposium, Pittsburgh, PA (March, 2015)
h ttps:// w w w . x s e d e . o r g / w e b / gu e s t / e c s s - s y mpo s i u m
https:/ / ww w . x s e d e .o r g / doc u m e n t s / 1 0 1 5 7 / 8 9 9 6 2 1 / ec ss - g a l a x y- r e v i se d- v 1 . p df / c 479ace4- 437 44 8 0 6 -8 fb9 -0c9562e3a662
J. Taylor, A. Nekrutenko, N. Coraor, P. Blood, A. Ropelewski, Z. Zhang, J. Palencia, S. Sanielevici, J. Yanovich, R. Budden, "A Sustainable National Gateway For Biological Computation," XSEDE13 Proceedings of the 2013 XSEDE Conference ACM Chicago, IL, USA (2013) htt p s :/ / w ww.x s e d e . or g / w e b / x s e d e 1 3 / p r e s e n t a t i on s
G. Kaganas, J. L. Rodriguez, M. Cheng, P. Avery, D. Bourilkov, M. Cheng, Y. Fu, J. Palencia, “"Distributing CMS Data between the Florida Tier2 and Tier3 Centers using Lustre and Xrootdfs," Computing in High Energy and Nuclear Physics (CHEP) 2013, The Netherlands h t t p : / / c d s . c e r n . c h / r e c o r d / 162782 3 / f i l e s / C R 2013_39 2 . pd f Journal of Physics, Conference Series, (2013)
Palencia, J., Budden, R., Benninger, K., Bourilkov, D., Avery, P., Cheng, M., Fu, Y.,Kim, B., Dykstra, D., Seenu, N., Rodriguez, J., Dilascio, J., Shrum, D., Wilgenbusch, J., Oliver, D., Majchzak, D., “Using Kerberized Lustre over the Wide Area Network for High Energy Physics Data,” XSEDE12 Proceedings of the 2012 XSEDE Conference ACM Chicago, IL, USA ©2012 ISBN: 978-1-60558-818-6 Proceedings of the (LUG) Lustre User Group 2012, Austin, TX, April 2012 h t t p : / / w w w . op e n s f s . o r g / w p c o n t e n t / up l o a d s / 20 1 1 / 1 1 / l ug201 2 - v20 . pd f
D. Bourilkov, P. Avery, M. Cheng, Y. Fu, B. Kim, J. Palencia, R. Budden, K. Sullivan, J. Scott, J. Rodriquez, D. Dykstra, N. Seenu, "Secure Wide Area Network Access to CMS Analysis Data using the Lustre Filesystem," Computing in High Energy and Nuclear Physics (CHEP) 2012, New York Journal of Physics, Conference Series, (2012)
D. Bourilkov, P. Avery, M. Cheng, Y. Fu, B. Kim, J. Palencia, R. Budden, K. Benninger, J. Kochmar, “Using Virtual Lustre Clients on the WAN for Analysis of Data from High Energy Experiments,” Computing in High Energy and Nuclear Physics (CHEP) 2012, New York Journal of Physics, Conference Series (2012)
Palencia, Josephine, Budden, Robert and Sullivan, Kevin, “Kerberized Lustre 2.0 over the WAN,” TG10 Proceedings of the 2010 TeraGrid Conference ACM New York, NY ©2010 ISBN: 978-1- 60558-81 ht t p:/ / w w w . t erag r i d . o r g / we b / eve n ts/ t g10/ schedule
Palencia, Josephine; Budden, Robert; Sullivan, Kevin, “Kerberized Lustre 2.0 over the WAN”, Lustre User Group 2010, Aptos, CA, April 16, 201
http:/ / w I k i. l us t re . o rg/ i ndex . PHP / Lustre_User_Group_2010
Palencia, Josephine, “Cluster Management,” Bettis Workshop, Pittsburgh Supercomputing Center, Pittsburgh, PA, 2007, 2008
Palencia, Josephine, and Sherrer, Kip, "Interfacing with the Roomba", Meeting of the Minds Undergraduate Research Symposium, Carnegie Mellon University, 2006
S. Zhou, W. Kuang, W. Jiang, J. Palencia, G. Gardner, “High-Speed Network and Grid Computing for High-End Computation: Application in Geodynamics Ensemble Simulations,” Workshop on Component Models and Frameworks in High Performance Computing, 22-23 June 2005, Atlanta, GA
Palencia, Josephine, “Thunderhead: Design, Architecture and Applications: Current and Future Vision,” 2004 High Performance Computing Workshop, NASA/Goddard Space Flight Center, Greenbelt, MD, Nov 2004
Palencia, Josephine, “Computing with Thunderhead: An Introduction,” NASA Summer School for High Performance Computational Earth and Space Sciences, NASA/ Goddard Space Flight Center, Greenbelt, MD, July 12-30, 2004 Palencia, Josephine and Simpson, Jeff “BLISS - Beowulf for LISA Source Simulations – Prototype Architecture for Small-Scale High Performance Computing in Gravitational Wave Astrophysics,” Science Data Symposium, University of Maryland Conference Center, College Park, MD NASA Space Science Data Center (NSSDC) Newsletter, May 2003 http:/ / nssdc . gsfc . NASA . go v / nssdc_new s / j un03/bliss.html
W. J. Campbell, S. Chettri, P. Coronado, J. E. Dorband, Josephine Palencia (RSTX), Udaya Ranawake (UMBC), J. Le Moigne, R. G. Lyon, G. Shirah, and J. Tilton, “Hyperspectral Imaging Mining, UAV's, Direct Satellite Broadcast, Data Fusion, Computational optics, Scientific Visualizations and More, ” NASA Goddard Space Flight Center, Geenbelt, MD
Dorband, J., Palencia, J., and Ranawake “Commodity Cluster Computing at Goddard Space Flight Center,” ESTC Earth Science Technology Conference 2002, Pasadena, CA
Palencia, Josephine, “In Search of Chaos and Attractors in the Solar Cycle,” Master’s Thesis, East Carolina University, NC 1994