Pamela L. Bradley 14110 Teakwood Court Disputanta, VA 23842 Cell – 804.586.3905 plbradley@yahoo.com I am an IT professional with over 20 years experience within various areas of the industry; the majority of which has been in Documentation, Process Improvement/Implementation, Quality Assurance (QA)/Testing, Project Management/Project Coordination, and Training. Currently hold Department of Defense (DoD) Secret level clearance. SKILL SET SUMMARY: Technical Writer/Business Systems Analyst/Quality Assurance – Testing: • Technical Writer for the End User Manual for a United States Army Standard Army Management Information Systems (STAMIS) software application. • Edited technical and functional design documents for accuracy, grammar usage, and standards. • Wrote/Edited Requirements documents, User and End-User Manuals, Lesson Plans, Test Cases, Test Plans, Implementation Plans, Test Reports, After Action Reports, Standard Operating Procedures (SOPs), How-To Guides, outgoing customer e-mails, help/web page text, and System Audit documentation. • Process Modeling – Developed procedures documents and guidance in defect tracking, issue tracking, and requirements change requests based on existing methodologies. • Wrote Statement of Work (SOW) documents. • Workflow/Data Modeling – Documented workflows and data modeling using Visio. • Designed and produced Excel spreadsheets according to the needs of various projects. • Communicated business directives, goals, and needs to the technical team both verbally and in writing. • Participated in code and technical design reviews. • Responsible for designing and coordinating System, End-to-End, Regression, and User Acceptance testing (mainframe, remote WS, Server, and e-Commerce), contingency plans, and technical documentation provisions. • Managed up to 20 testers to ensure all testing was complete and results were captured. • Executed web-based testing on an e-Commerce site. Where applicable, testing was done with Active X Controls and JavaScripting set to both on and off. All tests were executed using: o Operating Platforms – Win XP, Win 2000, Win 98, Win NT, Win 95, and Mac. o Browsers – Internet Explorer 5, 6, 07; Netscape Navigator 3, 4, and 7; and AOL (America Online based on both IE and Mozilla). (Mozilla is now known as Firefox.) • Performed Structured Query Language (SQL) data queries to validate test results. • Provided daily test status reports to all project participants and management. • Lead daily test status and other project related meetings. • Defect Management – Tracked defects, issues, and change requests from inception through closure to include chairing defect review meetings. • Requirements Analysis – Collect, analyze, and document the business requirements for the project. o Participated in Joint Application Development (JAD) sessions. o Gathered business and application functional requirements. o Conducted analysis of business and geographic requirements. o Identified technical requirements, application configuration requirements, and designed process flows. o Developed and maintained the Requirements Traceability Matrix for numerous projects. • Business Process Analysis – Determine business and user specifications to translate into application functionality. • Developed and reviewed Business Use Cases. • Assisted technical team in translating application requirements into application architecture. • Performed Risk Analysis and assessment. • Trending issues for determination of possible application enhancements. • Defined technical strategies to meet long-term business objectives. • Determined the scope of testing. • Acted as ongoing interface between business and technical teams. • Conducted issue Root Cause Analysis investigations. Process Engineering: • Introduced and incorporated new service roles, responsibilities, and workflow processes based on the client support environment. • Implemented defect-tracking procedures in accordance with the Capability Maturity Model Integration (CMMI) process. • Implemented requirements and documentation change control procedures in accordance with the CMMI process. • Consulted with the development and QA/testing teams on re-engineering and process improvement opportunities. • Implemented use of IEEE 829 standards for testing activities/documentation. • Implemented use of ISO documentation standards in all documentation to include User Manuals and Guides, technical documents, reports, etc. Project Management/Project Coordination: • Project Initiation and Planning. • Project Plan Development. • Project Scope Management – Developed timelines and ensured they were being met. • Negotiation and Conflict Resolution. • Project Implementation. • Project Termination Auditing and Closure. • Provided full Software Development Lifecycle Management (SDLC) of enterprise wide business services integration. Activities included: o Due diligence o Client presentations o Communications management o Change and issue management o Standard operating environment and procedures integration o Quality assurance o Classroom and train-the-trainer process/procedure training o Post integration support • Developed and managed the utilization of best practices and standard operating procedures. • Provided project progression/status reporting to the client and internal management. IT Training Specialist: • Conducted user training needs assessment. • Developed and maintain Lesson Plans and User documentation. • Conducted User Needs Assessment, developed the training material, and provided train-the-trainer and user training in the creation of reports and other uses of INFORMIX-Structured Query Language (SQL). • Conducted and lead Database Administrator, Network Administrator, and System Administrator training followed by a period of on-site user monitoring. • Performed as an analyst and senior trainer. • Developed/oversaw development of training materials and End User Manuals for a UNIX based Operating System. • Prepared and conducted vendor and proprietary software application platform, classroom, desk-side, and train-the-trainer type training at various sites around the country and overseas. Systems Architecture: • Managed legacy data conversions. • Performed total software builds (vendor and proprietary) of servers and workstations. • Provided physical inventory/asset audit and data management. • Established communications (LAN and modem) between remote workstations, servers, and the mainframe. • Established Radio Frequency (RF) setup between scanners and workstations. • Provided troubleshooting of and corrective actions to user files both on-site and via remote connection. • Conducted and lead parameter maintenance workshops to determine user specific settings. • Conducted and lead Database Administrator, Network Administrator, and System Administrator training followed by a period of on-site user monitoring. • Assisted users in establishing System Scheduler settings on an HP-UNIX mainframe. Methodologies: • IEEE 829 – Institute of Electrical and Electronics Engineers (IEEE) IEEE 829 is also referred to as the 829 Standard for Software Test Documentation. • Joint Application Development (JAD). • Capability Maturity Model Integration (CMMI). • Total Process Improvement (TPI). • International Organization for Standardization (ISO). • Six-Sigma. • Lean-Agile Tools: • Microsoft Suite: o Excel o Word o Project o Outlook o Power Point o Access (as user) • Visio 2000 • Mercury Suite: o TestDirector (Mercury Quality Center) (advanced) o Quick Test Pro (limited) o WinRunner (limited) • Caliber • Rational Team Test • Remedy • SnagIt 6 and 7 • TortoiseSVN-1.4.7 • Adobe Acrobat (.pdf) • Dreamweaver • Blocked Asynchronous Transfer protocol (BLAST) • TCP/IP - Transmission Control Protocol/Internet Protocol • RF Communications • Project Assets Library (PAL) • LiveLink/KnowledgeLink/ShareCenter • Adobe FrameMaker (training) RDBMS: • UNIX • Oracle 6 • DB2 Languages: • SQL (scripting) • JAVA (code review) • Visual Basic (VB) (code review) • C and C++ (code review) • COBOL (code review) Platforms: • Windows 2000, XP, NT, 98, 95 • MAC • UNIX o HP oSCO o Interactive • Sun Solaris • IBM Market Sectors: • Department of Defense (DoD) • Banking/Finance • E-Commerce SOFT SKILLS: Consistently achieve excellent ratings in the following areas: • Communication – oral and written • Ability to lead and take ownership of tasks/projects • Team building and motivation • Mentorship • Organization skills • Time management and task prioritization • Flexibility and adaptability to change • Multitasking • Issue resolution and troubleshooting • Professionalism - Handles challenges tactfully and diplomatically • Cross team relationship building and partnering skills RELEVENT EMPLOYMENT HISTORY and EXPERIENCE: Northrop Grumman Corporation (May 2009 – Current) Member of the Integration & Test team for the United States Army’s Property Book Unit Supply Enhanced (PBUSE) system. Software application tester. Responsibilities include: • Write all test scenarios for the PBUSE End User Manual (EUM) and On-line Help (OLH). • Review and edit the Property Book and Unit level application functionality test scenarios. • Perform testing of the PBUSE functionality. • Input help desk tickets (HDT) into the customer’s issue tracking system when Graphical User Interface (GUI) or application functionality issues are identified. • Participate in training of other PBUSE team members prior to the release of new functionality to the production users. Technical Writer for the PBUSE EUM. Responsibilities include: • Write and update the PBUSE EUM using MS Word and SnagIt (to capture screens) in IEEE format. • Write and update the text used for the Help screens, on-line help (OLH), and data element help. • Create data element help text as .html documents. • Convert the MS Word documents to Adobe Acrobat (.pdf) format and add bookmarking. • Update the documentation repository with the MS Word and .pdf documents using TortoiseSVN. • Prepare the EUM Update Summary consisting of all updates to the PBUSE EUM affecting the functionality of the PBUSE application. The document is included in the software release allowing the users to identify new/updated functionality without having to read through the entire Software Turnover list. • Attend bi-weekly Baseline Configuration Control Board (BCCB) meetings to review suggested changes to the software and determine when those changes will be fielded. • Review Software Build Update/Software Turnover documents to identify required PBUSE EUM text and/or figure updates resulting from new or changed application processes. • Periodically monitor the “live” production system to identify application changes not on the Software Build Update documents and update the PBUSE EUM accordingly. • Participate in Design Review Boards for new functionality. • Review the Graphical User Interface (GUI) screens and application functionality to identify possible issues. • Input help desk tickets (HDT) into the customer’s issue tracking system when GUI or application functionality issues are identified while reviewing the application functionality. • Monitor the customer’s issue tracking system for documentation HDTs that may impact the PBUSE EUM. • Work with the Training and Fielding team and Test team to identify possible updates to the PBUSE EUM. • Facilitate EUM Peer Reviews [when required by the project Mission Assurance (MA) representative] and analyze findings. • Perform edits on other project documentation as requested. Allen Corporation of America (May 2007 – May 2009) Worked for the Project Manager of the Logistics Information Systems (PM-LIS) at Northrop Grumman Corporation as the Lead Technical Writer for the United States Army’s Property Book Unit Supply Enhanced (PBUSE) End User Manual (EUM). Responsibilities include: • Write and update the PBUSE EUM using Microsoft Word and SnagIt (to capture screens) in IEEE format. Prior to my taking over as the lead technical writer of the EUM, it was not written in a standard format and continuity was lacking. • Write and update the text used for the Help screens and data element help. • Create data element help text in .html documents. • Convert the MS Word documents to Adobe Acrobat (.pdf) format and add bookmarking. • Update the documentation repository with the MS Word and .pdf documents using TortoiseSVN. • Attend bi-weekly Baseline Configuration Control Board (BCCB) meetings to review suggested changes to the software and determine when those changes will be fielded. • Review Software Build Update documents to identify required PBUSE EUM text and/or figure updates resulting from new or changed application processes. • Periodically monitor the “live” system to identify application changes not on the Software Build Update documents and update the PBUSE EUM accordingly. • Participate in Design Review Boards for new functionality. • Review the Graphical User Interface (GUI) screens and application functionality to identify possible issues. • Input help desk tickets (HDT) into the customer’s issue tracking system when GUI or application functionality issues are identified. • Monitor the customer’s issue tracking system for documentation HDTs that may impact the PBUSE EUM. • Work with the Training and Fielding team and Test team to identify possible updates to the PBUSE EUM. • Participate in application software testing when requested. • Facilitate EUM Peer Reviews [when required by the project Mission Assurance (MA) representative] and analyze findings. • Perform edits on other project documentation as requested. • Perform as a software application tester as requested. Global Consultants Inc. (GCI) (July 2006 – September 2006) Consultant at Capital One as a Business Systems Analyst (BSA) on the Credit Risk Management – Information Technologies (CRM-IT) Team. I was assigned as the BSA on two projects. The first project was an existing application, which used KnowledgeLink (LiveLink) as the platform. The application was in the maintenance and enhancement phase. When an issue was reported, I would analyze the requirements to determine if the issue was a design defect or a functional issue. I monitored the issues to determine any trends and presented them as possible future enhancements/requirement modifications. Issue tracking was done manually using an Excel spreadsheet. I tracked the issues through resolution and identified stopgaps and opportunities to shorten the time to resolution. I instituted the following: a numbering system for ease in issue identification and to prevent duplicate entries of the same issue; the definition and use of priorities to assist the developers in issue resolution; and I created/added a Closed Issue worksheet to the Excel spreadsheet to enable closed issues to be included in trend analysis. When the application owner was ready for an enhancement to be implemented, I assist in the writing of Change Requests. The second project completed the Feasibility phase and was in the Definition phase when I was assigned to it. The decision was made for the design, development, and maintenance of the application to be done outside of Capital One. I worked with the Project Manager to close the Definition phase and thus the “implementation” project so a new project could be opened using the Standard Design Methodology (SDM) guidelines for a project not using in-house resources. I assisted in writing the Statement of Work (SOW) to present to the company chosen to do the work. Sapphire Technologies (April 2005 – September 2005) Consultant at Capital One as a QA Manager/Test Lead on the Quality Services – Data Analytics Team. Responsibilities included: • Oversee testing activities of up to 7 testers • Review project documentation including Requirements • Estimations for Level of effort for assigned projects • Review and approval of documentation produced by team • Interaction with Project Stakeholders to ensure proper testing coverage of requirements • Attend/conduct Test Entrance Criteria hand-off meetings and certify test readiness • Primary contact for Capital One testing organization with vendor • Identify scope of regression effort for each project/release assigned to • Accountable for ensuring deliverables were met • Managing and directing staff to achieve deliverables • Create and maintain project documentation • Communicate project status and issues at the vertical and horizontal levels • Control planning and execution of the project’s activities and resources to ensure that established cost, time and quality goals were met • Coordinate project activities to ensure the project deliverables were met. • Integrate project plan activities with other related project plans (Requirements, Development, Releases, Data Warehouse, Interfaces, etc.) • Coordinate and lead the weekly team meeting (develop agenda and meeting minutes) • Adhere to Program Office requirements • Maintain project Issues Log • Communicate project Risks • Create roles & responsibilities for each team member • Resolve and escalate breakdowns as appropriate • Develop and execute test cases using TestDirector • Track and maintain test results using TestDirector • Track defects from initial identification through resolution using TestDirector IPC Technologies (July 2004 – December 2004) QA Manager at Philip Morris USA (PMUSA) supporting the Export Logistics System (ELS) Replacement Project. Although contracted as the Test Lead Consultant, I also acted as a Technical Writer and Business Analyst. The Requirements Document developed by the project Subject Matter Expert (SME) was more of a talking paper. There was no distinction between requirements, design questions, or possible issues. I took the initiative to create two separate documents, an actual Requirements document and an Issues/Questions document. • Requirements Document actions: Pulled out the requirements and grouped them by business area; developed the numbering schematic to identify each requirement; identified those requirements that were unclear and ambiguous, in direct conflict with other requirements, or missing. • Developed an Excel spreadsheet identifying Issues/Questions; divided between Technical and Business. This gave them visibility for tracking and allowed resource assignment for resolution. As the project progressed and testing began, I transitioned this into a means of identifying possible requirements changes and/or defects. • Implemented the use of Mercury TestDirector for Test Set/Case/Scenario development and defect tracking. • Provided desk-side training to the Project Manager and other team members in the use of TestDirector. • Instructed the PMUSA Project Manager and SME in levels of test and testing methodologies to determine the testing requirements of the project. • Used the projected implementation date to develop a Project Milestone Chart/Calendar to identify timelines and the suggested code cut-off date. • Determined dates for all test deliverables for input into MicroSoft Project Management. • Ensured the project adhered to the PMUSA Software Development Life Cycle (SDLC) process. • Managed the LiveLink project documentation repository. • Developed a high-level test flow document for Interface testing. Developed, maintained, and implemented the following: • Requirements Traceability Matrix (RTM). • Requirements Change Control Process. • Change Control Log (changes were manually tracked). • Test Incident/Defect Reporting and Tracking Procedures. • System Development Plan (with minimal input from the PMUSA Project Manager) • Communications Plan for test/project status. • Test Plan identifying the Unit, Integration, and System Test activities. • All test sets/cases/scenarios for System Test (provided Integration Test guidance). • Detailed Test Descriptor document by Business flow in Microsoft Word table format. (The Project Leadership Team, Business Owners, and other team members did not have access to TestDirector.) Other responsibilities: • Scheduled and lead various review and status meetings. • Determine the data setup required for all tests. • Validated test results using SQL database queries. • Tasked to review the Technical Project Plan, submitted by the project manager of the code development contractors, to ensure all requirements were addressed. In addition, group their deliverables into Business areas enabling the PMUSA Project Manager to track the progress of the developers. I was also tasked to map the business processes to the technical design deliverables to determine dependencies for use in determining the Deliverable Timeline. • Tasked to ensure the code development contractors submitted Code Review Checklist and maintained their System Design Specifications document. Also, tasked to review the document for errors and/or contradictions. • Tasked to identify and list all new or modified application screens and menus. • Tasked to write the User’s Guide/Handbook detailing the functionality and use (by Business area) of the new and modified application processes. Robert Half Technology (January 2004 – April 2004) Technology Delivery Lead at Bank of America supporting the Automated Credit Application Processing System. Areas of responsibilities included: Coordinating all technology deliverables for the application; performing resource management for the current project; address project issues and design conflicts; manage identified issues to resolution; maintain on-going communication within the project team and support groups; and ensure readiness of the development, testing, and training environments for the application. Circuit City Stores, Inc. (June 2000 – September 2003) Software Quality Assurance/Testing and Validation team on the e-Commerce QA/Testing team for the circuitcity.com web site. This 5-member team rotated Test Lead responsibilities. • When acting as the Test Lead, I was responsible for managing the test activities of the team as well as up to 10 guest testers. • Managed and executed System Test and provided guidance and support to developers during Integration Test for major site updates; most notably the ability to order music, movies, and games; and the ability to purchase and arrange home delivery of big screen TVs. • Assigned as the Test Lead on an extensive overhaul of the software application used by the Fulfillment Centers. System and Project Test Lead responsibilities included: o Writing/Editing Test Cases using TestDirector. o Writing/Editing Test Plans, Implementation Plans, outgoing customer e-mails, and help/web page text. o Provide instructions to guest testers and when needed, provide desk-side training in the use of TestDirector. o Leading daily test status and other meetings. o Defect tracking using TestDirector. o Configuration Management. o Implementing and fostering developer awareness in the division Capability Maturity Model Integration (CMMI) guidelines. o Participating in Requirements, Code, and Design reviews. o Developing and maintaining the Requirements Traceability Matrix for numerous projects. • Implemented Best Practices methods within the e-Commerce QA/Testing team to include the development of the Standard Operating Procedures (SOPs), writing How-To Guides, and development of the System Test Plan document. TRW (January 1998 – April 2000) Upon TRW being awarded the contract formerly held by CSC: • Continued as a member of the Conversion Team and was assigned as the Lead Analyst of the Help Desk Team providing technical support for all levels of SARSS. • Performed as the Test Director for both SAAS-Mod and ULLS-A during the US Army Year 2000 (Y2K) Certification testing for the Project Manager GCSS-Army. Testing included independent verification and validation of each system mandated by the Department of Defense. Developed the strategy and demonstrated the business process flow for selected operational scenario tests; developed the Y2K Certification Test Plan and Report IAW MIL-STD 498; ensured all aspects of Y2K Certification testing were coordinated and approved by government representatives; ensured all Certification test artifacts were captured, annotated, and compiled into a deliverable Y2K Certification Package for Program Executive Office Standard Automated Management Information Systems (PEO STAMIS); prepared the inclusive Y2K Certification Package presented at the Certification Review Board (CRB) and presented the ULLS-A retest package. Also provided strategy development assistance and documentation editing for all other Standard Automated Management Information Systems (STAMISs) the team was responsible for. Assisted in preparing the Test Report for the Army-wide End-to-End Level 1 test for Y2K; and assisted in the development of Configuration Management (CM) procedures for follow-on testing to ensure integrity of the Y2K certified baselines. Also developed the format for the Memorandum for Record (MFR) used to respond to the analysis of proposed baseline changes to Y2K certified systems. • Conducted performance evaluation and assurance analysis for Y2K compliance of STAMISs for complex, large-scale U.S. Army logistics systems. Monitored STAMISs test procedures and provided assessment of overall Y2K certification processes. Performed risk analysis and renovation design reviews for deficiencies encountered during system/subsystem compliance phases. Ensured end-user needs were met and that the century change had minimal impact on operations. Monitored Deployment Validation testing at End User operational test sites. • Performed risk analysis and renovation design assessments to ensure application software for the baselines that have been Y2K Certified remain compliant. Provided support to end-users after Y2K implementation. At the client’s request, I was their representative on a “Tiger Team” developed to identify and correct a print problem between the Standard Army Retail Supply System – Level 1 (SARSS1) and the Intermec 4400 printer. Was the Lead Analyst to ensure the Y2K Configuration Management procedures were adhered to by the Standard Army Ammunition System – Modified (SAAS-Mod), Unit Level Logistics System – Aviation (ULLS-A), and Integrated Logistics Analysis Program (ILAP). Also, provided guidance to the Lead Analysts for SARSS (all 3 levels), ULLS-S4, Standard Property Book – Redesign (SPBS-R), and the Automated Materiel Movement System (DAMMS). • Worked as a sub-contractor to GRC, International as the Supply Module Test Manager for the Global Combat Support System - Army (GCSS-Army) program. This position required training the other team members in the basics of the Army supply system as well as the functionality of two systems being replaced by GCSS-Army. Responsible for writing the test cases for the Materiel Item Order component of the Supply module IAW IEEE standards. Also wrote Business Rules and Module/Design Specifications while working with the Software Design and Development Team. Computer Sciences Corporation (CSC) (February 1996 – January 1998) As a member of the Conversion Team: • Performed technical support services for the Standard Army Retail Supply System (SARSS) for the US Army Information Systems Software Development Center, Fort Lee (SDC-L). Provided Database Management System (DBMS) consulting assistance and technical support to SARSS users to include maintaining communications connectivity. Conducted formal UNIX operating system, System Administration, and Non-Developmental Item (NDI) equipment training. • Conducted formal training for personnel assigned as Network, Database, and Systems Administrators on the HP-K200 mini-main frame and assisted users in setting up their TCP/IP and FTP communications tables to ensure uninterrupted data flow between all levels of the US Army logistics systems. • Configured systems to meet requirements for operating SARSS at the Division and Supply Support Area (SSA) levels. • Configured Network Controllers and hand-held scanners to operate using Radio Frequency (RF) connection. • Made corrections and updates to the technical End User Manual (EM) for SARSS 2AC/B and SARSS1. • Prepared tailored instructional materials to assist in training and job performance. Communicated with customers concerning technical problems relating to SARSS. • Provided on-site customer support, telephonic customer support, and general troubleshooting services for users. • Performed site surveys to obtain data and provide customers with information regarding future conversions to SARSS. • Took a great risk of personal injury to meet the client’s needs for on-site monitorship and technical support by volunteering for and participating in Joint Task Force-Eagle (Operation Joint Endeavor: Bosnia and Croatia). This entailed attending and passing an Army executed Situational Training Exercise (STX) in preparation for entering a country in conflict. Information Technology Solutions (ITS) (March 1995 - February 1996) While performing as a Unit Level Logistics System (ULLS) systems analyst supporting The Software Development Center - Lee (SDC-L), I developed and implemented a Requirements Traceability Matrix for the Unit Level Logistics System-S4/Ground/Aviation (ULLS-S4/G/A) baselines. Reviewed, updated, and made functional corrections to the system Functional Description (FD), End User Manuals (EMs), and System Support Manual. Prepared and conducted internal training for ULLS-G users as well as "Train-the-trainer" type training at various sites around the country. Developed a working knowledge of AdaSage. Performed troubleshooting and STAMIS software installation for ULLS-G on an MS-DOS based system. Assisted in converting hard copy documentation to on-line using Hypertext. Synoptic Systems Corporation (SSC) (November 1987 - March 1995) • Performed as an analyst and senior trainer supporting Software Development Center - Lee (SDC-L's) Standard Army Retail Supply System (SARSS). • As the Manager of the SARSS 2AC/B Documentation section, I developed/oversaw development of SARSS-2AC/B training materials and End User Manuals for a UNIX based Operating System. Prepared conversion documentation and assisted in incorporating approved Software Change Package changes into the End User Manuals and training materials. • Performed monitorship on a Sperry 5000 hardware platform for the first fielding of SARSS-2AC/B Software Acceptance Test. Deployed and performed customer assistance in Southwest Asia in Operation Desert Shield. Responsible for maintaining the Engineering Change Package - Software (ECP-S) database in INFORMIX on a UNIX hardware platform and the ECP-S Tracking Systems (ETS) on an MVS-XA operating platform for all levels of SARSS. • Managed a regular team of testers performing System and Regression testing for the Army’s automated supply system. On a regular basis, lead a team comprised of Major Area Command (MACOM) representatives from all over the world to conduct User testing. Conducted daily test status meetings and determined the scope of testing to occur the next day. Developed the Schedule of Events for all software quality testing and integrated testing within all levels of SARSS. • Participated in the Signal Battle Lab at Ft. Gordon, GA as the SARSS-2AC/B representative in a worldwide communication capability demonstration. • Assisted in the development of executive and utility routines used in maintaining the BLAST communications software package which allowed the End User Workstation (EUWS) to interface between a MS-DOS PC and a UNIX Sperry 5000 (then HP-UNIX) and also interface with a SCO UNIX Server on a PC and SCO UNIX on a Harris Night Hawk through a Local Area Network (LAN). • Conducted a Needs Assessment of the user’s INFORMIX-Structured Query Language (SQL) requirements. Developed and maintained SQL Lesson Plans and User documentation. In addition, provided train-the-trainer and user training in the creation of reports and other uses of SQL.