Ana səhifə

27 May 2008 Prepared for: commander, marcorsyscom program Manager, Advanced Amphibious Assault (pm aaa)


Yüklə 1.49 Mb.
səhifə2/7
tarix25.06.2016
ölçüsü1.49 Mb.
1   2   3   4   5   6   7

1.2BUSINESS Approach

1.2.1Contract Management


Stanley has a dedicated Contracts Administration (CA) staff to manage, maintain, monitor and provide contract related reports to our customers. Our Program Managers (PMs) and Task Order Managers (TOMs) are responsible for interfacing with our corporate CAs to ensure compliance to contractual requirements. Stanley has developed Contract Management Plans (CMPs) for some of our customers, and we stand ready to provide the same service to PM AAA when required. Under this management structure, all contract modifications will be processed through the designated CA member for each Stanley contract.

1.2.2Financial Oversight


Our PMs, TOMs and corporate representatives ensure transparency of our financial performance through effective application and management of automated financial and human resources tools, such as Cognos, Deltek, Employee Self Service (ESS), SharePoint Portals, and various internal administrative tools. Our corporate representatives provide direct support to our project leadership to ensure sound financial management and reporting. The systems that we use provide the capability for collaborative problem sharing, communications, invoice tracking, and document repository. Actual expenditures are promptly posted as they become available to our Accounts Payable department. This allows automatic infusion of expenditure data into the Billing Department using our DCAA-approved billing system. All invoices are reviewed and validated by our PMs to ensure accuracy of billing information.

1.2.3Cost, Schedule and Performance Management


Our designated PM is Mr. Ed Tudela, who will be responsible for the contract and interfacing with Mr. Westerbeck, PM AAA Project Officer, for all contractual matters. Mr. Tudela will be supported by our corporate staff to ensure effective contract and financial management.

Our designated TOM is Mr. Clift Bergeron, who will be responsible for the day-to-day management of this task order and the performance of our team members, including subcontractors. Mr. Bergeron will report directly to Mr. Tudela and provide the front-line interface to Mr. Westerbeck for all technical matters of this task order to ensure effective and efficient cost, schedule and performance management.


1.3Technical Approach


Team Stanley understands that operationally effective and suitable EFV must be delivered in a timely and cost-effective manner to support the Marine Corps Expeditionary Maneuver Warfare (EMW) and Ship-to-Objective Maneuver (STOM) warfighting concepts. The EFVP1 (Personnel Variant) and EFVC1 (Command Variant) provide the means of tactical mobility for the Marine Rifle Squad and Commanders during the conduct of amphibious operations and subsequent ground combat operations ashore. The challenges to ensuring precise engineering, rigorous testing, effective and concise training, and maintainability considerations, continue through the System Development & Demonstration (SDD) Phase of the MFV system acquisition life cycle. We envision continuing SDD challenges as the EFV end item and all its supportability, maintainability and reliability components evolve and mature.

Specific to this Statement of Work (SOW), Team Stanley is prepared to apply our proven capabilities in complex systems engineering, MPT analysis, and most recent FEA accomplishments. Our systems engineering approach is based on Institute of Electrical Electronics Engineers (IEEE) industry standards, recommended best practices by the International Council on Systems Engineering (INCOSE), and DoD system acquisition guidelines. This includes IEEE STD-1220-1998, Standard for Application and Management of the Systems Engineering Proces; DoD 5000.2, Operation of the Defense Acquisition System - Systems Engineering Master Plan; DoD Certification and Accreditation Process Guidance (DIACAP); DoD Directive 8320.1, Data Management; and other systems engineering governance.

Our interdisciplinary approach enables us to focuses on defining customer needs and required functionality early in the development cycle, documenting requirements, and proceeding through the system development life cycle, while considering the complete solution. Our approach to meeting your FEA requirements will be based on the Marine Corps Order 1200.13F, Marine Corps FEA Program, framework, but support with lessons learned and best practice methods borne from our in-depth experience. The following sections provide our detailed approach towards meeting the specific requirements of each task.

1.3.1Systems Engineering Support [SOW 2.1.1]


Understanding

Team Stanley establishes a solid analytical framework to support multiple concurrent efforts in direct support of the development of EFV training system products to be deployed in a cohesive, integrated, and cost effective manner. We understand this is a comprehensive management and technical effort that focuses on stakeholder collaboration for analyses of operational and maintenance tasks for front end training development. Systems engineering support for continuous integration of personnel, training, human systems integration (HCI) requirements into compliant DoD SE processes. To this end, a system must be defined to reduce task times, number of procedures, and maintenance complexity by proactively establishing and executing information assurance (IA) strategies, plans and continuous IA support throughout the system life cycle. Information systems support for leveraging Intranet portals, non-federated databases, must be aligned with MPT EFV training system product goals. Instructional system design (ISD) and information management for the training FEA and document generation supporting training system requirements analysis reporting is an integral part of the MPT training product strategy.



Approach

Our primary focus on this task is the systems engineering (SE) support activity that provides a holistic view from the perspective of comprehensive systems engineering management (SEM) framework, as depicted in. This focus requires a systematic approach to ensuring that the fundamentals of SEM and execution of systems engineering processes (SEP) are consistent with the DoD systems acquisition life cycle frameworks. Compliance with the applicable DoD and IEEE standards and guidelines, as well as the application of industry best business practices, are fundamental key factors in meeting PM AAA’s task objectives.

Stanley will support the PM AAA System Engineer with primary emphasis on training systems’ technical framework and standards. We will ensure that the systems engineering and acquisition strategies and processes are consistent with PM AAA’s systems engineering and acquisition objectives within MPT.

T


Figure 1: Systems Engineering Integrated Activity


he application of SEM framework involves a succession of decisions among various courses of action (COAs) through SE Integrated Activity [Reference: DAU SE Fundamentals, January 2001] depicted in Error: Reference source not found, which embraces the system life cycle stages consistent with DoD system acquisition guidelines.

The primary goal of SE is to define and understand the requirements, constraints and design alternatives to support those decisions, coupled with oversight of their implementation. Furthermore, SE ensures that the diverse elements comprising a system are compatible and ready when needed.



In relationship with the system life cycle processes, SEM focuses on three principal dimensions of the SE integrated activity: 1) Development Phasing; 2) Life Cycle Integration; and 3) SEP.

  • A
    The Development Phasing dimension correlates with the SDD phase of the DoD acquisition life cycle.
    1: Development Phasing
    – controls the design process and provides baselines that coordinate design efforts. This involves the preliminary planning activity to:

  1. Ensure that all technical activities are identified and managed. This will involve an immediate assessment of all technical activities that are primarily focused on EFV training systems.

  2. Communicate the technical approach to the broad development team. We envision our significant involvement in Integrated Product Teams (IPTs), technical working groups, logistic, FEA, Training Devices, Courseware, and external stakeholders. An effective Communications Plan may have to be developed to gain consensus of what, when, where and how to communicate the requirements of the EFV training systems developmental activity.

  3. Document decisions and technical implementation. A decision support system (DSS) must be available, supportable and maintainable to ensure effective data management (DM) of information crucial to the implementation of the EFV training systems. We envision an immediate assessment of available DSS application within the USMC Software Approved list. We will implement a DSS to support an effective DM of all developmental activities associated with the EFV training systems. The DSS should effective capture the decisions that support the developmental activities, which may include: Engineering Plan; Schedules; Technical Plans; Functional, Physical, and System Architectures; Rationale for design decisions; Trade-off analyses accomplished, including recommendations and impacts; Effectiveness assessments and their outcomes; Risk assessments and handling options; Sketches and drawings; Engineering changes; Specification trees; Specifications and configuration baselines; Operational environment; Manpower, personnel, training, and human engineering requirements and specifications; Archival data; Technical objectives, requirements, constraints, interfaces and risks; System Breakdown Structures (SBS); Design models; Test results; Metrics; and Any other data that allow for traceability of requirements throughout the functional and design architectures.

  4. Establish the criteria to determine how well the system development effort is meeting customer and management needs. We envision supporting the Test and Evaluation (T&A) activity to evaluate the effectiveness of the EFV training systems. This may include the development of Test Plans (TPs); IA strategies and plans in accordance with DIACAP guidelines; assessment of HCI issues; and overall support to EFV training systems testing and fielding activities.

  • A
    The Life Cycle Integration dimension overlaps all phases of the SE life cycle stages.

    acquisition life cycle.
    2: Life Cycle Integration
    – involves the customers in the design process to ensure that the design solution is viable throughout the life of the system. It includes, but not limited to:

  1. Planning associated with product and process development. As the EFV evolves through the SDD phase, plans are reassessed, revised and developed to meet the exit criteria of the SDD phase. This involves a myriad of development activities including: Assessing the outcome of the Preliminary Design Review (PDR); Developing associated engineering and acquisition documentation to support the Critical Design Review (CDR); and ensuring compliance with other DoD acquisition milestones documentation requirements. We envision our SE support to focus on reviewing the results of PDR; determine critical issues arising from the PDR; evaluating whether the critical PDR issues have been resolved, or if an effective plan is in place to resolve them prior to scheduled CDR; evaluate the planning for CDR; determine what key tasks must be accomplished to be ready in CDR; define internal exit criteria for CDR; assess if the CDR criteria are reasonable to support EFV training systems production considering; identify the COAs and Plan of Actions and Milestones (POA&M) needed be accomplished prior to CDR and the amount of time scheduled between CDR and Production. After CDR, critical issues must be clearly identified and COAs planned to resolve CDR issues prior to scheduled production.

  2. Integration of multiple functional concerns into the design and engineering process. Multiple functional areas and stakeholders are interested developing and producing world-class EFV systems, as well as train users on how to use this system. We envision a collaborative environment where government and contractor stakeholders share vital information to ensuring measures of effectiveness (MOE) and performance (MOP) into the design considerations.

  • A
    The Systems Engineering Processes dimension correlates with the IEEE Standards and DoD Systems Engineering Life Cycle (SELC) stages.
    3: Systems Engineering Processes
    – controls the SE activity from cradle to grave, as depicted in Figure 2, and includes eight SEPs—Requirements Analysis; Requirements Validation; Functional Analysis; Functional Verification; Synthesis; Design Verification; Systems Analysis; and Controls—all having multiple sub-processes.

  • Since PM AAA is interested in implementing best business practice, we envision exercising SEP controls that include: Data Management; Configuration Management; Interface Management; Risk Management; Performance Management; Systems Analysis and Verification; Test Data; Requirements and Design Changes; Performance against Project Plans; Performance against Technical Plans; Product and Process Metrics; Specifications and Configuration Baselines; Requirement Views and Architectures; and Engineering and Technical Plan

Figure 2: Typical System Engineering Life Cycle Stages



Process Execution

Within our approach are requirements to structure and facilitate process execution, coordinate stakeholder inputs, and ensure appropriate vetting at each level within the decision-making framework. This includes, but is not limited to, applying the results of analytical processes to effectively support recommendations and gain stakeholder concurrence. These efforts are supported with a sound analytical and engineering technical approach that provides for analyses and stakeholder vetting of recommendations at each step of the process. Results of the analysis will include thorough documentation of findings, interim briefings, and formal presentations with recommended changes.




  1. Conduct an Operators Job & Task Analysis (JTA)—meet and collaborate with several SME track operators to determine collective and individual tasks required to operate the EFV.

  2. Conduct a Maintainers JTA—meet and collaborate with several SME track mechanics to determine collective and individual tasks required to maintain the EFV.

  3. Develop an Operator Master Task List (MTL)—determine each operator task's tasks, conditions, standards, performance steps, and priorities

  4. Develop a Maintainers MTL—determine each maintainer task's tasks, conditions, standards, performance steps, and priorities.

  5. Perform Training Course Analysis for the operators—determine the best optimal methodology to incorporate operator tasks developed within the MTL into learning objectives and lessons to be taught.

  6. Perform Training Course Analysis for the maintainers—determine the best optimal methodology to incorporate maintainer tasks developed in the MTL into learning objectives and lessons to be taught.

  7. Develop an Operator Learning Analysis Report (LAR)—combine all operator related data into a single consolidated report.

  8. Develop a Maintainer LAR—combine all maintainer related data into a single consolidated report.

EFV Training Product Support


T


Figure 3: Instructional Systems Design (ISD) Process
eam Stanley will use the ISD process depicted in to develop training curricula and other training materials, which incorporates a systematic and proven approach to training and instructional design. The ISD model is an evolution of the proven methodology we have applied with success in all of our Team’s courseware products. Our approach to the design and development of training products is based on a detailed understanding of the training environment, instructor/students needs, and a thorough understanding of the subject matter. The ISD shows the ISD process that we will use to develop training products for our customers. While each phase of the ISD process represents a linear approach, we recognize that during the ISD process, the phases are interactive and that feedback from or within each phase may require revisions in other phases. For example, we conduct evaluations of both the training material and the instructor after each training delivery instance, which we use to update training materials and to help our instructors improve training delivery. The ISD is truly a system with inputs, outputs, and a feedback process.

This approach develops consistent business practices and acquisition processes for MPT to ensure current compliance with DoD systems engineering processes. ISD and information management for the training front-end analyses, document generation, and requirements analysis reporting are supported as an integral part of MPT training product strategy.


1.3.2Front End Analysis [SOW 2.1.2]


Understanding

The EFV program is required to conduct a FEA that complies with Marine Corps Systems Command (MCSC) and Training & Education Command (TECOM) requirements. To meet this requirement, the EFV Program Office will utilize the ISD approach to make decisions about the “who, what, when, where, why and how” of training to support EFV. The ISD entails five basic steps:



  1. Baseline analysis – assess EFV human system requirements, compile a task inventory of all tasks associated with each requirement, identify tasks that need to be trained (needs analysis), build performance measures for the tasks to be trained, choose instructional media (classroom, on-the-job, self study, simulator, etc.), and estimate the associated training costs.

  2. Training system design – develop learning objectives (terminal and enabling) for each task, identify the learning steps required to perform each task, develop performance tests to show mastery of the tasks (e.g. written, hands on, etc.), establish entry skills that the marine must demonstrate prior to training, and finally sequence and structure the learning objectives.

  3. Training system development – identify activities that will help the marine learn the task, select the delivery media, develop instructional courseware, synthesize the courseware into a viable training program and validate the instruction to ensure it accomplishes goals.

  4. Training system implementation – create a management plan for conducting the training, then actually conduct the training.

  5. Evaluate the training system – review and evaluate the first four phases to ensure they are accomplishing what was expected, perform external evaluations (e.g. observe that the tasks that were trained can actually be performed in an operational setting), revise the training system to make it better.

The FEA—also referred to as a Training System Requirements Analysis (TSRA)—focuses mainly on steps 1 – 3 above and therefore, the support that we provide under this task will also focus on these steps. As shown in the Figure 4, the FEA generally graduates thorough four major analysis activities. The general flow of the analysis starts with the identification and verification of a training needed (operator, maintenance and team). It then proceeds with identification of alternative solutions, and culminates with the detailed specifications required for training acquisition.

T


Figure 4: Training Systems Requirements Analysis Process


he Training Needs Analysis identifies all tasks associated with the system, maps them to required skills and abilities, conducts a training gap analysis, establishes specific tasks that need to be trained and identifies candidate training media. One of the major components of the FEA is a Master Task List (MTL), which provides an analytical basis for task and skills analysis and is the foundation for a bottom-up build of training requirements. Results of the training needs analysis provide a baseline for the next analytical phase which addresses media selection and cost.

Media and Cost Analysis develops alternative options to provide the required training identified in the Training Needs Analysis. Various training system alternatives for meeting the training requirements are identified and described. These alternatives are formulated after conducting a media analysis, training technology assessment and a training effectiveness analysis. During these assessments, current and evolving instructional technologies are surveyed and their training capabilities and effectiveness are determined. A cost benefit analysis across multiple dimensions is performed on the alternatives. Return-on-investment (ROI) calculations are included when appropriate. Tradeoff areas may include cost and other resource requirements, estimated training effectiveness, engineering risk, schedule implications, MPT requirements, reliability, maintainability and safety considerations. This step concludes with an Instructional Media Requirements Document (IMRD), which includes a complete description of the alternatives and a recommended solution with supporting rationale. The IMRD is normally submitted to the sponsor for review and selection of the alternative.

Once a training alternative is selected and approved for implementation, a Training Devise Requirements Document (TDRD) may be required by the sponsor for Program Objective Memorandum (POM) programming purposes. The TDRD summarizes the proposed training system, the training requirement, the training situation in which the system will be employed, and the resources required to develop, use, and maintain the training system. The TDRD typically focuses on training elements with high life cycle cost, such as simulators and establishment of a new school house.

Finally, a Training System Functional Description (TSFD) is developed to define the basic physical and functional baseline requirements of training devices specified in the IMRD. It describes how the device will be developed, consistent with any known constraints on cost, producibility and supportability. The TSFD defines the device that will be delivered to the user and also includes information regarding the facilities and logistics elements necessary to support training. In its final form, an approved TSFD provides system specifications to support industry proposals to supply the device.

Technical Approach

It is anticipated that contractor ISD and Information Management support will encompass all phases, tasks and deliverables associated with an FEA as described above. The general technical approach for each specific FEA support requirement identified in the SOW is described in Table 1-.



Table 1-: FEA Approach to Requirements Mapping

Support Requirement

Technical Approach

Continue to develop and update the EFV Training Systems individual, crew, and collective task lists for the operator and maintainer. The documents shall identify tasks; task difficulty, importance and frequency; knowledge; skills; training location (s); perishability and media recommendation

  • Establish close working relationship with PM AAA, Fleet and stakeholder points of contact (POCs) and subject matter experts (SMEs)

  • Consolidate & manage source documents used to identify tasks and frequency

  • Utilize established methodology and or SMEs to identify difficulty, importance, and knowledge, skills and abilities (KSAs)

  • Establish approved document formats that facilitate analysis, summary reporting and Enterprise access

Perform media selection analysis using the vehicle’s individual, crew, and collective tasks and apply the media selection models to the corresponding individual, crew, and collective task media pool. This will ultimately produce the Instruction Media Requirements document (Medial Selection Analysis) that will also include the Training System Alternatives to meet the training media requirements.

  • Use establish media selection models and or guidance (e.g. MIL HDBK 29612-2a)

  • Utilize available DoD resources (e.g. websites, Predeployment Training Program (PTP) Toolkit, etc.)

  • If required, develop new media selection models – coordinate with TECOM

  • Conduct interim review of media analysis results with AAA, Fleet and stakeholder POCs and SMEs

  • Conduct analysis and document results in a way that facilitates IMR document production

Collect, aggregate and report elements of training simulator cost pertinent to training cost-benefit decision-making with the use of a cost Analysis Model documenting these results in the Instructional Media Requirement Document; using these cost analysis results as an input, a cost-benefit analysis of Training System Alternatives will be conducted.

  • Adapt DoD guidelines for an Analysis of Alternatives (AoA) to the Training System CBA

  • Promulgate Requests for Information (RFI) to industry if required to collect life-cycle cost data

  • Construct appropriate training system alternatives

  • Identify suitable training measures of effectiveness (MOEs) (suitability, adaptability, training time, etc.)

  • Screen initial alternative set based on approved criteria

  • Conduct effectiveness analysis on remaining alternatives

  • Estimate costs via approved cost model(s)

  • Compare training system alternatives on the basis of cost and effectiveness

Conduct a fidelity analysis of the selected Training System Alternative and document in the functional description section of the Instructional Media Requirements Document.

  • Identify specific capability requirements for each component of the selected alternative

  • Solicit input from future operational users

  • Conduct sensitivity analysis on training effectiveness of the selected alternative

  • Refine life cycle cost estimate to include component level

  • Develop functional descriptions for each component of the selected alternative (sub-system descriptions, employment concepts, limitations)

Support Embedded Training Requirements (ETR) Analysis.

  • Integrate identification of ETR into the MTL

  • Nominate identified tasks and performance objectives for ET based on selected properties (criticality to mission, perishability, etc.)

  • Utilize established computer-based tools

  • Document ETR and analytical basis

Support Master Task List Development.

  • Establish automatic links from supporting documents and list to the MTL

  • Integrate all tasks, training and learning objectives and everything else that relates to training

  • Integrate capability to reflect changes and do “what-if” analysis

Perform various human factors analyses (HFA) of operational and maintenance tasks and develop Human Performance Measurement Specifications.

  • Design Systems individual, crew, and collective task lists to support HFA

  • Apply established HFA methodology, such as Probabilistic Risk Assessment (PRA), Technique for Human Error Rate Prediction (THERP) and Cognitive Reliability and Error Analysis Method (CREAM)

  • Identify and develop generic performance measures, then tailor to EFV based on task list

  • Develop HP specifications by relating performance measures to system performance requirements, such as Key Performance Parameters (KPPs)

  • Refine, change and document

Provide an Enterprise means of delivering FEA data so that users at various locations have real time access to the data. This effort shall be coordinated with PM AAA CIO to ensure consistency with Program IT objectives.

  • Assess current Enterprise and PM AAA IT capabilities

  • Develop FEA data delivery alternatives

  • Conduct feasibility study

  • Recommend delivery strategy/process

  • Implement approved delivery strategy/process

Provide FEA data in a format that will allow integration with other Logistics databases

  • Coordinate with PM AAA Logistics Lead

  • Identify current database capabilities/limitations

  • Identify FEA data base capability requirements

  • Recommend FEA data base format(s)

  • Implement approved formats

Staffing Approach

The ISD and a properly conducted FEA require a wide range of MPT, engineering, and other professional skills. While some skills will be required across the entire FEA process, others may be concentrated at certain phases. Table 1- below provides an assessment of the support staff skill sets required for each FEA support requirement identified in the SOW.



Table 1- : FEA Staffing Support Matrix

General Support Tasks:

Support Staff Skill Sets

Manpower & Personnel

Training Systems Analysis

Human Factors Engineering

Human Resource Management

Performance Analysis

Anthropometry & Biomechanics

Industrial Safety

Applied Probability & Statistics

Modeling & Simulation

Data Management

Test & Evaluation

Sampling & Survey Development

Cost Analysis

Develop / update task lists




































Conduct media selection analysis

































Collect & aggregate training system cost elements





































Conduct CBA of training system alternatives (TSAs)


































Conduct fidelity analysis of preferred TSA






























Support embedded training (ET) requirements analysis


































Support MTL development




































Perform human factors analysis































Develop Human Performance Measurement Specifications


































Provide enterprise means of delivering FEA data to users





































Integrate FEA data into other logistics databases






































Based on the range, depth and timing of required skill sets, establishing a fixed contract support staff over the entire FEA period of performance may be inefficient and sub-optimal. Therefore, rather than provide a fixed set of staff and skill sets, our approach is to provide a single, highly qualified, experienced and credible Full Time Equivalent (FTE) that is dedicated to supporting the MPT Division and focused on completion of the EFV FEA. This single FTE will then pull from a cadre of SMEs that can be assigned full time to the FEA for an extended duration, part or full time for shorter periods during selected analysis phases, or episodically as unique conditions of the FEA dynamics dictate. Given the emphasis on quality personnel in the SOW, this core of SMEs would be made up of personnel with advanced degrees and or significant (> 10 years experience) in the following disciplines:

• Applied Mathematics • Computational Statistics • Computer Science

• Cost Analysis • Human Resource Management • Human Systems Engineering

• Industrial Safety • Information Technology • Manpower Personnel & Training

• Operations Research • Personnel Management • Systems Engineering

As a further augmentation to the FTE + SME Core concept, our staffing approach also provides for “reach-back” support across a wider range of skills and talents, such as those listed below.

• Computer Programming • Concept Development • Decision Support

• Financial Management • Lean Six Sigma • Logistic Support

• Modeling & Simulation • Network Systems Design • Occupational Health

• President’s Budget (PB) • Program Management • Programming/Optimization

• Sampling & Surveys • System Acquisition • Test & Evaluation

• Training Systems Design • Training Course Design

• Joint Capabilities Integration Development System (JCIDS)

Finally, the FTE and SME Core along with individuals providing reach-back support to the FEA would have a significant military background and span all relevant USMC Officer Military Occupational Specialties (MOSs).



We envision that our FTE + SME Core team composition will be consistent across the period of performance with individual level of effort tailored to the FEA requirements. The following notional scenario illustrates how the FTE + SME Core + Reach-back approach would be applied to support the EFV FEA:

  • In response to a tasker from the FEA Coordinator to update the EFV MTL, the FTE assigns appropriate sections to three Core SMEs with experience in training systems development, USMC MOS nomenclature, and amphibious assault operations. One of the SMEs has become the MTL resident expert.

  • During review, the SMEs determine that: (1) preventive maintenance tasks associated with the rear troop hatch have been omitted; and (2) tasks flagged as EO candidates in the individual/crew/collective task lists do not roll-up into the MTL.

  • After further research the assigned SME determines that no source documents exist to establish PM requirements for the troop hatch. After acquiring PMS cards from the door supplier, the SME identifies specific task requirements and then enlists reach-back assistance to establish corresponding training and enabling objectives. The maintenance task list and MTL are updated to reflect the new information.

  • To address the task list roll-up issue, the FTE enlists reach-back assistance from a database expert. Working with the MTL SME, the database expert quickly establishes and tests linkages between task lists and in the process identifies ways to upgrade the capability of the task lists while at the same time making them more compatible for potential future web enablement.

  • The FTE advises the FEA Coordinator that the MTL is up to date, describes linkages enhancements that were made and summarizes potential upgrade options. The FEA Coordinator requests a more detailed assessment of what it would take to web-enable the MTL.

  • The FTE forms a Tiger Team consisting of the MTL SME and IT reach-back support to conduct the assessment.

The FTE + SME Core + Reach-back staffing approach has been recently applied with great success to support major acquisition programs, such as the KC-X (now known as the KC-45) tanker replacement program. We believe this approach will provide high quality and cost effective personnel with exactly the right skills at exactly the right time to optimally support the EFV FEA.

1.3.3Technical Management and Administration [SOW 2.1.3]

1.3.3.1Management Approach


Stanley will apply the Project Management Institute (PMI) best practices for managing complex projects. We use the PMI’s Project Management Body of Knowledge (PMBOK) Guide to ensure consistency, quality, accuracy, timeliness and cost-effective project performance. In our experience, these practices yield results that meet and exceed contract requirements, while ensuring high levels of customer satisfaction. Mr. Tudela is responsible for ensuring that the project adheres to PMBOK standards and that best practices are applied appropriately.

1.3.3.2Project Control and Oversight


Within five days of contract award, Stanley will prove a detailed Work Breakdown Structure (WBS) detailing our Plan of Actions and Milestones (POA&M) and quality control measures for the base year and two option years.

Our Technical Manager, Mr. Clift Bergeron, will be responsible for the day-to-day control and oversight of the project activities. For this project, we will collaborate with the Government to ensure that our project management practices align with and support the customer’s desired outcomes. Because Stanley’s management is continually tracking data supporting effective program management, the customer can easily apply its demonstration and inspection surveillance methods to assess Stanley’s performance against the performance requirements.


1.3.3.2.1Staffing

Stanley has selected three highly qualified individuals to support the PM AAA requirements. Table 1- summarizes our staffing plan and allocation of labor hours to the SOW WBS.

Our proposed Staffing Plan is provided in the following table.



Table 1-: Staffing Plan

Position

Name/ Status

Role & Responsibility

Task Areas Supported

2.1.1

2.1.2

2.1.3

Total

FTE

Principal Program Manager

Ed Tudela / Stanley Employee

Overall program management

0

0

94

94

0.05

  • Ensure effective contract management and timely delivery of cost, schedule and performance requirements.

Technical Manager

Clift Bergeron / Stanley Employee

Day-to-day technical task order management

562

702

140

1404

0.75

  • Manage and control day-to-day business and technical functions

  • Serve as PM MCT liaison to all stakeholders

  • Direct and monitor all project activities

  • Participate in all business and technical forums

Systems Engineer (SE)

John Broyle / TMI Employee

Resident SE support

1872

0

0

1872

1.00

  • Assess and solve complex engineering and technical problems.

  • Interpret requirements and develop specifications.

  • Plan, develop, or select potential technical solutions and acquisition documentation;

  • Exercise independent judgment on task execution;

  • Manage the systems engineering activities, processes and deliverables.

  • Serve a SME and technical liaison for systems engineering interface with other functional areas.

SE Reach-back

Harrison, Nobles & Quensel / TMI Employee

SME support on SE and acquisition matters

936

0

0

936

0.50

  • Assist the resident SE in solving complex engineering, technical and acquisition issues.

IA Specialist

Name / WBB Employee

Resident IA support

400

0

0

400

0.21

  • Assess IA requirements

FEA Lead Analyst

Name / WBB Employee

Resident FEA support

0

1872

0

1872

1.00

  • Information Assurance strategies, Information Assurance support

MPT Analyst

Name / WBB Employee




0

1872

0

1872

1.00



FEA/MPT Reach-back

Name / WBB Employee

SME support

0

1086

0

1086

0.58



PROGRAM TOTAL:

3770

5918

234

9922

5.30
1.3.3.2.2Project Deliverables

Stanley will coordinate project deliverable requirements with the Government to ensure proper allocation of resources and effectiveness of the project schedule and cost. Based on the requirements specified in the SOW, we propose the deliverables be considered for this project:

  • Project Management Plan (PMP)—details the project scope; the WBS of all required tasks; project schedule with dependencies; organizational relationships; resources; roles and responsibilities; deliverables; tooling; costs; and risks associated with the project.

  • Quality Control Plan (QCP)—reflects the quality control measures contained in the QASP, as reconciled with the PMO prior to task order commencement, as well as Stanley corporate quality policies. Major Deliverables and supporting plans will be captured in the task POA&M and assigned to appropriate WBS elements for tracking. Deliverable planning, to include identification of schedule and success criteria, will be developed in coordination with appropriate government representatives. All deliverables will receive quality reviews by Stanley personnel prior to delivery to the government. Major Deliverables will receive formal quality reviews by an independent review team of senior Stanley managers prior to delivery.

  • Monthly Status Report (MSRs)—provides monthly status reports to ensure regular communication between our project team and our customers. The report provides detailed schedule and cost performance information and metrics, risks and issues, work schedule updates, and inputs from other team members.

  • In Process Review (IPR)—provides an open forum for the PMO representatives to ask questions and make recommendations. Each month, the Stanley PM, Mr. Tudela, will coordinate, support, and conduct a formal status review of the work performed. As directed by the customer, our PM will conduct informal reviews, conferences, and audits. For each action item created, we will track the initiator, date assigned, responsible party, due date, date closed, and resolution.

  • POA&M—a management tool for specifying activities towards a target milestone, and tracking the progress of the project to ensure achievement of milestone objectives.
1   2   3   4   5   6   7


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©atelim.com 2016
rəhbərliyinə müraciət