Читать книгу Systems Assurance A Complete Guide - 2020 Edition - Gerardus Blokdyk - Страница 8

Оглавление

CRITERION #2: DEFINE:

INTENT: Formulate the stakeholder problem. Define the problem, needs and objectives.

In my belief, the answer to this question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. What are (control) requirements for Systems assurance Information?

<--- Score

2. How do you manage changes in Systems assurance requirements?

<--- Score

3. What baselines are required to be defined and managed?

<--- Score

4. How do you build the right business case?

<--- Score

5. How would you define the culture at your organization, how susceptible is it to Systems assurance changes?

<--- Score

6. What are the dynamics of the communication plan?

<--- Score

7. How is the team tracking and documenting its work?

<--- Score

8. Are there any constraints known that bear on the ability to perform Systems assurance work? How is the team addressing them?

<--- Score

9. How do you keep key subject matter experts in the loop?

<--- Score

10. How does the Systems assurance manager ensure against scope creep?

<--- Score

11. Is there a clear Systems assurance case definition?

<--- Score

12. What Systems assurance services do you require?

<--- Score

13. Is the team adequately staffed with the desired cross-functionality? If not, what additional resources are available to the team?

<--- Score

14. Who defines (or who defined) the rules and roles?

<--- Score

15. Will team members regularly document their Systems assurance work?

<--- Score

16. Has/have the customer(s) been identified?

<--- Score

17. Do you all define Systems assurance in the same way?

<--- Score

18. How can the value of Systems assurance be defined?

<--- Score

19. Are there different segments of customers?

<--- Score

20. What constraints exist that might impact the team?

<--- Score

21. Are improvement team members fully trained on Systems assurance?

<--- Score

22. Is the team sponsored by a champion or stakeholder leader?

<--- Score

23. Is special Systems assurance user knowledge required?

<--- Score

24. How would you define Systems assurance leadership?

<--- Score

25. If substitutes have been appointed, have they been briefed on the Systems assurance goals and received regular communications as to the progress to date?

<--- Score

26. How and when will the baselines be defined?

<--- Score

27. Is Systems assurance required?

<--- Score

28. When is the estimated completion date?

<--- Score

29. Is the scope of Systems assurance defined?

<--- Score

30. Are audit criteria, scope, frequency and methods defined?

<--- Score

31. Are required metrics defined, what are they?

<--- Score

32. What is in scope?

<--- Score

33. How will variation in the actual durations of each activity be dealt with to ensure that the expected Systems assurance results are met?

<--- Score

34. What specifically is the problem? Where does it occur? When does it occur? What is its extent?

<--- Score

35. Are the Systems assurance requirements testable?

<--- Score

36. How are consistent Systems assurance definitions important?

<--- Score

37. What scope to assess?

<--- Score

38. What intelligence can you gather?

<--- Score

39. Are customer(s) identified and segmented according to their different needs and requirements?

<--- Score

40. What are the requirements for audit information?

<--- Score

41. Are the Systems assurance requirements complete?

<--- Score

42. What scope do you want your strategy to cover?

<--- Score

43. Are resources adequate for the scope?

<--- Score

44. Who are the Systems assurance improvement team members, including Management Leads and Coaches?

<--- Score

45. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?

<--- Score

46. Is scope creep really all bad news?

<--- Score

47. What was the context?

<--- Score

48. What is the scope of Systems assurance?

<--- Score

49. What is the scope?

<--- Score

50. The political context: who holds power?

<--- Score

51. Is the current ‘as is’ process being followed? If not, what are the discrepancies?

<--- Score

52. Who is gathering information?

<--- Score

53. Does the scope remain the same?

<--- Score

54. Is full participation by members in regularly held team meetings guaranteed?

<--- Score

55. What happens if Systems assurance’s scope changes?

<--- Score

56. Who approved the Systems assurance scope?

<--- Score

57. What information do you gather?

<--- Score

58. Will team members perform Systems assurance work when assigned and in a timely fashion?

<--- Score

59. What is in the scope and what is not in scope?

<--- Score

60. How often are the team meetings?

<--- Score

61. How do you hand over Systems assurance context?

<--- Score

62. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?

<--- Score

63. When is/was the Systems assurance start date?

<--- Score

64. Will a Systems assurance production readiness review be required?

<--- Score

65. What key stakeholder process output measure(s) does Systems assurance leverage and how?

<--- Score

66. Is it clearly defined in and to your organization what you do?

<--- Score

67. What is the definition of Systems assurance excellence?

<--- Score

68. Do you have a Systems assurance success story or case study ready to tell and share?

<--- Score

69. How do you gather Systems assurance requirements?

<--- Score

70. Is there a completed SIPOC representation, describing the Suppliers, Inputs, Process, Outputs, and Customers?

<--- Score

71. Is there a completed, verified, and validated high-level ‘as is’ (not ‘should be’ or ‘could be’) stakeholder process map?

<--- Score

72. Is Systems assurance linked to key stakeholder goals and objectives?

<--- Score

73. How do you manage scope?

<--- Score

74. Is there a critical path to deliver Systems assurance results?

<--- Score

75. Are task requirements clearly defined?

<--- Score

76. Who is gathering Systems assurance information?

<--- Score

77. Has the direction changed at all during the course of Systems assurance? If so, when did it change and why?

<--- Score

78. What information should you gather?

<--- Score

79. What are the tasks and definitions?

<--- Score

80. Where can you gather more information?

<--- Score

81. Is data collected and displayed to better understand customer(s) critical needs and requirements.

<--- Score

82. Are stakeholder processes mapped?

<--- Score

83. Are different versions of process maps needed to account for the different types of inputs?

<--- Score

84. How will the Systems assurance team and the group measure complete success of Systems assurance?

<--- Score

85. What are the Systems assurance tasks and definitions?

<--- Score

86. Have all basic functions of Systems assurance been defined?

<--- Score

87. How do you gather the stories?

<--- Score

88. Is the team equipped with available and reliable resources?

<--- Score

89. What gets examined?

<--- Score

90. How did the Systems assurance manager receive input to the development of a Systems assurance improvement plan and the estimated completion dates/times of each activity?

<--- Score

91. Is the team formed and are team leaders (Coaches and Management Leads) assigned?

<--- Score

92. Are team charters developed?

<--- Score

93. Is the Systems assurance scope complete and appropriately sized?

<--- Score

94. How do you catch Systems assurance definition inconsistencies?

<--- Score

95. How do you think the partners involved in Systems assurance would have defined success?

<--- Score

96. Have specific policy objectives been defined?

<--- Score

97. Is the Systems assurance scope manageable?

<--- Score

98. What are the Roles and Responsibilities for each team member and its leadership? Where is this documented?

<--- Score

99. Does the team have regular meetings?

<--- Score

100. Has the Systems assurance work been fairly and/or equitably divided and delegated among team members who are qualified and capable to perform the work? Has everyone contributed?

<--- Score

101. Is there a Systems assurance management charter, including stakeholder case, problem and goal statements, scope, milestones, roles and responsibilities, communication plan?

<--- Score

102. Has a team charter been developed and communicated?

<--- Score

103. What critical content must be communicated – who, what, when, where, and how?

<--- Score

104. Has the improvement team collected the ‘voice of the customer’ (obtained feedback – qualitative and quantitative)?

<--- Score

105. How do you manage unclear Systems assurance requirements?

<--- Score

106. Are accountability and ownership for Systems assurance clearly defined?

<--- Score

107. Are all requirements met?

<--- Score

108. Has a high-level ‘as is’ process map been completed, verified and validated?

<--- Score

109. What is the worst case scenario?

<--- Score

110. Has anyone else (internal or external to the group) attempted to solve this problem or a similar one before? If so, what knowledge can be leveraged from these previous efforts?

<--- Score

111. Do you have organizational privacy requirements?

<--- Score

112. How have you defined all Systems assurance requirements first?

<--- Score

113. Is there any additional Systems assurance definition of success?

<--- Score

114. Is the work to date meeting requirements?

<--- Score

115. When are meeting minutes sent out? Who is on the distribution list?

<--- Score

116. What Systems assurance requirements should be gathered?

<--- Score

117. Has a Systems assurance requirement not been met?

<--- Score

118. What is a worst-case scenario for losses?

<--- Score

119. How was the ‘as is’ process map developed, reviewed, verified and validated?

<--- Score

120. What would be the goal or target for a Systems assurance’s improvement team?

<--- Score

121. What is the definition of success?

<--- Score

122. What are the compelling stakeholder reasons for embarking on Systems assurance?

<--- Score

123. What are the boundaries of the scope? What is in bounds and what is not? What is the start point? What is the stop point?

<--- Score

124. What is the context?

<--- Score

125. What defines best in class?

<--- Score

126. Has your scope been defined?

<--- Score

127. In what way can you redefine the criteria of choice clients have in your category in your favor?

<--- Score

128. Are roles and responsibilities formally defined?

<--- Score

129. Do the problem and goal statements meet the SMART criteria (specific, measurable, attainable, relevant, and time-bound)?

<--- Score

130. What customer feedback methods were used to solicit their input?

<--- Score

131. Is Systems assurance currently on schedule according to the plan?

<--- Score

132. Are approval levels defined for contracts and supplements to contracts?

<--- Score

133. What sort of initial information to gather?

<--- Score

134. Have the customer needs been translated into specific, measurable requirements? How?

<--- Score

135. What is out of scope?

<--- Score

136. Scope of sensitive information?

<--- Score

137. What is the scope of the Systems assurance effort?

<--- Score

138. Has everyone on the team, including the team leaders, been properly trained?

<--- Score

139. Has a project plan, Gantt chart, or similar been developed/completed?

<--- Score

140. What is the scope of the Systems assurance work?

<--- Score

141. What are the rough order estimates on cost savings/opportunities that Systems assurance brings?

<--- Score

Add up total points for this section: _____ = Total points for this section

Divided by: ______ (number of statements answered) = ______ Average score for this section

Transfer your score to the Systems assurance Index at the beginning of the Self-Assessment.

Systems Assurance A Complete Guide - 2020 Edition

Подняться наверх