Читать книгу Automated Search And Retrieval System A Complete Guide - 2020 Edition - Gerardus Blokdyk - Страница 8
ОглавлениеCRITERION #2: DEFINE:
INTENT: Formulate the stakeholder problem. Define the problem, needs and objectives.
In my belief, the answer to this question is clearly defined:
5 Strongly Agree
4 Agree
3 Neutral
2 Disagree
1 Strongly Disagree
1. Who defines (or who defined) the rules and roles?
<--- Score
2. How and when will the baselines be defined?
<--- Score
3. Have the customer needs been translated into specific, measurable requirements? How?
<--- Score
4. Who is gathering Automated search and retrieval system information?
<--- Score
5. Will a Automated search and retrieval system production readiness review be required?
<--- Score
6. Is special Automated search and retrieval system user knowledge required?
<--- Score
7. How did the Automated search and retrieval system manager receive input to the development of a Automated search and retrieval system improvement plan and the estimated completion dates/times of each activity?
<--- Score
8. Is the scope of Automated search and retrieval system defined?
<--- Score
9. Are there any constraints known that bear on the ability to perform Automated search and retrieval system work? How is the team addressing them?
<--- Score
10. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?
<--- Score
11. What is in the scope and what is not in scope?
<--- Score
12. What are the record-keeping requirements of Automated search and retrieval system activities?
<--- Score
13. What information do you gather?
<--- Score
14. Has the Automated search and retrieval system work been fairly and/or equitably divided and delegated among team members who are qualified and capable to perform the work? Has everyone contributed?
<--- Score
15. How do you manage changes in Automated search and retrieval system requirements?
<--- Score
16. What is the scope?
<--- Score
17. What is the context?
<--- Score
18. What are the boundaries of the scope? What is in bounds and what is not? What is the start point? What is the stop point?
<--- Score
19. What are the dynamics of the communication plan?
<--- Score
20. How do you manage scope?
<--- Score
21. Are the Automated search and retrieval system requirements complete?
<--- Score
22. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?
<--- Score
23. Has anyone else (internal or external to the group) attempted to solve this problem or a similar one before? If so, what knowledge can be leveraged from these previous efforts?
<--- Score
24. Where can you gather more information?
<--- Score
25. What are the rough order estimates on cost savings/opportunities that Automated search and retrieval system brings?
<--- Score
26. What system do you use for gathering Automated search and retrieval system information?
<--- Score
27. How will the Automated search and retrieval system team and the group measure complete success of Automated search and retrieval system?
<--- Score
28. What scope do you want your strategy to cover?
<--- Score
29. Who approved the Automated search and retrieval system scope?
<--- Score
30. Are different versions of process maps needed to account for the different types of inputs?
<--- Score
31. What is the scope of the Automated search and retrieval system work?
<--- Score
32. Why are you doing Automated search and retrieval system and what is the scope?
<--- Score
33. Is it clearly defined in and to your organization what you do?
<--- Score
34. What sources do you use to gather information for a Automated search and retrieval system study?
<--- Score
35. Has/have the customer(s) been identified?
<--- Score
36. What information should you gather?
<--- Score
37. Has a project plan, Gantt chart, or similar been developed/completed?
<--- Score
38. What constraints exist that might impact the team?
<--- Score
39. Is there a completed SIPOC representation, describing the Suppliers, Inputs, Process, Outputs, and Customers?
<--- Score
40. Who are the Automated search and retrieval system improvement team members, including Management Leads and Coaches?
<--- Score
41. Has a team charter been developed and communicated?
<--- Score
42. What are the tasks and definitions?
<--- Score
43. Are required metrics defined, what are they?
<--- Score
44. How would you define the culture at your organization, how susceptible is it to Automated search and retrieval system changes?
<--- Score
45. What are the compelling stakeholder reasons for embarking on Automated search and retrieval system?
<--- Score
46. Are audit criteria, scope, frequency and methods defined?
<--- Score
47. What Automated search and retrieval system requirements should be gathered?
<--- Score
48. Is the work to date meeting requirements?
<--- Score
49. What would be the goal or target for a Automated search and retrieval system’s improvement team?
<--- Score
50. How have you defined all Automated search and retrieval system requirements first?
<--- Score
51. What is the worst case scenario?
<--- Score
52. What scope to assess?
<--- Score
53. What is the scope of Automated search and retrieval system?
<--- Score
54. Are the Automated search and retrieval system requirements testable?
<--- Score
55. In what way can you redefine the criteria of choice clients have in your category in your favor?
<--- Score
56. Is Automated search and retrieval system required?
<--- Score
57. How do you gather the stories?
<--- Score
58. Has your scope been defined?
<--- Score
59. What is in scope?
<--- Score
60. Are all requirements met?
<--- Score
61. How are consistent Automated search and retrieval system definitions important?
<--- Score
62. Is there a critical path to deliver Automated search and retrieval system results?
<--- Score
63. Is there a completed, verified, and validated high-level ‘as is’ (not ‘should be’ or ‘could be’) stakeholder process map?
<--- Score
64. When is/was the Automated search and retrieval system start date?
<--- Score
65. Has the direction changed at all during the course of Automated search and retrieval system? If so, when did it change and why?
<--- Score
66. Has a high-level ‘as is’ process map been completed, verified and validated?
<--- Score
67. What baselines are required to be defined and managed?
<--- Score
68. How will variation in the actual durations of each activity be dealt with to ensure that the expected Automated search and retrieval system results are met?
<--- Score
69. Has a Automated search and retrieval system requirement not been met?
<--- Score
70. What is the definition of Automated search and retrieval system excellence?
<--- Score
71. What are the Roles and Responsibilities for each team member and its leadership? Where is this documented?
<--- Score
72. Have all of the relationships been defined properly?
<--- Score
73. Does the scope remain the same?
<--- Score
74. How do you gather Automated search and retrieval system requirements?
<--- Score
75. How was the ‘as is’ process map developed, reviewed, verified and validated?
<--- Score
76. What key stakeholder process output measure(s) does Automated search and retrieval system leverage and how?
<--- Score
77. Is Automated search and retrieval system linked to key stakeholder goals and objectives?
<--- Score
78. Are resources adequate for the scope?
<--- Score
79. The political context: who holds power?
<--- Score
80. Does the team have regular meetings?
<--- Score
81. Do you have a Automated search and retrieval system success story or case study ready to tell and share?
<--- Score
82. What Automated search and retrieval system services do you require?
<--- Score
83. How would you define Automated search and retrieval system leadership?
<--- Score
84. When is the estimated completion date?
<--- Score
85. Has the improvement team collected the ‘voice of the customer’ (obtained feedback – qualitative and quantitative)?
<--- Score
86. How does the Automated search and retrieval system manager ensure against scope creep?
<--- Score
87. Do you all define Automated search and retrieval system in the same way?
<--- Score
88. Have specific policy objectives been defined?
<--- Score
89. How do you gather requirements?
<--- Score
90. Are there different segments of customers?
<--- Score
91. Do you have organizational privacy requirements?
<--- Score
92. What customer feedback methods were used to solicit their input?
<--- Score
93. How is the team tracking and documenting its work?
<--- Score
94. What was the context?
<--- Score
95. What are the core elements of the Automated search and retrieval system business case?
<--- Score
96. Are task requirements clearly defined?
<--- Score
97. What are the requirements for audit information?
<--- Score
98. What are the Automated search and retrieval system tasks and definitions?
<--- Score
99. Have all basic functions of Automated search and retrieval system been defined?
<--- Score
100. What gets examined?
<--- Score
101. Is there a clear Automated search and retrieval system case definition?
<--- Score
102. What knowledge or experience is required?
<--- Score
103. What happens if Automated search and retrieval system’s scope changes?
<--- Score
104. How can the value of Automated search and retrieval system be defined?
<--- Score
105. Are accountability and ownership for Automated search and retrieval system clearly defined?
<--- Score
106. How often are the team meetings?
<--- Score
107. What is the scope of the Automated search and retrieval system effort?
<--- Score
108. What is out of scope?
<--- Score
109. How do you manage unclear Automated search and retrieval system requirements?
<--- Score
110. What is the definition of success?
<--- Score
111. What are the Automated search and retrieval system use cases?
<--- Score
112. What is out-of-scope initially?
<--- Score
113. Is the Automated search and retrieval system scope manageable?
<--- Score
114. Has everyone on the team, including the team leaders, been properly trained?
<--- Score
115. Are approval levels defined for contracts and supplements to contracts?
<--- Score
116. How do you keep key subject matter experts in the loop?
<--- Score
117. Is there any additional Automated search and retrieval system definition of success?
<--- Score
118. Is Automated search and retrieval system currently on schedule according to the plan?
<--- Score
119. Are roles and responsibilities formally defined?
<--- Score
120. What specifically is the problem? Where does it occur? When does it occur? What is its extent?
<--- Score
121. What intelligence can you gather?
<--- Score
122. Is the Automated search and retrieval system scope complete and appropriately sized?
<--- Score
123. When are meeting minutes sent out? Who is on the distribution list?
<--- Score
124. What is a worst-case scenario for losses?
<--- Score
125. Who is gathering information?
<--- Score
126. Do the problem and goal statements meet the SMART criteria (specific, measurable, attainable, relevant, and time-bound)?
<--- Score
127. Is the current ‘as is’ process being followed? If not, what are the discrepancies?
<--- Score
128. What critical content must be communicated – who, what, when, where, and how?
<--- Score
129. How do you hand over Automated search and retrieval system context?
<--- Score
130. Is the team adequately staffed with the desired cross-functionality? If not, what additional resources are available to the team?
<--- Score
131. How do you catch Automated search and retrieval system definition inconsistencies?
<--- Score
132. If substitutes have been appointed, have they been briefed on the Automated search and retrieval system goals and received regular communications as to the progress to date?
<--- Score
Add up total points for this section: _____ = Total points for this section
Divided by: ______ (number of statements answered) = ______ Average score for this section
Transfer your score to the Automated search and retrieval system Index at the beginning of the Self-Assessment.