-
Notifications
You must be signed in to change notification settings - Fork 1
Development Plan
This is our development plan. This document contains:
-
Project planning including major task breakdown and time estimates.
-
Risk analysis including risk analysis method applied, risks identified and risk mitigation strategies.
This section discusses planning and scheduling of our project.
2.1 Development Model
We discuss our development model and testing approach below. The decisions on this segment were critical as they informed on how we approach estimations and on how we assess risks.
2.1.1 Choice of Development Model
2.1.1.1. Code-Like-Hell Model
We aim to avoid using Code-Like-Hell due to its known risks (e.g., poor efficiency, cost overrun, project delay). Although modest in size, this project is not small and other development models such as Waterfall and iterative models should be considered. Below, we provide a summary of the models that we have considered, and our reasoning to lean towards iterative models.
2.1.1.2. Waterfall Models
a. Overlapped Waterfall: not preferred
b. Waterfall with Subprojects: not preferred
c. Staged Delivery: not preferred
d. Design to Schedule: not preferred when having appropriate priorities setup
2.1.1.3. Iterative Models
a. Evolutionary Prototyping: the chosen model for this project
b. Evolutionary Delivery: preferred when having continuous user's engagement and feedback
c. Spiral Model: not preferred; too complicated for this medium size project
Kostas Kavoussanakis from EPCC mentioned in one of our Software Development tutorials that when a team has an architecture loosely defined and does not necessarily know how to implement it exactly, it naturally leads to an evolutionary prototyping development model. We are confident that this will be the case in our project. So, even if we go by the books and choose a model with presumably low risk, we will naturally emerge towards the relatively riskier model of evolutionary prototyping. We would not emerge towards code-like-hell model at this point as we do have the architecture defined (i.e., our layered architecture).
Our evolutionary prototyping approach is very similar to evolutionary delivery. The caveat that makes it different is that in evolutionary prototyping, the visuals are emphasized. This is true in our case as we developed a user interface prototype, which we will iterate during the project. In evolutionary delivery, code that is unlikely to change comes first. This is not relevant in our case as we did not code anything yet. Also, we are expecting to iterate on the design and code as much as we need to get it working first, and then refactor as needed to improve on design, performance, and other qualities.
The spiral model is a complex iterative model normally meant for advanced software engineering professionals. The idea of the model is to reduce major risks in each iteration and analyze the feasibility of the project. This does not apply in our case as we do have to complete the project. It could be argued that we are also attempting to reduce risk in each iteration. The caveat here is that our focus is on prioritized requirements using the MoSCoW method. While prioritizing requirements does reduce risk, spiral model requires a much more nuanced view at assessing risks from teams and starting to work from reducing tail risks (with large impact) to lower risks (with small impact). Intuitively, we aim to keep the risks in our mind as we work on the project. The spiral model is not feasible for a modest project in our case.
2.1.2 Testing
We will apply three forms of functional testing: (1) unit testing, (2) integration testing, and (3) system testing. Also, we will leverage one form of non-functional testing, which is usability testing. We reviewed our list of requirements and linked them to testable units. With our preference towards test-driven development (TDD), we believe that this is the right approach.
2.2 Tasks
We list below the tasks of this project, expected time, actual time, dependencies, and members assigned.
We have used estimation by analogy to estimate expected times for the project. We have provided both low and high estimates to account for uncertainties in execution of the specific tasks [1]. Also, a risk buffer of 25% of the total time have been added to account for unpredictable events (i.e. "unknown unknowns") [2].
Table 1 below lists the task details of the first phase of our project: Requirements, Design, and Requirements Engineering. The tasks have been split into the five broad categories of the phase: Requirements Engineering, Software Design, User Interface Design, Team Structure and Role Assignment, and Miscellaneous. The Miscellaneous tasks include the supporting tasks to complete the tasks, e.g. communications, reviews, proofreads, etc. As this phase in our project has been completed, we have provided actual times taken to execute the tasks.
ID | Task | Expected Time Low (Hours) | Expected Time High (Hours) | Actual Time (Hours) | Dependencies | Members Assigned* |
---|---|---|---|---|---|---|
A | Requirements Engineering | |||||
1 | Read brief | 2.0 | 3.0 | 2.0 | - | TC, NS |
2 | List initial set of requirements | 2.0 | 3.0 | 2.0 | 1 | TC, NS |
3 | Discuss questions from brief with PO via email | 1.0 | 1.5 | 3.0 | 2 | TC, NS, AG |
4 | Research competitors to discover customer pain points | 2.0 | 3.0 | 4.0 | 2 | TC, NS |
5 | Prioritize requirements | 2.0 | 3.0 | 2.0 | 3, 4 | TC, NS |
6 | Draft section | 2.0 | 3.0 | 2.0 | 5 | TC, NS |
B | Software Design | |||||
7 | Gather technology stack | 1.0 | 1.5 | 1.0 | 1 | TC, NS |
8 | Analyze architectures | 2.0 | 3.0 | 3.0 | 5 | NS |
9 | Visualize system design | 1.0 | 1.5 | 2.0 | 8 | NS |
10 | Draw UML diagram | 2.0 | 3.0 | 2.0 | 8 | TC |
11 | Analyze implementation path | 1.0 | 1.5 | 2.0 | 8 | TC |
12 | Draft section | 2.0 | 3.0 | 4.0 | 9, 10, 11 | TC, NS |
C | User Interface Design | |||||
13 | Create sitemap | 2.0 | 3.0 | 2.0 | 8 | NS |
14 | Create UI/UX wireframes | 4.0 | 6.0 | 4.0 | 13 | NS |
15 | Draft section | 2.0 | 3.0 | 2.0 | 14 | NS |
D | Team Structure and Role Assignment | |||||
16 | Analyze members, division of labor, and communication proces | 2.0 | 3.0 | 2.0 | 8 | TC, NS |
17 | Draft section | 1.0 | 1.5 | 2.0 | 16 | TC, NS |
E | Miscellaneous | |||||
18 | Communicate | 5.0 | 7.5 | 7.0 | 1 | TC, NS, AG |
19 | Research | 8.0 | 12.0 | 12.0 | 1 | TC, NS |
20 | Review wiki | 2.0 | 3.0 | 2.0 | 6, 12, 15, 17 | NS |
21 | Format wiki | 1.0 | 1.5 | 1.0 | 6, 12, 15, 17 | NS |
22 | Proofread wiki | 2.0 | 3.0 | 2.0 | 6, 12, 15, 17 | NS |
23 | Submit wiki | 1.0 | 1.5 | 1.0 | 22 | TC, NS |
24 | Complete WebPA forms | 1.0 | 1.5 | 1.0 | 23 | TC, NS |
25 | Review feedback from PO | 1.0 | 1.5 | 2.0 | 23, 24 | TC, NS |
26 | Risk buffer | 12.8 | 19.1 | - | - | TC, NS, AG |
Total | Total Time | 64.8 | 97.1 | 69.0 | - | - |
[*] Members Assigned: [TC] Tom Chan, [NS] Nabil Shadman, [AG] Dr Alistair Grant (also referred to as Product Owner or PO).
Table 1: Task details of the Requirements, Design and Team phase (Phase 1).
Table 2 below lists the tasks of the current phase (i.e., second phase) of our project: Development Plan. The tasks have been grouped across its broad categories: Planning and Scheduling, Risk Analysis, and Miscellaneous. As this phase is still ongoing at the time of writing, we have not reported on actual times.
ID | Task | Expected Time Low (Hours) | Expected Time High (Hours) | Actual Time (Hours) | Dependencies | Members Assigned |
---|---|---|---|---|---|---|
A | Planning and Scheduling | |||||
1 | Analyze development model | 1.0 | 1.5 | - | - | TC, NS |
2 | List tasks of the entire project | 3.0 | 4.5 | - | 1 | NS |
3 | Analyze expected times, dependencies, and assignments | 3.0 | 4.5 | - | 2 | NS |
4 | Draw Gantt charts | 3.0 | 4.5 | - | 3 | NS |
5 | Analyze changes to design and team structure | 1.0 | 1.5 | - | 4 | NS |
6 | Draft section | 2.0 | 3.0 | - | 5 | NS |
B | Risk Analysis | |||||
7 | Examine risk analysis methods | 2.0 | 3.0 | - | 1 | TC |
8 | Identify risks | 2.0 | 3.0 | - | 7 | TC |
9 | Analyze risk mitigation strategies | 2.0 | 3.0 | - | 8 | TC |
10 | Draft section | 2.0 | 3.0 | - | 9 | TC |
C | Miscellaneous | |||||
11 | Communicate | 3.0 | 4.5 | - | - | TC, NS, AG |
12 | Research | 4.0 | 6.0 | - | - | TC, NS |
13 | Review wiki | 2.0 | 3.0 | - | 6, 10 | TC, NS |
14 | Format wiki | 1.0 | 1.5 | - | 6, 10 | TC, NS |
15 | Proofread wiki | 2.0 | 3.0 | - | 6, 10 | TC, NS |
16 | Submit wiki | 1.0 | 1.5 | - | 15 | TC, NS |
17 | Complete WebPA forms | 1.0 | 1.5 | - | 16 | TC, NS |
18 | Review feedback from PO | 2.0 | 3.0 | - | 16, 17 | TC, NS |
19 | Risk buffer | 9.3 | 13.9 | - | - | TC, NS, AG |
Total | Total Time | 46.3 | 69.4 | - | - | - |
Table 2: Task details of the Development Plan phase (Phase 2).
Table 3 lists the tasks of the next and final phase of our project: Code Prototype, Usability, Code and Project Evaluation. The tasks are grouped across the broad categories of Product Prototype, Usability Test Plan, Usability Analysis, Prototype and Project Evaluation, and Miscellaneous. As mentioned above, the requirements of our project have been included here with the appropriate mappings to their tasks. Within the Product Prototype category, the tasks have been further grouped across its level of priority of the MoSCoW approach: Must have, Should have, and Could have. The requirements that fall under “Won’t have” have been omitted as we are not executing on those requirements.
ID | Requirement | Task | Expected Time Low (Hours) | Expected Time High (Hours) | Actual Time (Hours) | Dependencies | Members Assigned |
---|---|---|---|---|---|---|---|
A | Product Prototype | ||||||
A1 | Must have | ||||||
1 | Allow user to search for games | Code test for search_game function in user profile | 1.0 | 1.5 | - | - | NS |
2 | Code search_game function in user profile | 1.0 | 1.5 | - | 1 | NS | |
3 | Code game_search_bar in user profile | 1.0 | 1.5 | - | 2 | NS | |
4 | Allow user to add games | Code test for add_game function in games | 0.5 | 0.8 | - | - | TC |
5 | Code add_game function in games | 0.5 | 0.8 | - | 4 | TC | |
6 | Code add_game button in games | 0.5 | 0.8 | - | 5 | TC | |
7 | Allow user to update games | Code test for update_game function in user profile | 0.5 | 0.8 | - | - | NS |
8 | Code update_game function in user profile | 0.5 | 0.8 | - | 7 | NS | |
9 | Code update_game button in user profile | 0.5 | 0.8 | - | 8 | NS | |
10 | Allow user to delete games | Code test for delete_game function in user profile | 0.5 | 0.8 | - | - | TC |
11 | Code delete_game function in user profile | 0.5 | 0.8 | - | 10 | TC | |
12 | Code delete_game button in user profile | 0.5 | 0.8 | - | 11 | TC | |
A2 | Should have | ||||||
13 | Allow user to search for games she does not have | Code test for search_game function in games | 0.5 | 0.8 | - | - | NS |
14 | Code search_game function in games | 0.5 | 0.8 | - | 13 | NS | |
15 | Code game_search_bar in games | 0.5 | 0.8 | - | 14 | NS | |
16 | Allow user to read reviews of games from user community* | Code test for read_review function in games | 0.5 | 0.8 | - | - | TC |
17 | Allow user to read reviews of games from websites (e.g. board, rpg and card game review blogs, and published online magazines)* | Code read_review function in games | 0.5 | 0.8 | - | 16 | TC |
18 | Code read_review section in games | 0.5 | 0.8 | - | 17 | TC | |
19 | Allow user to add her own review* | Code test for post_review function in games | 0.5 | 0.8 | - | - | NS |
20 | Allow user to share house rules and clarifications for games* | Code post_review function in games | 0.5 | 0.8 | - | 19 | NS |
21 | Allow user to embed FAQs and errata documents from publishers* | Code post_review button in games | 0.5 | 0.8 | - | 20 | NS |
22 | Allow user to rate games | Code test for rate_review function in games | 0.5 | 0.8 | - | - | TC |
23 | Code rate_review function in games | 0.5 | 0.8 | - | 22 | TC | |
24 | Code rate_game in forum in games | 0.5 | 0.8 | - | 23 | TC | |
25 | Allow user to see average ratings of games | Code test for average_rating function in games | 0.5 | 0.8 | - | - | NS |
26 | Code average_rating function in games | 0.5 | 0.8 | - | 25 | NS | |
27 | Code average_rating graphic in games | 0.5 | 0.8 | - | 26 | NS | |
28 | Allow user to see recommendations of new games based on her reviews, collection, and popular games in her geographical area | Code test for recommend_games function in user profile | 2.0 | 3.0 | - | - | TC |
29 | Code recommend_games function in user profile | 2.0 | 3.0 | - | 28 | TC | |
30 | Code recommend_games section in user profile | 2.0 | 3.0 | - | 29 | TC | |
31 | Allow user to search for new players to play with based on custom specification (e.g. collection, interest) | Code test for search_player function in players | 0.5 | 0.8 | - | - | NS |
32 | Code search_player function in players | 0.5 | 0.8 | - | 31 | NS | |
33 | Code player_search_bar in players | 0.5 | 0.8 | - | 32 | NS | |
34 | Allow user to search for clubs in a custom geographical area | Code test for search_club function in clubs | 0.5 | 0.8 | - | - | TC |
35 | Code search_club function in clubs | 0.5 | 0.8 | - | 34 | TC | |
36 | Code club_search_bar in clubs | 0.5 | 0.8 | - | 35 | TC | |
37 | Allow user to view average price of a new or a used game from different sources (e.g. Amazon, Geek Market) | Code test for average_price function in games | 0.5 | 0.8 | - | - | NS |
38 | Code average_price function in games | 0.5 | 0.8 | - | 37 | NS | |
39 | Code average_price section in games | 0.5 | 0.8 | - | 38 | NS | |
40 | Allow user to go to game seller's website from the game record | Code test for link_seller function in games | 0.5 | 0.8 | - | - | TC |
41 | Code link_seller function in games | 0.5 | 0.8 | - | 40 | TC | |
42 | Code link_seller field in games | 0.5 | 0.8 | - | 41 | TC | |
A3 | Could have | ||||||
43 | Allow user to rate games publicly or privately | Code test for review_visibility function in games | 0.5 | 0.8 | - | - | NS |
44 | Code review_visibility function in games | 0.5 | 0.8 | - | 43 | NS | |
45 | Code review_visibility radio button in games | 0.5 | 0.8 | - | 44 | NS | |
46 | Allow user to view games in her collection based on specific criteria or categories (e.g. number of players, game type) | Code test for filter_games function in user profile | 1.0 | 1.5 | - | - | TC |
47 | Code filter_games function in user profile | 1.0 | 1.5 | - | 46 | TC | |
48 | Code filter_games dropdown in user profile | 1.0 | 1.5 | - | 47 | TC | |
49 | Allow user to search for games based on specific criteria (e.g. number of players, game type) | Code test for filter_games function in games | 0.5 | 0.8 | - | - | NS |
50 | Code filter_games function in games | 0.5 | 0.8 | - | 49 | NS | |
51 | Code filter_games dropdown in games | 0.5 | 0.8 | - | 50 | NS | |
52 | Create a system to accept or ignore game reviews based on a selection criteria | Code test for filter_review function in games | 2.0 | 3.0 | - | - | TC |
53 | Code filter_review function in games | 2.0 | 3.0 | - | 52 | TC | |
54 | Code filter_review in frontend in games | 2.0 | 3.0 | - | 53 | TC | |
55 | Create a weighting system to compute ratings | Code test for weigh_rating function in games | 2.0 | 3.0 | - | - | NS |
56 | Code weigh_rating function in games | 2.0 | 3.0 | - | 55 | NS | |
57 | Code weigh_rating in frontend in games | 2.0 | 3.0 | - | 56 | NS | |
B | Usability Test Plan | ||||||
58 | Analyze size and demographic of test cohort | 2.0 | 3.0 | - | - | TC, NS | |
59 | Analyze usability testing procedures | 2.0 | 3.0 | - | 58 | TC, NS | |
60 | Document plan | 2.0 | 3.0 | - | 59 | TC, NS | |
C | Usability Analysis | ||||||
61 | Execute usability test | 4.0 | 6.0 | - | 60 | TC, NS | |
62 | Analyze data | 2.0 | 3.0 | - | 61 | TC, NS | |
63 | Document analysis | 2.0 | 3.0 | - | 62 | TC, NS | |
D | Prototype and Project Evaluation | ||||||
64 | Evaluate prototype | 2.0 | 3.0 | - | 63 | TC, NS | |
65 | Evaluate project | 2.0 | 3.0 | - | 64 | TC, NS | |
66 | Document evaluations | 2.0 | 3.0 | - | 65 | TC, NS | |
E | Miscellaneous | ||||||
67 | Communicate | 8.4 | 12.6 | - | - | TC, NS, AG | |
68 | Research | 33.0 | 49.5 | - | - | TC, NS | |
69 | Document code | 13.2 | 19.8 | - | - | TC, NS | |
70 | Review code | 6.6 | 9.9 | - | - | TC, NS | |
71 | Refactor code | 13.2 | 19.8 | - | - | TC, NS | |
72 | Run tests | 6.6 | 9.9 | - | - | TC, NS | |
73 | Debug code | 16.5 | 24.8 | - | - | TC, NS | |
74 | Review wiki | 2.0 | 3.0 | - | - | TC, NS | |
75 | Format wiki | 1.0 | 1.5 | - | - | TC, NS | |
76 | Proofread wiki | 2.0 | 3.0 | - | - | TC, NS | |
77 | Submit code and wiki | 2.0 | 3.0 | - | 76 | TC, NS | |
78 | Complete WebPA forms | 1.0 | 1.5 | - | 77 | TC, NS | |
79 | Review feedback from PO | 2.0 | 3.0 | - | 78 | TC, NS | |
80 | Risk buffer | 43.1 | 64.7 | - | - | TC, NS, AG | |
Total | Total Time | - | 215.6 | 323.4 | - | - | - |
[*] Tasks that fulfill multiple requirements are grouped together.
Table 3: Task details of the Code Prototype, Usability, Code and Project Evaluation phase (Phase 3).
2.3. Gantt Charts
We visualize the tasks of the three phases that we have detailed above in three Gantt charts below. Instead of visualizing all the tasks and their dependencies, we chose to keep the charts simple by only visualizing the broad categories of tasks. This allows us to read the chart better as too many tasks and arrows of dependencies clutter the chart and make it difficult to read. We used Lucidchart to create the Gantt charts [4].
As the developers in the team are collaborating on all aspects of the project, there is no clear division of labor across these broad categories. Instead, we have colored the bars to demonstrate which developer has the overall lead on the specific category. We discuss this team structure further below in Section 2.4. Also, we have decided to visualize the time in weeks to account for variability of the estimated times. The weeks have been modeled using the weekly timetable of the Software Development course.
Figure 1: Gantt chart of Phase 1 of the project.
Figure 2: Gantt chart of Phase 2 of the project.
A point to be noted for Figure 3 below is that we are aware that we may not be able to accomplish all of the requirements of the product due to a reduced team. To ensure high quality of the product, we aim to start from the highest priorities and work our way down to lower priorities. We will stop all Product Prototype activities (including documenting, reviewing, refactoring, and debugging) at the end of Week 12 to keep adequate time for Prototype and Project Evaluation in Week 13. The Usability Test Plan will be carried out in parallel with Product Prototype until the end of Week 11. In Week 12, we will execute Usability Analysis with what we have developed so far in our prototype at that point in time. With this approach, we ensure that enough resources have been made available to all moving parts of Phase 3.
Figure 3: Gantt chart of Phase 3 of the project.
2.4 Changes to Design and Team Structure
In here, we discuss changes to the design and to the team structure since the first phase of our project.
2.4.1 Design changes
At the time of writing, our layered architecture, which we defined in Phase 1 of the project, remains the same. However, we are discussing with our Product Owner (PO) about the possibility of reconsidering microservices architecture following the PO's recommendation in the most recent review. The microservices architecture maybe suitable for us as we tended to approach towards it when we diagrammed our system design in the first iteration. We will have more details on this segment shortly after our discussion.
2.4.2 Team Structure changes
Our team members remain the same since Phase 1. As mentioned before, one of our developers has been inactive since the beginning of the project. We will continue to assume that this will be the case for the rest of the project.
However, we have changed our approach on how we view accountability since Phase 1. In Phase 1, we viewed accountability in terms of the parts of the architecture of the product (i.e. Backend, Frontend, AI, and Testing). As we realized this project goes beyond just the product, we currently view accountability in terms of the broad categories of tasks. In Phase 3, these are Product Prototype, Usability Test Plan, Usability Analysis, Prototype and Project Evaluation. So, accountabilities have been assigned and visualized using these categories in the Gantt charts above. With this approach, we ensure that all parts of the project are resourced with the appropriate leadership.
In this section, we discuss risk analysis of our project. We divide our analysis into two parts: Risk Assessment and Risk Control.
3.1 Risk Assessment
In this section, we identified risks that we may encounter during the development cycle. We adopted the risk categorization from the six dimensions of software risk (Wallace et al., 2004) [3].
We adopted a qualitative approach to analyze the risks mentioned below due to difficulties of quantifying risks in this project. There are two dimensions to consider: probability and impact. There are four levels of probability and impact that we classified for each risk and we assigned scores ranging from 1-4.
-
Probability: Unlikely [1], Possible [2], Probable [3], Certain [4]
-
Impact: Negligible [1], Moderate [2], Severe [3], Catastrophic [4]
3.1.1 User
We have not directly contacted with the users of this product, which are the gamers who want to manage their game collections online and explore new games based on their interests. However, we discussed with the Product Owner to learn what the users need and how they will use our product. Under such scenario, we listed some of the common risks related to our users for this product.
-
Users' resistance to change [Probability: 2; Impact: 2]
Gamers might be loyal to the existing websites such as BoardGameGeek to manage their game collections and might refuse to try a new platform, i.e. our product. -
Conflicts between users [Probability: 3; Impact: 4]
Gamers may have different preferences towards games and they can freely write reviews and give ratings. It is possible that extreme gamers may write something irritating and insulting to the game, which may cause other gamers to be unhappy and to counter-attack. The forum may become chaotic and full of hatred. -
Users not committed to the product [Probability: 1; Impact: 2]
Users may not be loyal to our product and can shift their preferences rapidly, switching to another platform in the market. -
Lack of cooperation from users [Probability: 1; Impact: 2]
Users may not cooperate with other users, and may violate the code of conduct of the platform.
3.1.2 Requirements
Similar to the risks related to the users, we engineered the requirements based on the product description as well as discussion with the PO. If there are any changes to the requirements from the PO or if a requirement is misunderstood, it will increase the risk under this category.
- Continually changing requirements [Probability: 2; Impact: 3]
- Requirement not adequately identified [Probability: 1; Impact: 3]
- Unclear requirements [Probability: 1; Impact: 3]
- Incorrect requirements [Probability: 1; Impact: 3]
3.1.3 Project Complexity
As we stated previously, we have decided to make the development simple, easy and testable. We aim to lower the complexity of the project. However, a various technologies, tools and technical skills are considered in our design and there are risks associated with these.
- Project involves the use of new technology [Probability: 2; Impact: 2]
- High level of technical complexity [Probability: 3; Impact: 3]
- Immature technology [Probability: 1; Impact: 2]
3.1.4 Planning and Control
As mentioned above, we have initial planning details of the product development with estimated times to complete each phase. However, the actual times may deviate widely from the plan due to several reasons, e.g. over-optimistic scheduling, insufficient resources allocated, etc. Any risk under such circumstance will be grouped under this category.
- Lack of effective project management technology [Probability: 2; Impact: 2]
- Project progress not monitored closely [Probability: 3; Impact: 3]
- Inadequate estimation of required resources [Probability: 2; Impact: 3]
- Project milestones not clearly defined [Probability: 2; Impact: 2]
- Ineffective communications [Probability: 1; Impact: 3]
3.1.5 Team
The risks associated with a team are worthwhile to discuss here as our team has only 3 developers with one developer being inactive from the start. Moreover, the developers have different experiences and backgrounds related to the project so the division of labor should be evaluated carefully and there are specific risks related to the team.
- Inexperienced team members [Probability: 4; Impact: 2]
- Team members lack specialized skills required by the project [Probability: 3; Impact: 2]
- Frequent conflicts between developers [Probability: 1; Impact: 2]
3.1.6 Organizational Environment
This type of risk is not necessarily applicable here due to the modest nature of the project with only two active developers, who are not part of a large and complex organization.
3.2 Risk Prioritization
After identifying all the risks with analysis on their probability and severity, we prioritized them in the order below.
3.2.1 High probability-high impact [Probability >= 3; Impact >= 3]
- Conflicts between users [Probability: 3; Impact: 4]
- High level of technical complexity [Probability: 3; Impact: 3]
- Project progress not monitored closely enough [Probability: 3; Impact: 3]
3.2.2 Low probability-high impact [Probability <= 2; Impact >= 3]
- Continually changing requirements [Probability: 2; Impact: 3]
- Unclear requirements [Probability: 1; Impact: 3]
- Incorrect requirements [Probability: 1; Impact: 3]
- Requirements not adequately identified [Probability: 1; Impact: 3]
- Inadequate estimation of required resources [Probability: 2; Impact: 3]
- Ineffective communications [Probability: 1; Impact: 3]
3.2.3 High probability-low impact [Probability >= 3; Impact <= 2]
- Inexperienced team members [Probability: 4; Impact: 2]
- Team members lack specialized skills required by the project [Probability: 3; Impact: 2]
3.2.4 Low probability-low impact [Probability <= 2; Impact <= 2]
- Users' resistance to change [Probability: 2; Impact: 2]
- Users not committed to the product [Probability: 1; Impact: 2]
- Lack of cooperation from users [Probability: 1; Impact: 2]
- Project involves the use of new technology [Probability: 2; Impact: 2]
- Immature technology [Probability: 1; Impact: 2]
- Lack of effective project management technology [Probability: 2; Impact: 2]
- Project milestones not clearly defined [Probability: 2; Impact: 2]
- Frequent conflicts between developers [Probability: 1; Impact: 2]
3.3 Risk Control
In here, we discuss how to control for each risk that we have identified based on its priority. There are several techniques to handle risks and we only cover some of them. In addition, risk management is not a one-stop solution and we will keep regularly reviewing and updating the risk assessment and control throughout the development cycle.
3.3.1 High probability-high impact [Probability >= 3; Impact >= 3]
-
Conflicts between users [Probability: 3; Impact: 4]
i) Proposed solution: risk control
ii) Details: it is impossible to restrict all users on the freedom of expressing their opinions via reviews or ratings, so we cannot avoid this specific risk. On the other hand, we should deploy risk control measures such as adding an administrator role in the forum, or establishing rules regarding to usage of the forum. -
High level of technical complexity [Probability: 3; Impact: 3]
i) Proposed solution: buy information about the risk
ii) Details: given that we have identified that the high complexity of the project mostly comes from the recommendation engine using machine learning technology, we will allocate more time to the team for researching existing libraries that can help us to implement the recommendation engine with reduced risk. -
Project progress not monitored closely enough [Probability: 3; Impact: 3]
i) Proposed solution: risk control
ii) Details: we are fully aware of the severe consequences of not monitoring the progress closely, which will delay our project and fail to meet the deadline established by the Product Owner. The solution is to arrange more regular checkpoints to share what we are currently working on. In case there is anything in a bottleneck, we can identify root causes and discuss with the PO as early as possible.
3.3.2 Low probability-high impact [Probability <= 2; Impact >= 3]
-
Continually changing requirements [Probability: 2; Impact: 3]
-
Unclear requirements [Probability: 1; Impact: 3]
-
Incorrect requirements [Probability: 1; Impact: 3]
-
Requirement not adequately identified [Probability: 1; Impact: 3]
i) Proposed solution: risk monitoring
ii) Details: the four risks above are all related to requirements. When requirements are changed, or they are not correctly understood and identified, the project could be impacted significantly and the worst case is that the project is cancelled. To tackle these risks, we need to review requirements frequently and raise any confusions on requirements with the PO. Working with the PO closely can reveal insights, which can enable us to tackle the issues as soon as possible. -
Inadequate estimation of required resources [Probability: 2; Impact: 3]
i) Proposed solution: risk control
ii) Details: for this project, there is a risk of underestimating the resources that we need to complete the requirements. The resources are not limited to hardware or software, e.g. web hosting server, license for certain software, etc., but also extended to human resources and time allocated for the project. To handle this, we need to have flexibility on the schedule and on the resource allocation so that we are not affected much if there are any significant deviations from the estimations. -
Ineffective communications [Probability: 1; Impact: 3]
i) Proposed solution: risk monitoring ii) Details: the probability of having ineffective communications is low so far as we have close cooperation within the team and also with the PO. We will keep monitoring our communication process throughout the development cycle to avoid severe impacts from this type of risk, e.g. mismatched requirement and change of design without communication, etc.
3.3.3 High probability-low impact [Probability >= 3; Impact <= 2]
-
Inexperienced team members [Probability: 4; Impact: 2]
-
Team members lack specialized skills required by the project [Probability: 3; Impact: 2]
i) Proposed solution: risk avoidance and risk publicization
ii) Details: as we mentioned previously, we were originally a team of three members but one member has been inactive. There are only two members which increases the probabilities of the risks above. To tackle these, we have decided to keep the project simple and within our scope of experience. Although this may sacrifice the chance of using new technology to better fit the requirement, we should avoid any risk accumulation under current situation. In addition, we have notified the PO at an early stage and have his consent on keeping two members to develop the product. These decisions were made to minimize the surprise in case this type of risk occurs.
3.2.4 Low probability-low impact [Probability <= 2; Impact <= 2]
-
Users' resistance to change [Probability: 2; Impact: 2]
-
Users not committed to the product [Probability: 1; Impact: 2]
-
Lack of cooperation from users [Probability: 1; Impact: 2]
-
Project involves the use of new technology [Probability: 2; Impact: 2]
-
Immature technology [Probability: 1; Impact: 2]
-
Lack of effective project management technology [Probability: 2; Impact: 2]
-
Project milestones not clearly defined [Probability: 2; Impact: 2]
-
Frequent conflicts between developers [Probability: 1; Impact: 2]
i) Proposed solution: risk assumption and risk documentation
ii) Details: under this category of risk, the probabilities and impacts are relatively low compared to those of other categories. We understand that these events can happen and we need to be prepared to take action instantly. Also, we will document the risks proactively and review each of them regularly as part of our risk management discipline.
[1] https://www.agiledrop.com/blog/making-accurate-estimates-software-development
[2] https://www.scnsoft.com/blog/software-development-time
[3] http://www-public.telecom-sudparis.eu/~gibson/Teaching/Teaching-ReadingMaterial/WallaceKeilRai04.pdf
[4] https://lucid.co/