“No plan survives contact with the enemy.” The phrase is credited to Field Marshal von Moltke, nicknamed the old man. Mike Tyson said the same thing with characteristic ferocity: “Everyone has a plan until the first punch comes.”
Plans are necessary, but they must be adaptable, living organisms. Once we understand the objectives of a project or define the scope of the next release, we cannot fall into the trap of assuming that everything will go as planned, because it will not. The plan must be adapted as new information accumulates.
The basis of all probabilistic planning is to have a sufficient data set and a predictable flow supported by consistent metrics. Achieving this is a virtuous circle, we want better and more reliable metrics, so we refine the tasks, provide them with more detail, try to match our historical capability, which will lead to better metrics and so on.
Monte Carlo simulation is a statistical technique used to solve complex problems by generating multiple random samples. Its name is inspired by Monte Carlo casino gambling, it is based on the generation of random numbers. When we want to make predictions based on probability we need to assume that there are many possible scenarios that can happen. In this simulation, a mathematical model of the system to be analyzed is built. Then, repeated iterations are performed using random numbers to simulate the variability and uncertainty present in the problem.
It is based on throughput averages, their standard deviation and a set of iterations resulting in a percentile estimate.
In a project, a one-day level of accuracy is sufficient, so we can obtain reliable results with a limited data collection, i.e., it does not matter whether we have a deliverable at 10 a.m. or 12 a.m.
Let’s say we have 30 tasks in a new functionality. We have a date for a month from now, being today June 12, we know that we have a 75% chance of achieving it by July 12; if we want to take a risk, we can assume 50% or be conservative and consider 95% for a week later. By running several simulations, we could get a mean of 12.5 days to complete a task, with a standard deviation of 1.5 days. This provides an estimate of the likely time it will take to complete the task and also informs the variability in the process.
In summary, Monte Carlo simulation allows you to estimate the likely completion date of a task based on the generation of random samples that model the uncertainty in the completion time of that task.
Taking this data, we must track the progress of our engagement to re-evaluate the initial assertion, so we will re-run the simulation with periodic frequencies and make adjustments before a deviation occurs or we can report it to stakeholders. A forecast only takes into consideration past results and, as every investor knows, past results do not guarantee future results. There may be team departures, unidentified blockages, poorly mitigated risks or any other eventuality that invites us to re-plan, but all decisions will be justified and argued. This is what we call a data-driven decision-making process.
This iterative probabilistic planning model is beneficial, even if our containment/corrective measures in the face of an undesired scenario will not end up changing the trend, since we could communicate it in advance and allow for a reaction with room for maneuver. Acting in this way, using metrics as a window of transparency, fosters a climate of trust with the client and reinforces our reputation as professionals.
Therefore, probabilistic planning is the ideal alternative to classic waterfall estimations that consider development as a deterministic, linear and machine-like discourse. We use data generated in the past to extrapolate a prediction of what may happen in the future.
Monte Carlo simulations expose the risks involved in making commitments. It is a feedback loop oriented planning to manage realistic goals within an agile environment: Plan, check, re-evaluate, re-plan.
Image: Pexels | Engin Akyurt