Scopri GanttNavigator

PROMUOVE l’utilizzo di modelli dinamici STOCK & FLOW per migliorare le prestazioni dell’ECOSISTEMA PROGETTO

Learn about GanttNavigator

PROMOTES the use of dynamic STOCK & FLOW models to improve the performance of the PROJECT ECOSYSTEM

“Basically, we can interfere with any process, once we have studied it
long enough to understand how it works mechanistically”

Ozlem Tureci (Ugur Sahin) – The vaccine that changed the world – our battle to defeat the pandemic. Pg. 300 – Joe Miller – Mondadori

GanttNavigator is aimed at Graduates, Researchers and Professionals who are interested in planning and control issues in the Project Management of large infrastructure projects.

GanttNavigator is aimed at Graduates, Researchers and Professionals who are interested in planning and control issues in the Project Management of large infrastructure projects.

GanttNavigator aims to improve the implementation performance of large infrastructure projects by:

  • a realistic planning and effective project control.
  • a systems approach based on the use of dynamic Stock & Flow models.

GanttNavigator offers some examples of Stock & Flow models from real cases (lessons learned) to demonstrate the potential of the methodology.
GanttNavigator provides an opportunity for Project Managers and Planners to share their experiences of working within large infrastructure projects to evaluate the applicability of the Stock & Flow methodology.

What Gantt Navigator offers

Training and Modeling

We can illustrate through presentations and targeted training the potential of simulation for studying and solving recurring project management problems.

Lesson Learned

We can analyze (under strict constraint of confidentiality) your ongoing or completed projects to assess their critical issues and improve their future performance.

Sharing

We can analyze the specific issues posed to us in Project Management and evaluate the possibility of modeling for quantitative simulation of the optimal solution strategy(s).

Portfolio

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Duis ac eros ut dui bibendum ultricies. Maecenas egestas fringilla semper.

Eng. G. Iovino - Founder

Design is a complex, highly iterative process. The growth and consolidation of design cannot be separated from the requirements of functional suitability and physical consistency of the interfaces of all elements that are part of it. The website mentions models designed to take into account two fundamental design processes in particular:

  • The development and consolidation of the project pipeline.

  • The iterative cycle of finite-resource document approval.

    Find out more

Procurement of the relevant parts of the infrastructure to be implemented is a key success factor of the Project. To be effective, i.e., to contain time and cost with imposed quality, it is necessary to know the key steps in articulating the Procurement process, the actors involved, and in particular the timeline required by the process, which must be synchronized with the project-construction timeline. The Procurement process and its timeline are mentioned in the website.

Learn more

Construction is a deterministic and seemingly simple and straightforward process. However, an analysis based on real experiences shows the importance of the dynamic aspect of the process and the key role of certain parameters such as obviously the Productivity of the allocated resources but also their speed of Mobilization and Demobilization which is not infinite as is often implicitly assumed. The website mentions the models designed to take into account not only the above parameters but also two other important factors that condition the success of the project overo: the configuration of the facilities to be built and the ‘proactivity” of the project manager in controlling the implementation (PMKS model).

Learn more

Studies and Presentations for:

What people say about us

mail Oct 25, 2024.

"Your Ganttnavigator blog is) very interesting"
Giorgio Locatelli
Full Professor of Complex Project Business - Politecnico di Milano and Editor in Chief of the Project Management Journal

mail Jun 23, 2024

"The idea of applying automation to the control of a Project is brilliant: the advantages are clear and important. A tool of this type can certainly be of great help to: 1. Avoid starting with an overly optimistic schedule, 2. Correct the course promptly with the necessary actions..."
A. Fusar Poli
engineer - expert on large projects

mail Feb 17, 2024

"(Your video n. 3 using the Earned Value Method is) very interesting, and it seems to me to have a strong and flexible impact from a didactic point of view."
M. Balduccini
MaBa Consulting
School of Aerospace Engineering (since 1926) – Rome - Italy

mail Feb 7, 2024

"I’m always pleasantly surprised by your elaborations which demonstrate how much you are dedicating yourself to the development of techniques to improve Project Management Control."
R. Fidora

Technical Director of Projecta Monitoring (project planning and control company)

mail Oct 6, 2023

"your proposal (for an article on System Dynamic in construction) is very valuable and was much appreciated"
P. Mella
Former Prof Business Administration Univ. Pavia
Editor-in-Chief of magazine "Economia Aziendale on line"

Business and Management Science International Quarterly Review

mail Sep 17, 2021

....(Your study on dynamic simulation of the construction process). "'It seems to be going in the right direction."
J. Lyneis

Professor

Massachusetts Institute of Technology (MIT), Boston (USA) 

mail Apr 12, 2021

"Your article is very interesting and opens up scenarios for further investigation."


L. Bonamoneta
Engineer
Head of the Project Management Commission in the civil and environmental field at the Order of Engineers of the Province of Rome

mail Mar 17, 2022

"Thunderstruck on the road to Damascus" (with your article).... "you really did a great job".


A. Pennati
Engineer
Project Manager and Project Planner at Enel

The adaptive construction model concerns a single Work Package of the project, i.e. a part of the construction works that has a uniform estimated productivity for the entire process, in the case of the example this is equal to 10 items per person per month.

It is also assumed that the 1200 items to be assembled are small enough to be able to consider the “granular” hypothesis valid, i.e. it is possible to approximate the progress function as continuous.

In the case of videos 1 and 2, a linear planned progress function was adopted and this could mean that the structure will present a constant working front as happens for structures with one-dimensional geometry such as a chimney or pipeline, etc. We will see later that it is possible to have planned progress functions with trends other than linear.

Having said this, the actual trend of a given time step (in the videos there is a step every 3 months) allows you to calculate an actual productivity that the model will use for the final forecast. Naturally this is only one of the possible hypotheses to make a final forecast, others can also be made.

The final forecast can also be made with a direct manual calculation and this is indicated by the thin blue line which is reported for immediate verification comparison.

However, the model does not use the direct formula but uses the PID regulation function which simulates the trend of the resources that would need to be followed to cancel the progress error between planned and actual.

This allows you to generalize the calculation for any trend of planned progress which could also be of the “S” type or quadratic etc…

Note that the Proportional, Integrative and Derivative parameters of the regulation function are not constant but are calculated by the program based on the physical parameters of the problem on a case-by-case basis, hence the term ADAPTIVE. You could then change the N, P, T data of the Work Package and the adjustment will adapt PID to the new problem.

The graphs also show a magenta line. This calculates the final resources assuming that productivity is always equal to the initially defined planned value. This assumption is less realistic yet is very widespread in programs in current use.

Finally, a few words on the “moral” of video 2. When you are late, it seems easy to catch up but this is a false perception. Here is an example:

Let’s take the case of a car with which we intend to travel a certain distance L at an average speed of 150 km/h. After half the distance, we record an actual average speed of 100 km/h. We wonder what average speed I would have to reach in the second half of the route to get back to the 150 km/h average I wanted.

I will give the solution later but it is not obvious and clearly shows the difficulty of recovering construction times.

Let's continue from the "moral" of the simulation reported at the end of video 2, namely that:

"when you are late with the realization it seems easy to catch up but this can be a false perception".

To explain the concept we have taken the following example (1):

Let us take the case of a car with which we intend to travel a certain distance of L km at an average speed of 150 km/h. After half the distance, we record an actual average speed of 100 km/h. We ask ourselves what average speed we would have to reach in the second half of the route to return to the average speed of 150 km/h that I had initially set.

The solution to the problem is that the average speed to be maintained in the second half of the route is 300 km/h. The mathematical proof is reported at the bottom of this note. The interesting fact is that the majority of people answer 200 km/h (because the arithmetic average value between 100 km/h and 200 km/h equals 150 km/h) but this is not the correct answer. I ran the test myself with some colleagues and verified that many of them give the wrong answer.

On the other hand, many of us have experienced this false perception with our car's navigator. Upon departure, we get an estimated arrival time. During the journey we accumulate delays and consequently the arrival time is postponed. If we try to increase our speed to recover, the attempt often fails because recovery is “slow” and sometimes it happens that we reach the maximum speed limit.

The reason for this false perception can be argued, but the fact is that it practically always occurs even during construction.

The conclusion is that the project needs to get off to a good start with a realistic planning, which will have to be updated with a new realistic forecast one in the event of delays during construction. Better not to believe in miraculous recoveries….

Later on, I will explain my idea of realistic planning.


Here is the proof of the average speed problem:

Vav = L/T
Vav = L/(t1+t2) being t1 and t2 respectively the travel times used for the first and second half of L.
We have t1 + t2 = V1/L/2 + V2/L/2; V1 and V2 being respectively the average speeds of the first and second half of L.
Vav = L/(L/2/V1+L/2/V2)
Vav = 2/((V1+V2)/(V1*V2)) carrying out the steps and highlighting V2 we obtain:
V2 = Vav * V1 / (2 * V1- Vav) = 150 * 100 / (200 -150) = 300 km/h

(1) Problem taken from the book "L'algoritmo del parcheggio" by Furio Honsel – Mondadori (2007)

Video 3 describes the simulation of a project with the same scope of work as the previous videos, i.e. the assembly of 1200 items in 12 months with resources whose physical productivity is 10 items per person per month.
The two main differences compared to previous simulations are represented by a non-linear baseline (an S-curve) and the use of the Earned Value Method metric to evaluate the final forecasts. The control of the project takes place through an adjustment of the resources which has the aim of maintaining the pre-established execution time of 12 months even if the resources have an actual productivity different from that estimated or there are other project constraints such as maximum number of resources that can be allocated.
The S-shaped baseline indicates that the assembly speed is variable over time and therefore, assuming an almost constant physical productivity, it means that the resources are variable over time with a peak corresponding to the maximum assembly speed which, in the specific case, is equal to 30 units. The average resources are always equal to 10 units and in fact the total work is worth 120 man months (=10 units x 12 months).
The What if analysis is performed to evaluate the impact on resources if actual productivity is equal to 6 items per person per month.
The simulation calculates how many man months will be needed (obviously these will be in the productivity ratio therefore a total of 120 x 10/6 = 200 man months) and also their distribution over time which, in the specific case, will lead to a peak of 46 units. It should be noted that the CPI (Cost Performance Index) and SPI (Schedule Performance Index) reflect this trend over time.
The calculation should be repeated for states of progress as in the example in video 1. The calculation made at the beginning of the project can serve to define the risks and therefore the contingencies to be foreseen at the start.
The peak factor of 46 units could create problems and therefore the what if analysis can be repeated by setting a constraint on the maximum value of the resources. This is done by setting the limit at 24 units. Even in this case it is possible to complete the project on time.

The moral of the video is that the control system acts not only when the baseline is linear as in the case of videos 1 and 2 but also with any non-linear baseline. The attention in the planning phase can therefore be shifted to a correct definition of the baseline and the productivity of the resources, because then, in the project implementation phase, the regulation allows us to have information on how to overcome unforeseen events during construction by acting on resources (and costs). We will see in the following videos the suggestions on how to correctly trace the baseline according to the new Ganttnavigator method.

Description of the features of the Ganttnavigator Video

The “Ganttnavigator” video was created by Powersim S.A. based on an idea of G. Iovino as a demonstration of the possibility of planning and controlling an infrastructural project with the methodology based on the Stock and Flow model. The project model presents four Input-Output possibilities which correspond to four sections:
  • Overview section. In this section, the data for each Work Package of the project are provided to the model. The data allow to calculate the expected trend for the assembly resources and the relative progress of the assembly work. The model can include different WPs (Civil, Piping, Structures, Electrical and I&C, etc.). Each of them can be characterized by its own structural topology which will condition the progress of the work fronts and, consequently, can influence the progress of the planned resources and the progress of the assembly work.
  • Project Network section. In this section, the finish to start links of the WPs, that are part of the overall work, are defined in a very similar way to that of the classic Gantt bar chart.
  • Costs section. In this section we analyze the direct costs, i.e. those linked to man hours spent and to be spent, and the indirect costs, i.e. those linked to the duration of the construction site.
  • Uncertainty section. In this section the program performs the risk analysis of the project. The characteristic parameters (for example assembly productivity) can be made to vary randomly (Monte Carlo method) to evaluate the variability of costs that can be expected with the relative probabilities. An innovative feature of the Ganttnavigator method is the possibility, for different project progress states in which the “actual” situation differs from the “planned” one, to activate the control through feedback on the progress error that acts on the resources. In this way the Project Manager receives information on how to act in order to recover any delays.

The Problem of planning mega infrastructure

The purpose of planning a project is to make reliable predictions about its future development. Not only, obviously, predicting when the project will end and the cost at which it will end, but also predicting the achievement of important intermediate milestones.

The typical intermediate planning milestones consist of the delivery to the construction site and the relative positioning of large components such as: boilers, tanks, reactors, turbogenerators etc… The forecasts relating to the intermediate objectives are important because they influence the activities carried out by third parties.

The need to make reliable forecasts is common to many everyday complex fields such as economics and meteorology. In both of these cases, to make reliable predictions, it is necessary to know the laws that regulate the economic and atmospheric systems respectively as well as having powerful calculation means. For example, weather forecasts, not only require knowledge of fluid mechanics and thermodynamics as well as the physical properties of the fluids involved: air, water in its solid, liquid and vapor states, but also powerful computers. This is why weather forecasts have improved over time. Today, weather forecasts are reliable up to five or more days after the forecast.

It therefore seems rather unrealistic to expect to carry out the planning of a large project without knowing the "physics of the construction process" and without having adequate calculation means. By "physics of the construction process" we mean the set of laws that regulate the movement of resources, their presumed and actual productivity, etc., the topological characteristics of the structures to be assembled, etc…

Unfortunately today planning is almost exclusively empirical and based on the experience of previous results obtained on similar projects. Often projects are not even sized in terms of resources and the results is that 90% of megaprojects end late and with extra costs (this is called the "iron law" of megaprojects). You can then imagine what result you will get when it comes to creating a first-of-a-kind work.

Planning with the systems in use (Gantt bar charts) includes several simplifications:

  • Infinite resources
  • Instant mobilization and demobilization of resources
  • Open chain process without feedback i.e. “waterfall”
  • Fixed productivity

With these approximations, the forecasts can have a sense in the period of three months to follow (three months look ahead) which are nothing if compared to the years of duration of a megaproject. We then have to proceed with continuous updates up to the completion date.

This situation seems to persist and also involve the major works we hear about: the “Messina” bridge, nuclear plants, radioactive waste deposits, etc…

In this difficult situation for clients and companies, not even rules or guides help. In fact, I am not aware that there is a standard that establishes the minimum quality requirements for project planning. Anything could be presented under the term "time schedule": from a scheme with a few activities without resources up to a hundreds of "untied" activities whose incomprehensible contents.

Engineers should foster rules like technical guides suggesting a minimim quality level for preparing mega project time schedule based on qualified standards.