
The modern world, and the yacht-building industry in particular, is driven by a relentless pursuit of efficiency. Having come of age during an era of rapid technological advancement, we have grown accustomed to viewing “efficiency” as an unquestionable good.
While engineers and economists strive to optimize performance, mathematicians and systems theorists offer a more sobering perspective.
Efficiency
It is a well-established principle that as a system’s efficiency increases, its fragility often rises in tandem. In other words, efficiency frequently comes at the expense of robustness. Take, for example, the standing rigging of most modern sailing yachts. We see increasingly tall masts, swept-back spreaders, a minimal number of shrouds, and a single forestay. This configuration yields high efficiency: superior boat speed, tighter pointing angles and precise sail trim. Contrast this with the typical 1970s-era rig, where shrouds stabilized the mast in multiple directions, masts were keel-stepped with multiple stays – such as inner forestays or baby-stays – were commonplace.
The Yogi case
Consider a circus yogi who, to the applause of the crowd, lies upon a bed of a thousand nails. Captivated by the desire to make his act more “efficient” and spectacular, he gradually removes the nails. By the end of the season, only one solitary, fatal nail remains. The result is predictable: the yogi is no more.
The Mathematics of the Verdict
From a systems perspective, this outcome is explained by the narrowing of the survival corridor. When we optimize a system for a specific task – such as sailing fast within a narrow range of wind angles – we strip away everything deemed “superfluous.” At sea, however, the superfluous is often synonymous with survivability.
An efficient system performs brilliantly within a narrow envelope of conditions. A robust system performs acceptably across a broad spectrum. The former is brittle; the latter is resilient. The former thrives in the presence of White Swans, while the latter is designed to survive a Black one – a rare but high-impact outlier.
The Single Point of Failure
During the 1950s and 1960s, as modern aviation and aerospace engineering matured, the concept of the Single Point of Failure (SPOF) emerged. This classification applied to any component whose failure would result in the loss of the entire system. Identifying such points allowed engineers to redesign systems around redundancy.
A classic example is fly-by-wire control. As aircraft grew in size, mechanical linkages became impractical and were replaced by hydraulics. A centralized hydraulic system was efficient, but a single leak anywhere in the circuit could turn an aircraft into a falling brick. The solution was segmentation: independent hydraulic circuits, each with its own pump, powering individual control surfaces. By partitioning the system, engineers effectively added nails back to the bed – introducing redundancy and improving the yogi’s chances of survival.
The Equation of Checks and Balances
Modern technology offers advanced materials and powerful computational tools. Progress is inevitable, but it is rarely linear. It is driven by competing forces – technical, economic and commercial. Ultimately, every boat is a complex equation in which technology, seamanship and marketing are tightly interwoven.
There is no single correct solution – only an endless series of trade-offs. Every adjustment within this equation carries a cost. Turning the efficiency dial to its maximum inevitably reduces robustness. The issue is not progress itself, but that the full balance of the equation is often obscured. Lightweight structures and performance gains are highlighted, while the reduction in survivability under extreme conditions is seldom discussed.
Mathematics does not demand that we abandon modern solutions. It simply makes the price explicit. In this equation, human life remains a variable – one that depends directly on how much redundancy we choose to retain on board.

For want of a nail the shoe was lost.
For want of a shoe the horse was lost.
For want of a horse the commander was lost.
For want of a commander the army was lost.
For want of an army the kingdom was lost.
— Traditional English verse, late 18th–early 19th century
This is known as positive feed-back and some times called a “chain reaction” in the chain a – b – c – d. A slight increase in “a” produced bigger increase in “b”, and then still bigger one in “c”, a really hefty jump in “d”.
I am a mathematician, and my work is related to the stabilization of dynamical systems, that is, to the use of negative feed-back control.
What does optimality have to do with this? This is how. Suppose that we produce nails in forges in an optimal way, so that on average there are just enough for all horses, and optimality is pushed to the limit in the sense that a slight increase in demand immediately causes a lack of supply. If the lack of supply cannot trigger negative feed-back, then there is nothing terrible about this: after some time the supply will be restored. However, if this is possible, then the absence of a single nail can lead to a catastrophe.
The presence of negative feedback in a system is associated with the existence of hidden hyperbolic points within it. In other words, as long as everything is going well, but small deviations can trigger abrupt changes and even catastrophic consequences. I called the presence of such hidden hyperbolic points in a system black rabbits of Fibonacci. This name, by analogy with black swans, is explained by the following simple mathematical construction.
Suppose that the state of your system is described by a single number between minus one and one. At the same time, assume that if the state of the system exceeds 2, then this is already bad and has catastrophic consequences for the system. If, on the other hand, the state is 0, then this is homeostasis and the equilibrium state of the system.
Suppose that the first two states of the system are equal to 1 (positive) and – 0.618 (negative). The dynamics is extremely simple: the current state is just the sum of the two previous.
For example, the third state of the system is equal to 0.382, that is, = 1 – 0.618. The fourth state of the system is −0.236 = 0.382 – 0.618. Continuing in the same spirit, one can see that starting from the eleventh state of the system, that is, from the eleventh step onward, the state of the system differs from zero by less than 1%. Homeostasis. And this continues for several steps. So one may calm down.
However, it is too early to relax. Already at the thirteenth step, catastrophic changes will become noticeable. For example, all states will become positive. And at the twenty-second step the system will go beyond 2 and, after a dozen more steps, will fly off to infinity.
The system was pregnant with a crisis that had not been visible before.
What does this mean in the context of safety on a sailing yacht?
Nodes that can lead to a chain reaction should be made not optimally, but with a double or triple margin of safety, depending on the consequences that may be caused by the failure of these nodes. In addition, one should in every possible way avoid passing near hyperbolic points of the system along one’s trajectory.
In other words, the yacht should not be subjected to excessive loads.
And finally, safety issues should be approached from the point of view of dynamics. In any situation, one must anticipate the worst that can happen and always have some plan B in reserve, and even better, an additional plan C.