Decision-making under uncertainty

By David Silverman

A woman walking into a digitally rendered maze

We live in an uncertain world, and decision-makers in business and government face unknowns ranging from tomorrow's weather to unfolding events like technological revolutions, COVID-19 and Brexit.

Imperial researchers are working with companies to face these unknowns. They are developing innovative tools to maximise performance even under the uncertain conditions of the real world.

TL;DR

No time? Here's the digested read

Businesses and governments often use models, optimisation algorithms and other tools in their short and long-term decision-making to minimise costs or maximise economic and societal benefits.

However, important variables like the future demand for a product and the relationships that hold between them are often uncertain.

A plan that should enable high performance in theory can go badly wrong when assumptions prove false or we encounter unexpected events ranging from traffic jams to rare events like pandemics.

At Imperial, researchers are working with companies to improve decision-making under uncertainty. They are developing tools and techniques that can:

  • Yield plans that can be easily recovered when the world does not turn out as expected (e.g. in scheduling Royal Mail deliveries)
  • Represent events as probabilities rather than certainties to increase expected returns or reduce downside risk (e.g. in seed production, macroeconomic policymaking, portfolio optimisation, new energy systems, and healthcare scheduling during the COVID-19 pandemic)
  • Support phased decision-making (e.g. in major engineering projects).

The researchers are combining their academic expertise with the sectoral insights provided by industry partners to create practical solutions to real-world problems.

Most Fortune Global 500 companies and governments, and many smaller companies, use modelling, algorithms, and other kinds of advanced number-crunching to guide their planning in operations and to strategically inform policies and long-term investments. This can improve performance and bring benefits to society.

But a plan that is optimal in principle can lead to disaster when the world does not behave as expected: a single traffic delay can turn a finely tuned delivery schedule on its head, while the introduction of a new technology can leave a multibillion-dollar engineering project redundant.

At Imperial, researchers in a range of disciplines are working with industry partners to develop techniques and technologies to improve decision-making under uncertainty; their solutions come in a variety of shapes, aiming sometimes to ensure plans can be changed, sometimes to guarantee positive outcomes when they cannot be changes, and in other cases to increase certainty over time. The researchers are pursuing real-world applications of the work in a variety of industries and problem spaces.

This article will explore some theoretical approaches to decision-making under uncertainty and their applications in areas that include delivery schedules, seed production, macroeconomics and finance, healthcare scheduling during the COVID-19 pandemic, new energy systems and major engineering projects.

Scheduling

Professor Ruth Misener, an optimisation expert in Imperial’s Department of Computing, regularly works with industry partners on optimisation under the uncertain conditions of the real world.

“Optimisation is a form of decision-making where basically you say, ‘I want to do as well as possible’, maybe ‘I want to make as much money as possible’, and you recognise that the world is full of constraints, for example, you only have a certain amount of money to invest,” she explains. To make an optimal decision, you take these constraints and parameter values such as future customer demand and run them through an algorithm that tells you which course of action will best achieve your objective.

“The problem,” Professor Misener continues, “is that if you don’t have an understanding embedded into your optimisation that the world is an uncertain place, then you’re probably going to get a bogus solution that is going to hurt you once you implement it. If you explicitly take into account that you’re not quite sure what’s going to happen, you can get answers that are actually robust to things going wrong.”


One industry partner Professor Misener’s group has regularly worked with is the Royal Mail. In one project, Professor Misener, researcher Dr Dimitrios Letsios and MSc student Natasha Page, worked with the company to help it optimise scheduling for its last mile deliveries. A Royal Mail delivery office makes thousands of deliveries each day, and these are divided into jobs of varying lengths that each consist of multiple deliveries in a certain locale. The researchers were asked to develop an algorithm that could determine which van should complete each job and in which order, under the constraint that only a set number of vans can depart at once, with the goal of minimising the number of vans required and the makespan (the time it takes to complete all jobs).

Portrait of Professor Misener

Professor Ruth Misener in the Department of Computing. Photo: Tempest Photography

Professor Ruth Misener in the Department of Computing. Photo: Tempest Photography

With hundreds of jobs to schedule and an indeterminate number of vans, the number of possible solutions is vast, and the computing time required to run an algorithm that could identify the mathematically optimal solution would be prohibitive day to day. The researchers therefore identified heuristics (rules of thumb) that would allow the algorithm to quickly compute a solution that is near optimal.

The algorithm they developed first calculates the minimum number of vans required to deliver the mail in a sufficiently short makespan, then applies a heuristic called long-short mixing to schedule jobs, scheduling long jobs on some vans in parallel to short jobs on other vans to avoid bottlenecks caused by too many vans departing at once. Referring to historical data from three Royal Mail delivery offices, the researchers showed that the algorithm could have allowed the offices to carry out timely deliveries with significantly fewer vans than they actually used, assuming they had known in advance how long each job would take.

Professor Misener observes that major schedule changes on the day are not practical in the fast-moving environment of a delivery office. “If you’re a postal worker, and you’re told the night before what you’re going to be doing, you don’t want the story changed on you during the day.”

In the real world, however, unforeseen events like spikes or downticks in demand, staff absence, and traffic delays, make some jobs take longer than expected, while others are unexpectedly quick. The researchers showed that without optimising for uncertainty, an on-the-day change of plan as simple as the cancelation of a single job can turn a near optimal schedule into one that is dramatically suboptimal and requires substantial rescheduling to restore it to near optimality.  

Professor Misener observes that major schedule changes on the day are not practical in the fast-moving environment of a delivery office. “If you’re a postal worker, and you’re told the night before what you’re going to be doing, you don’t want the story changed on you during the day,” she says. “Last minute changes cause chaos and this is not really a way to lead an organisation filled with people. One of the things we aimed for in our solution was to avoid changing the schedule too much.”

The researchers therefore used an optimisation technique that can yield plans that can be recovered with minimal rescheduling when they break down. The technique, known as lexicographic optimisation, breaks a primary optimisation task down into separate sub-tasks for the algorithm to carry out in a particular order of priority. In this case, they designed the algorithm to optimise the full set of jobs by minimising the makespan on the first van, such that the time the first van takes to complete its jobs must be as short as possible and the time taken by all the other vans must be shorter still. Then, holding the makespan for the first van fixed, the algorithm reoptimises so that the second van’s makespan is as short as possible and the others shorter still, and repeats this for each van.

The researchers demonstrated with reference to the historical data that this approach yields a plan that is not only near optimal if there are no disturbances but can also be tweaked in response to disturbances with minimal changes while remaining relatively close to optimal.

The simple reason that lexicographic optimisation works here is that it ensures each van has a progressively greater capacity to take on extra jobs, meaning less busy vans are available to pick up the slack from busier ones if deliveries are delayed. More fundamentally, lexicographic optimisation makes the schedule more robust because it adds an internal structure: if the first van’s deliveries are delayed, jobs are still distributed between the second, third and fourth vans in a structured way that is conducive to restoring near optimality with very minimal reoptimisation on the day. In 2019 the Royal Mail used the work to begin a review of its small van fleet in delivery offices across the UK.

Photo: Royal Mail employees sort parcels into red delivery destination trolleys at the company's Mount Pleasant postal sorting office. Credit: Chris Ratcliffe/Bloomberg via Getty Images

Working with probabilities

One reason the approach we looked at in the last section is elegant is that it does not require us to know the probability of unforeseen events occurring. However, Professor Nilay Shah, head of Imperial’s Department of Chemical Engineering, points out that in many contexts – in particular, those where it is not possible to build in a recovery phase – it is useful to incorporate probabilities into decision-making.

“Some of the biggest things that come up are demands for products, so you’re trying to plan some manufacturing, or you’re deciding how big a facility to build, and you just don’t know what the demand will be”, he says. “The other thing is prices: you’re producing something that could be as simple as electricity or hydrogen and you don’t know what price you’ll get. You’re trying to make a decision about what to do given these uncertainties.”

The solution can be illustrated using a classical example from inventory management. A newspaper vendor must decide how many copies to purchase each day in the face of uncertain demand, knowing that any unsold newspapers at the day’s end will be worthless. A naïve solution would be to take the average number of copies sold each day and purchase that many. However, if the cost price is £1 and the sales prices £1.50, this would probably be suboptimal as the cost of buying too many newspapers is greater than the opportunity cost of buying too few.

portrait of Professor Shah

Professor Nilay Shah OBE, Head of the Department of Chemical Engineering

Professor Nilay Shah OBE, Head of the Department of Chemical Engineering

A better approach is to use historical data and informed guesswork to formulate a probability distribution that can be represented as a curve on a graph with newspapers demanded on one axis and a probability between 0 and 1 on the other. Using this information, the vendor can then draw a curve for each potential inventory volume (say, 100, 150 and 200 newspapers) with probability on one axis and the other now showing overall profit or loss. By comparing these curves, the vendor can identify the inventory volume that will maximise their expected value (the average profit they would make if the day were replayed an infinite number of times). It also gives them the option of choosing a purchase volume that minimises their possible loss. This is a simple case of an approach to decision-making under uncertainty called stochastic optimisation.

Photo: A news kiosk in Lisbon, Portugal. In reality, newspapers are often sold to vendors on a sale or return basis because it is in the publisher's interest to err toward offering too many copies for sale rather than too few. Credit: zulufriend via Getty Images

Black and white photo of a news vendor standing inside a kiosk surrounded by magazines and newspapers

Seed production

Professor Shah and colleagues in the Department of Chemical Engineering apply stochastic optimisation to a variety of real-world business and engineering contexts.

In one project, Professor Shah’s master’s student, Yanbin Zhu, applied the technique to seed production using data from seed manufacturer Syngenta. With the global population increasing, there is a greater need for more efficient agriculture. Most of the world’s crops comes from seeds, which are often supplied to farmers by commercial seed manufacturers. Mr Zhu and colleagues developed a model that could allow a seed manufacturer to plan seed production at regional level.

The company needs to decide which seeds to plant in which regions, accounting for several variables. These include expected yield (which varies by variety, region and season), demand (which varies by variety and season) and the costs of growing, processing, and transport. Using a deterministic model that did not account for uncertainty, the researchers produced an optimal plan that did better at minimising cost and land-use than a real plan provided by Syngenta and that took much less time to devise.

With the global population increasing, there is a greater need for more efficient agriculture. Most of the world’s crops comes from seeds, which are often supplied to farmers by commercial seed manufacturers. Mr Zhu and colleagues developed a model that could allow a seed manufacturer to account for uncertain yield, demand and costs.

When they introduced a way of accounting for uncertain yields, factoring in overproduction and underproduction as costs (either in additional production cost or the opportunity cost of failing to meet demand) they found the model considerably reduced the overall cost in situations where the yield was low, with a relatively small increase in cost in the event of an average or high yield.

Photo: Bags of Syngenta AG Golden Harvest brand hybrid seed corn sit stacked in a barn in Princeton, Illinois. Credit: Daniel Acker/Bloomberg via Getty Images

Piled-up sacks of corn seed produced by Syngenta

Macroeconomics

We are often uncertain not only about future events but also laws (for example economic laws) that determine whether one event will follow another. Plans can go wrong not only because we fail to foresee future events, but also because we hold incorrect assumptions about the effects our decisions will have.

This is one kind of uncertainty addressed by Emeritus Professor Berc Rustem, leader of Imperial’s Computational Optimisation Group, which he founded over 30 years ago, helping establish a research programme that is still pursued at Imperial and other universities.

In the 1970s, Professor Berc Rustem began research to advise government on macroeconomic policy. When Thatcher became prime minister, she favoured monetarism over the longer established demand-side economics. “There was huge controversy, and it became political… which model to choose became a real problem,” says Professor Rustem.

In the 1970s, Professor Rustem began work on a long-running research programme to optimise macroeconomic policy with control theorist Professor John Westcott, macroeconomist Professor Maurice Peston (later Lord Peston) and Labour MP Dr Jeremy Bray with funding from research councils (first the ESRC and later the EPSRC). The team advised policymakers such as parliament and HM Treasury and provided written evidence to House of Commons Treasury Committee.

portrait of Professor Rustem

Professor Berc Rustem in the Department of Computing

Professor Berc Rustem in the Department of Computing

At first, the group produced their own macroeconomic models – sets of assumptions about how the economy works that could be used to help drive optimisation algorithms. However, there was at this time increasing debate among economists about what sort of model to adopt. When Margaret Thatcher became prime minister, she favoured monetarism, which states that the best way to promote economic growth is by using the interest rate to regulate the money supply, over the longer-established demand-side economics, which calls on the government to use spending to regulate demand.

“There was huge controversy, and it became political,” says Professor Rustem. “We therefore abandoned modelling ourselves and focused on using existing models, and let that economic fight take place among the modellers.”

The team began to use models produced by the London Business School, the National Institute of Economic and Social Research, and HM Treasury. “Which model to choose became a real problem,” says Professor Rustem. “When you apply your policy, assuming a monetarist model is a fair representation of the economy gives you one result. If you choose another model and apply your policy, you get another result. This is when we started using robust optimisation. What we did was to combine all these models together in one framework and aimed to decide on a policy that would take into account all possible models and minimise the downside of choosing the wrong model.”

Prime Minister Margaret Thatcher at a circular table next to other representatives at an EEC summit

Prime Minister Margaret Thatcher looks on in December 05, 1986 as she chairs the EEC Economic Summit. Photo: Steve Wilkinson/AFP via Getty Images.

Prime Minister Margaret Thatcher looks on in December 05, 1986 as she chairs the EEC Economic Summit. Photo: Steve Wilkinson/AFP via Getty Images.

Finance

After the project ended in 1991, Professor Rustem began applying related techniques in chemical engineering and finance. In the latter case, he has focused on portfolio optimisation, the task of deciding which assets to invest in. This, he explains, uses historical data about how assets have risen or declined in value to optimise future returns and manage risk The latter involves taking account of the uncertainty (e.g. standard deviation, downside risk) gleaned from historical data.

An issue, here, is that we need to consider the kind of worst-case scenario that happens from time to time, such as the 2008 financial crisis. If you base your decisions on the historical change in the value of an asset over the last few years, this may not help protect you against the consequences of a sudden crash.

“Sometimes the worst case happens, and you need to be prepared for it. You can’t always assume that the worst case will happen and stop investing entirely. At the same time, you need to also think deep in the back of your mind that the worst case can and does happen from time to time,” says Professor Rustem. “All this leads to robust optimisation technology that takes into consideration worst-case events, or worst-case probabilities, in the future.”

If you base your decisions on the historical change in the value of an asset over the last few years, this may not help protect you against the consequences of a sudden crash.

To protect against the worst case, his portfolio optimisation algorithms make use of competing scenarios, analogous to the competing macroeconomic models built-in to his earlier work. Instead of computing the optimal outcome under a single scenario, they compute an optimal outcome assuming a possible range of scenarios that may come to pass, aiming to minimise the loss occurred in view of the worst case.

Professor Rustem emphasises that this is not a counsel of despair against the possibility of understanding how future events will unfold. Using a refined version of this approach, he computes the probability distribution for a range of scenarios, allowing, for example, a 70% probability that one model or scenario will prove correct and a 30% probability that another will. This makes it possible to reconcile trust in models with a healthy degree of robustness.

Photo: The HM Treasury building in Westminster, London.

Exterior of HM Treasury building in Westminster, London

Unknown probabilities

While one would ideally make use of probability distributions to account for uncertainty whenever they are available, a key problem is that they can be very hard to accurately compute.

Wolfram Wiesemann, Professor of Operations Research in the Imperial College Business School, specialises in applying optimisation under uncertainty in a variety of business domains. He uses a method that allows decision-makers to benefit from probability distributions while avoiding the headaches and pitfalls inherent in accurately computing them.

portrait of Professor Wiesemann

Professor Wolfram Wiesemann in the Imperial College Business School

Professor Wolfram Wiesemann in the Imperial College Business School

Professor Wiesemann says there are three major obstacles to computing accurate probability distributions, which apply equally whether we are calculating these distributions ourselves or using machine learning:

  • First, the past data that decision-makers largely rely on is not always a good guide to what will happen in the future. In particular, we don’t usually have enough historical data to avoid underfitting – the mistake of assuming that events which happen for complex reasons have simple causes – or overfitting – the mistake of assuming that events happen for reasons that are more context-specific than they really are.
  • Second, the number of parameters that must be incorporated into a model makes the task computationally very challenging. The problem, known as the curse of dimensionality, is that building a probability distribution for the cost of manufacturing a product, for example, will require probability distributions for multiple parameters (for instance the cost of labour and of various raw materials), and each additional parameter multiplies exponentially the number of future possibilities we must compute a probability for to arrive at our probability distribution for manufacturing cost.
  • Third, decision-making happens in a dynamic environment, and when we put our plan into action, we receive new data that may call into doubt the accuracy of our original probability distributions. Professor Wiesemann gives the example of flu vaccinations: every few months, health authorities must estimate which flu strains are likely to be most prevalent in the coming season and formulate a vaccine, which takes time. The longer they wait, the better they know which flu strains are dominant, but the less time they have to manufacture the vaccine, so they are forced into an uncomfortable trade-off between certainty and time. It is important for the earlier decisions to account for the richer information base that future decisions can be based on.

Because calculating a probability distribution is very hard and prone to giving an inaccurate result, Professor Wiesemann suggests something quite surprising – namely that we abandon the attempt to compute accurate probability distributions.

His approach is to optimise on the assumption that the true probability distribution differs as far as conceivable from the one modelled for. He then selects the model which, on this basis, achieves the best possible outcome under the worst anticipated probability distribution. Rather than attempting to get the probability distribution right, which often cannot be done, this approach minimises the downside risk of formulating an incorrect probability distribution.

Healthcare scheduling during the pandemic

Professor Wiesemann is working with colleagues from Medicine, including the team led by Professor Neil Ferguson, whose work has informed the COVID-19 policy of governments worldwide, on models to help identify the optimal way to schedule and prioritise elective treatments during the COVID-19 pandemic.

“At the moment [January 2021], people measure governments by their performance on the COVID-19 pandemic. One number that people compare is how many new infections and how many people died. What we haven’t focused on sufficiently is the number of deaths that we will have due to postponed cancer treatments and all the other treatments that will be put on hold,” says Professor Wiesemann.

“In this project we try to take an objective approach and say we don’t care if someone comes into hospital because of cancer, because of COVID-19, whatever it is – our sole objective is to save lives, within the resources we have.”

At the moment, people measure governments by their performance on the COVID-19 pandemic … how many new infections and how many people died. What we haven’t focused on sufficiently is the number of deaths that we will have due to all the other treatments that will be put on hold,” says Professor Wiesemann.

The models they have produced categorise patients by condition (COVID-19 and other conditions such as cancer) and age group and suggest how healthcare resources should be scheduled under scarcity to minimise the years of life lost, using Professor Wiesemann’s methods for dealing with uncertainty to allow for uncertainty about variables like patient numbers, treatment duration and prognosis.

The research has been used so far to highlight to policymakers the need to factor in the negative impact of delaying elective treatments for conditions such as cancer during the UK’s healthcare emergency in winter 2020/21, where some hospitals are full and treatments temporarily rationed. The research has been covered, among others, by the Daily Mail, the Financial Times and France24.

“Fortunately, in normal times, this should not be a substantial issue with the NHS,” says Professor Wiesemann. “If you think now about developing countries, unfortunately this situation of resource scarcity is in some cases more normality than an exception. We hope that our solution approach for the pandemic will also be applicable to developing countries in non-pandemic regimes.”

Photo: Clinical staff wear personal protective equipment as they care for patients at the Intensive Care unit at Royal Papworth Hospital in Cambridge, on 5 May 2020. Credit: Neil Hall / POOL / AFP via Getty Images.

Clinical staff wearing PPE in an intensive care unit

Renewable energy

Professor Wiesemann is collaborating with another leading expert in decision-making under uncertainty, Dr Panos Parpas, Reader in Computational Optimisation, on research into new energy systems. They are part of a large project led by Professor Tim Green, Co-Director of the Energy Futures Laboratory and funded by the EPSRC, that seeks to model and optimise the roll-out of low carbon energy systems, in particular solar and wind energy.

“One important issue with renewable energy is that it is inherently uncertain. We can have cloudy days, in which case we have less sun, and we may have more or less wind. So the provision of energy becomes uncertain,” says Professor Wiesemann. “That’s fundamentally different to our energy systems in the past. We had nuclear, coal and gas power plants, and they worked. They needed some fault resistance, so if a power line goes down the whole energy network doesn’t go down, but you did not require a sophisticated probabilistic treatment for that. Now it’s different, suddenly uncertainty becomes a major player.”

Renewable energy is inherently uncertain. We can have cloudy days, in which case we have less sun, and we may have more or less wind. That’s fundamentally different to our energy systems in the past.

To offset shortfalls in wind and sun, conventional energy needs to play a role in the system. But we also need to plan investments in an energy system years ahead. Here, the problems Professor Wiesemann identified with probability distributions appear: “First, we cannot form an accurate probability distribution for wind and sun five years from now. Second is the curse of dimensionality — we would have to keep track of wind and sun in 50 or 100 places in the UK. Third is dynamism – we’d better anticipate in our choices today that in ten years’ time we will know much more about demand and the technology that will be available.”

For this reason, Professor Wiesemann does not aim to compute an accurate probability distribution. He works on identifying the plan that incurs the minimal loss should it turn out to be maximally wrong.

A landscape covered with solar panels with a wind turbine in the foreground

Longer term decision-making

We have seen that it is an advantage to have a plan that can be easily adapted when unforeseen events occur, and where this is not possible, to use probability distributions to appropriately balance reward with downside risk. For longer-term decision-making, another valuable tool – which can be combined with both these approaches – is to carry out a plan in phases, adapting it as more information becomes available. The key here is to build in flexibility, the ability to adapt to a changing environment, and recognise the right time to adapt.

Dr Michel-Alexandre Cardin, a senior lecturer in the Dyson School of Design Engineering, develops decision-making techniques for major engineering projects such as the building of new aerospace, building, energy and transportation systems. These require financial investments in the millions or billions and significant physical resources. They are also highly vulnerable to a changing natural environment, economy and technological landscape, and events like the COVID-19 pandemic.

Decision-makers sometimes tie themselves into major projects without building in flexibility, because they are attracted by the economies of scale achieved by committing all at once to significant investments, or they are confident about future projections of major uncertainty drivers like price or demand. However, recent studies have shown that building in flexibility can improve expected value to investors or society by 10–30% and sometimes more.

Dr Cardin says: “Incorporating flexibility into decision-making when we design critical systems is hugely important not only for maximising returns on investment, but also for making society more resilient to extreme situations like pandemics and terror attacks. It also supports sustainability by making better use of resources.”

Dr Cardin offers the example of the Iridium satellite phone network, which launched 66 satellites in a single year in the 90s, anticipating major demand just as land-based mobile phone networks entered the mainstream. The unexpectedly low demand, combined with early engineering decisions that meant the satellites could not adjust their orbits, meant revenue failed to cover the huge $4 billion investment – and ultimately led to the company’s bankruptcy, despite its award-winning technology.

Decision-makers sometimes tie themselves into major projects without building in flexibility, because they are attracted by the economies of scale achieved by committing all at once to significant investments, or they are confident about future projections of major uncertainty drivers like price or demand. However, recent studies have shown that building in flexibility can improve expected value to investors or society by 10–30% and sometimes more.

Dr Cardin offers the example of the Iridium satellite phone network, which launched 66 satellites in a single year in the 90s, anticipating major demand just as land-based mobile phone networks entered the mainstream. The unexpectedly low demand, combined with early engineering decisions that meant the satellites could not adjust their orbits, meant revenue failed to cover the huge $4 billion investment – and ultimately led to the company’s bankruptcy, despite its award-winning technology.

This situation could have been mitigated, Dr Cardin says, if Iridium had adopted a flexible capacity expansion strategy by launching the satellites in phases and designing the satellites to change orbital configuration in space in line with new information on demand and coverage requirements. Many other engineering projects shared a similar story, for example the IUT Global waste-to-energy system in Singapore and Ghost Cities in China.

The mission is to create a new mindset,” says Dr Cardin. “We need to design systems in a way that enables flexible decision-making to face future uncertainty, threats, and opportunities."

Real options analysis, a technique that became popular in the 80s and 90s, offers decision-makers a way to value a flexible project. A real option is the right, but not obligation, to take an investment decision such as expanding or reducing capacity or delaying or abandoning investment. To analyse real options, we break a project down into decision-making phases and, given the current state of the project (e.g. capacity) and expected future realisation, determine the value of exercising each option.

This analysis can sometimes show that the value of expanding a system’s capacity in phases, for example the Iridium network, is greater than the value of expanding all at once. The phased approach can be more valuable because it makes it possible to expand capacity only when needed and to expand more than originally planned if conditions are favourable. Moreover, the analysis may value the project as a whole more highly than a valuation that does not break the project down into phases, an approach that might even give the project a negative expected value by failing to incorporate the value of being flexible and leaving options open.

A limitation with the classical real options approach is that it assumes that if a parameter such as the price of the product we are selling increases by £10 and then in the next time period decreases by £10, the value of the asset or project is equivalent to it decreasing and then increasing by the same amount. This is often a bad assumption in industrial contexts, because it does not account for path dependencies, such as the fact that you are likely to expand capacity when the price is high, and that this expansion cannot always be reversed without a significant cost in case of downturns.

Portrait of Dr Cardin

Dr Michel-Alexandre Cardin in the Dyson School of Design Engineering. Photo: Jason Alden

Dr Michel-Alexandre Cardin in the Dyson School of Design Engineering. Photo: Jason Alden

Dr Cardin has developed a version of real options analysis better suited for industry practice, which uses decision-rules. These are rules that specify the best time to change the system to capitalise on an upside opportunity or reduce exposure to downside risks, so as to improve the expected value. By using decision-rules, we can account for path dependency by modelling the decisions that will be made in each phase and reflecting these in the values of the choice sets available in subsequent phases. This is computationally complex, and he is currently exploring the role that deep reinforcement learning could play to build and analyse effective models more efficiently than humans can.

In addition to financial decision-making, Dr Cardin examines the engineering decisions – including the mechanical nuts and bolts – that need to be considered at the outset of a project to build flexibility into the design of a system: for Iridium, for instance, each satellite would have needed to be designed to change orbit. His approach is in line with the Dyson School of Design Engineering’s focus on systems thinking, which considers product design and engineering not only from the point of view of user needs, but decision maker’s needs along with other components of the system a product is embedded in. He is currently interacting with private equity firms in infrastructure, who need better data-driven decision support tools to enable them to make best use of their resources and increase value.

“The mission is to create a new mindset,” Dr Cardin says. “We need to design systems in a way that enables flexible decision-making to face future uncertainty, threats, and opportunities. Think of future buildings – these will need to embed more flexibility so we can adapt to future health crises like the COVID-19 pandemic, or climate change. A lot of these decisions will not be possible unless systems are appropriately designed from the outset to build in flexibility.”

Photo: The Delta-II Rocket, carrying the Iridium network's first five satellites, lifts off on 4 May 1997 in California. The satellite phone network was launched just as land-based mobile phone networks were entering the mainstream, leading to unexpectedly low demand. Credit: STR/AFP via Getty Images.

A rocket launching into space

Partnerships for better decision-making

Decision-makers in business and government can benefit from the advanced tools and techniques for decision-making under uncertainty under development at universities,” says Professor Misener

At Imperial, there is a strong focus on working with industry partners to solve decision-making challenges. “Achieving an optimal outcome in a complex environment is challenging, whatever field you are working in, and decision-makers in business and government can benefit from the advanced tools and techniques for decision-making under uncertainty under development at universities,” says Professor Misener.

“But techniques from fields like maths and computing are only useful if combined with a very strong understanding of the application area and the on-the-ground realities decision-makers face. This is why I and my academic colleagues at Imperial really value our partnerships and connections with industry. By lending us their sectoral expertise, they are helping us to develop solutions that make a real impact, and we are helping them find practical solutions to the challenges they are facing.”

This feature was produced by the Enterprise Division at Imperial College London, which offers businesses access to Imperial’s world-leading academic community and new technology.

It accompanies a 25 February 2021 event on decision-making under uncertainty by Imperial Business Partners, an Enterprise service that guides members through Imperial’s innovation landscape.