In order to develop plans and recommend actions in support of strategic goals, national security professionals need the ability to anticipate the impacts of change in their external environment. The planner’s task is complicated by the fact that from the vantage of the present, there are many possible impacts of change. In a laboratory, variables can be titrated precisely and outcomes predicted; in the national security environment, variables are dynamic and complex, and outcomes are the product of emergent interactions among people, institutions, and systems. The exact path of these interactions is inherently nonlinear and difficult to predict.
The national security strategist is thus also in need of specialized thinking skills to help him or her mentally model uncertainty and grasp the nonlinear and complex pathways of change. These thinking skills do not come naturally to the modern American military education system, which valorizes an Enlightenment-inspired scientific approach and has historically focused on teaching critical thinking skills. Such skills are valuable when a problem is well defined and it is possible to identify its component parts, evaluate evidence, and generate solutions. However, they are not sufficiently robust to address situations that are as ambiguous, loosely bounded, and complex as the possible futures of national security.
In contexts of uncertainty, another set of skills—those contained in the strategic foresight toolkit—is required. Arguably, this requirement is especially vital today: technological advancements and their unevenly distributed but powerful effects, climate change, and social change are unfolding at a challenging pace in our interconnected global system. Black swans, cascading problems, and uncertainty stemming from interconnections abound. The stakes are high for anticipating and planning effectively for the potential impacts of change.
By way of example, imagine you are a strategist in the 1970s seeking to understand the implications of the newly created Internet. Its early architects did not view Internet protocols as a potential locus of national security threat because they assumed that small communities of mutually trustworthy academics would be the most likely users of the future Internet. Critical thinking would not assist you in generating scenarios of the possible futures of the Internet, let alone conceiving of it as the foundational infrastructure of future human institutions. In open-ended situations such as the future of a new technology or institution, systems thinking and frameworks to help structure imaginative and expansive exploration of the implications of change are required. Strategic foresight supplies these frameworks.
This article makes a two-pronged argument. First, strategic foresight, a discipline I describe in more detail, provides the vitally needed mindsets and frameworks required to plan in uncertainty.1 Strategic foresight should be taught and used more widely in the national security space. Second, where foresight is being taught and used (it has recently had an upswing in interest), there are opportunities to improve its application and better serve planning staffs and decision-makers.
What is Strategic Foresight?
Strategic foresight is an interdisciplinary domain that draws on cybernetics and systems thinking, management sciences, sociology, data science, cognitive psychology, and creative thinking, among others. Anticipatory thinking to support decision-making is its essence. The individual who invests time in learning how to think like a futurist emerges with an appreciation for the cognitive barriers faced by the human brain when it attempts to envision the future and will be well-practiced in holistic, synthetic, analytic, and creative ways of thinking. Organizations that adopt foresight practices to help them identify trends at an early stage and adapt or innovate to leverage those trends are in stronger competitive positions than those that do not. This value is demonstrable: A recently completed longitudinal study of large European firms demonstrated that those incorporating foresight into their strategic planning realized significant gains in both profit and market capitalization over the long term.2 Management science has revealed that systematically scanning the peripheral environment for weak signals of change can help people and institutions prepare for otherwise unexpected events.3
Foresight is not an unknown quantity in the U.S. national security space, yet it has waxed and waned as a discipline of interest. Following sustained enthusiasm from the end of World War II through the early 1990s, interest languished as the dramatic events of the moment—the fallout from the demise of the Soviet Union, the 9/11 attacks, the 2008 financial crisis—took center stage. Recently, strategic foresight has reappeared on the radar. The growing number of conference sessions, professional education opportunities, and pursuits such as science fiction writing contests designed to trigger creative thinking about the future attest to this rise in interest. This is all good news, and, hopefully, leaders in all relevant institutions will continue to grow their support for fostering successors who are skilled at thinking both systematically and creatively about how to envision an uncertain future.
Yet enthusiastic support, while necessary, is not sufficient to create a future-minded national security workforce. It is possible to use strategic foresight well or badly. In the national security community today, there is room for improvement. Strategic foresight activities are often brought into classrooms and conference rooms in ways that are superficial. A quick exercise in scenario-building, for example, may give participants the satisfaction that they have engaged in strategic foresight. But when conducted superficially, such activities typically become exercises in reinforcing rather than challenging preexisting ideas about what the future will be like. To be clear, superficiality is never intentional. Instead, urgent pressure to produce activities leads course or activity facilitators to using frameworks and ideas that are the easiest to access instead of those that are the most appropriate. Popular ideas and activities circulate through the national security educational community uncritically, so that rough usage in one place is replicated in another, and it is difficult to get new thinking in the door.
As the history of national security community engagement with foresight demonstrates, thinking creatively about the future is a cultural challenge. Large bureaucracies, such as the Department of Defense, are often resistant to change and to reckoning with the fact that conditions for success in the future may be different from those of today. Institutional proclivities can shape and constrain the imagination that is required to develop insights into the future of a profoundly complex, changing, and uncertain world.
To take one example, futurism is frequently presented in mainstream culture as primarily associated with technological innovation. This is a narrow use of the strategic foresight skillset; technology is only one of the drivers of complex social events such as war. When they assume, rather than interrogate, a high-tech future, military participants in strategic foresight foreclose the opportunity to identify signals of change and development across the spectrum of human activity. This has in the past led to institutional blindness to signals of change in societies that might produce low-tech, asymmetric approaches to armed conflict. One of the key tenets of foresight is that it is imperative to explore not only the most likely future but also a range of possible futures. It is in this arena where potential black swans lurk.
The popular premise that future wars will take place in megacities (with more than 10 million inhabitants) offers another example of how a selective use of the tools of strategic foresight can narrow strategic vision precisely where it would be useful to expand it. The war-in-megacities scenario is grounded in trend information related to the urban growth. By some accounts, there will be at least 50 such cities by 2050.4 So it is reasonable to project that at some point, warfighters will probably engage in a megacity. However, strategists who halt their exploration of the future with scenarios based on the extrapolations of current trends alone are underutilizing the tools of strategic foresight. To use the foresight toolkit more comprehensively and effectively, planners will also think deeply and creatively about the possible, a much wider and more complex world of potential than that of simply the probable. This is not easy; it takes intellectual rigor and self-knowledge to explore trends that may violate one’s institutional worldview. How could a war unfold in a nonurban area, especially in a world that is primarily urban? What assumptions are held today about what a city is and looks like? Will other emergent realities—for example, about the way people communicate and work or about how climate change and weather evolve—change the ways that cities develop in the future?
Venturing answers, however exploratory, to questions that probe beyond the boundaries of current expectations could help reduce strategic surprise in the future and prompt innovative thinking in advance of the unexpected. Strategic foresight offers compelling frameworks for asking these sorts of questions, and the frameworks themselves are not fancy or difficult to understand. This is all the more reason why advancing the understanding of strategic foresight as a discipline and a strategy support tool is not only a good idea but also a clear and simple route to creating opportunities for asking difficult questions about the potential future at a time when such questions are critical.
Strategic foresight functions best as a normal, integrated element of an organization’s planning cycle. This cycle will typically include horizon scanning (also called environmental scanning) for early indicators of change, the integration of early signals into existing forecasts, impact assessments, and a decision-making process that uses insights of foresight to inform action.
Historically notable examples demonstrate the power of this activity. The most famous example is likely that of Royal Dutch Shell—a common tool in the foresight kit because of its pioneering use of scenarios. The oil industry historically forecast its future needs on the presumption of steadily growing demand and opportunities to locate supply. In the 1970s, Royal Dutch Shell recognized that geopolitical developments (such as the newly formed Organization of Petroleum Exporting Countries) could lead to a serious disruption in oil supply, transforming what was heretofore a buyer’s market into a seller’s market. As a result of its readiness to take this scenario seriously, the company was prepared for the 1973 oil embargo and recovered with greater speed than its industry peers.5
In the United States, the coordinated effort to prepare for potential disruptions related to the Y2K “bug” offers a powerful example of the role strategic foresight can play in raising awareness and addressing potential crises. In 1998, the World Future Society (formerly a nonprofit organization for futurists) began working with the White House, United Nations coordination groups, and others to anticipate and address potential Y2K issues in the United States. Most of their efforts were in “real-time networking and swift decision-making,” but the group also raised awareness in a 1998 conference on the consequences that could unfold without further attention.6 Failures of foresight are similarly dramatic, as the many well-known anecdotes of corporations and retailers that failed to recognize the potential impacts of technological and cultural trends, such as online shopping or streaming video, attest.
In the spirit of supporting this capacity, the remainder of this article offers a brief account of the role that foresight has played in military planning, followed by recommendations for advancing its implementation in military education today.
The history of foresight in the U.S. national security environment is offered here to rebut the pervasive idea among national security professionals that the United States cannot be good at long-term strategy or planning. (This idea is often justified by reference to the United States as a young country, as compared to China, a country perceived to be strong at long-term planning because it has a long history and a centrally controlled government.) This is clearly a discussion that deserves its own time and place; what can be stated here is that military futurists have played a critical role in creating some of the foundational techniques and ideas of foresight, which offer an alternative history of successful and thoughtful exploration of potential futures. It also helps to press into relief some of the cultural tendencies that might have helped planners in the past but that might be hindrances today.
A quick survey of the history of strategic foresight as a coherent management planning discipline often begins with the example of the U.S. Air Force. After World War II, under the direction of Secretary Harold “Hap” Arnold, the Service took the first steps to connect U.S. military planning with long-term scientific and technological developments. In order to organize resources and investments, Arnold commissioned a major study titled Toward New Horizons that projected future technology needs for the Air Force. The planning momentum was maintained by standing up the Research and Development Corporation, known today as RAND, which became the military’s go-to think tank for long-term questions and also the home of some of the country’s most prominent futurists during the Cold War.
This story of foresight’s foundations in the United States encapsulates the spirit of the American brand of foresight: a triumphal and empowered energy, a focus on technology as the key critical driver of future events, and a positivist outlook of the future as knowable and manageable. In the ensuing decades, this foundational vision of the postwar American future infused planning activities and also a particularly American mindset about how to think about “the future” in the abstract.
In the 1970s, the ideas of previously obscure futurists gained popularity, most notably as a result of Alvin Toffler’s bestselling book, Future Shock. These ideas trickled into the executive offices of both government leaders and major corporations. Long-range planning and the basic tenets of foresight were accompanied by a spirit of openness and an exploratory readiness to consider the potential that more than one future might emerge. At the same time, voices of warning also called on political and military leaders to adapt U.S. planning processes to a world that was becoming more complex and interconnected. Projects such as the Department of Defense Office of Net Assessment, which was established in 1973 to assess the impact of converging macro-trends, were attuned to the need to assess complex environments.
Some of the most forceful notes of warning can be found in a 1987 volume titled Creating Strategic Vision: Long-Range Planning for National Security.7 This compilation of essays outlining the various techniques of strategic foresight was offered as an antidote to the “pragmatic, fragmented, short-term” tendencies that were presumed to characterize the American way of leadership.8 Much of this critique from a generation ago about the short-term nature of U.S. strategy has become dogma today. When I introduced the work to a cohort of flag officers in an advanced training course recently, they readily warmed to the thesis that the United States is inherently poor at long-term thinking and needs to do a better job.
Also, in the late 1980s, the U.S. Army War College introduced a new course titled Futures: Creating Strategic Visions.9 The goal of the course was to provide promising future leaders with the creative thinking skills required to envision and communicate alternative futures in an executive setting. Alternative futures, in this context, refers to a practice of indicating that more than one future is possible and that one’s own present-day decisions help to shape the future. The course was notable for stressing creativity as a teachable skill and for proposing that the future may unfold in many possible ways.
And there the enthusiasm stops. There is little documentary evidence in the 1990s of the creative, open-ended energy that suffused futures work in the 1980s. Indeed, the signs point in the opposite direction. The 2004 Strategic Leadership Primer published by the Department of Command, Leadership, and Management of the Army War College, while retaining the language of strategic vision and the future, presents the concept quite differently than it had in the 1980s.10 Drawing grimly on President George W. Bush’s 2004 remarks that the Nation’s “terrorist enemies have a vision,” the document calls for a countervailing one: an overarching summation of what “ought to be,” subject to the ends-ways-means logic of strategy creation and capable of being summarized in a pithy image or phrase—vision, in other words, as a tagline. Little could be further from the late 1980s promotion of strategic vision as an empowering, adaptive capacity to think creatively and imagine alternative futures.
A decade later, as the mood of crisis that permeated the “hot” years of the war on terror waned, foresight activities once again emerged into national security and Federal Government consciousness. Today, we can find a Federal Foresight community of interest sharing activities across the government in the shape of formal educational opportunities, such as the Army War College futures seminar titled What Kind of Army Does the Nation Need in 2035 and Beyond; the commitment to develop an entire course on foresight at the Army Command and General Staff College; and hands-on long-term planning experiments such as the Air University’s Blue Horizons program.11 Beyond formal education, there are forums such as the periodic conferences and online community of the Mad Scientists, sponsored by the Army Training and Doctrine Command, and various think tank conferences and events. This upsurge of interest, coupled with forays in different areas of the military into more wargaming, red-teaming, and activities structured according to design theory, suggest that this is a favorable moment to advocate on behalf of not simply quantity, but also higher quality. Here are five recommendations for its achievement.
Five Recommendations to Maximize the Benefits of Foresight
Embrace Analytic Holism. The U.S. military typically privileges technological innovations as the key driver of the future, which reflects a deeply embedded tendency in American culture and history. This is problematic in several directions, all of which distort the ability to accurately assess the evidence about potential contexts of future conflict.
First, technological change does not take place in a vacuum, but at the intersection of other human institutions and drivers of change. While there is a need for pure technological forecasting in weapons development and other related areas, this work will not generate scenarios of potential future conflict. It will only produce scenarios of future weapons systems and other related technologies.
Analytic holism is a concise directive reminding participants in futures work to keep a wide range of drivers of change in mind. A traditional place to start is with the drivers encapsulated in the acronym STEEP—society, technology, environment, economics, politics. There are others, of course: cultures, demographics, media, and legal systems, to take a few obvious examples.
Change in a complex, open system, such as the international system, will occur at the intersection of developments in these areas. War and conflict, as quintessentially social events, are always shaped by developments in these areas, even when technology on the battlefield is of the essence. If planners do not look at their surrounding environment as holistically as they possibly can, they risk not seeing or recognizing signals that are eminently available for analysis and thus losing the opportunity to consider how to avoid being surprised by them. One sobering example from this century should be the social media sophistication of the planners of the al Qaeda attacks in 2001. If the national security community had been better prepared to see how, in the 1980s and 1990s, satellite television and the advent of the Internet affected social interactions around the world, it could have reduced the unwarranted surprise that “low-tech” cultures could use new media in sophisticated ways.
An even more sophisticated step in this arena will be for strategic foresight projects to start acknowledging the fundamental transformations in the global economic, political, and social systems being wrought by the ongoing evolution of digital technologies. As many commentators have noted, all of humanity is in the first stages of a new era grounded in digital infrastructure.12 When technological innovations on this scale become ubiquitous and accepted, they actually become less notable in themselves as features of our world. Take, for example, electricity. Although not everyone has electricity, its ubiquity is a critical explanandum of human behavior. The world is on the way to a digital ubiquity (even though not everyone will have access to digital tools), and it is at the point of ubiquity that nontechnological drivers of change become vitally important to explore in order to posit potential future environments.
Rather than highlighting technological drivers of change and treating other drivers as “soft” or less real, strategic foresight project leaders should frame explorations of the future holistically and with a strong eye to ways in which people, collectively and individually, drive emergent and unexpected system behavior. This nuanced approach can improve the accuracy of insights into potential futures and potential surprises, even in high-tech battle space environments.
Demographers project that more than 70 percent of world’s population will live in cities, many of them coastal, by 2050, and that potential for instability and strife caused by humanitarian or other disasters in megacities makes it necessary to look at them as potential future battlegrounds, Lagos, Nigeria, June 23, 2011 (Wikipedia)
Adopt a Shared Lexicon Across the Government. Foresight terminology can be confusing. Not only does it present a number of terms of art that are also present in our everyday language (such as foresight, uncertainty, and prediction), but there also are differences among futurists and other disciplines in the ways they use these same words. While I might use the word predict in a loose and general sense to indicate my effort to explain my subjectively developed insights into how the future might unfold (“Here’s how I predict the long-term impacts of negotiations over the Arctic on both trade and culture,” for example), many practitioners in the strategic foresight community use the concept of prediction to refer to the narrow capacity to identify exactly what will happen, to a degree that is typically available only under strictly controlled experimental conditions. To add to this difficulty, many terms are somewhat similar in everyday usage (“forecasting the weather” and “predicting the weather” point to the same general idea for most purposes). Similar lexical and conceptual confusion abounds in the national security community and between different projects.
A clear and relatively simple route to orienting defense practitioners around foresight work will be by developing an authoritative lexicon and educating people across the government to use it as a reference. Other dictionaries of terms have been created—most notably by the government of Singapore, whose civil service does use the lexicon—and these and many other resources are available on the Internet for anyone’s reference.13 However, as a glance at the Singapore lexicon shows, such dictionaries are reflections of the context and priorities of their governments. A U.S. lexicon may share terminology as it is used by futurists around the world, but it will be a more authoritative resource for American professionals if it is composed with the United States in mind. Such a project will engender other benefits as well; it will create a clear point of reference for developing institutional knowledge across Services and agencies and, simplest of all, the introduction of conceptual clarity into the disparate activities by different actors.
“Get on the Balcony.” The title of this recommendation borrows from the advice offered in the 1990s by management strategists Ronald Heifetz and Donald Laurie to corporations facing emerging business conditions requiring novel forms of behavior and new ways of defining and achieving success.14 Heifetz and Laurie suggest that rather than offering solutions in such situations, leaders should galvanize adaptation to these new conditions by safely exposing employees to the challenges facing them and supporting the development of new behavioral models.
To this end, Heifetz and Laurie encouraged leaders to learn not only to view their organizations from the “field of play,” where they are a part of the day-to-day work of their team, but also to “get on the balcony.” From the rafters, high above the game itself, leaders can see not only competitors and the dynamics of doing business side by side with their colleagues but also the larger dynamics of the system—how different parts of the organization work together, and how they interact and intersect with the world beyond. Observations made from the balcony can provide powerful insights into the dynamics of the wider system and introduce opportunities to find “leverage points . . . to intervene” in the system, as the esteemed systems thinker Donella Meadows characterized the opportunity.15 Strategic foresight education and activities offer an appropriate venue for this exploratory way of seeing the world. First, holistic vision and systems thinking are intrinsic to foresight; only by seeking signals of potential change throughout the system, and beyond one’s typical domain, will one find the potential surprises and opportunities that offer competitive advantage.
This recommendation is especially salient for leaders in the U.S. national security community seeking to grapple with how to influence future events in the emerging and not yet fully understood geopolitical circumstances of the 21st century and beyond. In a rough analogy to the sports teams that serve as models for adaptive leadership in Heifetz and Laurie’s work, institutions whose work is national defense tend to the see the world in terms of opposing teams. This is reasonable; it is their job. The field of play is the space from which members of the institution seek to see threats and potential adversaries.
When the world and national situations are in flux, however, this view will not provide a sufficiently comprehensive view of the evolving system—in this case, the global geopolitical, economic, and social systems. Leaders who can “get on the balcony” to view the larger context of change will see the system from an unusual vantage point that highlights flows, connections, and feedback loops not only beyond but also between parts of the U.S. defense establishment and other actors, whether these are militaries, corporations, global nonprofits, or any of the other institutional actors who make up the world.
Incorporate Complexity Thinking into Foresight Activities. Foresight and the study of complex systems arose from similar and even intertwined conceptual movements in the 20th century, and both futurists and complexity scientists draw inspiration from some of the same people—for example, Jay Forrester and Donella Meadows (and others), whose research used computer modeling in the 1970s to explore the intricate relationships between such large-scale systems as human societies and the planet’s ecological systems. The interdisciplinary science that emerged in the late 1970s recognized that some systems cannot be reduced to their component parts but rather are the result of small, simple actions whose interactions can produce intricate collective behavior of the systems as a whole.
Despite these early connections with foresight, the potential contributions of complexity thinking to more effective foresight work are too often given short shrift in contemporary education and activities in the defense context. The technical specificity of terminology used by complexity thinkers, such as complexity and uncertainty, are instead reduced to brisk contextual commentary that is presented as self-evident: the world is more complex and uncertain than in the past. Once past these observations, military foresight classes and seminars typically return to the comfortably reductionist space of a future battlefield projected as more or less walled off from the other systems with which it interacts. This means that the fullest spectrum of potential scenarios that could be explored as elements of future conflict is left unexplored, since war, as a social institution, resides and interacts with other systems.
Incorporating instruction in complexity thinking could produce nuanced scenarios of possible futures and therefore result in higher quality planning. While this is not the place to elaborate in depth on complexity thinking and complex systems, we can note that a deep dive into the conceptual lexicon of complex systems, applied to the global system, can help strategists and planners to visualize the potential actions of militaries (as systems), as the porous systems they are, and to map their interactions both in and out of wars in relation to these systems. Such an activity in the runup to the second Gulf War would have usefully mapped the potential interactions of the military, industrial, national, and social systems that could be expected to interact in the case of a war.
Start Early to Build a Culture of Adaptive Leaders. This recommendation could not be simpler. Foresight mindsets and tools are too important to leave until the last moment, when a Servicemember or civilian equivalent has already become a flag officer, which is when many are first exposed to them. Foresight, in one sense, is a habit of mind, a way of seeing the world in such a way that we question our assumptions, view events holistically, and seek out the interconnections between them. These are all the kinds of habits of mind required to be the adaptive, agile thinkers who will be needed in the future. Developing an educational “ladder” that begins with habits of mind that prepare emerging leaders to think like futurists, and that continue to advanced opportunities to apply thinking skills to the open-ended challenges of the future, has the potential to advance the overall strategic capacity of the military.
There could not be a more auspicious time to institutionalize more deliberate, speculative, and imaginative approaches to thinking about potential futures of violent conflict and its management, prevention, and resolution. The world appears to be at a pivotal moment, and the need for excellent leadership on the world stage is strong. Societies worldwide are just beginning to experience the transformational effects of the shift from an industrial to a digital world and are as dramatically on the brink of the potent effects of climate change, demographic shifts, and cultural swings. There can be little doubt that emerging environments producing social stress, violent conflict, or significant displacement will have novel characteristics and the potential to look quite different from those for which the military typically prepares. In light of the acknowledged need for an increasingly adaptive and future-focused force, it is important to encourage the burgeoning interest in the future. Yet how this future focus is encouraged and what activities are undertaken to explore it are just as important. In this realm, there is currently room for more reflection and improvement.
Copyright 2019 Amy Zalman All rights reserved.