Power and vulnerability in structural analysis
July 21, 2017
Walking through design offices and flipping through books and archives, we’re often struck by glimpses of images that aren’t immediately decipherable — the one above, for instance. Our What’s That? series tells their stories.
Neither an avant-garde abstraction nor a QR code, the image above depicts a stiffness matrix, the basic building block of structural analysis software.
Modern analysis programs have empowered engineers to design increasingly ambitious structures. Recent decades have seen the realization of incredible buildings unachievable without advanced computational analysis. Landmark projects like Beijing’s CCTV Headquarters stand as testament to these programs’ capabilities.
The complexity of today’s structural analysis programs tends to obscure their inner workings, even to the engineers who use them every day. As a result, it can be difficult for design teams to critically assess the software’s performance.
In an attempt to bridge this knowledge gap, Serguei Bagrianski, an architectural designer in Arup’s Toronto office, recently coauthored a textbook with Princeton University’s Jean H. Prévost. Based on an undergraduate course that Dr. Prévost taught at Princeton for over 15 years, An Introduction to Matrix Structural Analysis and Finite Element Methods chronicles the evolution of the computer code behind modern structural analysis.
The history of structural analysis
To understand how stiffness matrices became so prominent in modern structural analysis, we need to go back more than three centuries.
Elasticity, a fundamental principle of structural mechanics, was first described by Robert Hooke in 1676 as an anagram: “ceiiinosssttuv.” Daring his contemporaries to decipher the code, Hooke revealed the answer in 1678: “Ut tensio, sic vis,” which translates from Latin to “as the extension, so the force.”
In establishing a linear relationship between force and the deformation of a given object, Hooke’s law articulated the mechanical performance of trusses, beams, and frames.
Prior to the Industrial Revolution, builders had limited need for Hooke’s law. Traditional masonry and timber structural systems were designed using empirical knowledge and rules of thumb rather than analysis. But when mass production of iron began late in the 18th century, the material’s incredible strength attracted the interest of builders eager to explore more complex structures.
In order to realize these new systems, building designers developed a variety of rigorous analytical techniques. However, these techniques could be used to analyze only determinate systems: that is, structures with no redundancy, where every element is essential. Although builders of the day had limited appreciation for the benefits of redundancy, today’s engineers recognize that it is a prerequisite for safe, resilient infrastructure.
It wasn’t until the 1930s that a technique began to take shape for the analysis of indeterminate systems: redundant structures where some elements could be removed and the structure would remain stable. This period marked the emergence of matrix structural analysis (MSA), which allowed designers to analyze a complex structure by mathematically stitching together its simple component parts. MSA relied on the invention of stiffness matrices: mathematical entities that precisely express the elastic behavior of structural elements.
It’s no coincidence that MSA emerged at the same time that Alan Turing was laying the foundations for the programmable computer. In order to assemble and manipulate stiffness matrices, it was necessary to perform an overwhelming amount of repetitive operations. Although these calculations were too laborious for hand computation, they proved ideally suited for computers.
MSA made it possible to analyze structures of any complexity — as long as the contributing element stiffness matrices were available. (An element stiffness matrix describes the behavior of one element in a structure, whereas a global stiffness matrix describes the behavior of the entire structure.)
Unfortunately, the elements that could be derived using classical mechanics were limited to trusses, beams, and frames — what engineers refer to as discretized systems. MSA was simply inadequate for analyzing continuous surfaces such as membranes, plates, and shells.
The 1950s witnessed a critical “A ha!” moment, as engineers recognized that the solution to this dilemma lay in living with error.
This breakthrough occurred with the development of the finite element method (FEM). FEM provided a necessary complement to the universal framework of MSA: it defined a universal method for generating elements.
FEM effectively allowed a continuous structure to be analyzed as a mesh of discrete elements.
To understand how this works, it’s important to know that the behavior of continuous structures like the plate below is governed by something called continuum mechanics, which transforms the simple algebraic expressions of classical mechanics into partial differential equations (PDEs).
As any engineering student will confirm, solving PDEs is no easy task. Even for a simply supported plate, arriving at an equation for the deformed shape requires several dozen pages of algebra:
Any minor modification to the plate’s geometry — a duct opening or a skewed edge, for example — invalidates this solution, necessitating another volume of derivations. Consequently, analytical solutions are impossible (or at least impractical) to obtain for most engineering applications.
FEM bypasses this hurdle by solving PDEs approximately rather than exactly. In order to control the error inherent in an approximation, FEM has a unique property: the more elements employed, the lower the magnitude of error.
Although using a fine mesh might seem like the obvious way to achieve better approximations, computational burden grows dramatically with the number of elements used. Even with today’s computational power, analysis of complex structures typically takes hours and sometimes even days. The art of finite element analysis lies in finding the optimal balance between accuracy and computation time.
The engineer’s perspective
Therein lies the power and vulnerability of modern analysis. MSA and FEM, respectively, contribute the computational framework and element library for a universal method of analysis. Yet the approximation that enables this power necessarily introduces the vulnerability of error.
Modern analysis software has enabled the design of extremely complex structures, but it hasn’t lessened the engineer’s burden of ensuring the competency of the analysis.