12/07/2016, Michael Heflin, CEO, Sensuron
“Do you know what the secret of life is?” Curly asks from atop his horse, cigarette bouncing in his mouth with each syllable. “This.” He holds up one finger and stares at Mitch.
“Your finger?” Mitch inquires.
“One thing. Just one thing. You stick to that and the rest don’t mean [anything].”
This famous scene from the 1991 comedy “City Slickers” has inspired individuals and organizations to find their “one thing.” When it comes to innovation, however, it is not about “just” one thing.
There is no doubt that computer models have greatly accelerated the design process and produced significant cost savings over relying entirely on prototypes. By using simulations to estimate the behavior of components and assemblies, engineers can reduce the amount of physical tests that are required to launch a new product. In doing so, organizations have shifted the cost of product development to occur early in the process when CAD work and simulations are done, rather than having to wait until the end of the process to run many physical tests.
Industries that have extremely high time-to-market pressures and significant development costs, such as automotive and aerospace markets, already rely heavily on computer simulations to cut time to market and maximize the ROI of development dollars. Historically, computer simulations have greatly advanced in terms of data detail, while sensing technologies have generally retained the same level of detail they have always had. The success of computer models and the slow rate of innovation around sensing technologies puts today’s organizations at risk of stagnating innovation. Without better integration between computer models and physical testing, organizations will be overly dependent on computer models for design, which can lead to costly product recalls or product failures.
In the automotive industry, computer simulations have becoming an essential part in vehicle designs and have grown significantly more complex. The data detail of computer simulations has improved significantly. And yet, as Tim Stubbs, general manager of the Advanced Structural Dynamics Evaluation Collaborative Research Centre at the University of Leicester observes, the data detail of physical testing is often minimized and not advanced. Computer models are becoming increasingly more detailed and complex, further enhancing their utility.
Physical testing, on the other hand, has not kept up with the improvement in detail in the same way. Test engineers often seek to get the minimum data detail possible in each test. As a result, they often have to go back and repeat a test to either verify the results or get better data. Improving the data detail of simulations without advancing the detail of physical testing technology will cause the expanding role of models in the design process to plateau, and can lead to expensive product failures.
Computer simulations have a symbiotic relationship with physical tests that produce experimental data. Therefore, not advancing testing technology at the same rate of computer models will limit the role of computer models unnecessarily and can lead to costly product failures in the field. The accuracy of models is entirely dependent on data inputs about the behavior of materials under different conditions.
Finite element models, for example, rely on statistical assumptions to determine how a design will behave in a real-world environment, but reality often does not match the assumptions made by models for a variety of reasons. Minute differences in the machining process, tiny cracks in the material, retrofitting situations, and unexpected environmental impacts can cause loads to be distributed across a design differently than a model predicted.
Similarly, models cannot take into account varying weather conditions. There are simply too many variables to take into consideration for a model to predict how a design will behave in all conditions. Although certainly not foolproof, physical tests help engineers account for some of these unforeseen variables that FE models don’t predict. Since computer models are dependent on inputs from experimental data, failing to advance the data detail of testing technology while rapidly advancing the detail of computer models undermines the utility of models in the long run.
The proliferation of new materials across industries presents additional challenges for computer models. If engineers were to continue using the same materials in designs for years to come, the role of physical tests would certainly decrease; however, new materials are constantly being developed as customer and market demands shift over time.
For example, the aerospace and automotive industries have rapidly adopted composite materials due to their excellent specific strength. One of the challenges this presents in terms of computer simulations is the ability of models to accurately predict the behavior of material that lacks significant characterization data relative to materials like aluminum, steel, or plastics, which have long been used in a variety of designs.
Validating computer models for composite materials is critical to the design process. For a model to be validated, a test must be conducted to verify the predictions of the model. When conducting physical tests to measure strain and load distribution, engineers focus on critical points identified by the computer model. Strain gauges are applied and information about the critical points is collected. This process lends itself to only monitoring for expected problems, which can create serious blindspots in the testing process. Without a comprehensive view of real-world data, a model could be validated when in reality it is inaccurate.
The issue with legacy sensing technologies like strain gauges and thermocouples is not a matter of accuracy, but rather, one of data detail. For the most part, traditional sensors only detect single points of information, limiting the insight they can provide into how a design is performing in the real world. As a result, engineers have to rely on models to tell them where to place sensors when performing a physical test. Although this process is good enough for materials that are well understood, it can be detrimental for designs that incorporate new materials. Design engineers in many industries are looking for and adopting newer sensing technologies that provide better data than legacy sensors.
While strain gauge technology has remained largely static since they were invented decades ago, newer technologies—such as fiber-optic sensing—have evolved to be able to accurately measure full strain fields and temperature distributions (e.g., using a single, hair-like fiber-optic cable). Unlike legacy technology, which only collects data at critical points, fiber-optic sensing platforms can collect spatially continuous strain and temperature data. These technologies are advantageous for model validation in that they allow engineers to observe how a structure behaves both at critical points and everywhere in between.
For example, fiber-optic sensors can be embedded into composite materials during layup in order to better understand the curing process. A common flaw, such as wrinkling in one of the layers of material, produces strain within the material that is almost always unaccounted for and not well understood. Since there is little data about how strain is distributed inside composite materials, computer models are missing critical data inputs—and as a result, are less accurate.
Legacy technologies like strain gauges can detect residual strain in composites, but only if the strain field reaches the surface and a gauge happens to be placed in exactly the right place. On the other hand, spatially continuous sensing technologies like fiber-optic sensing can measure entire strain fields—capturing data about critical points and everywhere in between. Additionally, as mentioned above, sensing fiber can be embedded in composite materials so that internal strain can be observed, as well.
The design process is typically considered to end once a product is tested and shipped to customers. However, the dawn of the Internet-of-Things (IoT) era provides ways for engineers to collect data about their designs long after the first customer receives the product. In reality, the design process for a product never ends.
Let’s use a bridge as an example: A bridge that is instrumented with a structural health monitoring system that obtains spatially continuous data can provide design engineers with high-quality data about how the bridge is behaving in the real world. Point sensors cannot provide enough insight about a bridge to truly reap the benefits from this type of process. Legacy sensors rely on models to dictate where they need to be placed, but will miss changes in a bridge’s behavior that occur in unexpected places.
These unexpected changes are often what leads to catastrophic failure, and is the type of data that computer models most require to become more accurate. In addition to helping prevent catastrophic failures when a bridge nears the end of its design life, the information about how the structure behaved over time would be invaluable to improving computer models. This type of continuous design process will not be as successful without advancing the data detail of sensing technologies.
Computer models and physical tests go hand in hand. Neither is sufficient without the other; the path forward requires better integration between the two, not further juxtaposition. Investing in advancing the data detail of sensing technologies requires some upfront investment for great returns. As you plan for future product development efforts, do you care more about incremental short-term gains, or substantial long-term advantages? It is, after all, human nature to seek immediate rewards even when the opportunity for substantially larger long-term benefits is available.
Those who are concerned with establishing long-term competitive advantages over short-lived, incremental advantages will seek to adopt more robust physical testing technologies like fiber-optic sensing. The symbiosis between physical testing technologies and computer models is not likely to disappear in the future. In fact, it will only grow stronger.