Big data features not only large volumes of data but also data with complicated structures. Complexity imposes unique challenges in big data analytics. The issue at hand is how to link typical new data elements of big data as covariates to traditional reliability responses such as time to failure, time to recurrence of events, and degradation measurements. New methods like deep learning, text mining and multivariate degradation models are currently explored to use big data for reliability applications. These new methods can be the basis for new reliability propositions like use based insurance. Basis for this presentation is a paper by William Meeker and coworkers, were new reliability methods for using Big Data are introduced. At TNO we are currently working on Digital Twins for Smart Manufacturing, a topic closely related to use of big data for reliability in industrial environments
Jan Eite Bullema, TNO