[Ontario Technologist, Feb 2020]

In the 21st century, big data has emerged as one of the most disruptive forces in the marketplace, causing businesses to develop new technologies, skills and tools to drive innovation.  As a society, we’ve now entered an era where devices, products and ser-vices are connected and interact with each other, which generates massive amounts of data. This large volume of information (structured and unstructured), often referred to as “big data,” uncovers trends and reveals hidden patterns, unknown correlations, market trends, customer preferences and internal processes, all of which help businesses make more informed decisions. While this is most often associated with digital marketing and social media research, the truth is big data touches many industries and a wide variety of applications.

Many organizations are turning to data-driven solutions to solve issues that affect productivity, influence capital project outcomes, reduce risk and increase customer value. To achieve this, these same organizations are modernizing their operating model with more sophisticated data analytics applications and initiatives. In 2016, Skanska, a construction and development company, began using sensors to track employee, equipment and material movement. This initiative revealed that workers on-site were walking up to six miles a day to procure equipment and materials. Using this data, they recognized the value in reconfiguring their job sites and daily workflow to save time and, ultimately, money.

Whatever the industry, big data requires an advance process framework to store and manipulate information that outpaces traditional data processing, which is not able to handle complex, voluminous and faster data sets. This adjustment to a new way of doing things has prompted a process evolution. And, within the next five years, organizations must revolutionize their existing systems – and learn what works and what doesn’t – to stay competitive in the marketplace. Every project an organization undertakes involves collecting, organizing, and analyzing significant amounts of data while simultaneously juggling other business priorities.

These projects require a familiar eye for detail that only modern data management can provide. For example, building information modelling uses data analytics and network collaboration to create more efficient processes.  JE Dunn Construction partnered with Autodesk to build LENS, a real-time system that uses data-driven predictive modelling to create a custom visualization technology. LENS speeds up the design process by allowing owners to see the project design, thus driving changes in the early design stages and avoiding wasted time.

Depending on the organization, the data measurement and management will vary based on their industry, workflow and processes. Parameters such as scale, materials, and subcontractors can also vary significantly from project to project, making it difficult to establish benchmarks. However, this common problem can be avoided by looking at what you  already  have, which is often in isolation by the business department or division which collected it. A recent report from Fails Management Institute revealed 30 per cent of North American companies are using applications that don’t integrate with one another. The report also showed that unstructured data captured from materials such as blueprints, time cards, emails and PDFs are leaving 49 percent of firms to transfer data manually between applications.

With an updated data system, organizations will find they have a lot more data than they realize, everything from finances to accumulated summaries from previous projects. The best way to leverage data that is collected or archived is to conduct a proper audit of the organization’s analytics. This audit will uncover critical insights that accelerate, affirm and improve the decision-making process, refining daily operations and helping the organization reach its annual targets. Plus, data analytics helps project teams evaluate market conditions, portfolio composition and individual project performance.

Finally, organizations should think about their company culture, and the way information flows, from marketing insights to day-to-day processes. It’s a common problem where experience is valued over empirical data, yet the experience can degrade over time, leading to a breakdown of internal infrastructure. Finding out where and how information is stored may prove difficult but is essential in preventing information paralysis.

The realization that records are information, all information is data, and that all data can be organized, and ultimately interpreted to find new insights will help companies to embrace process evolution, impacting their bottom-line profits.