When technology encourages the ostrich policy

Written by Wilfried Kirschenmann, on 06 February 2018

Through three examples, this brief article aims to remind us that implementing new tools does not exempt us from investigating the root causes of problems and seeking to address them at the source.

Enterprise Social Networks

The first example I would like to use concerns enterprise social networks. Indeed, in recent years, many companies have implemented social networks at various scales. There are countless articles recounting the failures of these projects. The causes often cited are the lack of attention to employee usage. From my perspective, the problem is deeper than that, and to understand it, one must question why companies are implementing these enterprise social networks. I will criticize two of the most common reasons:

  • To enable some level of cross-functionality within the company.
  • To facilitate knowledge capitalization.

In neither of these cases is the enterprise social network the solution. If there is an issue with cross-functionality, the solution will be organizational (agile organization, liberated organization, and other forms of ad-hocracies) or human (serendipity spaces, etc.). Within this organizational solution, the enterprise social network may indeed have a role to play, but only if it fits into a broader framework. If the aim is to promote knowledge capitalization by encouraging employees to publish their work, again, the answer is not in the social network. It lies precisely in encouraging employees to publish. The platform is only secondary, and there are many alternatives (blogs, dedicated websites, SharePoint, etc.). Some companies offer incentives for publications. Others fund trips to conferences. Still, others include them as personal goals for the year. The right solution is the one that fits the company's culture, and if it requires a tool, it must be part of a more comprehensive solution.

Data lakes

The second example I would like to use concerns data lakes, the foundation of Big Data storage. To understand the mistake made by many decision-makers, one must understand the challenges related to data and its storage before the advent of NoSQL storage. At that time, there were mainly two types of storage: files and structured data in databases. In the latter case, one of the difficulties in storing them was defining a data model, evolving it (along with the data!), and defining the related references. The major innovation brought by the Data Lake approach was to eliminate the need to define a model for the stored data since these tools allow storing data in any format. However, this does not eliminate the need to consider the company's data model and references. Indeed, the primary purpose of the latter is not to define the data storage format; that is just a side effect. Their primary purpose is to serve as a support for exchanges within the company; to ensure that a figure has the same meaning for everyone. That is, they allow defining a shared semantics of the company's data model. Besides the need for this shared vision to enable an aggregated view at the enterprise level, it is also an accelerator of any IT transformation: if the semantics and data models are defined, a functional approach is sufficient to respond to process evolution.

In many cases, technology, by allowing the treatment of the visible part of the data management iceberg, has led companies to overlook these challenges. It is precisely this lack of comprehensive thinking at the enterprise level that is now the main cause of failures when it comes to moving from proof of concept to production application. The design of data repositories remains necessary. Previously, it was used to define the data format. Today, it is used to document the data. This is necessary for its use and exchange.

Numeric Simulation and HPC

Finally, the last example I would like to use concerns numerical simulation and HPC with a particular focus on computing architectures. Indeed, for various reasons, many researchers or engineers work on the design of codes using them, forgetting the overall view of their problem. In most HPC applications, the challenge is either to reduce execution time or to allow more complex problems to be addressed. To achieve this, it is not always necessary to turn to the supercomputer. This approach is very reductionist and overlooks other important factors such as mathematical modeling and the resolution algorithms used. Many studies show that these two elements have had a more significant impact on computing capabilities over the past 30 years than the mere increase in supercomputer capacity, which doubles every 18 months. For example, the use of so-called reduced basis methods has allowed reducing the dimension of problems while guaranteeing precise results. We can even find several examples where better results are obtained on less powerful infrastructures. The example presented by Salli Moustafa is quite striking. Another example is available here in the field of oil exploration.

These different examples illustrate a rush forward where technology serves as a pretext for not addressing fundamental issues. Of course, technology can be part of the solution, but it should not become the entire solution. In some cases, the implementation of new technologies can reveal new uses that address the initial problem. For example, collaborative tools allowing several people to work on the same document simultaneously have changed our ways of working in teams. However, let us not forget that they were designed to bring about this new way of working. In general, it is the new use that is the solution to be sought, and its emergence often requires more than just the deployment of a technological tool. In the company, this often also involves support and training. In everyday life, marketing and communication can fulfill these roles, as illustrated by the change in habits induced by smartphones.

The fact that these examples occur on such a large scale illustrates that these problems (intra-organizational human interactions, enterprise data management, or numerical simulation) are inherently complex problems... And that's good news: at a time when we seek to use artificial intelligence to automate our recurring activities, here are areas where human creativity still has its place!