March 2018 | 1026 words | 4-minute read
Data should be seen as a strategic asset and, as with oil, there is much to be distilled from it.
When I joined Tata Consultancy Services 30 years ago, data resided in large mainframes, in the sanctum sanctorum of the IT department. Hardly anyone had access to these machines. Data was collected and stored in neat rows and columns. Only a certain number of reports could be run on the type of data collected, and that seemed to satisfy business functions.
Data is not so well behaved today. It floods through our servers in multiple formats. Users of the internet alone generate 2.5 quintillion bytes of data each day. There is more data coming from different gadgets and from people. Genomes, for instance, are a big data source. Valuable insights are buried in the text, audio and visual data emerging from diverse sources. This is a big challenge, but with each passing year, we are able to handle data better and use it meaningfully. A few examples:
The digital twin
Manufacturing operations are fairly well automated today. However, automation systems deployed across the enterprise are not completely integrated. Optimising operations to meet targets still depends on the knowledge, skill and intuition of plant engineers. Let us take the case of emission control in a thermal power plant. The plant control system enables the tweaking of several parameters that impact emissions: coal flow, air flow, temperature or pressure. A boiler commissioning expert, after conducting a number of tests on the boiler for a given type of coal, determines the most suitable operating conditions for controlling emissions. If the type of coal changes, it becomes necessary to carry out dozens of tests on the boiler. This is an expensive, time-consuming effort.
Creating a digital replica of the boiler or the whole power plant using past data from operations, will be useful to run various scenarios and determine the best conditions for new types of coal. The digital twin runs in tandem with the physical entity and keeps learning from the data, through advanced artificial intelligence (AI) algorithms. This helps the organisation run fewer tests, reduce efforts in analysing the results, and thus create savings.
New material matters
The World Steel Association states that there are about 3,500 grades of steel. And yet, there are new requirements from various industries for new grades: the automobile industry is looking for advanced high-strength steels that are lighter, the medical industry asks for martensitic steel for precision instruments, and so on. To create materials according to customer requirements involves many types of data, and many teams working closely together.
‘Integrated computing materials engineering’ (ICME) is a method that brings together key players: the materials engineer, the researcher, product developer, manufacturing engineer, and design and application engineers. A central digital platform helps them collaborate and exchange data, information and knowledge from different systems. The platform has building block models for a variety of foundational engineering design problems and the knowledge pertaining to their design processes: product components, materials, manufacturing processes, physics-based simulation models, models learnt from data, etc. It is trained to make simulations easy to run. A well-designed ICME platform drastically reduces the time and effort spent and creates better quality material.