March 2018 | 1026 words | 4-minute read
Data should be seen as a strategic asset and, as with oil, there is much to be distilled from it.
When I joined Tata Consultancy Services 30 years ago, data resided in large mainframes, in the sanctum sanctorum of the IT department. Hardly anyone had access to these machines. Data was collected and stored in neat rows and columns. Only a certain number of reports could be run on the type of data collected, and that seemed to satisfy business functions.
Data is not so well behaved today. It floods through our servers in multiple formats. Users of the internet alone generate 2.5 quintillion bytes of data each day. There is more data coming from different gadgets and from people. Genomes, for instance, are a big data source. Valuable insights are buried in the text, audio and visual data emerging from diverse sources. This is a big challenge, but with each passing year, we are able to handle data better and use it meaningfully. A few examples:
The digital twin
Manufacturing operations are fairly well automated today. However, automation systems deployed across the enterprise are not completely integrated. Optimising operations to meet targets still depends on the knowledge, skill and intuition of plant engineers. Let us take the case of emission control in a thermal power plant. The plant control system enables the tweaking of several parameters that impact emissions: coal flow, air flow, temperature or pressure. A boiler commissioning expert, after conducting a number of tests on the boiler for a given type of coal, determines the most suitable operating conditions for controlling emissions. If the type of coal changes, it becomes necessary to carry out dozens of tests on the boiler. This is an expensive, time-consuming effort.
Creating a digital replica of the boiler or the whole power plant using past data from operations, will be useful to run various scenarios and determine the best conditions for new types of coal. The digital twin runs in tandem with the physical entity and keeps learning from the data, through advanced artificial intelligence (AI) algorithms. This helps the organisation run fewer tests, reduce efforts in analysing the results, and thus create savings.
New material matters
The World Steel Association states that there are about 3,500 grades of steel. And yet, there are new requirements from various industries for new grades: the automobile industry is looking for advanced high-strength steels that are lighter, the medical industry asks for martensitic steel for precision instruments, and so on. To create materials according to customer requirements involves many types of data, and many teams working closely together.
‘Integrated computing materials engineering’ (ICME) is a method that brings together key players: the materials engineer, the researcher, product developer, manufacturing engineer, and design and application engineers. A central digital platform helps them collaborate and exchange data, information and knowledge from different systems. The platform has building block models for a variety of foundational engineering design problems and the knowledge pertaining to their design processes: product components, materials, manufacturing processes, physics-based simulation models, models learnt from data, etc. It is trained to make simulations easy to run. A well-designed ICME platform drastically reduces the time and effort spent and creates better quality material.
As the world gets more and more power hungry, there is a need to generate, distribute and monetise electricity in an optimal way. The electricity value chain is tightly coupled; any fluctuation in weather, demand and supply can impact the entire value chain. Consumption patterns, dependent on region and time, vary widely. Energy companies would like to understand the best bidding strategy in such volatility, and tools that guide this decision-making are growing more sophisticated. The entire electricity value chain can now be modelled using AI-based systems. With increased usage, these platforms become more knowledgeable and efficient, and will help power producers to make the best use of their assets.
Even as Amazon Echo and Cortana are making waves among consumers, text-based conversational agents, or ‘chatbots’, are gaining in popularity. For instance, HR departments would find chatbots to be good assistants that can answer basic queries on policies such as leave eligibility and insurance entitlement. Chatbots built on ‘deep learning’ models process and answer questions raised in natural language from employees. For the employee, it feels like having a conversation with a human. The bot keeps learning through interactions, enriching its knowledge base.
The profusion of data available is helping AI algorithms mature. Our search inputs, coming at roughly 2 million records per minute, help Google refine its search algorithms. Similarly, voice-based assistants such as Siri improve with the increased voice interactions they process. Once trained on good quality data, AI techniques help applications enrich themselves by learning from new data. Robust AI-based applications are invaluable for managers who make crucial decisions based on countless parameters. They can save costs and aid in making more informed, less risky decisions. They help in upping operational efficiency and customer experience.
AI-based platforms find application in every industry: pricing products in an omni-channel, retail environment; optimising store space for maximum returns; controlling fraud with speech biometrics for banks and high-security environments; image-recognition applications that can be embedded in robots and drones; autonomous cars; unravelling genomic data and personalising medicine; even in training strategies for sportspeople.
AI offers exciting possibilities. As its fuel, data is gaining in value. Data trading platforms are already in use. Copenhagen — to reach its goal of being a carbon neutral city by 2025 — has launched the City Data Exchange as a software-as-a-service solution that allows the sale, purchase and sharing of a wide variety of data from multiple sources.
Every company has a wealth of data. You have to mine it for insights, just as usable products are distilled from crude oil. You have to guard it, because every breach will tell you how precious your data is. Data is a strategic asset; you just have to make it perform!
Author Ananth Krishnan is executive vice president and chief technology officer at Tata Consultancy Services.