Watson cognitive computing brings new thinking to IoT data analytics

 

Reducing data in the data center has been a mentality in the past, but the Internet of Things (IoT) demands more, more, and more still. Withholding information from systems is in essence selling IoT systems short; actively seeking it, on the other hand, invites challenges perhaps never-before-seen by even the most seasoned data scientists.

In this interview with Chris O’Connor, General Manager of Watson Internet of Things Offerings at IBM, he discusses how the power of is being harnessed through the company’s Watson platform – now exposed to developers through a set of application programming interfaces (APIs) – to turn the IoT data deluge into increasingly valuable insights.

For those unfamiliar, can you briefly describe Watson, and then fill us in on what it’s been up to since its Jeopardy! days?

O’CONNOR: Watson is a true learning platform. It interacts with its environment, and the environment is both data and use of that data. It then learns what to expect from watching that information. So it might identify key patterns that are dynamically changing. It may help you understand circumstances that you think you’ve never seen before but actually have based on relevant points of information that you wouldn’t be able to see otherwise because the fog of information would have obscured its view. And when working directly around runtimes or processes, it enables you to learn how to tune those processes to run better.

If you put 15 million text pages into Watson, over time it will learn where the best answer is for the next person who comes along and asks that question. But it won’t just learn it by studying that everyone went there, it will actually learn it by where the answer probability was right. It will help tune in and provide local intelligence to you, which is why it’s so strong in medicine. There are so many millions of medical records out there today and no doctor can consume them all, but if you can collectively learn from the set of cases, you can present the physician dealing with symptoms a learned diagnosis that comes from some of the best minds on the planet as a cognitive realization of how it’s put together.. So it’s far more meaningful than index databases and tables and straight-line analytics.

In a nutshell, Watson from Jeopardy! has evolved immensely. If you think about the Jeopardy!timeframe, a literal room full of compute power was needed to get that cognitive intelligence to come to fruition. Those types of capabilities now can go down into packages that are the size of small shoeboxes. Just the advancement in understanding the amount of programming, the amount of compute power needed, and the continual advancements of compute power, the industry itself allows us now to take what used to be large, room-size efforts and consolidate that down into highly repeatable modules of hardware/software technology. That really provides the next step of what Watson’s been up to, which is that Watson can now start to exist in the .

Think about going out to cloud data centers and putting Watson population centers into those data centers, and sharing the capabilities so people can subscribe and run a Watson service for their own tuning, whether that’s legal work or in healthcare, or whether that’s around the Internet of Things or some other industry that they imagine. Watson then sits as a service. We’ve put a service-oriented architecture on top of it, and Watson now sits as a set of APIs that you can call, whereas before you programmed Watson for a single purpose. Those APIs are specific to certain types of actions – some might be around voice, others might be around machine language, some might be around video, some might be around unstructured text processing – and all of these are now separate APIs that you can call, start to use yourself, and train so you can run your own “education” program on a fresh Watson for your own purposes.

How do you see Watson’s cognitive computing playing out in the of the billions of data streams generated by the IoT?

O’CONNOR: It really plays out in a variety of ways. If we hook up a camera to an IoT device and watch the performance of some machine by making that camera a data feed, you now have the device’s information and you also have the information of what the camera says it’s doing. Those often are the exact same things, but sometimes they’re not. And sometimes there are nuances around the device that also can get captured by the camera that can help provide information as to why performance anomalies are happening – somebody may have left a wrench, somebody might have put a coffee cup down, somebody might have moved another machine too close and it added too much heat. All of those things become a off the camera, and Watson is able to process the device data and the camera data, and, in its simplest form, learn about what’s normal and identify what the variants are and also allow you to do variance understanding or matching around certain events. That’s a very simple example around Watson.

If you have a large manufacturing floor, you probably do have 15 million pages of text associated with every machine there. We all know that a manufacturing floor is supposed to run as an orchestra, where parts are put together and move down a line and eventually you build a finished good out of what happened on that manufacturing floor. So, if you have all of these machines that build something with 15 million pages of manuals underneath them and there are symptoms that come out in your end devices, they might be the result of tolerances, they might be heat statistics, they may be signatures, or it may even be the look and feel of the device itself at a base level. If you want to go back and query what those 15 million pages would have told you about the device and which ones could be relevant to why that color came out the way it did or why the tolerance was there, it’s a lot of learned knowledge applied against a lot of pages to make that happen manually. In Watson you’re able to start to draw corollaries and look through that unstructured information in a way that let’s the actual device performance be compared to what the manuals say, and bring the manuals into a learning-oriented environment that allows you to look through that information quickly. And as you see common symptoms and use the system over time, it becomes your source of expertise to help keep producing whatever you’re making and to keep the manufacturing floor, which is itself an IoT “thing,” moving efficiently.

You mentioned the various Watson APIs. Are they available as part of IBM’s cloud platforms?

O’CONNOR: The Watson APIs are all available on Bluemix, and there is a wide variety of them – some related to the Internet of Things, some not. Inside of the IBM Watson IoT Platform (formerly IoT Foundation) we have included a subset of those that are relevant, so if you become an IBM Watson IoT Platform client we’ve pre-integrated the Watson APIs as a feature that you can use the same way that The Weather Company data is a feature that you can choose to use inside our platform as well.

If you think about what you want to have happen, just imagine an automobile driving down a road and it’s snowing outside. That automobile should not only know about itself, but also the conditions around it. So it ought to have an idea of forecast, barometric pressure, whether there are icy conditions that exist, if it‘s going to keep snowing, if the type of snow is going to change. In the cloud we can then compare the capabilities and tell all devices of the same type what they’re dealing with. As you hit certain conditions driving down that road perhaps with a certain type of tires and then lose control, you should be able to learn that event and then tell all cars with the same set of tires about that condition before it happens again because someone’s already learned about it. That’s the type of environment we see going forward, and we see that example playing out in many, many industries. It plays out on the manufacturing floor, in oil and gas refineries, in aircraft machinery. We see the combination of more information gaining value.

We spent 20 years teaching people to reduce the data in the data center; with Watson and what we’re doing with IoT we’re teaching people to bring in more information and learn from the integration of the data types. That’s a big change in thought process. We’ve got four Watson APIs in the IBM Watson IoT Platform right now that we think are the strongest use cases – text, video, voice or audio, and machine-to-machine. We think those first four are legitimate starting points for anyone who’s thinking about doing an IoT platform, but we’ll be adding more.

Sidebar | Cognitive computing market to grow as industry extracts value from intelligent systems of systems

As part of its study on the Digital Universe, research firm IDC predicts that 44 zettabytes of content will be generated worldwide by 2020. By that time only about half of the data created will be relevant to or actionable by Internet of Things (IoT) systems, but in the ensuing five years those percentages are projected to grow exponentially (Figure 1).

Figure 1

The availability of more IoT data will provide organizations with the opportunity to gain increased insight into intelligent systems architectures through the use of analytics. Moving forward, this trend will position analytics as a key differentiator for IoT businesses, as “The value of the IoT solution is proportional to the speed and number of times the original IoT data is analyzed,” says Vernon Turner, Senior Vice President and Research Fellow for the Internet of Things at IDC. “The more IoT data is churned or read, the more value and innovation happens.”

Where traditional big data analytics provides some worth, cognitive computing leverages machine learning, , and deep learning technology that applies analytics to semi-structured and unstructured data sets quickly and at massive scale. The result is high value systems of intelligence that accentuate actionable IoT data, using “cloud platforms that broker new and different data sources and have interfaces that support an open data plat-form ‘funnel’ to feed the beast,” Turner adds.

Based on these developments, IDC forecasts spending on cognitive systems to reach $31.3 billion in 2019, growing at a compound annual growth rate (CAGR) of 55 percent over the next five years. For more information, visit www.idc.com.

How important will cognitive computing be in the IoT moving forward?

O’CONNOR: If you look forward, the purpose of everything in the Internet of Things is to drive a comfort around better business and better lifestyle. To the degree that that is a learned environment, you get better business and better lifestyle as a continuously improving element versus having to wait for technology jumps. If you look back over time, you’ll see that there have been multiple technology jumps, but we think that with a cognitive set of capabilities out there it’s a much smoother and straightforward transition that businesses and individuals go through as the world transforms around us.

Quite simply, it’s the next big shift. It’s about enabling businesses to operate more smoothly and more educated, and for individuals it will play out in an infinite number of ways in our lifestyles and creature comforts.

IBM Watson Internet of Things

www.ibm.com/internet-of-things

@IBMIoT

LinkedIn: www.linkedin.com/company/ibm-internet-of-things

Facebook: www.facebook.com/IBMIoT

Google+: plus.google.com/+IBMInternetofThings1

YouTube: www.youtube.com/channel/UCFNoGF7Ea-FfmAjfK4ReFpA