Monthly Archives: November 2017

Cognitive Computing

The talk is going to be about the cognitive computing era,’ says Dr Guruduth Banavar, addressing the topic of his 2017 BCS/IET Turing Lecture. Over the past few years, he explains, we’ve witnessed the establishment of a new era in computing – the age of machine learning. And, as we move into this new age, the resulting technical, professional and societal changes will be profound.

Rounding off his summary, Dr Banavar asserts: ‘It means having a very different relationship with machines. We’ll need to start getting used to having machines with us, to having natural conversations with them, and get used to the idea that they’ll be doing a lot of tasks in every part of our lives.’

Dawn of a third age
If you’re a student of such things, the Tabulating Systems Era began in the early 1900s and ran to the 1950s. The Programmable Systems Era – the if and then epoch – began in the 1950s and has served us well. It’s the foundation of much of the digital world that surrounds us.

A career in technology
Born in India, Dr Banavar spent the first half of his life there before moving to the United States. ‘I did my graduate school in the US and, after my PHD, I joined IBM at the TJ Watson research centre’ he recalls. ‘Since then, I’ve held a number of very interesting roles at IBM.’

The motivation to speak
So, why did Dr Banavar take on the challenge of speaking at the 2017 Turing Lecture? ‘Turing is one of my heroes’, he enthuses. ‘His vision of what computers can do… things like the Turing Test. It measures the limits of what computers can do. These things have always been a guiding light and are very relevant to my work.

Meeting a thinking machine
IBM’s work, and a big part of Dr Banavar’s career, have been focussed on Watson – a machine learning and natural language processing platform named after IBM’s founder, Thomas John Watson Senior.
Watson gained prominence in the popular consciousness when it won at the US game show Jeopardy!. The 2011 victory was an important proof of concept and, since then, Watson has developed many new skills.

What can cognitive do?
Cognitive computing platforms like Watson, Dr Banavar is keen to point out, aren’t intended to replace workers. Rather, the cognitive computing revolution is all about computers and humans working together.

How does Watson work?
To achieve all of this, Watson is constantly ingesting huge amounts of knowledge. When the system is asked a question it finds answers that are likely to be correct by exploring this ever growing corpus of information.
‘When a prescribed time limit has passed’, Dr Banavar explains, ‘the systems stops all the algorithms that are looking for answers and presents all of the likely solutions in a probabilistic fashion. We then use a number of different scoring techniques. They look at the accuracy of the inference and the credibility of sources. At the end of that process is a set of answers with confidence levels.’

Politicians are learning from social media

Demos is a cross-party research organisation and it has brought its investigative skills to bear on social media. The work led, ultimately, to the creation of Demos’ Centre For Analysis Of Social Media (CASM) – a partnership with University of Sussex.

From CASM came Method 52, a piece of machine-learning software that brings social media analytics and social research in-line with academic standards – rigorous standards that government is willing to listen to. And to help politicians understand its research, Demos has worked with BCS to create an innovative and visual dashboard system.

‘Around five years ago we watched the rise of social media and we thought this would be a potentially useful tool through which we can understand some of the issues that we’re already dealing with,’ said Demo’s Krasodomski-Jones, recalling the decisions that led ultimately to Method 52’s creation.

The idea seemed logical and the necessary tools appeared to already exist. ‘Coke was interested in how many times you click on its site. Nike was interested in sentiment about its shoes online,’ he recalled.

Demos explored these internet marketing insights and the tools that were used to create them. But, when it spoke to politicians, they said: ‘It’s great. It’s modern. It’s up-to-date. It’s shiny. But I can’t use this stuff. We’re not interested in the same insights Nike is interested in. We don’t know how accurate this data is and we don’t know who is represented. We don’t know who is speaking and we don’t know what’s going on inside the technology.’

Researcher and machine in unison
Five years in the making, CASM’s Method 52 is a human assisted machine learning platform. It works hand-in-hand with researchers and can be used to aid political insight through looking for patterns in huge data sets of Tweets. These sets, Krasodomski-Jones, explained can consist millions of tweets – so large they are impossible for a human to grapple with alone

A focus on Twitter
For the purpose of dealing with UK politics, Demos looks primarily at Twitter. This, of course, leads to a natural question: is Demos looking at a representative cross-section of society? Twitter is, after all, much loved by the London media set, politicians and opinion makers. Could looking at the micro blogging site lead to some kind of bias?

Refining the tool
Despite being very rigorous in its approach to analysing and understanding trends on social media, there was still work to be done. Working with BCS, Demos developed a dashboard tool that makes Method 52’s findings easier to understand. Politicians, Alex Krasodomski-Jones explained, can be notoriously time poor.

Engineering intelligent networks

According to Gartner’s forecast on Public Cloud Services1, end-user spending on public cloud services is expected to record a compound annual growth rate of 17.7 per cent from 2011 through 2016.

This creates a tremendous opportunity for broadband carriers to expand and enhance their broadband networks to better support and offer cloud services, and in fact, many ISPs around the world have already started offering cloud services to their customers, including residential and enterprises.

However, revenue decline, decreasing profitability and the explosive traffic growth on the existing networks hamper service providers as they strive to innovate and differentiate themselves from competitors.

Service providers want to offer cutting-edge and personalised cloud solutions to their customers while they also look to improve operational efficiency, accelerate network deployment and lower total cost of ownership. The challenge of migrating existing network architectures, managements and policy frameworks is impacting providers around the world as they work feverishly to ramp up to this new area of opportunity and demand.

Cloud services are generally understood as being combinations of communications, storage and computing services that enable convenient, on-demand access to a shared pool of configurable, rapidly provisioned resources.

These cloud computing environments require networks that can cope with high levels of traffic, as well as frequently changing types and patterns of traffic. Many of the concepts inherent in cloud services are not new, but are becoming more economically feasible and IT attractive with advances in technologies and market developments.

In considering cloud services provided in the context of multiservice broadband networks, there are a number of approaches that could be considered. These alternatives are illustrated in the following figures, indicating potential new cloud service related functions.With the creation of the Broadband Forum’s Cloud Intelligent Broadband Network (CIBN) project, service providers are being given the tools to take advantage of these market developments – helping them migrate to a cloud supporting network, reduce costs and enhancing revenue opportunities. The goal of this project is to provide the industry with the specifications needed to capitalise on the cloud service opportunity, ensuring the delivery of new services without cannibalising older ones.

This two-phased cloud project is focused on transitioning the multiservice broadband network to address the cloud requirements, incorporating a holistic approach that addresses not only the architecture, but management, policy control and the quality of experience (QoE) of cloud service offerings, while leveraging technologies such as SDN and virtualisation.Progress in SDN and virtualisation is coming fast and furious as customers increasingly push operators to find methods to provide more bandwidth or deploy applications. The world is becoming increasingly software-centric and virtualised, and getting the most efficiency and value out of data centres, along with their seamless connectivity and interoperability with evolving network operator infrastructures, is Broadband Forum’s top work in progress.

Big data trends

Big data has a huge crossover with the business intelligence world. But recently analyst Forrester commented that most organisations still analyse structured and unstructured data in silos, using different tools and serving different use cases. ‘The techniques used, such as statistical analysis, machine learning, natural language processing and artificial intelligence are now bringing text analytics closer to the world of business intelligence’ they say.

Burgeoning technology such as the internet of things (IoT) with its obvious connection to big data due to the sheer amount of data that can be produced means that there will be some tensions in the use of the data. Forrester mention in a recent paper that ‘enterprise applications must handle IoT data in two ways: 1) analyse large volumes to find patterns and insights that can be valuable in the future and 2) perform streaming analytics to glean immediate, actionable insights.’

2015 will be the year, say the likes of PricewaterhouseCoopers (PwC), that analysing the data collected via sensors from the internet of things will really kick in. But as more data is collected, processes will need to be decentralised to provide agility in its use. In their survey ‘Guts & Gigabytes’ PwC showed that 41 per cent of British executives use their intuition and experience in the decision-making process but only 23 per cent uses data and analytics.

Getting big impact from big data
Earlier this year McKinsey Quarterly ran an item on the efforts that organisations are making to turn their use of big data and data analytics into large-scale benefits. They note that some of the obstacles to that include a lack of willingness among leaders to invest in analytics and the fact that when analytics are undertaken users are either unable to correctly understand it, or don’t have the confidence to implement the changes suggested by the results. The piece goes on to suggest ways to make an organisation more receptive to analytics through change management and redefining jobs where needed.As with their pioneering use of 3D printing, Formula One racing teams are making good use of big data analytics and have been for some time. The technical demands are high, with hundreds of sensors providing thousands of data points for analysis. In Formula One these include such things as tyre pressure and fuel burn efficiency, which have to be collected in real-time for very quick analysis by race engineers onsite. A Forbes piece by Frank Bi on this looks at how Red Bull Racing and driver Sebastian Vettel used big data in 2012 to fix damage to his car during a pit stop, leading to a race strategy change that helped Vettel win the world championship.